32579822389423c7f4120e222aa26652f8507735,onmt/utils/optimizers.py,Optimizer,set_parameters,#Optimizer#Any#,172

Before Change


        self.sparse_params = []
        for k, p in params:
            if p.requires_grad:
                if self.method != "sparseadam" or "embed" not in k:
                    self.params.append(p)
                else:
                    self.sparse_params.append(p)

After Change


                if not param.requires_grad:
                    continue
                // TODO: Find a better way to check for sparse gradients.
                if "embed" in name:
                    sparse.append(param)
                else:
                    dense.append(param)
            self.optimizer = MultipleOptimizer(
                [optim.Adam(dense, lr=self.learning_rate,
                            betas=self.betas, eps=1e-8),
                 optim.SparseAdam(sparse, lr=self.learning_rate,
Italian Trulli
In pattern: SUPERPATTERN

Frequency: 3

Non-data size: 3

Instances


Project Name: OpenNMT/OpenNMT-py
Commit Name: 32579822389423c7f4120e222aa26652f8507735
Time: 2018-12-18
Author: guillaumekln@users.noreply.github.com
File Name: onmt/utils/optimizers.py
Class Name: Optimizer
Method Name: set_parameters


Project Name: OpenNMT/OpenNMT-py
Commit Name: eae55f722265a6b263a6485af88332b7a65d001c
Time: 2017-06-04
Author: sasha.rush@gmail.com
File Name: preprocess.py
Class Name:
Method Name: extractFeatures


Project Name: keras-team/keras
Commit Name: d4f39c8f53bc10c241046641d03a1fe5411467c3
Time: 2015-07-06
Author: floydsoft@gmail.com
File Name: keras/layers/containers.py
Class Name: Graph
Method Name: add_node