ba164c0dbb3d8171004380956a88431f4e8248ba,onmt/Models.py,Embeddings,make_positional_encodings,#Embeddings#Any#Any#,51

Before Change


        for i in range(dim):
            for j in range(max_len):
                k = float(j) / (10000.0 ** (2.0*i / float(dim)))
                pe[j, 0, i] = math.cos(k) if i % 2 == 1 else math.sin(k)
        return pe

    def load_pretrained_vectors(self, emb_file):
        if emb_file is not None:

After Change



    def make_positional_encodings(self, dim, max_len):
        pe = torch.arange(0, max_len).unsqueeze(1).expand(max_len, dim)
        div_term = 1 / torch.pow(10000, torch.arange(0, dim * 2, 2) / dim)
        pe = pe * div_term.expand_as(pe)
        pe[:, 0::2] = torch.sin(pe[:, 0::2])
        pe[:, 1::2] = torch.cos(pe[:, 1::2])
        return pe.unsqueeze(1)

    def load_pretrained_vectors(self, emb_file):
        if emb_file is not None:
Italian Trulli
In pattern: SUPERPATTERN

Frequency: 3

Non-data size: 4

Instances


Project Name: OpenNMT/OpenNMT-py
Commit Name: ba164c0dbb3d8171004380956a88431f4e8248ba
Time: 2017-08-01
Author: bpeters@coli.uni-saarland.de
File Name: onmt/Models.py
Class Name: Embeddings
Method Name: make_positional_encodings


Project Name: keras-team/keras
Commit Name: 7da1523053f2e5f4fa15c87e019b3244c8653a53
Time: 2015-12-09
Author: francois.chollet@gmail.com
File Name: tests/keras/test_normalization.py
Class Name:
Method Name:


Project Name: keras-team/keras
Commit Name: 488bda8fc8e699471263bc0bf5f5574326559894
Time: 2021-02-02
Author: scottzhu@google.com
File Name: keras/layers/preprocessing/index_lookup.py
Class Name: IndexLookup
Method Name: _set_forward_vocabulary