Comments (4)
Here is an update:
I have found the fasttext_multiling.sh
script in the repository.
Since I do not have access to the /work/gglavas/data/word_embs/yacle/fasttext/200K/npformat/ft.wiki.${language}.300.vocab
which is given as parameter to --embedding_vocab
, I didn't set --embedding_vocab to anything. Similarly, I set --embedding_vectors
(which is set to /work/gglavas/data/word_embs/yacle/fasttext/200K/npformat/ft.wiki.${language}.300.vectors)
in the script) to the german fastText embeddings shared above. The script is then:
similarity_type="cosine"
language="de"
for test_number in 1,2; do
python weat.py \
--test_number $test_number \
--permutation_number 1000000 \
--output_file ./results/w2v_wiki_${language}_${similarity_type}_${test_number}_cased.res \
--lower True \
--use_glove False \
--is_vec_format True \
--lang $language \
--embeddings \
data/fastTextEmbeddings/wiki.${language}.vec \
--similarity_type $similarity_type |& tee ./results/w2v_wiki_${language}_${similarity_type}_${test_number}_cased.out
done
With that script I got the following outputs:
results]$ cat w2v_wiki_de_cosine_2_cased.res
Config: 2 and True and 1000000
Result: (1.1948289903673552, 1.2983299550815979, 0.0)
results]$ cat w2v_wiki_de_cosine_6_cased.res
Config: 6 and True and 1000000
Result: (0.48375295403351787, 1.5778229695669055, 7.77000777000777e-05)
results]$ cat w2v_wiki_de_cosine_7_cased.res
Config: 7 and True and 1000000
Result: (0.0071803716755713815, 0.038459829350379, 0.4732711732711733)
So, these results are closer to the ones reported in Table-5. My question is then: "Is it normal to see such variations from the results in Table-5 or are these different results indicator of a mistake I made?"
from xweat.
Thanks for your request. I'll look into the issue and let you know asap.
from xweat.
Your usage and results seem to be fine. I reran the exact configuration and get the same results. The small variation most probably comes from the fact that in our experiments, we cut all vectors to the top 200k thereby increasing efficiency. For test 2, for instance, the term "feuerwaffe" cannot be found in our version. Also note, that in order to keep lists the same lengths, we randomly drop terms from the longer lists. In order to get the exact scores, you might, therefore, need to rerun the experiments multiple times for some languages. If you like to reproduce the exact scores, I can also assist you by forwarding you the exact lists that were used for each individual experiment (but I assume this is not necessary?).
from xweat.
No need to share the exact list - it is more than enough to know that my usage is correct. Thanks for helping me on this!
from xweat.
Related Issues (1)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from xweat.