1 |
Star |
text |
2 |
HMTL Victor Sanh et al.AAAI 2019 |
image |
3 |
paper |
text |
4 |
code |
text |
5 |
Transformers 30k+ stars on GitHub |
image |
6 |
contribute |
text |
7 |
Write With Transformer transformer.huggingface.co |
image |
8 |
DistilBERT Victor Sanh et al. 2019 Distilllation. A smaller, faster, lighter, cheaper version of BERT. Code and weights are available through Transformers. π π π |
text |
9 |
Transfer-Transfo A Transfer Learning approach to Natural Language Generation. A workshop paper on the Transfer Learning approach we used to win the automatic metrics part of the Conversational Intelligence Challenge 2 at NeurIPS 2018. |
text |
10 |
Meta-learning for language modeling Thomas Wolf et al.ICLR 2018 |
image |
11 |
workshop paper |
text |
12 |
ICLR 2018 |
text |
13 |
top open source library |
text |
14 |
train it |
text |
15 |
TorchMoji State-of-the-art emotion detection π₯ π° π π |
text |
16 |
Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups |
text |
17 |
100 Times Faster Natural Language Processing in Python |
text |
18 |
The Current Best of Universal Word Embeddings and Sentence Embeddings |
text |
19 |
From zero to researchβ—βAn introduction to Meta-learning |
text |
20 |
Medium |
text |
21 |
Jobs |
text |