“We tested Bort on 23 NLU tasks, and on 20 of them, it improved on BERT’s performance — by 31% in one case — even though it’s 16% the size and about 20 times as fast.” — Adrian de Wynter
The objective of the present article is to show how to overcome the few issues you’ll encounter when you try to get Bort running in the popular and convenient environment of Simple Transformers. This environment is built on top of Hugging Face Transformers.
First you create something that works well, then you optimise it.
Language Engineer, Python programmer, Scrum Master, Writer