Facebook Open Sources RoBERTA NLP Model

Facebook has made a new natural language processing model called RoBERTA available as open source. The model is an optimized version of Google's BERT model.

Facebook has made a new natural language processing model called RoBERTA available as open source. The model is an optimized version of Google’s BERT model.

The Facebook researchers describe their model as a robustly optimized method for pretraining natural language processing (NLP) systems that improves on Bidirectional Encoder Representations from Transformers, or BERT, the self-supervised method released by Google in 2018.

BERT has become know for the impressive results the technique has achieved on a range of NLP tasks while relying on un-annotated text drawn from the web. Most similar NLP systems are based on text that has been labeled specifically for a given task.

Facebook’s new optimized method, RoBERTa, produces state-of-the-art results on the widely used NLP benchmark, General Language Understanding Evaluation (GLUE).

RoBERTa has been implemented in PyTorch, and the team modified key hyperparameters in BERT, including removing BERT’s next-sentence pretraining objective. RoBERTa was also trained with much larger mini-batches and learning rates. The developers say this allows RoBERTa to improve on the masked language modeling objective compared with BERT and leads to better downstream task performance.

After implementing these design changes, the Facebook model showed notably better performance on the MNLI, QNLI, RTE, STS-B, and RACE tasks and a sizable performance improvement on the GLUE benchmark. With a score of 88.5, RoBERTa reached the top position on the GLUE leaderboard, matching the performance of the previous leader, XLNet-Large. The team says these results highlight the importance of previously unexplored design choices in BERT training and help disentangle the relative contributions of data size, training time, and pretraining objectives.

There’s a full description of RoBERTA and the research carried out in a paper published on arXiv.

More Information

Last updated on April 1st, 2023

001
Gabby
Gabby

Inspiring readers to expound the possibilities of the unfolding World