Facebook open-sources Hyperparameter autotuning for fastText to automatically find best hyperparameters for your dataset

Facebook open-sources Hyperparameter autotuning for fastText to automatically find best hyperparameters for your dataset
April 9, 2020 2 Comments Beauty Priya Saha

Two years ago, the team at Facebook AI Research (FAIR) lab open-sourced fastText, a library that is used for building scalable solutions for text representation and classification. To make models work efficiently on datasets with large number of categories, finding the best hyperparameters is crucial. However, searching the best hyperparameters manually is difficult as the effect of each parameter varies from one dataset to another. For this, Facebook has developed an autotune feature in FastText that automatically finds the best hyperparameters for your dataset. Yesterday, they announced that they are open-sourcing the Hyperparameter autotuning feature for fastText library. 

What are hyperparameters?

Hyperparameters are the parameter whose values are fixed before the training process begins. They are the critical components of an application and they can be tuned in order to control how a machine learning algorithm behaves. Hence it is important to search for the best hyperparameters as the performance of an algorithm can be majorly dependent on the selection of these hyperparameters.

The need for Hyperparameter autotuning

It is difficult and time-consuming to search for the best hyperparameters manually, even for expert users. This new feature makes this task easier by automatically determining the best hyperparameters for building an efficient text classifier. A researcher can input the training data, a validation set and a time constraint to use autotuning.

The researcher can also constrain the size of the final model with the help of compression techniques in fastText. Building a size-constrained text classifier can be useful for even deploying models on devices or in the cloud such that it becomes easier to maintain a small memory footprint.

Also Read  How to remove skin tags

With Hyperparameter autotuning, researchers can now easily build a memory-efficient classifier that can be used for various tasks, including language identification, sentiment analysis, tag prediction, spam detection, and topic classification.

The team’s strategy of exploring various hyperparameters is inspired by existing tools, such as Nevergrad, but has been tailored to fastText for using the specific structure of models. The autotune feature explores hyperparameters by initially sampling in a large domain that shrinks around the best combinations over time. 

It seems that this new feature could possibly be a competitor to Amazon SageMaker Automatic Model Tuning. In Amazon’s model, however, the user needs to select the hyperparameters required to be tuned, a range for each parameter to explore, and also the total number of training jobs. While Facebook’s Hyperparameter autotuning automatically selects the hyperparameters. 

To know more about this news, check out Facebook’s official blog post.

About The Author
Priya Saha I am content writer at LoogleBiz -A Large Local Business Directory with over 5 years' experience in creating high-quality content for a range of clients. Writing clear marketing copy to awareness about products/services, Preparing well-structured drafts using Content Management Systems, Researching industry-related topics (combining online sources, interviews and studies), include conducting thorough research on industry-related topics, generating ideas for new content types and proofreading articles before publication. #Some qualities that I have - *Excellent command over English language. *Basic analytical skills. *An eye for details. *Ability to meet deadlines. *Ability to develop innovative and engaging content. *Being able to deliver under deadlines. *Excellent writing and editing skills in English *Good command over Microsoft Office tools like Word doc, Powerpoint etc.
Leave Comment
  1. 1

    Sunil Kumar

    Good read. Thanks for the data. It helps us. We only have several comments per post and were evaluating the impact on metrics. Doesn’t seem to be any. Thanks once again.. https://bloggerspassion.com/

  2. 1

    Aarti Biswas

    Good article, keep going on.. https://firstsiteguide.com/