Get Up to Speed With NLP: Major Trends & Breakthroughs

Data Basics, Scaling AI Nancy Koleva

2019 has been a landmark year for the field of natural language processing, more commonly referred to as NLP. In the last couple of years, we’ve seen a ferocious race of models and researchers trying to get to the first place of podiums across a variety of NLP tasks, from reading comprehension to sentiment analysis.

From the rise of self-supervised learning and unstructured data to major architecture breakthroughs such as the attention technique, Transformer models and BERT, the past year has been anything but boring for the realm of NLP.

All of these techniques, which once were very restricted to the research area, are now becoming more and more mainstream, and unlocking a variety of business applications. Watch the video below for a quick recap of the recent innovations and upcoming trends in NLP, or get the new white paper for a detailed overview of the major breakthroughs in NLP architecture, techniques, and use cases.

Get Up to Speed With NLP: Get the Ebook

You May Also Like

AI-Ready Architecture in the Cloud

Read More

Is All AutoML Created Equal?

Read More

Moving Toward a Citizen Data Science Model

Read More

How to Perform Basic ML Serving With Python, Docker, and Kubernetes

Read More