It is based on two pillars: data pre-trained models: sets of information to be analyzed using natural language processing; and methodology how the algorithm uses these models. In other words, with BERT, Google intends to “read” users’ minds by understanding not only the query, but what it does not explicitly say . It is also a lever for understanding new queries, those which are formulated for the first time, and which Google estimates at the time at around % of daily searches.
In , Google's work on NLP intensified to give MUMMultitask Unified Model , an update to its algorithm that further improves the understanding of natural language and, in doing so, the relevance of the answers provided to Internet users . In BTB Directory particular, MUM focuses on what Google calls “complex queries,” characterized by their length and the inclusion of multiple propositions. The goal of MUM is to be able to answer these queries in one go by relying on advanced functionalities: extraction of information from several content formats, display of resources extracted from results inlanguages with instant translation and supporting multiple tasks simultaneously.

What does natural language processing at Google actually change for SEO? What you need to understand is that the integration of NLP into Google's search engine aims to improve the services provided to Internet users. Natural language processing technologies help algorithms better understand user queries and provide more relevant answers that are likely to satisfy them. This is all the more important for Google as these needs are dictated by changing behaviors, in particular by the growing use of voice search – itself enabled by NLP applications.
|