
BERT is the most recent algorithm update in Google Search’s dynamic and constantly-evolving mechanism.
But what is it and how does it work?
What is BERT?
BERT is an acronym for Bidirectional Encoder Representations from Transformers and is Google’s most recent NLP (natural language programming) update.
If this sounds like a foreign language to you, you’re not alone, so let me translate it into plain English:
The whole idea behind BERT is to improve Google’s understanding of our search queries.
Essentially, with this update, Google is attempting to “humanize” the search process by using context to better understand our queries. This has a rebound effect because if Google has a better understanding of the questions we ask it, it can then provide answers that are more relevant to us.
How Does Google’s BERT Algorithm Work?
BERT works by using what are known as “Neural Networks.” According to Search Engine Land, neural networks-
“are designed for pattern recognition, to put it very simply. Categorizing image content, recognizing handwriting and even predicting trends in financial markets are common real-world applications for neural networks.”
These neural networks then create what are known as “pre-trained models” by recognizing the patterns present in millions of pages of text found on the internet. These patterns provide clues to the intent of the searcher by establishing a framework of contextual guidelines.
To put it simply, if Google sees that a word is used in a certain context, it can be fairly certain that the searcher has a specific intent. It can then apply this rule every time that particular context appears.
Once the patterns (pre-trained models) have been established, Google will then be able to gain a better understanding of the context and intent of your search, helping it to provide you with the most relevant answer.
The Major breakthrough with BERT is the element of Bidirectionality. Aptly put by InfoQ.com, Google is now “able to define the context defining the meaning of a word not only by considering parts of the same sentence leading to that word, but also parts following it.” Hence, BERT is Bidirectional.
Pandu Nayak, Google’s VP of Search, helps us further understand the potential of BERT in the following excerpt from a news release he wrote for Google:
“Here’s a search for “2019 brazil traveler to usa need a visa.” The word “to” and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil. With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query.”
What BERT Means for the Future of Search
This new update to Google Search has the potential to streamline the search process for users, helping them to find more relevant answers in less time.
As AI technology improves, look for more updates like BERT to promote a better search experience in the very near future!