Google Applies New Improved BERT Model to Search Rankings

Affecting 1-in-10 Queries, Google's BERT Model is one of the biggest leaps forward in the history of search.

Google is rolling out what it says is the biggest step forward for search in the past 5 years, and one of the biggest steps forward in the history of Search altogether. It is using a new technology it introduced last year, called BERT, to understand search queries.

BERT stands for bidirectional encoder representations from transformers. Transformers refer to models that process words in relation to all other words in a sentence.

“We’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search,” Google search vice president Pandu Nayak said in an online post.

The company said the new effort is based on what it calls Bidirectional Encoder Representations from Transformers (BERT), which seeks to understand query words in the context of sentences for insights, according to Nayak.

Google software, like humans, has to grapple with understanding what people are trying to say even though they might not be expressing themselves clearly, or even be making sense.

Some BERT models for figuring queries out are so complicated they need to be handled by high-powered computer processors specifically designed for the cloud, according to Google.

“By applying BERT models to both rankings and featured snippets in search, we’re able to do a much better job helping you find useful information,” Nayak said.

- Advertisement -

“In fact, when it comes to ranking results, BERT will help search better understand one in 10 searches in the US in English.”

He gave the example of Google software now understanding that the word “to” in a query such as “2019 brazil traveler to the USA need a visa” is about a Brazilian heading to the US and not the other way around.

“Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about US citizens traveling to Brazil,” Nayak said.

“With BERT, the search is able to grasp this nuance and know that the very common word ‘to’ actually matters a lot here, and we can provide a much more relevant result for this query.”

Google planned to spread the improvement to more languages and locations “over time.”

- Advertisement -

Related Articles