Are you aware of Google’s BERT update?
BERT means Bidirectional Encoder Representations from Transformers.
It is one of the biggest algorithm changes to search engine optimization in the last 5 years… And it’s affecting 10 percent of all search queries, according to Google.
This is how it happens…
BERT, a learning algorithm related to Natural Language Processing works with the Rankbrain, Google’s foremost machine learning technology, to match up search requests that deliver intent based search results for Google users.
The idea about BERT is for Google to better understand the intent and context behind each search query you make on Google.
So, with Bidirectional Endcoder Representations from Transformers, Google wants to match a searcher’s intent with what they are searching for whenever they make searches online.
With BERT, Google wants to be able to take search beyond what is being typed into the search engine, to getting the intention of the searcher without considerations for typos or errors.
Now, this means that if someone types in a search query like CAN YOU GET MEDICINE FOR SOMEONE PHARMACY, Google can now understand what the user means is CAN YOU GET MEDICINE FOR SOMEONE AT A PHARMACY?
Before BERT, such search query would actually bring up medical prescriptions etc… The results would be largely determined by what was typed into the search engine.
But with the coming of BERT Update, Google will now understand that the searcher intends to know if medicine can be gotten for someone and will therefore answer the question and send in the appropriate search result.
How will BERT affect Content Marketing?
In a nutshell, here’s how BERT will affect SEO and Content Marketing…
Keywords alone will not determine page rankings as before.
Despite the keyword a person types into the search engine, if your content contextually relates to that keyword, your web page will still be able to rank based on the searcher’s intent.
BERT will enable Google to Match search results with user intent.
With this improvement, typos and grammatical errors will not affect search result as much as it used to.
This is because instead of organizing your search result based on the previously used Google’s word for word understanding of search queries, BERT models can now consider the full context of a word by looking at the words that comes before and after it then matching same to understand context.
This innovation will definitely affect the use of keywords, especially long tail keywords.
With the coming of BERT and a better understanding of search queries, search snippets will change.
This is even as Google will get better at matching the correct snippet with each search query.
BERT and SEO Keywords
The following are ways Google BERT’s update will affect SEO keywords…
Less Keyword, More Semantics.
With the rolling out of BERT, keywords will gradually be losing its prominence as one of the most important page ranking factors.
As Google gets more detailed with BERT, search results and rankings will now shift to focus more on semantics and the concepts behind an idea than on the keywords used on a web page.
With this update, LSI keywords will be more effective in content marketing and SEO, compared to using ordinary keywords repeatedly in a web page.
So if you’re writing content about Google Algorithm Update for instance, to get that content to rank properly, you’re not just going to give a passing mention of the key term with heavy keywords optimization, you’re going to be straight to the point, write richer content and focus on making your work better than others.
You’re going to have to pay more attention to your semantics and topic analysis rather than focusing entirely on keyword research.
With BERT, poorly written content that don’t answer specific questions will not be prominently featured in search results, notwithstanding the keywords contained in such posts.
With the advent of BERT, it is most likely that the use of Schema markup of web pages will help Google understand and rank web pages much better especially contextually.
In a nutshell, the latest Bidirectional Endcoder Representations from Transformers aka BERT update from Google is an attempt to understand the user intent while searching, filter out typos from affecting search results and provide more relevant results to Google users.
Of course BERT may look like all shades of negative for content creators but Neil Patel believes that even if you may lose some search ranking after the BERT update, you’ll still gain at the end of the day as BERT’s understanding of context search will help in sending better targeted traffic to your website.
This will reduce bounce rates, leading to a higher possibility for more conversions.
In conclusion, if you’re going to take anything from here, I want you to take this; with the gradual deployment of NLP, semantic search is going to define the future of search and the search engine.
Have you noticed any changes on your website analytics since the BERT updates? Let’s hear you.
PHOTO CREDITS: seoroundtable.com