(This is both an SEO short and a vital update so will be on both pages. Read time 1-2 minutes. Level beginner/intermediate)
For a long time Google has tried to understand language. They want to understand jargon so that you have appropriate readability and content for your core audience. This is why plugins and readability tests like Yoast’s and Flesch-Kincaid are useless (as I have already established).
In the largest content update since Latent Semantic Indexing (LSI) 5 years ago, Google are now implementing Bidirectional Encoder Representations from Transformers (BERT). It sounds complex but Google just like long words. Allow me to break down Google’s two most important updates in 5 years.
Google is trying to achieve sentience. Leave your microphone on and vocal recognition makes ads more relevant. BERT is similar. It is a nuero network (AI). It aims to understand what your content is about, who it is aimed at, and if you are using the right language (jargon) for your demographic. I have been unpopular saying it’s good to use jargon. Speak in your clients language; it gives your clients a sense of community and faith you understand them. It shows you are competent and know the lingo. Google has wanted to know the lingo since LSI.
What is LSI?
Latent Semantic Indexing is an algorithm rolled out in 2015. It was revolutionary. It:
- Helped prevent keyword stuffing.
- Started to understand language.
- Learned that a cat might have sat on a mat. It tried to understand what words fit together and what is overoptimised gobbledygook.
LSI understands apple is a fruit – so linked to nutrition, but also a brand, and a company. You could talk about I-Phones and it would know your content relates to Apple. You could talk about apple pie and LSI knows you are probably giving an apple based recipe or review. This was and still is huge.
Google said we have had enough of people writing for the engine rather then the audience. That makes search less relevant. BERT is the next step forwards.
So what is so revolutionary about BERT?
Bert is similar to LSI. It is Bidirectional – 11 new factors which are being kept very hush hush. The main goal of BERT looks at our apple example. It doesn’t try and understand the word per say. It tries to understand the whole page based on the cornerstone content (identified by LSI) by looking at context.
If our apple page speaks about vitamins, chances are you are into nutrition. If it mentions flower (also got many meanings – maybe you made a phonetic spelling mistake), sugar (also has many meanings) and eggs (I am repeating myself), BERT analyses these words. Without needing schema it can guess that we have a recipe.
So very long words for a simple concept (if you know anything about SEO) with very complex code on their part. This will rock the Search Engine Results Page. There is plenty of time to prepare. For once Google warned us in advance of a large update. My guess is they have become tired of saying write for your audience! or have become desperately tired of spam and keyword stuffing.
Leave a Reply