Why BERT Is a Big Deal, But Also Not a Big Deal
We are both fascinated and intimidated by artificial intelligence. We feed AI by seeding search engines with billions of queries every day. We even try to teach AI to flirt and end up with weird but endearing pickup lines like, “You’re so beautiful that you say a bat on me and baby,” and, “You look like a thing and I love you.” But — we also stiffen at the thought of AI stealing our jobs.
As digital marketers (and especially at GPO), we rely on Google’s AI-powered algorithms to get our content and ads in front of the right users at the right time, even if the user’s query isn’t an exact match to the words in the content. We expect the search engine to understand the purpose and value of our content. Most of the time, it does. BERT’s goal is to help when it doesn’t.
To understand BERT, you need to understand NLP
Computers are great at reading text but not good at understanding language.
Natural language processing (NLP) works to bridge the gap between reading and understanding.
Researchers have already developed NLP models that help computers understand specific types of language. Examples of successful NLP include entity recognition (being able to tell the difference between a person, time, organization, location, monetary value, etc.) and sentiment analysis (recognizing attitude and tone).
Moz’s Britney Muller explains that each piece of successful NLP is like a kitchen utensil. Your whisk is excellent at whisking, but don’t ask it to chop. Likewise, your food processor can slice and dice and grind, but don’t expect it to grill a sandwich like a panini press!
BERT is the one kitchen utensil that does it all — the Swiss Army Knife of kitchen tools!
So, what’s BERT?
Instead of looking at the meaning of words one-by-one and in consecutive order, BERT looks at words in relation to all the other words in a sentence.
“BERT models can therefore consider the full context of a word by looking at the words that come before and after it,” notes Google.
Here’s an example of a search engine result page (SERP) that Google tested with and without BERT.
As a human, if you saw the query, “2019 brazil traveler to usa need a visa,” you’d probably recognize that a Brazilian wanted to travel to the U.S. and needed a visa. Before BERT, Google wouldn’t get that. It would return a news article about U.S. citizens traveling to Brazil.
BERT, however, grasps the nuance and placement of “to” and serves a result for tourists traveling to the United States — just what the searcher wanted!
Going forward, Google estimates that BERT will impact one in 10 searches in the U.S., and primarily improve long-tail conversational queries where prepositions and context are pivotal to accuracy.
What does that mean for you? Next time you ask Google if you can pick up medicine for someone at the pharmacy, Google will show you a result titled “Can a patient have a friend or family member pick up a prescription,” instead of general results about filling prescriptions.
That said, BERT still needs some fine-tuning.
BERT’s great, but not perfect
BERT is still far away from understanding language and context in the same way that we humans can understand it, believes Allyson Ettinger, Natural Language Processing researcher at the University of Chicago.
Ettinger found that the BERT model “struggles with challenging inferences and role-based event prediction–and it shows clear failures with the meaning of negation.”
BERT can’t understand what things are NOT. For example, BERT knows that a Robin is a bird. Great. But when asked to predict what a Robin is not…BERT also predicted a bird.
BERT struggles with layered inferences, too. For instance, you can search for “what state is south of Maine,” and you’ll get results for South Portland, Maine, when the answer you wanted was “New Hampshire.”
Can you optimize for BERT?
Nope. There’s no magic potion for pleasing BERT. You can’t optimize for it, but you can write towards it.
“The only way to improve your website with this update is to write really great content for your users and fulfill the intent that they are seeking,” recommends Muller.
Similar to the June 3 algorithm update (and most other broad core algorithm update), BERT is about improving relevance and enhancing Google’s ability to connect search queries to the right content. It’s not about targeting specific websites or verticals.
Google’s recommendation echoes Muller’s:
Google believes you should be able to search in a way that feels natural to you. BERT gets us one step closer to AI having a real degree of language understanding, and leaps and bounds closer to seeing SERPs that are so relevant to our query that we can’t help but say, “You look like a thing and I love you, Google.”