Have you met BERT ?
Search is all about understanding language. Therefore Google introduced BERT an open-sourced neural network-based technique for natural language processing in 2018 (see also elsewhere in Mediafirst). This enables them to train their own excellent question answering system.
BERT (Bidirectional Encoder Representations from Transformers) is applied to ranking and featured snippets in Google search and this helped search better understand 1 out of 10 searches performed in English, more languages will be added over time. This is mainly the case for conversational queries with prepositions as “for” and “to” which have a great impact on the meaning. This will help Google Search understand the context of the words in a query. So users can now search in a way that feels natural to them.
What does this mean for your business? In the first stages there is not really a change happening. As soon as the people adapt their way of searching, advertisers will need to update their own content strategy and align it with new upcoming search queries. For SEA, your experts will adapt new queries when optimising the campaign and there is no need for a change in strategy.