From the earliest days of the search engines, there has been an on-going battle between the search engines, Google in particular, and the agencies that aim to be able to second guess the algorithms that run those search engines. The agencies hope to promote their clients to the top of the Google rankings whereas Google is trying to provide the best results for their users and sell the Adwords that rank alongside them - which even now provides nearly 85% of Alphabet's total revenue.
For example, back in the day, there was a technique where you aimed hundreds of keyword-focused URL's at a particular website, a version of 'Google Bombing'. It worked brilliantly well, until Google started black-listing the firms that did it. This process of new technique and response has been repeated hundreds of times. Each time a technique works, it becomes common practice among Search Engine Optimisation (SEO) agencies and then Google takes steps to punish those techniques and the websites that use them. The more aggressive of these techniques are sometimes known as 'black hat'. As opposed to simply ensuring that your site is well designed for the search engines to list - black hat techniques work actively against the search engines to artificially inflate the results.
Until recently, Google had two clear strategies to stop the SEO agencies unfairly gaming the results. One was to change the algorithms extremely often - over 3000 times a year according to The Drum article below - to, as it were, outpace the pursuing agencies. And second was to run more than one algorithm at any one time so results varied even in identical situations, thereby weakening the value of any given technique.
Until now agencies could argue, though sometimes with scant evidence, that they understood the algorithms and so could game them. But it seems that this avenue is now closing (or is simply closed) with the BERT update in November, which is now using natural language processing and machine learning techniques to drive the search responses. As before, the aim is to track the outcome of each search to see whether it is successful and to adjust any subsequent query. Google, through its free Analytics package that is used on some 52% of all websites, can easily understand both the query and the value of the output, and track the whole reader journey on a vast scale (it's scary but it's true).
With no central algorithm, it will become extremely difficult, if not impossible, to game this process. And in any case, if the query gives a result that does not satisfy the reader then it will adversely effect subsequent queries so any successes in gaming the system will be fleeting.
So this leaves the big problem of how to ensure that Google is your friend and that you can be found. I think there are two simple methods.
Firstly, in 2018, Google released their web.dev tool to that you can easily make sure your website is technically optimised in four areas - speed, accessibility, best practice and, also, optimised for Google to list your site.
Second, you need to create content that is human-centric. If your readers click through from Google and remain on your site because they find your content valuable, then Google will know that your site is useful. And rank it higher as a result. And if your content is well shared then Google can follow those links - and you rank higher as a result.
In the world of business-to-business professional services, the value of SEO-driven eyeballs has always been disputed, clickbait - "London's Top 20 Hottest Lawyers" - and keyword-stuffed articles might well entice users to your website but whether they are the same people who are going to ask you to advise on the sale of their engineering company is a very different point.
The Drum article refers to the SEO industry becoming 'obsolete' with these new changes. In its present very narrow form, chasing any click as the currency of success, I do not think that is an overstatement.