December 9, 2017 | Categorised in: Tags: ,

It’s machine learning – the most widely used form of artificial intelligence in search right now – that has driven the brilliance of Google’s algorithms.

AI, essentially, made the Google algorithms too dynamic to understand, throwing the SEO industry upside down a few years ago. No longer could SEO experts come up with formulaic solutions to ranking well in search engine results, leaving webmasters to take another approach to web rankings – publishing high-quality, fresh, relevant and useful content as priority number one. Classic technical SEO inputs like keyword usage, meta description length and content length became secondary.

But, Google and other search engines don’t have a monopoly on machine learning. For years, SEO experts have predicted the possibility that SEO professionals would be able to reverse engineer the algorithms with the same machine learning. Will 2018 be the year that happens?

How AI Changed SEO

In 2013, Google shifted towards the use of machine learning, and then a more sophisticated version of AI, deep learning, to write and classify the rules of search. You could say, artificial intelligence has been writing Google’s SEO rulebook for years – a cypher no human intelligence can, on its own, unravel.

With AI, the appropriate SEO inputs and technical factors are in a continual process of evolution as the algorithms learn, based on billions of users’ activity, how to better match searchers with the content they are after. As a result, there is no way to predict how well a web page will rank because every search result involves its own unique mix of algorithms.

The SEO insights come after the fact. Once webmasters publish content, they can then go back and tweak the SEO and build on their content to try and improve page rank. The days of SEO being proactive and predictive are over.

Reverse Engineering Google

Or are they?

The idea behind reverse engineering is to take the same technology – AI – to figure out how Google’s machine learning model works. Build a machine learning model to understand the original machine learning model. Predict how Google’s model works and then test it against real results.

Each time this is done, new data can be fed into the neural network (otherwise known as AI grey matter), helping it to better connect the dots and to make increasingly accurate predictions. When the reverse engineering model aligns with the results of the real thing, by going back and examining the inputs that led to the relative accuracy, you’d gain insights into Google’s algorithm.

Reverse engineering AI to uncover the original machine learning model is possible – computer scientists at Cornell Tech have already done it.

This has some important implications for SEO in 2018, making AI an important field to watch within search engine optimisation going forward. If computer scientists are able to unravel search algorithms, there is the risk that we’ll go back to a time when some webmasters sacrifice quality content, favouring the reverse engineered insights instead. Let’s be honest. No one wants to see a return of spam.

On the other hand, it also speaks to Google’s foresight. The ability to reverse engineer has been around for a couple of years now but no one has figured out how to steal Google’s AI. The search engine just has to stay ahead of the code crackers, perhaps making another AI leap beyond even deep learning. Which they may just be able to do. After all, relevant search results depend on it.