Is organic SEO dying? It seems that every time a search engine evolves or web marketing best practices change, naysayers start heralding the demise or organic SEO.
2011: Google will kill organic SEO and replace it with paid promotion. Six years later, organic SEO is still the number one source of internet traffic, way ahead of paid SEO.
2012: Are we heading towards the end of organic SEO? Many studies were published arguing that social media would soon replace traditional search engines, but even today, this has yet to come to fruition.
2013: Will SEO disappear in favor of inbound marketing? Inbound Marketing quickly became a big component of SEO and reinforced the idea that content creation and keyword study were the key to good ROI.
While SEO has endured so far, it doesn’t mean it’ll be around forever.
During the 2017 E-Commerce One-to-One conference, Google shocked attendees by announcing the transition from search engines to virtual assistants. Virtual assistants like Siri, Alexa, and Google Home represent the first generation of Artificial Intelligence that will change the way search engines look for content.
There are numerous examples of what these assistants can do, such as:
“Hey Siri – It’s my brother’s birthday tomorrow. What should I get him?”
“Ok Google – How old is Donald Trump?”
“Ok Google – Order four pizzas for lunch and have them delivered at work.”
“Ok Google- Send Matt directions to the restaurant and make a reservation for two.”
What can we learn from this?
As many SEO specialists have found in the last few years, the first changes to search occur in the nature of our requests. Instead of searching for “Donald Trump age”, we ask our AI “How old is Donald Trump?” This is a big difference, but it doesn’t quite revolutionize SEO best practices. Since the end-user typically searches for long sentences anyway (i.e. typing a question), SEO experts are used to working with long-tail requests.
It is also possible for the same web page to be optimized for “Name of Manchester residents” and “What do you call people who live in Manchester?” Google does not necessarily grant the top search result to the website that contains the exact phrase of the request.
With its Hummingbird algorithm, Google is now able to look at the meaning of a sentence instead of just focusing on individual keywords. Suddenly, the content of a web page is much more capable of generating traffic without relying on keyword stuffing.
The real revolution is not in organic search, but in AI
To keep up with the changing landscape of SEO, we must adapt and change the competitive keywords we’re trying to optimize. Most users today are searching using long sentences, so it’s important to focus on long-tail keywords, which generate more traffic.
The question at hand is, who is searching for information?
In the past, it was a user typing a request directly into a search engine. But today, an AI can formulate searches based on a user’s request. The idea that AIs might know what we want before we do is scary, but what will the consequences be for SEO?
The AI searches for and selects information
This change in who is selecting the information will completely change the way websites are optimized for SEO for two main reasons.
“Robot first” or “User first”?
If the AI selects information and presents it directly to the user, we may need to optimize our content and backlinks for the AI and not for the user. This does not mean going back to keyword stuffing, over-optimizing backlinks, or cloaking, but structuring content will become key to good SEO – even if it means sacrificing the UX/UI of a website. In the era of personal assistants, users are retrieving information without even looking at the websites that host it.
The user experience on Google is as follows:
- The user types or dictates a keyword or phrase on Google
- The search engine ranks the results and chooses the most relevant ones
- The user scrolls through the results and selects the one they want
- The user visits the website and evaluates its relevance
- The user then stays on the website or selects a different result
Google does not know if the human is satisfied or not. All it can do is analyze the user’s behavior on the page and deduce the level of satisfaction based on those results.
With personal assistants, the user experience is as follows:
- The user states a request out loud, or the machine suggests something on its own (e.g. “It’s your sister’s birthday soon, shall I buy her a gift?”)
- The machine searches several databases for the information it needs
- The machine chooses, among several websites, the most relevant based on the user’s habits
- The machine decides and presents the results to the user. It may also ask for confirmation or clarification regarding the request (e.g. the location where the user is looking for a restaurant)
Where does the AI get its data?
It is difficult for an AI to identify the price of a product, its description, its color, and different models available on the same webpage.
Google is trying to fix this problem in several ways:
- By using Rich Snippets that provide more context. The webpage will give the AI more details on its content (reviews, prices, dates of an event, etc.)
- By improving the semantic analysis of the AI. AI bots have made incredible progress in understanding human requests. However, it is easier for a robot to understand a spoken request than text on a webpage. There are two reasons for this. First, the text wasn’t written for the machine to understand and can contain numbers for the price of a product, as well as shipping costs or the prices of related products; this can be very confusing for the machine. The second reason is that the machine can’t decipher irony or nuance.
We need to start structuring our content to make it more AI-friendly, or else we won’t be able to reach the steadily-increasing number of AI users in the coming years.
Qualitative data: a key element to analyzing AI behavior
The objective of an AI personal assistant is to give the most accurate response to a user’s request. To achieve that goal, it will rely on the user’s past searches to select the most pertinent results.
It is also possible that the Google algorithm will soon select content via qualitative data. For example, the AI will be able to differentiate customer reviews left by users with a similar profile (i.e. age, location, work industry), from reviews left by users that are completely different.
To be more attractive to an AI, it is going to be more and more important to create content that is extremely precise and complete. Unlike humans, AIs are not limited in terms of the amount of content they can read and process, but the data will need to be well-structured to yield the best results.
In 2020-2025, the quantity and quality of information will be key to having good organic SEO and conversion rates.
In the age of virtual assistants, APIs are the key to efficient search
APIs are a more reliable source of data and can be directly used by AIs. An API works by giving users access to information contained in a database. It was specifically designed to make machines work together seamlessly. In the age of AI personal assistants, an e-commerce website will need an API that allows them to read and write the database easily.
For example, an AI might ask the API for information to decide if the page’s content is relevant or not. Then, once the AI receives the information and decides that the product is relevant, it will be able to trigger the buying process. The easiest way to do that would be to automatically create an account through the API with all the payment information. The AI could potentially ask the user if the product is suitable, and if not, cancel the order.
When AIs talk to each other
The next step will be AI coordination. Instead of communicating with an API, the artificial intelligence conducting the search will simply ask another AI (e.g. the AI in charge of the e-commerce website) to supply it with the data it needs. The notion of browsing a website could disappear and give way to groups of standardized content written just for AI. For example, let’s say you own a restaurant in Chicago; you could simply grant access to an AI (e.g. through the robot.txt file) capable of communicating with other AIs or humans (through live chat, for example) and provide information about your business automatically.
This system of interconnected APIs could eventually lead to more complex behavior from the AI: “Matt, I noticed that you left your meeting later than expected. Your next meeting is in 20 minutes, so I requested an Uber for you. The car is waiting downstairs and you should arrive by 4:02pm. I have called Lisa Brown to inform her that you’ll be five minutes late. Her assistant will have a coffee waiting for you.”
How to prepare for these changes?
Zapier or IFTT allows you to connect the apps you use every day through their API to automate tasks.
1) Include every bit of data that you can (e.g. rich snippets).
3) Use AI at home or at work (e.g. chatbots, virtual assistants, etc.).
4) Keep working on the fundamentals of SEO; machines will always need to classify information according to its pertinence. Semantic analysis is more relevant than ever and online reputation (e.g. backlinks) will always be the core component of Google’s algorithm.
5) Be as precise and descriptive as possible when writing your content.
6) Use CMS platforms with large communities; their APIs will get better and better with time. If you own a specific technology, create your API to prepare for the massive influx of AI in the coming years.