A couple of days ago I was giving a presentation on chatbot building and the business opportunities that they represent (also the subject of a free chatbot course I created).
I was very fortunate to have an engaged audience that asked many great questions. One of which is particularly relevant to this blog. An attendee asked, “What is the impact of chatbots on SEO?”.
What’s the impact of chatbots on SEO?
My answer was that chatbots are so new it’s unlikely Google has already put algorithmic changes in place that are specifically targeted at chatbots.
All we have to go by are the SEO rules we already know. From these rules, we can deduct the likely impact of chatbots on SEO.
Google does not interrogate chatbots like a human would, in order to crawl their content. So you should have the same information available on the site in traditional pages as well.
Provided you do that, chatbots are a net positive because they increase engagement and time on page, while lowering bounce rates.
All great signals of a quality site, which Google will reward with better rankings in the result pages.
How do you future-proof your blog for Google?
The question, and the answer I developed on the fly, led me to consider the more general question of how to future-proof sites against Google’s constant algorithm changes.
Google fine tunes its algorithm with minor updates hundreds of times per year. Once in a while they’ll roll out major updates that drastically change results and in the process impact millions of site owners.
You might be familiar with classic updates such as Panda, Penguin, and Hummingbird.
We can predict the future with a higher degree of accuracy if we first study the past.
The general trend that Google has shown over the years has been “serving the end user”. They have increasingly rewarded sites that provided value to the user, and penalized those that did not.
For example, Google loves long content because it is more likely to provide the answer that the user is looking for, or to inform them thoroughly on a given topic.
When users click on a result and stay on the site for a long time, it’s an indication that the site has something valuable to offer to the user.
Speed helps the user. A responsive design that works well on mobile phones and tablets helps the user.
Artificially repeating the same keyword the user searched for 50 times on the page does not help the user in any way. This so-called “keyword stuffing” approach worked when Google’s ranking algorithm was much simpler, but today it’s something that will actually penalize your site.
Do what’s best for the user
Your North Star, and the only way to future-proof your blog against Google’s algorithm changes is to do right by your readers. When making a choice that could affect rankings down the line, ask yourself, how will this help my users?
It would be very uncharacteristic of Google to penalize you for helping your users further. If anything, you are likely to be rewarded to a greater degree for this, as the algorithm becomes even more intelligent and refined.
At any rate, it’s the right thing to do, so do it anyway and trust that Google will take notice in the future.
Get more stuff like this
Join thousands of subscribers and followers who are taking their blogging to the next level.
Thank you for subscribing. Please check your email to confirm your subscription.
Something went wrong.
Leave a Reply