With the announcement from Google earlier this month that Googlebot now supports Chromium rendering engine version 74, we all sat back and breathed a collective sigh of relief, as this should mean now that we have to spend less time on technical workarounds for Google’s crawlers.

Classed now as “Evergreen”, Googlebot will always be using the latest version of Chromium and can now access more websites using modern technologies and features such as ES6 and Web Components v1 APIs. Whilst there will always still be limitations on what Google can see and crawl (including certain JavaScript web components and WebGL), it will now see more website content than ever before, without the need to make special workarounds to cater for SEO requirements.

Google have released a video series detailing their advice for JavaScript SEO on YouTube which you can find here and their list of JavaScript issues that still exist here.

What does this mean for website builds?

Whilst this change is welcomed, there are still things that developers, SEOs and creative designers need to bear in mind when building websites;

  • We still need to ensure that websites are well built in a search engine friendly manner and have all the basic elements that search engines use to crawl and rank, such as meta content, JSON-LD Schema, fresh content functionality and Header tags.
  • As Google can now see more content, this also means that website copy should be original, unique and compelling, now more than ever before. Content as a rule of thumb should always be of high quality and written for users, adding value to the website instead of “filler”. Tools such as Copyscape can be invaluable when trying to protect against duplicate content problems.
  • Schema markup will continue to rise in importance as new Types are added to the Schema library, with this being used to return more informative information for Google searchers in the result pages. The move to become “Evergreen” should mean that Googlebot will never struggle to read Schema markup no matter how it is presented.
  • SEO advisors should not be laid back about indexing, as the new engine does not shortcut the indexing render queue for reactive code.