CAll Us: +1 888-999-8231 Submit Ticket

Which JavaScript Features Does Googlebot Support?

Which JavaScript Features does Googlebot supportGoogle’s web crawlers have supported JavaScript for many years, but “support” is an ambiguous term. JavaScript is a fast-moving target and although Googlebot can process many JavaScript features, it is possible to do a lot of damage to a site’s SEO by relying on features that the crawlers can’t handle.

The issue of JavaScript-support has become increasingly important over the last few years. The release of client-side web frameworks like React, Vue, and Angular caused a move away from traditional server-side rendering.

Today, developers often prefer to render content in the browser with JavaScript. The server sends an initial HTML app shell and the site’s JavaScript code; everything else is loaded from an API and rendered by the client-side code.

WordPress’s REST API was built for just this type of use. Magento has enthusiastically embraced Progressive Web Apps for eCommerce front-ends. Progressive Web App frameworks are available for WooCommerce.

But if a WordPress site or Magento or WooCommerce store uses JavaScript to render its content, that content will be invisible to crawlers that don’t support the relevant features.

Which JavaScript Features Can Googlebot Understand?

In a nutshell, Google’s crawlers can understand ES5 and older. It cannot understand the new features added to JavaScript in ES6, which is a shame because ES6 is awesome.

ES stands for ECMAScript, the name of the standard on which JavaScript implementations are based. Each release is given a number. ES5 was released in 2009 and introduced various new Object methods and many of the Array methods – such as array.map and array.reduce – that are much loved by JavaScript developers with a functional bent.

ES6 was released in 2015 and brought some big changes to JavaScript, including classes, arrow functions, object destructuring, promises, the let and const syntax, iterators, generators, and a lot more.

But here is the problem: at the time of writing, Google’s crawlers are based on Chrome 41, which was released just before ES6. They do not support any of the whizzy new features. That wasn’t much of a problem previously as most of the browsers in use didn’t support them either. Developers were required to stick with older features or to transpile their JavaScript to ES5 using Babel.

But as browser support has improved, developers, especially library developers, are releasing production code that uses a lot of ES6 features, and if web developers are careful, they can introduce code that Google’s crawlers don’t understand.

Three Solutions To Google’s Language Barrier

There are three basic solutions to Google’s non-comprehension of ES6:

  1. Don’t release ES6 code to production: transpile it to ES5 or avoid it altogether.
  2. Stick to traditional server-side rendering with Node, PHP, or other server-side languages.
  3. Render the initial load of a client-side application on the server. This is a popular choice and frameworks like Next.js make it fairly straightforward. The initial view, content included, is rendered on the server and sent to the browser. Thereafter content is loaded by the client-side application in the traditional way.
  4. Google has recently introduced a new option: dynamic rendering. With dynamic rendering, content sent to the web crawler is rendered on the server, but content sent to browsers is rendered on the client. If that sounds like a breach of Google’s rules against sending different content to browsers and crawlers, that’s because it is. But Google is making an exception because it recognizes that its crawlers are holding developers back.

It is likely Googlebot will lag behind browsers for the foreseeable future, so if you want to embrace Progressive Web Applications for your business or eCommerce store, it’s worth taking the time to understand exactly what the crawlers understand.

Posted in:
Hostdedi

Source link