close
close

Google JavaScript Warning and Its Link to AI Search

Google JavaScript Warning and Its Link to AI Search

A recent discussion within the Google Search Relations team highlights a challenge in web development: getting JavaScript to work well with modern search tools.

In Google’s latest Search Off The Record podcast, the team discussed the growing use of JavaScript and the tendency to use it when it’s not needed.

Martin Splitt, Search Developer Advocate at Google, noted that JavaScript was created to help websites compete with mobile apps, introducing features like push notifications and offline access.

However, the team warned that excitement over JavaScript functionality can lead to overuse.

Although JavaScript is handy in many cases, it is not the best choice for every part of a website.

The JavaScript spectrum

Splitt described the current landscape as a spectrum between traditional websites and web applications.

He says:

“We’re in this weird state where websites can be just that: websites, basically pages and information presented on multiple pages and linked, but it can also be an application.”

He offered the following example from the JavaScript spectrum:

“You can visit an apartment in the browser… it’s a website because it presents information like the square footage, what floor that apartment is on, what the address is… but it’s also an app because you can use a 3D view to navigate the apartment. .”

Why is this important?

John Mueller, Google Search Advocate, noted a common trend among developers to rely excessively on JavaScript:

“A lot of people like these JavaScript frameworks, and they use them for things where JavaScript really makes sense, and then they wonder, ‘Why don’t I use it for everything?’ »

While listening to the discussion, I remembered a study I covered this weeks ago. According to the study, over-reliance on JavaScript can lead to potential problems for AI search engines.

Given the growing importance of AI search bots, I thought it was important to highlight this conversation.

Although traditional search engines generally support JavaScript well, its implementation requires greater consideration in the age of AI search.

The study finds that AI bots account for a growing percentage of search bot traffic, but these bots cannot render JavaScript.

This means you could lose traffic from search engines like ChatGPT Search if you rely too much on JavaScript.

Things to consider

The use of JavaScript and the limitations of AI crawlers present several important considerations:

  1. Server-side rendering: Since AI crawlers cannot run client-side JavaScript, server-side rendering is essential to ensure visibility.
  2. Content Accessibility: Major AI crawlers, such as GPTBot and Claude, have distinct preferences when it comes to content consumption. GPTBot favors HTML content (57.7%), while Claude focuses more on images (35.17%).
  3. New development approach: These new constraints may require reevaluating the traditional “JavaScript first” development strategy.

The way forward

As AI crawlers become more and more important for website indexing, you need to balance modern functionality and accessibility for AI crawlers.

Here are some recommendations:

  • Use server-side rendering for key content.
  • Make sure to include basic content in the initial HTML.
  • Apply progressive improvement techniques.
  • Be careful about when to use JavaScript.

To be successful, adapt your website to traditional search engines and AI crawlers while ensuring a good user experience.

Listen to the full podcast episode below:


Featured Image: Ground Photo/Shutterstock