Google Search now requires JavaScript

Jono Brain LinkedIn

Founder & Technology Director

Google has announced that, starting January 2025, JavaScript must be enabled to use its search engine.

As shown in the image below, Google Search is inaccessible if JavaScript is disabled.

Why has Google done this?

Google has rationaled the change to help improve security.

Enabling JavaScript allows us to better protect our services and users from bots and evolving forms of abuse and spam

Essentially, by requiring JavaScript, Google can more effectively identify and block bots, scrapers, and other malicious activities that manipulate data or interfere with search results.

Does this make sense?

In a word, yes. The vast majority of users already have JavaScript enabled. Back in the days of basic browsers, or before the rise of smartphones, it made sense to support devices with limited or no JavaScript capabilities. However, in today’s world, almost every device with a browser supports JavaScript by default.

In fact, if you encounter a device that doesn’t support JavaScript, there’s a good chance it’s being used for less-than-legitimate purposes. This move reflects the evolving web landscape and prioritises both user security and functionality.

For the average user, this change is unlikely to have any noticeable impact.

So who is likely affected?

Bots and scrapers often operate without executing JavaScript, making them all of a sudden unable to access Google Search which is primarily the reason Google has made this change.

However, not all bots and scrapers are harmful. In fact, some well-known tools and services like Ahrefs, SEMrush, and Moz, rely on bots to scrape Google search results for analysing keyword rankings and SEO performance. These platforms were forced to quickly upgrade their crawling technology to support JavaScript rendering to continue operating effectively.

What are these bad bots and scrapers?

There are multiple reasons why bots and scrapers are used, but Google’s primary focus with this change is to crack down on malicious tools designed to exploit its systems. One major target is bots used to exploit pay-per-click (PPC) advertising by artificially generating clicks on competitors’ ads. This tactic wastes competitors’ ad budgets and skews their campaign analytics, creating unfair competition and financial losses.

Another common misuse involves bots designed to manipulate Google’s search rankings. These bots send fake traffic to specific pages to artificially boost their rankings, which can distort the fairness of search results. Additionally, some bots are used to spam Google with false reports about a competitor’s site, aiming to trigger penalties and harm their SEO performance. By requiring JavaScript, Google aims to mitigate these harmful practices and protect the integrity of its search engine.

Will this affect how my website is indexed?

The short answer is no, your website can still be indexed whether it uses JavaScript or not. However, Google is increasingly prioritising websites that rely on JavaScript by executing it during indexing.

If your site relies on JavaScript to load key content, such as product descriptions, text, or links, Google must execute the JavaScript to access and index that content. If the JavaScript isn’t properly optimised or doesn’t render correctly, important parts of your site could be missed, potentially impacting your rankings.

That said, relying solely on client-side rendering (where JavaScript runs in the browser to load content) is not ideal. Google’s crawlers may take longer to fully index your pages since they need to render the content first, which could delay how quickly your site appears in search results. We strongly recommend using server-side rendering (SSR) or dynamic rendering to ensure all content is visible to Google immediately, improving both indexing speed and accuracy.

5 key steps to ensure your website is indexed optimally by Google

To ensure your website is indexed efficiently and ranks well on Google, focusing on technical and performance factors is crucial. Here are five key areas to optimise:

1. Core Web Vitals

Google heavily emphasises Core Web Vitals as a measure of a website’s user experience. These metrics focus on how quickly and smoothly your website operates. For example, Largest Contentful Paint (LCP) measures how fast the main content of your page loads, First Input Delay (FID) tracks how quickly users can interact with your page, and Cumulative Layout Shift (CLS) ensures that elements on the page don’t unexpectedly move around during loading. By improving these metrics, such as through optimising images, minimising JavaScript, and improving server response times, you provide a better experience for users while boosting your chances of being indexed favourably by Google.

2. Server-Side Rendering (SSR)

Server-Side Rendering (SSR) is critical for ensuring Google can easily crawl and index your website. With SSR, your server processes the page and delivers fully rendered HTML to the browser. This eliminates the reliance on JavaScript for Google to access your site’s content, reducing the risk of missed information during indexing. If your site depends heavily on JavaScript, SSR ensures that your key content is available to Google immediately. Alternatively, dynamic rendering can be used, where pre-rendered pages are sent to search engines, while regular users interact with the client-side rendered version. This approach ensures your site remains search engine-friendly without sacrificing functionality.

3. Structured Data

Structured data, implemented through schema.org markup, helps Google understand the content on your site more effectively. By adding structured data, you enable Google to display enhanced results, such as rich snippets, featured snippets, or knowledge panels. For example, product schema can showcase pricing, availability, and reviews for eCommerce sites, while FAQ schema can highlight common questions directly in search results. Structured data not only helps with better indexing but also improves how your content appears in search results, making it more engaging and increasing click-through rates.

4. Mobile-First Optimisation

Since Google uses mobile-first indexing, the mobile version of your site is the primary version Google uses for ranking and indexing. This makes mobile optimisation a must. Your site needs to be fully responsive, ensuring that it displays properly on devices of all sizes. Additionally, mobile performance must be fast, especially for users on slower mobile networks. Avoid intrusive pop-ups and interstitials that could harm the user experience, as they can negatively impact rankings. Regularly test your mobile site using tools like Google’s Mobile-Friendly Test to ensure it performs seamlessly.

5. Crawlability and Indexing Hygiene

To make sure Google can efficiently crawl and index your site, focus on improving your website’s overall crawlability. Start by creating and submitting an XML sitemap to Google Search Console to guide crawlers to your site’s most important pages. Check your robots.txt file to ensure that you’re not accidentally blocking Google from accessing critical content. Address broken links and 404 errors, as they can disrupt the crawling process. Additionally, use canonical tags to indicate the preferred version of a page, helping Google avoid duplicate content issues and ensuring your site is indexed as intended.

How healthy is your website?

If you’re unsure how well Google is indexing your website or are facing challenges with Google Core Web Vitals, we’re here to help. Get in touch for a free website health check or book a 30-minute consultation with our Technology Director, Jono Brain.

Poor Performance?

Struggling with Google Core Web Vitals?

We understand that a fast and accessible website is essential for user experience and SEO, so we focus on creating ultra-fast, optimised websites with outstanding Core Web Vitals scores.

Learn how we upgraded Amtico's sluggish, outdated Umbraco 7 website into a cutting-edge solution powered by Storyblok, outperforming competitors with a less than 1-second loading time.

Book a 30 minute Headless CMS demonstration

With our Technical Director, Jono Brain