How to Ensure Organic Traffic Sustainability During a Major Website Technology Migration

If you ever experienced a major website technology upgrade as an SEO, or currently going through this process, I wrote this case study, which will focus on the process of ensuring your site’s health during such a massive change, the challenges you’ll face, and the solutions I found for a smooth migration.


In my case, the business has decided that one of its big milestones for the year was to upgrade the site’s JS framework from Angular 1.6 to Angular 6 (AKA Angular Universal), while ensuring no regression in organic traffic for the website. And we are talking about a huge site with millions of active users.

Original timelines for this super challenging project were as follow:

Q1 – Q3: JS framework upgrade

Q4: SEO implementation

So, what do you do with this information now? Can we wait with the SEO work while the site works on Angular 6, or need to raise a BIG red flag before development starts? Is the current Dynamic Rendering crawling solution will be supported after the upgrade?

I had no answers at that point, so I decided to take it step by step, and break this complex task to smaller tasks and follow one at a time. Here is what I did:

Step 1: Research

JavaScript frameworks are regularly updated, which is one of the key reasons they become an SEO concern. Google has openly announced that they crawl and render websites using Google Chrome 41, which was originally released in 2015.

This means any JavaScript frameworks that are not compatible with Chrome 41, will likely not be compatible with Google’s rendering capabilities. Well, that was back then, today we have the evergreen Googlebot.

That said, a newer JavaScript framework does not necessarily mean Google won’t be able to render it. New versions of JavaScript frameworks still utilize the same tools and features from previous Angular versions, but with the addition of new features, only available in that specific framework or newer.

This means if we switched to Angular 6, the website could very well be fully rendered by Google depending quite heavily how it was coded, and which JavaScript framework features were used.

Step 2: Site Audit

Switching to Angular 6

Before switching to Angular 6, I had to understand some key elements that might negatively impact organic performance.

All the visible content that is loaded on the page should not utilize the latest tools that were recently released in Angular 6. For example, if we use tools unique to Angular 6 and newer to load the main navigation, the main body etc – there is a high chance that Google will not be able to load the main navigation or main body. Back then, Google only crawled and rendered using a much older browser (Chrome 41).

When switching to Angular 6, I highly recommend only using Angular 6 specific tools to improve how the website interacts with users. Google cannot simulate user-interactions.

Meaning, Google cannot click on a JavaScript enabled tab and see the dynamically loaded content that is loaded upon a user-interaction. Therefore, all interactive functions on the site cannot be used or seen by search engines.

Therefore, all the content loaded on the website should still be fully visible, even when you are using an older browser such as Chrome 41.

DOM Content Load Time

How fast the DOM is loaded can have a critical impact on organic performance.

Unlike humans, computers struggle to understand when a page is fully loaded.

Generally, most tools depend on network activity. For example, if you run a page speed test using tools such as, the tool has no idea when the page is actually fully loaded. Therefore, it monitors network activity and determines the page has fully loaded after 2 seconds of no network activity has been detected.

With this in mind, we need to ensure the DOM is continuously loading with no pauses in parsing the DOM. Be it slow server response times from certain API scripts or waiting for an event to fire upon a user-interaction.

Any delays in loading the DOM or dependencies on a user-interaction will be completely missed by Google’s crawler, resulting in a negative impact on organic performance.

A prime example of this can be seen on website homepage, where the main body module takes a bit too long to load. Be it no network activity for a short period of time, causing Google to think the page is finished and ending the render (most likely), or the DOM load time exceeding Google’s timeout.

Therefore, we must ensure any render blocking scripts are removed or moved to the bottom of the page. The DOM is continuously being parsed without any pauses and load times are extremely fast.

Step 3: Data Analysis, Forecast & Potential Risks

Failing to follow the advice above can lead to significant ranking drops, resulting in a loss of traffic and revenue to the business due to any of these issues:

  1. Loading key content using features only available on Angular 6+

  2. Loading links using features only available on Angular 6+

  3. Pauses in the DOM parsing

  4. DOM full load time taking too long 3+ seconds

In order to understand the impact of ignoring the recommendations above, I have included 3 forecasts to help simulate what could happen. The forecasts were calculated by Avg.

Search Volume, Position, and Estimated Traffic (by Avg. CTR per Position model) per ranked keyword positions 1-10.

Low Impact

Low Organic Traffic Impact Graph

This forecast presumes that certain new features that are unique to Angular 6 and newer versions are used to load small portion of the website.

This include smaller interactive elements on the page and may include some internal links to deeper event pages but does not include main body text, navigation links etc. 

Medium Impact

Medium Organic Traffic Impact Graph

This forecast presumes that certain new features that are unique to Angular 6 and newer versions are used to load several parts of the website, this can include the main body, navigation or other key elements on the site.

This include smaller interactive elements on the page, include all internal links to deeper event pages and in some circumstances main body content.

High Impact

High Organic Traffic Impact Graph

This forecast presumes that many new features that are unique to Angular 6 and newer versions are used to load a significant parts of the website, this includes the main body, navigation and other key elements on the site.

Step 4: Recommendations

Dev team can utilize Angular 6’s latest features to build all elements of the website’s front-end be it for speed improvements or new capabilities under one condition. If we utilize server-side rendering.

With server-side rendering, we will be passing the parsed DOM directly to users and search engines. This means it does not matter if Google’s crawler does not support Angular 6 as it will not need to execute or attempt to render it.

Using server-side rendering for all modules on the website allows the developer to use any framework they desire without any concerns to SEO repercussions. Additionally, it goes without saying that any JavaScript features that load client side which allows the user to interact with the website is not an SEO concern.

To conclude, I highly recommend using server-side rendering.

SEO Checklist for Dev

I wrote down all the required, high level SEO components to ensure developers have a checklist to follow when developing a website in Angular 6, with respect to SEO.Certain features within Angular 6 cannot be interpreted by search engines such as Google.

Therefore, incorrectly coding the website can negatively impact organic performance, impacting traffic and revenue.

I provided the dev team a checklist to ensure Google can fully render an Angular 6 built website.

  • Using Chrome 41, are all the elements on the page fully visible without any user interaction?

  • Angular 6 specific features should only be used to improve the user-experience of a website. For example, after a user has interacted with the website. Due to Google using Chrome 41, we must not use Angular 6 specific features to load any parts of the website, otherwise Google will not be able to render them.

  • All links loaded must contain valid HTML hyperlink tags, for example <a href=”URL”>Anchor Text</a>. These links must be visible within the DOM of Chrome 41.

  • The website must be built to support server-side rendering soon.

  • Ensure the website serves correct status codes. For example, on a nonexistent page, serve 404 status codes and ensure 301 redirects work correctly.

  • Pushstate redirects should be avoided throughout the website.

  • Main navigation links should be fully visible in the DOM without any user-interaction.

  • Tabbed content should be fully visible within the DOM and not dynamically injected upon opening a tab.

  • DEV FAQ – Do we still need server-side rendering if we obey all the rules mentioned in this document?

Yes, primarily because the site is very dynamic, and content is always changing, multiple times per day. Although Google can render websites, it does not do it often. This means a newly published event today could take Google a week to see and index it, by that time the event could well have expired.

Server-side rendering will allow Google to see all the content on the website without having depend on rendering it. Resulting in new events/pages being indexed within a matter of minutes/hours.