Website Redesign: 5 SEO Checks To Save and Increase Your Traffic

Website Redesign: 5 SEO Checks To Save and Increase Your Traffic

Here Matteo Monari provides his expert guide to getting the right strategy and checks in place to protect against losing traffic in the process of changing your website design.

Published 29th February 2016

With more and more users switching from desktop to mobile devices as their favourite way to access the Web and mobile searches1 surpassing desktop searches, many websites still stuck in the desktop-only world are rushing towards mobile-friendly - possibly even mobile-first - redesigns. Most changes in the design (i.e. front-end) of a website are potentially critical in terms of SEO, and if we consider the fact that very often redesigning a website also means re-platforming it (e.g. changing its backend and URL structure as well) we can easily understand why any action of this kind entails several SEO risks. Long-term benefits following a good redesign are often significant, but in the short and middle-term, even small mistakes in the redesign and migration processes can lead to significant traffic loss. To help you limit these risks and improve the SEO benefits of a website redesign, here are 5 SEO safety checks you should always perform before launching a new design.

1. Identify your most important pages and sections

Websites can have hundreds, thousands and even millions of pages, but most of the time the value of their content follows a typical 80-20 distribution rule, in which 20% of the pages are responsible for 80% of their traffic and revenue. More precisely though, the most important pages of your website are not only those with the highest amount of traffic, but those with the highest value. This can be assessed in number of different ways – traffic, but also rankings, persuasion, conversions and links. Before launching your newly designed website - possibly even before starting to develop it - you should make sure to know what the most important pages of your site are, not only in terms of traffic and conversion (i.e. sign-ups and leads) but also in terms of rankings, links and SEO value. Here are four questions you can ask yourself to identify such pages:

  1. Which pages are the most popular landing pages for SEO traffic?
  2. Which pages are key steps in the conversion funnel?
  3. Which pages drive most conversions?
  4. Which pages have received the most links in the past?

Ahead of starting your redesign you should answer these questions and make clear lists of your top pages. As a matter of fact, in your redesign you will want to pay special attention to their layout, hierarchy and migration, in order not to lose their precious traffic, persuasive power, conversions and SEO value. From a technical point of view, questions 1 to 3 can be answered by your analytics software, while for question 4 you can use SEO tools like Majestic2 or Moz3 , which can provide you with lists of your site’s most linked-to and “powerful” pages (See Figure 1).

figure 1_2.PNG
Figure 1: Majestic analysis of most linked-to pages

2. Make sure your new design has no obvious SEO pitfalls

If you are changing your old website for a new one, it goes without saying that the latter should satisfy all basic requirements in terms of SEO. As a quick checklist, here is a non-exhaustive list of macroscopic mistakes you may make when redesigning your site:

  • Moving content to login-protected areas.

As search engine spiders are not registered users, it would be impossible for them to access any area of your site that requires a username or password in order for users to access it. Therefore, you should consider replicating any “hidden” piece of fundamental information in a publicly accessible area of the site.

  • Generating key areas of content using AJAX and JavaScript.

Despite recent advances in crawling and indexing JavaScript-based content4 , HTML-based content remains a safer and easier-to optimize choice in terms of SEO. Search engine spiders may encounter problems in interpreting client-side languages such as JavaScript and JavaScript-based technologies such as AJAX. Therefore, it is advisable to limit the use of these technologies in key areas of the site, making sure all relevant content and links are accessible also in alternative ways.

  • Basing navigation on drop-down menus and search boxes.

During their discovery process, search engine spiders browse from page to page by discovering and visiting HTML links they find. In most cases, spiders may have problems in choosing elements from drop-down menus or filling in search boxes and complex forms. Therefore, all information accessible only via drop-downs and search fields may be hard to reach for search engines and should be made available also via alternative navigation paths based on HTML links.

  •  Creating redundant sections.

Keyword cannibalization occurs when several pages focus on the same set of keywords, confusing both users and search engines. For example, in the case your redesigned site contains a few additional sections dedicated to poker tournaments, probably the best thing to do is to keep just one generic section about tournaments and to focus other sections on specific kinds of these, so that each page and section makes use of a specific set of mid- and long-tail keywords. This way, both users and search engine will know exactly what each section and page is about and what is the best area of your site for every specific kind of tournament, without getting confused.

  • Changing the way in which metadata is managed.

As you probably already know, HTML titles and meta descriptions are key element of any webpage in terms of SEO. However, different front-end templates, CMS systems and back-end solutions handle these highly sensitive elements in different ways, and sudden changes in them can damage your website rankings and CTR rates. Therefore, it is generally a good idea to keep your metadata unchanged during a website redesign. As said, this is just a quick list of common macroscopic mistakes one may make. Before deciding on any new design, architecture or platform it is always recommended to perform a thorough SEO assessment of it.

3. Benchmark your old and new design’s performance

Once your new design is ready in a staging environment (i.e. a test area) you should make sure it is not slower than your former one, as this can impact on your users’ experience and ultimately also SEO rankings5 . As we mentioned in the first section of this article, a redesign often involves changes both in the front-end (i.e. what HTML, CSS and JS code composes your site) and in the back-end (i.e. how the code composing your site is generated). Therefore, before launching a new design it is always advisable to test both back-end and front-end performances extensively. A tool which can help you get an idea of how your new design’s back-end performs in terms of speed is ScreamingFrog6 , a crawling software which can massively “call” your site’s resources and show you how long it takes before you server delivers them (see Figure 2).

figure 2_1.PNG
Figure 2: URI’s response time breakdown in ScreamingFrog

Another tool which can be extremely helpful in determining the speed of your new design (in both its back-end and frontend components) is Google Pagespeed Insights7 , which will also offer you specific advice on how to improve your site. After the new design’s performance has been optimized and tested in a staging environment, it is always advisable to double check it once the new site is live on the server that will host it permanently. As a matter of fact, the server hosting the site and its actual “live” load also affect a site’s performance. Luckily though, production servers are typically better performing than staging environments (see Figure 3).

figure 3_1.PNG
Figure 3: A multilingual website’s URI’s average response time in seconds before redesign, during staging and post-redesign.

4. Benchmark your old and new design’s depth and internal linking structure

When crawling websites, search engines need to carefully manage their resources in terms of energy and computational effort. Therefore, they like information to be made easily accessible to them, and may decide to stop crawling a site when they think they already dedicated enough energy to such task. As a consequence, sites should make all their most important content reachable from the home page in the lowest possible amount of clicks. This normally translates to a flatter, “horizontal” site structure being preferable to a deep, “vertical” one. Also, pages that are more heavily linked within the site will acquire more “SEO value” than others, and possibly rank more easily. Considering the above, a redesign offers a good chance for lowering a site’s depth and making its pages easier to crawl, possibly also improving its internal linking structure in favour of key pages (see section 1). Above all though, a new design should definitely not make internal pages harder for search engines to reach. To make sure you did not commit any mistake in redesigning your site’s architecture, you can use ScreamingFrog to perform a massive analysis to your new design, to verify whether “deep” pages are harder or easier for search engine spiders to reach (see Figure 4).

figure 4_1.PNG
Figure 4: URL depth before and after a successful redesign (as displayed in ScreamingFrog)
figure 5_1.PNG
Figure 5: Internal link distribution before and after a redesign

Apart from the new design’s depth, another element that is good to check during a redesign is link distribution. Essentially, while a depth analysis tells you how far from the homepage most of your site’s pages are in terms of links, a link distribution analysis will give you an overview of how internal links are distributed across pages. In other words, it will show you how the “link equity” of your site is distributed. Conducting this kind of analysis needs a bit more manual effort than the previous one, but is still rather simple. First of all, you will need to set a significant sample of pages to base your analysis on (e.g. the first 1,000 pages of your site, in crawling order). Then, you will need to crawl them in their old and new version with ScreamingFrog, exporting the reports on internal links per URI. Finally, you should work on the reports in Excel, sorting them in decreasing order and creating a comparative graph, possibly comparing them both in absolute and proportional terms (note that due to the typically high difference in link distribution across pages, using a logarithmic scale is often necessary). This kind of analysis can help you highlight whether your website’s new design increases or decreases the gap between your site’s top pages and the rest of it, and in how many tiers your site is split in terms of link equity. Taking the analysis in Figure 5 as an example, we can see how in this case our website’s new design generally increases the amount of links pointing to each page. More significantly, it seems to attribute much more “link equity” to the site’s top 40 pages, while before only the site’s top 10 pages were clearly given more importance in the internal linking hierarchy. If these changes reflect the actual importance and SEO potential of the site’s pages, then this is to be considered as a positive result; in this example, the increase in link equity towards the top 40 pages was due to the presence of a “top pages” menu in the site’s new design, with the specific purpose of strengthening its most important pages.

 

5. Double check your new design for internal content duplication

Changing a site’s design and template basically means changing the way in which its pages are “assembled” starting from numeric, graphic and textual building blocks. This can result in the same elements being used in different ways across multiple pages. If the way in which data and content is used across different pages is not carefully monitored, the site may end up with several pages having almost identical content (duplicate content) or with pages in which duplicated boilerplate elements (menus, headings, footers, etc.) take more space than meaningful content. Both these situations can damage the site, confusing the search engines about what pages should be ranked for a specific topic and generally lowering the site’s perceived quality, especially following Google’s implementation of its Panda algorithm8 . In order to avoid accidentally lowering your site’s perceived quality, it is important to ensure your new design is not excessively re-using and internally duplicating content. A tool which can help you to check your website for this kind of risks is Siteliner9 , which can crawl your site and show you how different content elements are re-used across it.

Migration and final checks

If you have successfully completed the checks listed above, you should now be confident your new design does not violate any key SEO best practice, performs better than the old one, has an improved internal linking structure and does not present any significant case of internal duplication. You are now ready to “migrate” your site, i.e. putting your old site off-line and replacing it with its redesigned version. However, as said in the introduction to this article, redesigning a website also very often involves re-platforming it, which means its URLs tend to change. If this is the case, then the job is not finished yet, as you should make sure each and every old URL redirects via a server-side, permanent (301) redirect towards its corresponding new URL, and that each destination URL correctly answers with a 200 status code and the right content. The safest way to do this is using a crawling tool like ScreamingFrog to crawl all your new and old URLs (you can use historical data from Google Analytics10 if you did not save them before the migration of the site). While conducting this check, special attention should of course be paid to the URLs of your most important pages (see section 1). Once your new design is online and you have correctly redirected old URLs, you should closely monitor any significant changes in Google’s crawling stats via Google Search Console11. If your site’s code and architecture really has improved, you should as a result see more pages getting crawled, more data being downloaded every day and less time being spent by Google to download each page (see Figure 6).

figure 6_0.PNG
Figure 6: Effects of a good redesign in Google Search Console

 

Apart from crawling stats, you should of course also keep an eye on your rankings and traffic. More specifically, I recommend aggregating your traffic and conversion data before and after the migration at pagetype and site-section level in Excel. In this way you will be able to easily see how the redesign impacted different areas of your site (See Figure 7).

figure 7_0.PNG
Figure 7: Analysis of organic traffic changes following a website redesign

 

As we have seen, changes in the front-end and back-end of a website may be critical in terms of SEO, as they may impact a site’s crawlability, perceived quality and - ultimately - search engine rankings and organic traffic. Since redesigning a website means modifying several front-end (and often back-end) aspects, the SEO risks of even small mistakes in the redesign and migration processes are significant. However, adopting the right strategy and conducting checks like those described in this article can help to safely improve a site’s design also in terms of SEO, hopefully leading to significant ranking and traffic improvements, so do not be too afraid and... fingers crossed for your next redesign! 

Advertisement