Remediate.Co
Top 10 Website Redesign Mistakes

Top 10 Website Redesign Mistakes

Last Edited September 11, 2023 by Garenne Bigby in Search Engine Optimization

For anyone who has an established website with good traffic and search rankings, the greatest challenge of a website redesign is reaching or surpassing the current algorithms so rankings and traffic is not lost. That is generally not an easy task, and for good reasons. Let's look at the top 10 website redesign mistakes site owners usually make.

Top 10 Website Redesign Mistakes

1. Taking Google Rankings for Granted

Google has published their Webmaster Guidelines in order to help individuals build websites that work for their human visitors as well as search engine spiders that crawl for rankings. All too often though, people believe that their rankings on search engines are automatically taken care of when, in fact, that is not always the case. Many people are shocked when they find out that the Webmaster guidelines are accessible, because so few people read or even follow them. When you choose to ignore these guidelines, Google will basically ignore you—how's that for a Golden rule? Printed publishings are distributed while internet publishing depends on being found—through a web search. This means that many times sites are leapfrogging over one another to get a higher rank in search engines. Publishing online means that you must work with the algorithms in order to enhance your ranking among search engines. You need to understand your ideal customer and then work backwards to develop a strategy that will effectively reach them. For marketing on the internet, this means that you will need to identify and visit the top ranked site that Google is giving the top search results to for the searches that you want to compete for. Assess these websites and work backwards.

To start, you will need to investigate the data that is easy to discover like page titles, meta descriptions, and URLs. When you have observed all of the visible factors, begin to consider the off-site factors. The worst enemy of existing Google ranking maintenance are the web developers that believe they will never have to check how their performance is on search engines. Humans are limited as to what they can see as far as SEO, while spiders can not only see it, but can compute it as well. One of the most important guidelines put forth by Google is to know how your CMS is performing on search engines. This is because CMS so many things that humans can't, which will greatly impact search rankings. You will also need to actually look at Google's index and reckon on how your pages are being displayed among the search results to see if the changes that you have made to the site are having any impact. Many times website owners aren't aware of the problems in their SEO simply because they have not looked to make sure that their site has been crawled or indexed.

2. Overlooking Auditing Current SEO Elements

Many times when a website is undergoing a redesign, the content responsible for achieving the current rankings is overlooked. Yes, rewriting this content to be more relevant is a good thing, but you don't want the penalty to be fewer viewers. Many times, the content that is responsible for achieving current rankings is lost in the creative process which will result in a dramatic drop in traffic from search engines. Take initiative to be proactive and do not allow this SEO mistake to happen, as it should be one of the top priorities when undergoing a website redesign. Rankings come from a mixture of on-site factors as well as the off-site factors. The higher your authority and rankings are because of off-site linking, the less your site will depend on on-site factors to push you to a higher ranking, but never forget that you control the on-site factors 100%. Many people consider their website redesign a success if they see that they changes that they have made have a positive effect on traffic, which is why it is vital to involve SEO from the very start to ensure that it is search engine friendly right from the foundation.

3. Shallow Content

The ranking algorithms used by Google are very text dominated—this is because GoogleBot is essentially blind and cannot see pictures or text that has been embedded in images. The web pages that are heavier in text via word count (up to a certain point) are able to compete with a larger range of keywords without exceeding the keyword densities that Google rewards with the top ranks. Those web pages with a higher ratio of text to images will have a tendency to do better in search engine rankings. Thin or shallow content is determined by what your competition is doing for their rankings. With all other things being equal, those pages that have relevant and on topic text will rank higher than shallow content with less words per page.

In website redesigns, there is a tendency to make images larger and reduce the amount of text. When this happens, the loss of opportunity for text reduces the amount of keywords that can be used, thus driving down web traffic. Many times, the information that holds the keywords crucial for rankings will get buried in a secondary page, like the “About” page. This page does not have links leading to it from other websites like the homepage does, making it unable to compete for rankings. Text length will also influence the way that Google view's the importance of a page—the longer a person stays on it the more important it is. Tip: Google will favor sentences that are grammatically correct other those short bullet points that are simple phrases.

4. Failure to Audit SEO Keywords

When using Google analytics, look to see which of your pages are receiving the most traffic and rankings. These pages should be your top priority for your redesign in terms of SEO. You should be looking at the page titles, number of words per page, body text keyword densities, and other elements like alt text on images and headline designations. When text is being dropped from important entrance pages for a site that already exists, you should begin with a keyword audit and see what is being lost. It is a huge SEO website redesign mistake to take away any keywords that your current rankings depend on without counting them. Once you know this, determine what text will be put in its place to compensate for the loss. Will it meet or exceed the current ranking? When you change or delete entrance page texts without really knowing what you are going to lose in terms of keywords can really mess with your ranking. Essentially, the problem lies in the process of changing copy, and not taking into consideration if the previous copy was optimized and bringing in traffic targeted by keywords.

5. Using Images that Spiders Can't See

GoogleBot is, for lack of a better term, blind. It cannot see images or text embedded in images, and this will reduce rankings in a website redesign. Although alt-text is very important, it does not carry as much weight as regular plain text does. Google's Webmaster guidelines say that you should avoid converting your text to images whenever it is possible while undergoing a website redesign. The guidelines are also quite clear in explaining that the alt-text should be short but accurate and descriptive. There should not be any keyword stuffing. This is the mistake of a novice, the spiders do not see this text so it is pointless to stuff it. It will also bring in penalties for the whole site when it is detected by a search engine.

6. Loading Speeds

This is a relatively new part of Google's algorithm. When doing a website redesign, this is the perfect opportunity to re-code, condense any externally referenced files, and accomplish faster loading times. Take care not to accidentally increase loading times by adding objects and images before checking how long it takes each one to load. You should be checking loading speeds across many platforms to assess the average load speed. To do this, using the Google webmaster tool, choose “Performance” and find the load speed option.

7. Work-in-Progress Shouldn't be Crawled

Spiders should never have access to any pages on a website that are not finished. Keep these untouched by choosing a password protection on the staging site, or with a noindex, nofollow, and disallow in the robots.txt files to keep the spider away. This coding can be found in the Google Webmaster Tools under the “Crawler Access”. This is a huge redesign mistake—allowing a staging site to be crawled and entered into a search engine's index. The Googlebot can see the content in its unfinished form and view the beta site as a finished site, in turn making the final publishing of the main domain seen as duplicate content that is of secondary importance. When this happens, the new website cannot be found on Google because it has been pushed down in rankings by the duplicated content that was the work in progress website. This might take weeks or even up to months to fix, depending on the crawl speed of the site. It is also very important to keep the spiders from any temporary page that could have the “under construction” warning on it. You should just always assume that the site will be crawled at any moment—the simple fix is to just always keep them protected.

8. Changing the Navigation

When you change the navigation of a website during a redesign, you run the risk of unknowingly cause the website to look totally different to search engines. This is because they are not able to see and follow the dynamic links. Think twice before converting your navigation to pull down menus, flashing buttons, or fancy hover buttons. Do not ever assume that something can be seen by the spiders simply because it can be seen by your own eye when looking at the website. It should be spider checked to make sure that search engines will actually be able to recognize it. It is recommended by Google to use a Lynx text based browser to look at the work-in-progress that hasn't been published so that you can see if spiders can see the important aspects of the website.

Commonly when changing the navigation in a website redesign, the homepage will have an inflation of links. This is the result of adding to the number of pages that the homepage links to. This can impact the ranking of the sub-pages. It is recommended to have a website with a clear website hierarchy. Pages of primary importance should be linked to pages of secondary importance, and so on. It is not necessary to link every page of a web site to every other page. This is an indication of the opposite of a clear hierarchy.

9. URL Renaming and Not Redirecting

In a perfect world, 301—permanently redirect any old URLs to the new URLs. When this is not possible on a large website, it should be of the highest priority on the top level pages. This should be done immediately after making the site live. You may even run a test to validate the links to ensure that they are working properly.

10. Changing Domain Names

The age strength of a domain is an important factor in determining ranking. If your website redesign features a change in domain names, take note. Changing your domain name without managing your SEO can mean that you will lose your age strength as well as your link authority. Domains that are new on Google do not carry the same power that the former domain did. You will not receive rankings for competitive search for anywhere from 3 to 14 months, depending on how competitive the search is. Not understanding this can cause major damage if it isn't handled properly. When the domain is being changed, make sure that you are doing critical outreach to the top websites that link to your site to make sure that they update their links to you, or you will be seeing a steep decline in rankings on search engines. Ensuring that your off-site links are changed to the new domain will help you to retain your own link authority. When the old domain is redirected to the new domain with a redesign, it is important that after Google has crawled the old domain it is redirected and reflects the new domain in the index.

Garenne Bigby
Author: Garenne BigbyWebsite: http://garennebigby.com
Founder of DYNO Mapper and Former Advisory Committee Representative at the W3C.

Back
Remediate.Co

Related Articles

Create Visual Sitemaps

Create, edit, customize, and share visual sitemaps integrated with Google Analytics for easy discovery, planning, and collaboration.

Remediate.Co

Popular Tags

Search Engine Optimization SEO Accessibility Testing Create Sitemaps Sitemaps UX User Experience Sitemap Generator Content Audit Visual Sitemap Generator
Create Interactive Visual Sitemaps

Discovery has never been easier.

Sign up today!