Technical SEO Checklist
Many times webmasters will focus on on-site factors when they need to increase their organic growth but much like a vehicle, a website does require behind the scenes maintenance so that it can perform at its best. If your website is falling behind in the ranking, there are definitely some things regarding your SEO that will need to be updated or looked at. When thinking of SEO, it tends to center on on-page factors rather than the technical issues that aren't so pretty. These are all of the tiny little sort of boring things that do not seem like much but actually make a pretty significant impact. All of these technical things are what gives your website the strong foundation that it needs so that it can support all of the great content that you are publishing on it.
The Technical Factors
- You will need to check to see that Google has indeed indexed your website. It is an easy but very necessary test to determine that this has been done successfully. If the website has not been indexed, it will not ever be ranked through a search engine. To check, you will only need to type in (with no quotes) “site:yoursitename.com” into Google's search bar. You should be seeing most of the web pages on your site listed. If they are not, you will need to take additional actions listed below.
- Check to see what your site speed is. A website that loads quickly is not only beneficial to website visitors, but it will factor into ranking within a search engine. There are many speed tests available, and a quick Google search will produce many free results. Your website load speed will then be ranked, normally with a higher number indicating a fast speed. When the number is lower, there could be some issues that will ultimately hurt your website's search engine optimization.
- You will also need to consider the click-through rate as well as the bounce rate. Load speeds will absolutely effect these two numbers. A website that takes a long time to load will have a higher bounce rate because people simply do not have the patience to wait. As a general rule, if the load speed is more than 10 seconds, think about taking away some of the media elements. If this does not reduce the load speed time, check to make sure there are not any coding errors that these issues stem from.
- More common SEO errors include server response problems, so it is best to work to detect and correct them. One way to get started on this is to create an account with Google Webmaster Tools. You will need to verify your ownership of the website through one of a few ways. You will need to grant them access to change items within the website code, possibly leading to webmaster help. If you would like to see the server response codes, you will need to login to your Google Webmaster Tools account, and select “health” and “crawl errors”. This is where you will find any server errors, problems with access being denied, and any “404” file not found errors. While it is normal for any website to contain a few errors, more than a handful of 404 errors (or any other errors for that matter) will indicate problems for your SEO. The webmaster should be able to iron out these issues.
- When you are using Webmaster Tools, use all of the services offered to your full advantage. The dashboard can signal any important website issues or warnings of malware. If you are not very familiar with configuration settings, you should not be changing any without the help of a trusted webmaster. The Health statistics of a website will provide you with an array of useful information like how often Google crawls your site with a bot, the number of pages indexed, and blocked URLs. You are also able to ask to “fetch as Google” to any page on your website. This will show you how Google's bots index any page on your website. When you do this, you will get to see if there are any problems in the indexing process and will give you an accurate representation on how Google is ranking the searches. Another tool that you may find useful for SEO purposes is the “Traffic”. This will show exactly which search queries have lead visitors to your website as well as any external links to your website. Webmaster Tools also has its own “Optimization” section that provides assistance with sitemaps, improving HTML, and content keywords.
- Make sure to take care of any broken links, also known as link rot. Google takes more than 200 factors into consideration when ranking a page among search results. When there are broken links coming from your website, your website then sees a negative impact, no matter how relevant the content is. A quick search will present you with a number of free tools to check the status of all links on your website. All that you will need to do is enter the URL into the tool and allow it to work its magic to find all of the broken links. You will need to make a lot of the broken links and then go to the page that they are contained on and fix them if needed, otherwise you may need to remove them altogether.
- The metadata contained on the website play an important role in the search ranking. The mere presence of a description tag doesn't hold much weight on the ranking, it is used more to get potential website visitors to click through and become legitimate website visitors. Metadata is somewhere else that needs to stay away from keyword stuffing. Because it is no longer considered in the ranking algorithm, it will have a negative impact when it is misused.
- If it has not been stressed enough, your website's architecture has a heavy impact on the ranking within the search engine. Not one formula for website architecture works for all websites, and the most vital thing to remember is that your website needs to be user-friendly. The website architecture will impact some of the engagement factors such as the click-through rate, bounce rate, and time on site. These engagement factors directly and indirectly influence the search results. Using the website architecture, your website needs to have a clear navigation as well as a powerful design, social media icons, and clear calls to action. The design that you choose to use needs to be able to be used properly on every browser and device. A website with good architecture will highlight the best content and lead to better search engine rankings.
- Create both an HTML and XML sitemap. The HTML sitemap is for your website visitors, but the XML sitemap is basically a text file that has all of the URLs contained on the site. Human visitors will not see this, only search engine spiders. Very large websites will need more than one sitemap, as it can only contain 50,000 URLs. Keep in mind that the XML sitemap is the one that will be going behind the scenes to take care of the issues that are taking place under the hood, so to speak.
- It is highly recommended to create a sitemap using a program such as DYNO Mapper. All that you will need to do is input the main page URL into the program and create the sitemap. It may then be exported into any number of file types and then used as necessary. When you have access to a sitemap, you are armed with a visual tool that can be utilized to optimize your website.
- Redirects are a necessary evil when it comes to keeping a website up to date, but they need to be done the right way in order to be effective. When you use the wrong codes, it will bring down your search engine ratings and will hurt your visitors. Some good reasons to use redirects are because there is an updated version of the content or you no longer cover this exact topic, and the redirects can be either permanent (301) or temporary (302).
- When a page is not found, it usually gives the 404 error. When you know that this is the case, you should create your own custom page for the 404 error. What you can do with your custom 404 error page is invite lost visitors back in to your site. You can add in links to encourage visitors to stay and explore the website even though they have run into a temporary snag. A 404 error will occur when you have moved a page, when someone has linked to a URL that is not correct, or when you have deleted a page.
- Google's Panda is an algorithm that is designed to target low quality or duplicate content and punish the sites with a lot of issues. This is just one of the many reasons that you should think about removing content that is thin or duplicated. Duplicate content is not good for visitors and it can confuse search engines, as they will not know which version of the page is the most relevant. Even if this is not caught by a punishable entity, you will still be vulnerable to losing traffic. You will be able to see the duplicate content with the sitemap.
- If you choose to make your sitemap using a sitemap generator like DYNO Mapper, view the analytics on the web pages, you will be able to see where your duplicates are, and also where your content is not performing as well as it could be. You may choose to either improve this content or get rid of it all together.
- Structured data and schema are certainly not new concepts, but they are definitely underutilized. These concepts make it easier for search engines to know what exactly a web page is about simply by analyzing the content. The schema vocabulary will describe the content to search engines. These schema terms can be inserted into the existing HTML. This will help to categorize the page and create rich snippets. These rich snippets will affect the search ranking and have been proven to increase the click through rate. This means that it tells Google that your page has more information on it than those around it, which leads to increased traffic and higher rankings.
- There are literally thousands of terms to use for schema, so it will take a bit of time for you to learn which is suited for your needs. Both Google and WordPress have their own plugins to help with schema. There is just a few steps in each of these plugins, and schema.org provides a full list of terms that can be used.
- The structured data and schema will allow you to curate your content descriptions like a professional, making it so much easier on modern search engines.
There are so many advanced tools that are available to increase organic growth that will solve many technical issues that many sites will encounter. While working to improve your growth, they will also work to improve your search engine ranking. You can figure out the most common technical problems when you complete this small list of items. This in-depth analysis is only an overview of how to increase organic growth by fixing technical factors. The intensity of this process can be wavered simply by choosing just a few of these things to work on. You can see how they are in relation to your website, and fix just a few things at a time. This will also make it easier for you to track any of your progress and results so that you can really see how bad the previous mistakes were hindering the performance of the website.