Best URL Structure for SEO
Having a great URL structure for SEO might seem straightforward, but it is often times overlooked. URLs are really the building blocks of your website and can make or break your traffic. While it is important to mindfully create these URLs, these tips are not critical to every single page that is created. Your website will not perish if every single suggestion is not followed, but it's more about doing these things where you can so that you are contributing to the success of your website. Keep in mind that these URLs are being read by humans and search engines alike, and it is not very difficult to structure them in a way that will please both.
URL Structure for SEO
1. Remove Extra Words
You should remove words that don't add significance to the URL (sometimes called stop words), for the sake of readability. This would be words like “that” or “and”, or even a few extra words from the end that are used for clarification. The main thing to remember is that while you can take out some of the extra words, make sure that the URL is still readable by humans. Not only will this help them to understand what they are looking at, the site will feel more trustworthy. Rather than using a phrase like 'The 7 ways that helped me get more sleep”, you could shorten it to “7 ways to sleep more” or a variation of that. It expresses a clear thought as to what the content is about and what to expect from it.
2. Relevant Keywords
Keywords are one of the ways that a search engine and humans gather knowledge about a web page and its content, so it is important that you are including one or two relevant keywords in the URL. They will be most useful when they match the contents of the page. Think about how often you copy and paste a URL on social media, in text messages, and in emails. Because there is no anchor text, the text used in these URLs is vital in describing what the link actually contains. When you are using relevant keywords within the URL, you are really checking many boxes at once—the keywords are great for communicating what the content is all about to search engines, so that it can deliver your content to those who are performing a relevant search query.
3. Easily Readable
Because a URL is read by both search engines and humans, it is important for it to make sense. Your URL should be able to tell the reader what they will find if they choose to click on it. For example, when you put in many keywords and delete the extra ones (as suggested above) you will still have a URL that makes sense. Ideally a URL would have the domain, subdomain, and then the keywords—just make sure that it does not look like spam. This is especially important when thinking about accessibility. Why would you not want to make it as easy as possible for users and search engines to read your URL? When the power is in your hands, there is no reason to not take advantage of it.
4. Utilize Hyphens and Underscores
The options when separating words in a URL are to use hyphens, use underscores, or use nothing. It is highly suggested to go with the hyphens. This will help with the overall readability of the URL, making it easier for humans and search engines to comprehend it. Underscores have actually risen in popularity within URLs, thanks to search engines overcoming their previous challenges with them. Now, hyphens and underscores are treated similarly. Spaces are not recommended because they translate into %20 within the URL, and takes away from the overall readability of the page. Avoid spaces when possible, but this has become increasingly easier thanks to newer content management systems. Just do not forget to double check your hyphens and underscores, ensuring that they are not truncating your URL.
5. Single Domain and Subdomain
Many people actually speak out against moving content from subdomains to a subfolder to improve rankings. Some webmasters might have no real choice in the matter, and if that is the only way to put content on the internet, then go for it. But to increase the likelihood of your site performing well within the search engine rankings, it should be organized all together on a single root domain and sub domains. To think about this in another way, recall the way that your website architecture should be— shallow. Shallow means that there are not endless subdomains within each other. Two or three clicks should be the end of the line for content. When you go beyond that, a user trying to reach a goal may see this as too complicated.
6. Canonicalize When Possible
If for some reason you end up with two URLs that contain content that is very similar, you should consider using a 301 redirect or canonicalizing them. Having duplicate content isn't really make or break for a search engine, but it might harm your search engine potential because of the ranking signals being split between the two pages. It might not occur to some people that a web page will have a www version as well as a non-www version, and even an http version and an https version.
7. Exclude Dynamic Parameters
It is suggested that you avoid using URL parameters if you are able to. When there are more than 2 URL parameters, you might want to invest in reworking the URL to be readable and static text. Chances are that if you are using a content management system, the program is likely already savvy to helping when there are a plethora of parameters. These dynamic parameters are used in tracking clicks and in general are not a problem except that they can make for some unreadable and awkward URLs. Again, these are not a make or break, so you can use your own judgment on whether to include them or not.
8. Match URLs and Titles
This is the ideal situation, but a variation on the title works as well. Matching the URL with the title accomplishes a goal that is centered around the user experience. This is also the reason that many experts suggest that the page title and visible headline match—the title gives an expectation and the headline confirms it. There is an expected level of clarity in regards to URLs and page titles, and adhering to it actually is not very difficult. Though matching URLs to page titles is mostly for aesthetics, this is what makes the user experience either good or bad.
9. Remove Unnecessary Punctuation
It is quite easy to remove punctuation from URLs, as too much of it can lead to confusion. Not only are they hard to read, but they also have the potential to break web browsers or crawlers. Alphanumerics are allowed, and few select special characters do not need encoding when included in a URL. Best practices would be to just exclude punctuation when it is not totally necessary.
10. Limit Folders
Within a URL, it isn't the visible dashes separating folders that is a problem, it actually creates the perception that the site is deeper than it actually is—for users and search engines alike. Having many folders also makes it more difficult to edit the URL. While the rules on this are not black and white and it is mostly about aesthetics, you should be using your best judgment on the use of folders. Having multiple folders will also make your web page much deeper than that is suggested—remember that a shallow navigation is what is suggested.
11. Restrict Redirection
Redirections are okay, but aim to limit it to two or fewer. It is normal for a crawler to request URL 1 and be redirected to URL 2 or URL 3, but if it goes on for more than just a handful of links, you could get into some trouble. Many times a search engine will follow the redirects as far as they will go, but it actually makes these URLs appear as less important thus halting in tracking the ranking signals. In the bigger picture, users and browsers are both slowed down by these redirects—keep it simple and you will be dealing with less problems in the long run.
12. Case Sensitivity
For the most part, case sensitivity is not an issue if you are using servers for Microsoft or IIS. If your website is being hosted with Linux or Unix, case sensitivity could be an issue. These servers can have trouble interpreting the separate cases and two seemingly similar URLs might contain different content just based on the case sensitivity of the server. There is no real reason to use capitalization within a URL, so best practice would be to keep all words uniform so that you do not accidentally run in to any problems regarding case sensitivity.
13. Avoid Keyword Stuffing
While you want to include a sufficient amount of keywords within your URL, you definitely do not want to go overboard with them. Too many keywords and your link will start to look like spam, automatically creating a bias against it. This type of repetition will not help in search rankings and will hurt your chances of getting clicked on. Google and Bing have become savvy to keyword stuffing, so there is no way that it could prove beneficial. If anything, keyword stuffing will lead you to be penalized both immediately and in the long run.
14. Using Tracking Parameters
These are variables that are added to a URL in order to indicate something specific about the URL. For example, a store might use samplestore.com/?storeid=101 to indicate the page of a specific store. Instead of using this format, when possible use actual descriptive words instead, like samplestore.com/store101. When it is reasonable, aim to limit the number of variables that the URL includes. Clear yet descriptive URLs are preferred over long and complicated URLs.
15. Shorter is Better
In general, shorter URLs are preferable over longer ones. If your URL is already about 50 or 60 characters that is fine, but when it gets up to 100 or more characters, there is probably a great opportunity for shortening it. This problem isn't one rooted in search engines, it is only a problem for the user experience. It is easier to share them and shorter URLs just provide a better aesthetic overall. A shorter URL can be typed out easily if needed, and can be shared even through a medium that has a character limit like twitter or a text message—both of these platforms would truncate a URL when it exceeds the character limit.
16. One URL For Your Home Page
The home page is the most important page on your website, and it is the most powerful. This is the reason that it should have just a single URL. Your home page will get most inbound links than the other pages on your site, and it is what is bringing your traffic in. Your home page needs to have a canonical URL that is simple and straightforward. You should never be linking to any other version of your home page, and any 301 redirects should point to the main page if for some reason your website is producing any duplicate URLs.
17. Paginated URLs
Paginated URLs happen when a page for a category has to create additional pages when all listings won't fit on a single page. For example, a category page may have 150 items to list, but 50 can only fit on a single page. These paginated pages would appear like http://www.sample.com/white-shoes?page=1 and so on.
These suggestions for your URLs are considered best practice for SEO. When you are creating a web page or site from scratch, it is simple to just employ these strategies from the beginning, and even when you decide to edit an existing site, it is not much work to ensure these tips.
Related Articles
- 4 Easy Ways to Search a Website for a Specific Word
- Should You Use Nofollow Links For SEO
- White Hat SEO vs. Black Hat SEO
- Redirection and the Impact on SEO Rankings
- 12 Page Speed Best Practices to Follow for SEO
- All About the Robots.txt File
- Web Accessibility and Search Engine Optimization (SEO)
- What is Speakable and How Can it Help Voice Search SEO?
- How to Prevent Blacklisting When Scraping
- JavaScript (JS) and Search Engine Optimization (SEO)
- What is Negative SEO, and How to Defend Yourself
- The History of SEO and Search Engines
- How to Build a Website for Search Engine Optimization
- Duplicate Content Issues Hurting Your SEO
- Top 10 Backlink Checker Tools for SEO
- Why Does SEO Take So Long for Search Engine Success?
- Top 10 Content Planning Tools for Content Marketing
- Seo Copywriting: How to Write for Search Engine Optimization
- Top 15 Tools for Measuring Website or Application Speed
- Top 25 SEO Tips for eCommerce Websites
- Top 15 Plagiarism Tools for Finding Duplicate Content
- Top 25 SEO Blogs to Stay Up-to-date with Search
- The Best Content Management System (CMS) for SEO
- Social Media & SEO: Why Your SEO Strategy Needs To Include Social
- HTTP or HTTPS? The SEO Impact of Using SSL Certificates
- 35 Amazing Web Analytics Tools that Rival Google Analytics
- 25 Ways to Build Backlinks to Your Website for Free
- What Is Structured Data and Why You Need It for SEO
- 60 Innovative Website Crawlers for Content Monitoring
- Voice Search: How Our Voices Mold the Future of SEO
- Crawl Budgets. How to Stay in Google's Good Graces
- 30 Awesome Keyword Ranking Tools for SEO
- 43 Amazing Websites to Learn SEO Online
- Pagination or Infinite Scroll for SEO?
- How to Get Started Using Google Search Console for SEO
- CMS Plugins for Google AMP (Accelerated Mobile Pages)
- Are Keywords Still Important for SEO?
- Do Sitemaps Help SEO? The Importance of Sitemaps for SEO
- Getting Started with Search Engine Optimization
- SEO vs. PPC: Which One Should You Choose?
Categories
Create Visual Sitemaps
Create, edit, customize, and share visual sitemaps integrated with Google Analytics for easy discovery, planning, and collaboration.