Remediate.Co
How to Make Your JavaScript Website SEO Friendly

How to Make Your JavaScript Website SEO Friendly

Last Edited September 11, 2023 by Garenne Bigby in Search Engine Optimization

For the majority of the life of the internet, search engine optimization practices have revolved around using plain text versions of your content as much as possible so that you do not encounter content that is dynamically generated, like through JavaScript. As technologies have advanced, this seems to still be an issue for the well-known search engines like Google, but does it have to be? In general, crawlers have historically only looked at raw content that is within the HTTP response body, and did not typically interpret what a normal browser running JavaScript would interpret.

Almost 10 years ago, Google started crawling and indexing the content being rendered through JavaScript, but it was still limited. Last year, Google announced that they are handling Javascript in a way that is unlike before- links, redirects, dynamically inserted content, and more are no longer an inconvenience. Though Google can now understand and render most web pages on a modern web browser, it does not have the same confidence with every single scenario. Static content is easily indexed, but dynamically generated content may still be left out. How do you fix this? Use one of these programs to ensure that your JavaScript is SEO friendly.

javascript seo friendly


Six Tools that help your dynamic javascript website get crawled by search engines


1. PRERENDER.IO

If you prefer to run your own server, Prerender is open source software, meaning that all of the code is available for use to anyone. This software works in conjunction with crawlers so that when a page is not cached but it is trying to access it, the program renders the page as soon as it is requested, and caches it after the fact. Because of this process, the cache is always complete. The pages that have been cached are returned very quickly, as Prerender's middleware has been built so that users will see response times around 5ms. Prerender follows the crawling specifications that Google wrote for AJAX, so there is no doubt that the website will be crawled in the correct manner by Google, as well as other search engines.

Follow these guidelines to make sure that you are giving the correct information to the web crawlers. Ensure that you are returning the right status codes back to the crawlers. There are a few distinct meta tags that should be put in the header that will return a different header or status code back to a crawler, based on the REST calls. You will also need to tell Prerender exactly when the page is ready to be saved so that Prerender can perform more accurately. Though the program tries it's best to detect when the page is done loading, but sometimes it is best to inform the program yourself. You will also need to use Prerender's own API to cache and re-cache pages. This will ensure that they are properly cached when a crawler is trying to access them. Pages can be reached with this program's API when they have been changed so that the page is up to date and there is minimal wait time for the cache to recall the page.


2. BromBone

This program works to automatically download all of your web pages and then render them in a real life web browser. This will makes sure that the AJAX calls are made, the Javascript runs correctly, and the DOM changes are executed. When all of this is done successfully, BromBone will save the HTML that corresponds with this. As a search engine bot lands on a website, simply proxy the prerendered snapshot using BromBone. This is the first and only step that is required to make a change to the code. The changes will be small and BromBone shows the user exactly what to do. There is no need to be intimidated by this, all that is needed to do is copy and paste, which anyone is capable of. When this is done, the program sends the HTML to the bot from the search engine. Google will then see a web page that looks exactly like the one that people will see using their own web browsers. Google will then index the page in the proper way because BromBone has already run the JavaScript and made the AJAX calls.

To use BromBone, you will need to make sure that you are generating an XML sitemap, which is likely already the case. BromBone uses this file to make sure that the page is ready for search engines. You will also need to proxy the HTML snapshot from the program when a search engine bot visits the sitemap. This is a simple step and really only consists of a small copy and paste. When you sign up, BromBone gives free consulting to help you get your JavaScript working properly. The company believes that many people won't need this service because it is just so easy to use, but BromBone promises that they will do whatever it takes to get your website working.


3. Angular JS SEO

When Google is indexing a web page, it is not reading the data, but it is reading the templates. Because of this, it is necessary to write code for the server that will send a version of the site that does not have JavaScript to Google. Sometimes this means that you will have to basically duplicate the entire thing. Some of the code can be reused with functionality removed, but that still takes time. You could use Phantom JS to render a snapshot of the page and then give them to Google, but it constantly crashes, it uses way more RAM than thought—overloading the servers, it moves at a snail's pace which will show in a penalty in the form of lower rankings in Google, and the pages just aren't being generated fast enough.

With Angular JS, HTML snapshots are prerendered for all pages without the need for you to make any changes to your code. You simply have to fetch the page when Google begins the crawl. Server maintenance will be totally taken care of, because developers have enough to do, as it is. Angular JS allows all of your team members to worry less about maintenance while the program works to keep your site ready to be indexed. It is vital for your website to be able to be crawled and indexed in order to show up in Google's search results, and Angular JS ensures that your SEO is in it's prime form to be found. The program allows you to put your attention toward making a great site with no need to totally duplicate your code, mess with extra servers, or get jumbled up in browsers.


4. SEO.js

This program will make your JavaScript app conducive to being crawled by Google, thus making them appear within search results. You will be able to integrate the program on your own server using a portion of text that you will copy and paste onto your server. You can use it with Nginx, Ruby on Rails, Apache, or any other programs. When you add your website into the dashboard of SEO.js, the program will visit your webpage and put together the HTML snapshots for each of them with all of the dynamic content that will be rendered fully. The program also makes sure that the snapshots are regularly updated to reflect recent changes. When a search engine crawler such as Googlebot visits a web page, they will automatically see the wholly rendered content. This is because SEO.js will deliver the pre rendered version of the page given by the snapshot.

It is so easy to use SEO.js for your site. All that you need to do is to add your web page to the SEO.js dashboard, and then paste the integration text into your own web server's configuration. That's it! When you are done setting everything up, you will not need to check back with the program. It will keep working along in the background. When you make changes to the website or add in pages, the program will gather the information and then share the updated content with Google. A page that loads quickly is given priority in a search engine, and the technologies implemented by SEO.js ensure that pages are loaded with very low latency. This just means that they load fast. SEO.js also automatically creates your XML sitemap, so that is one less thing that you have to worry about.


5. Year of Moo

Year of Moo is a blog created by a programmer to help other programmers expand their education regarding web development, software design, and computer science. The purpose of the blog is to help anyone interested in expanding their knowledge about development on the web. The focus is on Angular, HTML5, CSS, Animation, JavaScript, Ruby, and more. More specifically, you can find things like improving JavaScript SEO. Using AngularJS, you will be changing the framework of the HTML structure, making the previously rendered HTML invalid for search engines. To get around this, you will use a specific URL route and a headless browser to retrieve the new HTML to support SEO with Angular JS.

Bing and Google both employ hashbang URLs that inform the search engine that the current web site being access contains AJAX content. This URL is then converted to something that is able to be accessed by the server. The URL is then visited and the content should appear as it would to the user that is seeing the final version. With Angular JS, you will configure your own web server in order to create an HTML browser with no header that will access your page, and will then give you the HTML for that final URL that Google will use. If you prefer to not use the hashbang URLs while informing Google of your website's AJAX content, you will need to include a special meta tag in the header of the file that is being accessed. Then, you will need to configure the Angular JS to use the HTML5 URLs when manipulating URLs and routes. It is suggested to use the hashbang URLs so that you can intermingle non-AJAX pages (like a basic form) to work with pages that are non hashbang HTTP URLs (like a registration page).


6. Backbone JS SEO

When content is hidden back in a template, it is difficult for Google to access it—it will see empty tags. It is vital for Google to be able to see the final product so that your site will show up in search results. JavaScript helps to make a web page visually pleasing and a great API, but you will need to duplicate the code on the server to make it feasible for Google to read it. Using Backbone saves you valuable time. One of the largest benefits of using Backbone is that it will prevent annoying duplication. You won't need to recode anything, so that you can focus on making the site the best that it can be for the client. Backbone takes the worry away from behind the scenes so that you can give your attention to your customers. The architecture of your servers will become simplified with Backbone.

Backbone takes the pressure off by not asking you to install anything extra on your server, while the program processes the pages and outputs a static html version of the content and will continually keep it up to date. When Google starts a crawl, Backbone will give this static version of the page to them and life will go on. The program offers to help programmers by walking them through each step, though they claim that it doesn't get any more difficult than copying and pasting. BromBone works with Backbone to make it what it is. It makes sure that the final result will look exactly like what the users will see, and this static version will be passed on to the Googlebot when they are asking for it. Again, the program follows what Google suggests, so you know that the site will be crawled properly.

Garenne Bigby
Author: Garenne BigbyWebsite: http://garennebigby.com
Founder of DYNO Mapper and Former Advisory Committee Representative at the W3C.

Back
Remediate.Co

Related Articles

Create Visual Sitemaps

Create, edit, customize, and share visual sitemaps integrated with Google Analytics for easy discovery, planning, and collaboration.

Remediate.Co

Popular Tags

Search Engine Optimization SEO Accessibility Testing Create Sitemaps Sitemaps UX User Experience Sitemap Generator Content Audit Visual Sitemap Generator
Create Interactive Visual Sitemaps

Discovery has never been easier.

Sign up today!