for WordPress – A How-To Guide

So, you’re thinking of using for your wordpress website / blog? Here’s a detailed step-by-step guide that summarises everything you need to get the prerender service up and running on your WordPress installation.

Note! If you’re already up-to-speed on the benefits of the service, you can skip straight to ‘Ok, I’m sold! How do I get it fired up on WordPress?’

What’s and why would I need it?

To answer this question in two words, SEO and Javascript…

It’s no secret that many websites use Javascript to enrich onsite experiences and functionality (have you ever seen with JS turned off!?), from autocomplete search to product sliders etc. However JS could be serving you up a hot plate of you know what if your website depends on it to display content. A considerable problem with many premium WordPress themes:

Not all search engines (cough Bing cough) can process JS

Surprisingly, although Bing announced they can, in most cases and in testing, it seems they can’t or don’t. No doubt other search engines will have similar limitations.

This limitation isn’t just restricted to search engines. Twitterbot, Slackbot and any other bots cannot process JS either. This could make it much more difficult for many popular bots to fetch your website’s content.

Google can render JS, but defers its crawling of it

Yup, Google can execute Javascript. However doing so requires a healthy dose of server juice to get the job done. So much so, Googlebot makes a second pass of your website once Google has the resources available, which could be days after the first crawl.

Having your content imprisoned in Javascript makes makes it slower for Google to discover your content and is likely to make your website slower for visitors too.

This is where can help.

  1. It prerenders the Javascript, so search engines and bots don’t have to; ultimately caching an already executed copy of your Javascript powered web page and saving it as a static HTML file.
  2. It reveals any content to bots that Javascript could be hiding

If you’re not sure whether your website’s hiding content behind layers of Javascript, here’s a quick and lightweight extension you can add to Google Chrome. It allows you to quickly toggle Javascript on and off allowing you to see how your site behaves.

Here’s the great news – is free up to 250 pages!

Ok, I’m sold! How do I get it fired up on WordPress?

First, here’s what you’ll need:

  1. A account and your prerender token
  2. Access to your .htaccess file (it’s a hidden file, so you’ll need to be sure to tick the ‘Show Hidden Files (dotfiles)’ if you’re using CPanel’s File Manager to reveal it.
  3. Necessary access to your website’s header to add a small meta tag to your website’s header
  4. A copy of the code to be added to your .htaccess file.
  5. A complete backup of your original .htaccess file and FTP just in case!

Making changes to .htaccess

If you’ve changed your permalink settings from default, you’ll notice your .htaccess will contain the following:

# BEGIN WordPress

RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ – [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]

# END WordPress

Tip! Even if your .htaccess file is blank or does not contain the above code, it’s recommended you follow this step in any case as changing your permalink settings from default later down the line will break your integration.

Adding the middleware

Add the following code to the TOP of your .htaccess file:

#RequestHeader set X-Prerender-Token “YOUR_TOKEN”RewriteEngine On

RewriteCond %{HTTP_USER_AGENT} googlebot|bingbot|yandex|baiduspider|facebookexternalhit|twitterbot|rogerbot|linkedinbot|embedly|quora\ link\ preview|showyoubot|outbrain|pinterest|slackbot|vkShare|W3C_Validator [NC,OR]
RewriteCond %{QUERY_STRING} _escaped_fragment_

# Only proxy the request to Prerender if it’s a request for HTML
RewriteRule ^(?!.*?(\.js|\.css|\.xml|\.less|\.png|\.jpg|\.jpeg|\.gif|\.pdf|\.doc|\.txt|\.ico|\.rss|\.zip|\.mp3|\.rar|\.exe|\.wmv|\.doc|\.avi|\.ppt|\.mpg|\.mpeg|\.tif|\.wav|\.mov|\.psd|\.ai|\.xls|\.mp4|\.m4a|\.swf|\.dat|\.dmg|\.iso|\.flv|\.m4v|\.torrent|\.ttf|\.woff))(.*)$2 [P,L]

    1. Change "YOUR_TOKEN" to your prerender token (keep quotes) and removing the prefixed #
    2. Change the last line of code to the following (changes highlighted in bold):
RewriteRule ^(?!.*?(\.js|\.css|\.xml|\.less|\.png|\.jpg|\.jpeg|\.gif|\.pdf|\.doc|\.txt|\.ico|\.rss|\.zip|\.mp3|\.rar|\.exe|\.wmv|\.doc|\.avi|\.ppt|\.mpg|\.mpeg|\.tif|\.wav|\.mov|\.psd|\.ai|\.xls|\.mp4|\.m4a|\.swf|\.dat|\.dmg|\.iso|\.flv|\.m4v|\.torrent|\.ttf|\.woff))(index\.html\.var|index\.php)?(.*)$3 [P,L]

This amend ensures that your home page is also captured by when the bots make their request for your homepage if you’ve permalinks changed from their default setting. (The out-of-the-box Apache code prevents this from happening.

  1. Change to your site’s domain paying attention to the protocol on your domain (http OR https) and ensuring the trailing slash remains

Adding in the meta tag

Simply add this meta tag <meta name="fragment" content="!"> anywhere before the closing </head> tag on every page you wish to have processed by


Now, here’s where the fun (or the hair pulling) begins! The good news is, testing your integration is pretty simple. Assuming you’ve added the meta tag from the previous step, we can just query that for every website URL you’re looking to test.

For example,

This URL parameter will fetch the latest static html snapshot for a given page. If the response you receive is aesthetically similar/the same as the ‘full fat’ version of your website, then you’re almost there.

A quick (but important) note on cloaking

Cloaking can be damaging for your website’s SEO and overall organic performance. To summarise, cloaking is when your website serves different content to bots (specifically search engine bots) and a different version to visitors.

If is caching and serving an incomplete or a inconsistent versions of your pages, your site may be negatively impacted as search engines could assume you’re website is cloaking. It’s important to ensure your static HTML files are as close as possible to your live website in terms of content and layout.

Final checks

As always, measure twice (three times if you’ve the focus) and cut once. Before going live, I recommend you make these final checks:

1) Ensure your web copy is consistent. I use DiffChecker to quickly compare two blocks of code or text for changes. If there’s anything missing or anything weird being added in, it’ll be highlighted.

Tip! Pay particular attention to hyperlinks and special characters.

2) Is Googlebot fetching and rendering as expected? Fire up Search Console and navigate to Fetch as Google. (URL Inspection Tool is also handy if you’re using the newer version of SC.

3) Set your cache freshness – This determines how old (in days) your static HTML files should be before being automatically refreshed. Too long and search engines won’t be exposed to your page’s updated content as soon as you may like.

4) Make sure your requests are showing in your crawl stats reports. Here you can grab the raw HTML that has stored, review the name of the bot that made the request and see the number of requests over time.

So assuming the above is all in order, you’re probably ready to hit the big red button and deploy your changes to live. No more JS vs SEO problems. Go you!

However, as always – there are potential problems one can face. A few of the more frequent stumbling blocks are highlighted below:


1) Issues with 404 errors

If search engines aren’t properly recognising your website’s 404 status codes, you can add a meta tag which forces to send your website’s 404 page to bots/crawlers when a 404 error is triggered. You can find further information on this remedy here.

2) Some of the page missing in the static file?

You can delay the snapshot by manually telling when your page has finished loading. This can happen if you web page has a lot of Javascript which causes the loading of your pages to be slover than usual.

It works by inserting a few meta tags as close to the beginning and end of your page’s source code as possible; forcing to wait until the second meta tag is discovered before taking a snapshot of the page.

3) Don’t forget to double-check in incognito mode / an alternative browser

Sometimes the problem can be hidden in plain sight. In this case, your web browser. By previewing your changes in incognito or another browser altogether, it helps to ensure you’re bypassing your browser’s caches and grabs fresh copies of all files direct from the server instead of from the browsers temporary storage.

4) None of the above fixing the issue? It’s time to point the finger at your plugins

Plugins frequently conflict with WordPress websites. It’s tedious work (especially if you’ve a healthy number of plugins installed), but it could just prove to find where the gremlins lie. Disable each plugin one at a time and refresh the page to see whether your problem is resolved.

5) Still no luck? Reach out to (their support is awesome)

Need additional support? are lightning fast in responding and are super-helpful. In fact, its through their support that this blog post came to be. Sending over a copy of your .htaccess file is a great place to start.

Final note: W3TC and/or autoptimize plays nicely with

My own blog has both W3 Total Cache and autoptimize to make things as fast as possible. Every site installation’s different – but for this site, and the aformentioned plugins play beautifully with one another. The plugins themselves can also be activated and deactivated without causing any grief for me or prerender.

To ensure the best possible experience for both bots and your visitors, caching plugins are highly recommended and no WordPress website should be without them in my opinion!

I hope this guide was at least somewhat helpful. Whether it simply got you thinking about your potential JS / SEO gripes or gave you the confidence to get running on your website.

Thanks for reading!