When developing and optimising a website you’ll often reuse handy code snippets to achieve different tasks. Here are some ways to optimise your website performance and SEO. We share code snippets that we’ve found useful while building websites on our hosting platform. When you are using website performance tests and SEO scanners they will often pick up similar issues. Here is how to fix some common issues and improve both the performance and SEO crawlability of your website in the process. You can use these website performance and SEO code snippets for WordPress and also any other website.
Remember that website performance and SEO go hand in hand. Improving your website performance means more user engagement, lower bounce rates, and better SEO rankings.
Another one of our .htaccess SEO code snippets, this time it is one that almost every website owner needs. When we have a SSL certificate we often want to force all visitors to use our SSL version. In addition we force the www version of the website so that the search engines don’t penalise us for duplicate content. If we don’t set redirects correctly then we can have http://yourdomain.com, http://www.yourdomain.com, https://yourdomain.com and https://www.yourdomain.com all as separate websites in the eyes of the search engines. Fix this by redirecting everything to https://www.. You will need a different rule if you want to use the non-www version of your website.
RewriteCond %{HTTPS} off # First rewrite to HTTPS: # Don't put www. here. If it is already there it will be included, if not # the subsequent rule will catch it. RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] # Now, rewrite any request to the wrong domain to use www. # [NC] is a case-insensitive match RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
Many websites and WordPress themes load Google Web Fonts in a way that blocks your page rendering and decreases your website performance. The solution is to load your fonts asynchronously, so they download alongside the rest of your content instead of delaying the loading. This is the fastest way to load Google Web Fonts in your website. Just remove any other Google Web Fonts scripts from your site and add this code to the bottom of your <body> tag and set which font families you want to load:
<script src="https://ajax.googleapis.com/ajax/libs/webfont/1.5.18/webfont.js"></script> <script type="text/javascript"> WebFont.load({ google: { families: ['Ubuntu:300,400,600,700', 'Dosis:400'] } }); </script>
Performance optimisation tweaks often mean simply enabling things that are already set up on your hosting server. Gzip efficiently compresses your files before they are sent to the visitor and can have significant speed improvements. If your server already supports the Apache gzip module then there is no reason not to make use of this, all you need to do is add the following lines to your .htaccess file and it will automatically compress all .html, .txt, .css, .js, .php, and .pl files.
<ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file .(html?|txt|css|js|php|pl)$ mod_gzip_item_include handler ^cgi-script$ mod_gzip_item_include mime ^text/.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_exclude mime ^image/.* mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* </ifModule>
Your server might be using mod_deflate instead of mod_gzip (as in the last item), in which case you can add the code below to your .htaccess file to compress your static files:
<ifmodule mod_deflate.c> AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript </ifmodule>
If you are using a SEO scanner tool, you have probably seen “HSTS” flag up as an issue. It’s more of a security code snippet but since it often reported by SEO scanners we are including it here. To enable HSTS on your domain just add this line to htaccess:
Header set Strict-Transport-Security "max-age=31536000" env=HTTPS
It’s beyond the scope of a short code snippet to cover all browser caching possibilities as there are lots of combinations using Cache-Control, Mod Expires, ETag, Last-Modified headers and more. The good news is that you can implement browser caching very easily just with the Cache-Control header and not worry about the others unless you want extra control. The two code snippets below can be added to your .htaccess file to implement browser caching on your website. The first sets your static .css, .js, .htm & .html files to cache for 7 days and the second sets your images to cache for one month:
<filesMatch ".(htm|html|css|js)$"> Header set Cache-Control "max-age=604800, public, must-revalidate" </filesMatch> <filesMatch ".(jpg|jpeg|png|gif|ico)$"> Header set Cache-Control "max-age=2592000, public, must-revalidate" </filesMatch>
Not a website performance tweak but understanding the robots.txt file is essential for SEO. Why would you want to disallow all? It’s important to make sure the SEO spiders don’t crawl your development or staging websites. Just create a file called robots.txt in the root directory of your website and add the following, but remember – don’t use this on your live website:
User-agent: * Disallow: /
Did you know that you can also harden the security of your website using your .htaccess file? Read our htaccess security tips article for our useful security tweaks. Looking for more website performance, SEO, and website security code snippets? Check out our developer resource code.hostasean.com for more code snippets and also our own WordPress plugins. Having problems or got more tips? Let us know in the comments.