Single page applications (SPAs) are single page websites (e.g. index.js) that rout to multiple URLs via AJAX and json. Most of the rendering occurs client side, i.e. in the browser, rather than server side, as with traditional websites.
The site looks like it has multiple pages, but from the web server’s perspective, it’s s single page.
SPAs are here to stay and the trend towards the use of frameworks such as React, AngularJS, Vue.js and Elm are on the rise.
Advantages of SPAs
Disadvantages of SPAs
Traditional Web Crawlers
Traditional web crawlers or ‘spiders’ find pages on the web via links from other pages and download their text and metadata. From there they would schedule the links that were discovered and visit them according to a priority evaluation scheme (e.g. PageRank).
The key here is that only raw HTML (text & meta data) and hrefs (links) within the source code would be discovered and ‘crawled’. You can use ‘CTRL+U’ to see the raw HTML of any page on web.
This method of crawling relied on server side HTML to be fed to the search engines. These HTML document were often individual files (the base file being index.html) on the web server that were retrieved by HTTP requests.
What this means in practice is that when you look at the raw source of a lot of web pages today, you see a lot of code and not a lot of text or typical HTML elements.
Often you’ll see links to .js files like this.
Luckily, your browser knows what to do with this to transform it into something useful.
But what about search engines?
Client Side Rendering
Clients are typically the devices that we use to receive information from centralised web servers. Since the beginning of the internet nothing has changed in this respect.
Where client side rendering changes things is how it puts more emphasis on the client (laptop, tablet, smartphone) to finish the create the finished page whereas the traditional server-side model put more emphasis on the centralised server.
We can see from the above that the crawling model is similar with the exception that there is a second step in the process where the WRS takes a second look and finalises the crawl.
Single page applications should now be good to go for SEO right?
Well not so fast.
- Any bugs in your code can cause Google to trip up and not index your content as desired.
- Google assigns your site a certain amount of crawl budget that can be measured in seconds. This budget is based on your site’s value as perceived by Google (using factors such as PageRank, Brand equity and content quality etc). The longer your pages load, the less of them get crawled and on large sites green in Google’s eyes. This can be a real problem when launching a large site and getting it fully crawled and indexed.
- Some frameworks run natively with URLs that are unindexable (e.g. Vue.js hash fragments).
- Google recommends alternatives to client side rendering for elements that are critical to SEO.
- Rendering does not equal ranking. Experience shows that purely CSR site tend to perform worse in search.
- Bing and many other tertiary search engines really struggle to render content client-side. If you rely on search engines apart from Google, SPAs definitely need some extra work to make them search engine friendly.
- Not all frameworks are as easy for Google to render client side as others. Ironically Google’s own Angular framework wasn’t SEO friendly when it first launched.
- Social networks such as Facebook can’t render content client-side, so if your app has no raw HTML content, your social media efforts will struggle.
There are multiple approaches for optimising single page applications for SEO and social networks depending on the size of your site, the resources you have, how far you are into the development process and how reliant your site is or will be on organic visibility.
Get in touch now and we can talk about your options and quote you on phase by phase recommendations.