SEO for Single Page Apps and PWAs – Part 1

Posted on May 10, 2018

Imarc has been using Single Page App (SPAs) techniques to build websites for some time, and has been experimenting with Progressive Web Apps (PWAs) as well. Although these patterns aren’t ideal for all websites, with them you can achieve a very “app-like” experience with standard web technologies. SPAs let you create super fast, highly usable, and far more interactive experiences than a traditional website architecture.

We’ve mainly used these app-like models to build intranets and other internal tools for our clients, because historically such models were not compatible with Google’s search crawlers. They were, in short, bad SEO. But that’s changing. In the past few years Google has added fairly robust support for JavaScript and has really improved its compatibility with SPAs.

To understand how to generate search-friendly content, though, you should know a bit of the history of how search engines and dynamic web content have interacted.

The Mesozoic of the Web

Way back in the olden days of 1998, Google ignored JavaScript completely. Every search engine did. (Remember Alta Vista?) The best practice was to avoid dynamically created content, unless you were generating it server-side. Good old document.write(), with us since 1994, became an anti-pattern (not that it was ever a great one).

It was a simple rule of thumb. JavaScript content = search unfriendly. Just don’t do it.

Image: Netscape's page loading indicator, aka a throbber
Pour one out for Netscape 2.0, the birthplace of JavaScript

Around 2005, the web started changing in some fundamental ways. Web developers were figuring out more clever ways to use JavaScript to communicate with a server and update the local web page. Jesse James Garrett dubbed it AJAX, and off we went, making fully interactive web pages with content generated in the browser.

But, even though some Google teams were aggressively pushing the JavaScript-rich web forward with apps like Google Docs and Google Maps, the search team’s GoogleBot was not. Any content you wanted to be Googleable needed to be created on the server and then sent to the visitor.

As a web developer, this feels like a waste of effort. Frustrating. If you wanted to deliver the richest experience possible to the visitor, while still appearing in Google results, you had to build your web page twice. Once for Google, and then build the site you actually wanted to build for your users! (It was called “progressive enhancement,” and was nominally about supporting all users. But really, it was about supporting Google.)

This was the received wisdom of SEO for nearly two decades: good SEO meant no JavaScript dependencies.

Thank heavens, things change.

Modern search bots should speak JavaScript

In 2014 Google quietly announced that GoogleBot would now support JavaScript. In addition to simply scraping page content, Google would run JavaScript and fully render page content. GoogleBot basically uses a version of the Chrome browser for this.

Google put some performance disclaimers on it, but over the following year SEOs found that Google was actually pretty good about running on-site scripts. Content not appearing in Google appeared to be a non-issue.

By 2017 it was determined that Single Page Apps were not just search-safe but Google friendly. Today, if you want to build an SPA, Google is just fine with it. So long as you observe a few guidelines and perform some testing, building your site as an SPA shouldn’t hurt your search rankings.

In Part 2, we dive into the nuts and bolts of SEO for SPAs and PWAs, offering concrete guidance for ensuring great search engine optimization for your modern web site architecture.