A simple server to provide rendered html to crawlers, for ajax sites.
prepare the web app
make sure it's "AJAX crawlable.", which means it adopts hashbang url schema. Read the google doc again if you don't understand.
prepare the server side
-
Install PhantomJS, on Mac, you can:
$ brew install phantomjs
$ sudo apt-get install phantomjs
-
Start SEO Server
$ phantomjs seo.js
-
Setup nginx, add codes below into site configuration:
if ($args ~ _escaped_fragment_) { rewrite ^ /snapshot$uri; } location ~ ^/snapshot(.*) { rewrite ^/snapshot(.*)$ $1 break; proxy_pass http://localhost:8888; proxy_set_header Host $scheme://$host; proxy_connect_timeout 60s; }
$ curl http://yoursite.domain/page#!/id/12
$ ## verify it's fully rendered HTML
$ ## if your app is running at http://localhost:3000
$ curl http://localhost:8888/page#!/id/12 --header Host:localhost:3000
$ ## verify it's fully rendered HTML
For index page of your site, you need to add this in HTML if you haven't:
<meta name="fragment" content="!" />
if you have trouble for https URLs, try this:
$ # phantomjs --ssl-protocol=any seo.js