Goodbye Wordpress, Hello Jekyll!

I've finally given up on Wordpress, and moved this blog over to Jekyll, the "blog aware static site generator". My blog is now 100% static HTML. No database, no dynamic code, and hopefully no downtime the next time one of my posts makes it to the front page of Hacker News!

My primary motivation for the move was the fact that wordpress would die every time my blog got a decent amount of traffic, even with WP-SuperCache enabled. That's far from the only benefit though. I don't have to worry about keeping wordpress updated anymore, or it getting hacked (which has happened a few times). Tom Preston-Werner, GitHub co-founder and Jekyll author, talks about even more benefits in his article Blogging like a Hacker.

Before settling on Jekyll I evaluated a few different static site generators, including rstblog and Hyde. What tipped it in jekyll's favour was the integration with GitHub. Thanks to GitHub pages this blog is now served directly from GitHub, so I don't even need to worry about a server!

Migrating from Wordpress

Paul Stamatiou did a write up of his own experience of moving from wordpress to Jekyll. I took a slightly different approach. Rather than installing Ruby on my server and running the normal migration script I instead took an XML export of my posts from wordpress, and then ran a php script on my local machine.

The script took care of creating files for each of the posts, but those posts still had a load of links to images and scripts that would need to be updated. For me this is where the benefits of having your posts locally as flat files became really clear - with a little bit of command line magic I could update all my links, and download all the linked files!

As I was using wordpress I knew all of the links that would need updating would contain "wp-content", so I could pull them out with grep:

$ cd _posts
$ grep -Eho "[^\"']*?/wp-content/[^\"']*" *

I outputted these to a file, and the downloaded them all with wget:

$ mkdir files
$ cd files
$ cat files | xargs wget

Next I updated the URLs, coverting something like to /files/google-index.png:

find . -type f -exec sed -i -e "s/[^\"']*\/wp-content\/[^\"']*\/\([^\"']*\)/\/files\/\1/g" {} \;

All that was left to do then was to push it to GitHub, and let them take care of the rest!

Posted on 20 Jan 2012
If you enjoyed reading this post you might want to follow @coderholic on twitter or browse though the full blog archive.