Quickly create github.io pages for your Elm projects

I’ve been learning Elm recently and writing a few small web app to learn it. The code is on github but its nice to publish the actual page for people to play with. This is pretty easy to do with github pages and elm-make.

Just go to your repository, type the following commands and you should get a github.io page such as http://jasonneylon.github.io/stamp-duty-calculator/

git checkout --orphan gh-pages
elm-make Main.elm --output=index.html
git add index.html
git commit -m "Creating github page"
git push --set-upstream origin gh-pages

If you are interacting with JavaScript code via ports you can create your own index.html page (copying a generated index.html file is a quick way to do this), run elm-make to build the elm.js file and then reference the generated elm.js script from your index.html file.

git checkout --orphan gh-pages
git add index.html
elm-make Main.elm # this will output elm.js
git add elm.js
git commit -m "Creating github page"
git push --set-upstream origin gh-pages

I followed this approach to create the github.io page for http://jasonneylon.github.io/scream-into-the-void/.

Adding Clojure unit tests results to Bamboo Continuous Integration server

We have started to use Bamboo for building our clojure project. It requires a little bit of tweaking to your project and the bamboo project setup to integrate unit test results fully.

First add the test2junit plugin to your leiningein project.clj file:

:plugins [[test2junit "1.1.0”]]

Then configure Bamboo to run the test2junit task:

Configuring the tests to run

Configuring the tests to run

Finally you need to setup a JUnit Parser task to pickup the results.

Parsing the test results

Parsing the test results

By default this path should work:


Rails 4 FlashHash Upgrade Gotcha

We upgraded an application from Rails 3 to Rails 4 this week and came across an interesting gotcha which I haven’t seen documented anywhere.

As we rolled out the Rails 4 version of the app we split the traffic between the upgraded Rails 4 app and our existing Rails 3 app. Some of the requests to Rails 3 app failed with the following exception:

Production NoMethodError: undefined method `sweep' for {"discard"=>[], "flashes"=>{"just_switched"=>true}}:Hash

Error Message: NoMethodError: undefined method `sweep' for {"discard"=>[], "flashes"=>{"just_switched"=>true}}:Hash

[GEM_ROOT]/gems/actionpack-3.2.19/lib/action_dispatch/middleware/flash.rb, line 239

It turns out that Rails 4 serialise the flash to the cookie differently to Rails 3. When Rails 3 attempts to deserialise it you get the above error. This error can also occur between Rails 3 and Sinatra apps.

To get around the problem whilst in the migration period we patched the Hash class. This would mean of course that the flash methods wouldn’t work, but as they are only used for minor presentation tweaks that seemed a good compromise.

How local are party candidates for the Lambeth council elections?

Getting annoyed by timewasters

During the Brixton Hill by-election campaign a few years ago I attended a hustings organised by Brixton blog. I got pretty annoyed by how candidates from some parties used the event (and the campaign) to parrot their parties’ national and international policies.

The Socialist Party called for full communism as the solution to pretty much every local issue. For most issues UKIP demanded we leave the EU. When the UKIP candidate did address local issues her positions (such as promoting private car use) seemed very strange for a Brixton local. It turned out she lived in Clapham. I wrote a blog post about it at the time where I plotted the candidates addresses on a map relative to the ward and council boundaries.

Ranking parties on how local their candidates are

This time around the whole borough has council elections and I was wondering if a similar trend exists. Visualising each ward doesn’t seem as useful as there are 289 candidates standing so instead I ranked the parties using a simple scheme: The percentage of candidates who live in the same ward they are standing in.

Parties ranked by % of candidates who live in the ward in which they are standing
Rank Party % of Candidates who live in ward Local Candidates Total Candidates
1 The Pirate Party 100% 1 1
2 The Green Party 67% 42 62
3 Independent 50% 2 4
4 Liberal Democrats 49% 31 63
5 Labour Party 47% 30 63
6 Conservative Party 46% 29 63
7 UK Independence Party (UKIP) 35% 6 17
8 Trade Unionists and Socialists Coalition 30% 4 13
9 The Socialist Party (GB) 0% 0 3

The Pirate Party only having a single candidate obviously helps them come first. I suspect the Green Party’s high ranking reflects their decentralized local nature (Disclaimer: I’m going to vote for them). On the bottom of the list are the The Socialist Party (GB). None of the Socialist Party’s candidates live locally (in fact none of them live in Lambeth at all – they hail from Kingston, Bromley and Richmond).

UKIP and the Trade Unionists and Socialists Coalition are predictably on the bottom end of the table. The big three traditional parties sit in the middle, possibly reflecting their tendency to ‘parachute’ in candidates to wards.

The gory technical details

There should be open data on who is standing in elections right? Well if there is I couldn’t find it. Instead I downloaded the PDFs that listed candidates from Lambeth website. I used the the command line version of Tabula to extract the data. I then geocoded the data using MySociety’s fantastic MapIt API. All the data and code is on github.

Does it matter?

Obviously this isn’t a perfect way to rank parties. Although candidates stand in a particular ward they are elected to serve on the council for the entire borough. Many will live in neighbouring wards of the borough, have a reasonable idea of what local issues and be capable of doing a decent job. The results do does reflect my experience at hustings though – candidates who don’t live locally are often just parroting party policy and don’t address local concerns.

How about a real debate on climate change on the Today show?

I got so irritated by Nigel Lawson’s appearance on the BBC Radio 4’s Today show this morning that I sent an email to the show.


Your selection of Nigel Lawson in this morning discussion about the link between climate change and the ongoing flooding was awful. You gave Lawson a platform to repeat a selection of discredited and incorrect arguments regarding climate change that are misleading and wrong. You did not mention the fact that he is intimately associated with the coal industry and let him use the show to promote their agenda unchallenged. Having Lawson on the show does not lead to a balanced or informative discussion for the public.
The actual debate on climate should be between those who believe our 2 degree commitments are sufficient to avoid ‘danger’ and the absolutely terrifying risks that implied by the science. I would urge you to have a climate scientist such as Kevin Anderson on the show to have a real debate on this vital issue.
Jason Neylon
If you feel similar annoyed by it you can contact the show too.

How to do ‘SEO’ for the website of a new offline business

A friend of friend asked for some advice on how to do ‘SEO’ for the website of a new (offline) business he was launching.

Here are the broad pointers I gave to him:

There are 2 types of search engine marketing paid (PPC) and organic (SEO), they are very different but are both ways to get people to see your website in Google. For both the  first step is to identify what terms or phrases people who search for services like yours  are searching with. You can do this with Google’s Keyword Planner.

Paid search (PPC)

Google has a service called adwords to pay for ads on google. This can be expensive, but has the huge advantage over other forms of advertising that it can be measured very accurately. Essentially you associate an ad with keywords and then Google will show them to people based on how much you bid and several other factors. There is a significant amount of work in refine your list of keywords and its also important to limit it geographically and to exclude unrelated keywords. The big advantage is you will get immediate results and customers to your website (which is very helpful to validate how the website performs).

Organic search (SEO)

SEO has 3 elements: ensuring the content on your site is correct to Google, ensuring that your content is relevant to your audience and getting others to link to your website. Ensuring your content is correct to google is basically following good web development standards. Ensuring your content is relevant is writing quality content relevant to what people search for. Getting others to link to your website would involve adding the website to any online directories for your sector and writing guest blog posts on others people websites, etc.

There is a cost in doing organic search properly – you have to spend time writing relevant content. Frequently people outsource to external people to write content for them. Many web business also pay to have links to their website added to other people website  – this is a bit of a black art and I would avoid considering for now.

Organic search is a long term investment – it will take you months or years to reach anywhere in the search results as many of the incumbents are well established and have good content.

Which to do first

To start, I would recommend doing some PPC to learn about what keywords are effective then considering tailoring the content on your website or adding some new content targeting those keywords to attract organic traffic.

Detecting web applications that aren’t converting with Riemann

We have been playing around with Riemann at uSwitch to do some of our monitoring. One of the core metrics we track is the number of customers who are currently converting on our website. If this number drops to zero it usually indicates something is broken. Each time a customer converts an event is sent to our Riemann server.

I added a stream to our riemann config to count recent conversions, create a new summary event and notify us whenever that number drops to zero.

    (where (and (service"app") (tagged"conversion"))
        (smap (fn [events]
                (let [conversion-count (count events)]
                    :time (unix-time)
                    :metric conversion-count
                    :state (if (> conversion-count 0) "ok" "warning")
                    :description"Conversions in the last 30 seconds"
                    :ttl 30}))
                  (fn [event]
                    (warn "Conversion state has changed to: " event)))))))

Sadly this didn’t work as I expected! If there are no conversions than no events are sent to Riemann and the moving-time-window code block is not executed. This is discussed further here.

You can work around this by however by using expired events. Events in Riemann have a time to live (TTL) associated with them. If an updated event is not received within the TTL the event will be expired – indicating in this case that no further conversion have taken place since it was last fired. You can add a stream to catch this event expiration and notify whoever is interested.

Just like above we add a stream to sum recent conversions and create a summary event:

   (where (and (service "app") (tagged "conversion"))
           (smap (fn [events]
                   (let [conversion-count (count events)]
                     {:service "app-conversions"
                      :time (unix-time)
                      :metric conversion-count
                      :state "ok"
                      :description "Conversions in the last 30 seconds"
                      :ttl 30}))
                 (changed :state
                          (fn [event]
                            (info "notify that conversions are happening again" event)))))))

Then we add a stream that waits for the summary event to expire:

    (where (service "app-conversions")
           (with {:state "warning"
                  :metric 0
                  :ttl 30
                  :description "No conversions in the last 30 seconds"}
                 (fn [event]
                   (index event)
                    (warn "Notify about no conversions" event)))))))

The above config is on github.

Playing with a RaspberryPi NoIR camera

We got some RaspberryPis at uSwitch.com to power our monitoring dashboards. For a bit of fun we also got some peripherals including the new NoIR infrared camera.

You need a source of infrared light to see anything so I used a remote control I had lying around. Here is some video of me waving at the camera.

And shining the remote control at the camera.

To record the video I used the raspivid command with the night exposure option.

raspivid -t 20000 -rot 180 -ex night -o hand.h264

The omxplayer command is handy to playback the results.

omxplayer hand.h264

Tip: List non replication operations when using db.currentOp() in the MongoDb shell

We run ad-hoc queries against our MongoDB hidden slaves pretty frequently at work. Some of these queries are long running so it nice to be get a filtered view of what operations are running without all the replication operations that are constantly running.

The mongo shell is a javascript interpreter so that is easy to do:

db.currentOp()["inprog"].filter(function(x) { return !x.desc.match(/repl.*/); });

From 5 kilowatts to 5 watts

I was in Bletchley Park last week for the Over the Air event. One of the highlights was a tour of the museum in which we got to see a reconstruction of the first computer the Colossus. Originally the computers were left on all day but now they are only switched on from time to time for tours due to the high cost of running them (and for environmental reasons).

Colossus computer at Bletchley Park

Colossus computer at Bletchley Park

The machines use a whopping 5.5 kilowatts of electricity and standing near them you can certainly feel the heat. The sheds in which they are kept apparently get pretty warm in winter too.

iPhone charging

iPhone charging

My iPhone by comparison uses only  5 watts (approximately)  or 0.1% of the power usage of the Colossus!

The increasing energy efficiency of computers is formalised in Koomey’s law which states that “at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half”. This trend has all sorts of interesting implications as it will allow computer to become smaller and more ubiquitous in the future.