How about a real debate on climate change on the Today show?
I got so irritated by Nigel Lawson’s appearance on the BBC Radio 4’s Today show this morning that I sent an email to the show.
Hello,
I got so irritated by Nigel Lawson’s appearance on the BBC Radio 4’s Today show this morning that I sent an email to the show.
Hello,
I was in Bletchley Park last week for the Over the Air event. One of the highlights was a tour of the museum in which we got to see a reconstruction of the first computer the Colossus. Originally the computers were left on all day but now they are only switched on from time to time for tours due to the high cost of running them (and for environmental reasons).
Colossus computer at Bletchley Park
The machines use a whopping 5.5 kilowatts of electricity and standing near them you can certainly feel the heat. The sheds in which they are kept apparently get pretty warm in winter too.
My iPhone by comparison uses only 5 watts (approximately) or 0.1% of the power usage of the Colossus!
The increasing energy efficiency of computers is formalised in Koomey’s law which states that “at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half”. This trend has all sorts of interesting implications as it will allow computer to become smaller and more ubiquitous in the future.
I did an interview and filled out a logbook with http://www.suslabnwe.eu/ on home energy use. One of the activities was to document and take pictures of an everyday activity which was much easier to do as a blog post hence this post. Apologies if this bores you to tears!
I made soup because I was hungry after work and had load of vegetables in the fridge that needed cooking!
Here are the step involved in me making soup:
This is obviously pretty banal but it is eye opening just how many electrical and energy related things are involved in such a mundane activity.
Xively (formerly Cosm, formerly Patchube) now has a Javascript library for their API. I have been meaning to learn the Rickshaw library for awhile as it comes highly recommend from several people. It turned out to be a doddle to use. I just mixed in the moment.js library to do some date munging and was able to knock up some graphs in an hour or so.
These two data source aren’t pumping out data anymore but you can see an example here.
xively.setKey( "YOUR API KEY HERE" ); | |
var startOfYesterday = moment().subtract("day", 1).startOf('day'); | |
var endOfYesterday = moment().startOf('day'); | |
console.log(startOfYesterday); | |
console.log(endOfYesterday); | |
var query = { | |
start: startOfYesterday.toJSON(), | |
end: endOfYesterday.toJSON(), | |
interval: 60, | |
limit: 1000 | |
}; | |
xively.datastream.history( "106267", "NO2_00-04-a3-37-cc-cb_0", query, loadData); | |
function loadData(data) { | |
var unit = data.unit.label; | |
var series = []; | |
var filtedData = data.datapoints.filter(function(x) { return (x.value < 1000); }); | |
for (var i=0; i < filtedData.length; i++ ) { | |
var date = moment(filtedData[i].at); | |
var value = parseInt(filtedData[i].value); | |
series[i] = {x: date.unix(), y: value}; | |
} | |
drawGraph(series, unit); | |
} | |
function drawGraph(data, unit) { | |
var graph = new Rickshaw.Graph( { | |
element: document.querySelector("#chart"), | |
width: 640, | |
height: 400, | |
renderer: 'line', | |
series: [ | |
{ | |
data: data, | |
color: '#6060c0', | |
name: unit | |
} | |
] | |
} ); | |
graph.render(); | |
var hoverDetail = new Rickshaw.Graph.HoverDetail( { | |
graph: graph | |
} ); | |
var legend = new Rickshaw.Graph.Legend( { | |
graph: graph, | |
element: document.getElementById('legend') | |
} ); | |
var shelving = new Rickshaw.Graph.Behavior.Series.Toggle( { | |
graph: graph, | |
legend: legend | |
} ); | |
var axes = new Rickshaw.Graph.Axis.Time( { | |
graph: graph | |
}); | |
axes.render(); | |
var yAxis = new Rickshaw.Graph.Axis.Y({ | |
graph: graph | |
}); | |
yAxis.render(); | |
} |
The code is also on github.
I recently got an Air Quality Egg. The Air Quality Egg is a open source hardware project to measure air quality hyper-locally. My egg is located on the balcony of my flat in Brixton, London. The Air Quality Egg device has sensors which read NO2, CO, humidity & temperature and posts the data to Cosm. You can view readings and simple graphs on the Cosm feed page.
I wanted to see the trends in the data so I wrote a script in R to curl the data from the Cosm API and plot it using GGPlot.
Here is the script:
install.packages('RCurl') | |
install.packages("ggplot2") | |
library(ggplot2) | |
library(RCurl) | |
# grab an api key from cosm and put it here | |
api_key = 'YOUR_API_KEY' | |
# your feed id | |
feed_id = '106267' | |
csv_data = "" | |
start_date = as.POSIXct("2013/05/07","GMT") | |
end_date = as.POSIXct("2013/05/08","GMT") | |
while (start_date < end_date) { | |
start_date_as_str = format(start_date, format="%Y-%m-%dT%H:%M:00Z") | |
next_date = start_date + (3 * 60 * 60) | |
next_date_as_str = format(next_date, format="%Y-%m-%dT%H:%M:00Z") | |
cosm_url = paste("http://api.cosm.com/v2/feeds/", feed_id, "/datastreams/NO2_00-04-a3-37-cc-cb_0.csv?start=", start_date_as_str, "&end=", next_date_as_str,"&interval=0", sep="") | |
csv_data_for_period = getURL(cosm_url, httpheader = c('X-ApiKey' = api_key) ) | |
csv_data = paste(csv_data, csv_data_for_period, "\n") | |
start_date = next_date | |
} | |
data = read.table(textConnection(csv_data), sep = ",", col.names=c("at", "value")) | |
data$timestamp <- strptime(data$at, "%FT%T", tz = "UTC") | |
# strip out outliers | |
data = data[which(data$value < 1000),] | |
#summary(data) | |
ggplot(data, aes(timestamp,value)) + geom_point() + geom_smooth() + xlab("Time") + ylab("NO2 PPB") |
Here are some example graphs of NO2 over a single day and several days. (Note I don’t have the latest sensor updates so the readings may be a bit off)