Raspberry Pi - A new type of RAM

If you've had trouble booting up your Raspberry Pi then read on...

The latest batch of Raspberry Pi's we have been selling through the shop (manufactured in South Wales, UK) have use a new type of RAM chip.

Previously the Pi used a Samsung chip, they have now switched to using a chip manufactured by Micron marked with 3KA18 D9QHN and an 'M' logo. This chip is visible in the middle of the photo below mounted on top of the processor using their cleaver package-on-package technology. This RAM chip is still 512Mb in size

Raspberry Pi with new type of RAM chip
Older Samsung RAM chip
To my knowledge there has been no evidence that the new chip give any performance benefit, the change is probably due to cost or logistic reasons. 

This new chip requires a firmware update to work. Our current SD card images (e.g oemgateway_24sep2013.img) won't boot with the new RAM; static red PWR LED and nothing else. 

To make the Raspberry Pi boot you will need to download the following files and put them in the SD cards FAT (boot) partition overwriting the older files: 


Alternative you could download the whole Raspberry Pi firmware repository (95.6Mb) and copy out the files from the boot directory https://github.com/raspberrypi/firmware/archive/master.zip

I'm currently working on getting a new ready-to-go SD card image download uploaded with the changes above included. This should be available to download soon from: http://emoncms.org/site/docs/raspberrypigateway. Look for the 22nd Oct 2013 image.

All SD cards purchased in the shop after today will have the new image which works on the Raspberry Pi's with the new RAM.


emonTH Update - Software and Power Consumption

Today I have spent some time writing the software for the emonTH. The goal for the emonTH is for it to last as long as possible from batteries (2 x AA's). The boost converter circuit as highlighted in my previous post will go someway to increasing battery life, however most gains in battery life will come from the software (ATmega328 Arduino sketch) .

The emonTH supports both the DHT22 (humidity and temperature) and DS18B20 either onboard or remote temperature sensor. The default software will search for the presence of either sensor at startup. If both sensors are found it will return humidity from the DHT22 and temperature from the DS128B20. If only the DHT22 is found it will return both humidity and temperature readings from this sensor, finally if only the DS18B20 is found only temperature readings will be returned. In the future I would like to expand the code to support multiple DS18B20 sensors on the one-wire bus.

I have implemented many of the power saving tricks as Martin Harizanov has used in his Funky Sensor code, in particular the his DS18B20 power saving tweaks. Martin has done some great work optimising power and designing some very small low power nodes, his blog is well worth a read.

The emonTH code (in beta) is now up on Github: https://github.com/openenergymonitor/emonth

The power consumption results are as follows, assuming one reading is taken per min and using this battery estimation tool assuming AA capacity of 2200mAh and not taking into account AA self-discharge*

emonTH with DS18b20 temperature only (Vin = 2.6V)

Blue - DS18B20 power digital power pin, Yellow - voltage drop across series resistor. Due to switching noise from the DC converter the scope was not very useful for measuring current (voltage drop across a resistor), the scope was used to measure timings and power was measured with accurate multimeter 
Sleep Current: 0.12mA
On current: 9.7mA for 70ms then peaking to 26mA for 2.8ms for RFM12B transmission, giving average of 10.2mA for 9.8ms

Approximate battery life of 3.5 years*

emonTH with DHT22 (temperature & humidity) only (Vin = 2.6V)

Blue - DHT22 power digital power pin, Yellow - voltage drop across series resistor. Due to switching noise from the DC converter the scope was not very useful for measuring current (voltage drop across a resistor), the scope was used to measure timings and power was measured with accurate multimeter 

Sleep Current: 0.12mA
On current: 9.5mA for 1700ms then peaking to 26mA for 2.8 ms for RFM12B transmission giving average of 9.525mA for 1703ms

Approximate battery life of 1.1 years*

*Stay tuned for the next post on AA battery considerations including how to deal with self-discharge issues...

emonTH Update - Hardware

Since my last post on the emonTH wireless Temperature and Humidity monitoring node good progress has been made.

emonTH cased up

emonTH - unboxed

The most significant hardware change has been the addition of a DC-DC step-up boost converter to step-up the voltage from discharging AA batteries to a steady 3.3V. The boost converter circuit consists of a tiny (SC-70 package) LTC3525-3.3 chip a 10uH inductor and a couple of small 1uF capacitors. The step-up converter is essential for the DHT22 as this sensor does not perform well with varying supply voltage,  specifically once below 3.3V. The addition of the converter will also significantly increase battery life. The LTC3525 was chosen because of its low quiescent power consumption of 7uA and high conversion efficiency of up to 95%.

emonTH LTC3525 DC-DC boost converter circuit

The boost circuit is very impressive, given a minimum input voltage of 0.7V it boosts up to a steady 3.3V.

Using scope with AC coupled probe to examine boost converter output when stepping 2V up to 3.3V with no load: output exhibited 9.3mV RMS ripple at 333Khz

Testing  emonTH external DS18B20 temperature sensor terminal block connection

We hope to have the emonTH in the shop by December.

Stay tuned after the break for update on emonTH software, power consumption and batteries...

AA Battery Considerations

The manufacture of batteries is a very energy intensive process often using many types of heavy metals, it makes total sense to use rechargeable batteries where possible and always recycle old batteries. When it comes to low power sensing nodes I’m as guilty as anyone else when it comes to just sticking in some cheap alkaline batteries, always believing that the performance of rechargeable batteries was much lower. This is not the case anymore.

If rechargeable batteries are used (which they should be) the self-discharge rate can be significant. The self-discharge rate of NiMH batteries is high: around 30% per month at room temperature. This problem can be almost eliminated by using low-self discharge NiMH cells such as Eneloop, they have a discharge rate of about 5% per year. If non-rechargeable alkaline batteries are used they have a self-discharge rate of less than 2% per year.

Rechargable batteries self-discharge graph from batterydata.com

If you care about the environment (which we all should do) we highly recommend the use of Sanyo Eneloop rechargeable AA’s in the emonTH and emonTx. They are a bit more expensive (about £2 each) but over their lifetime (they can be recharged 2100 times!) they will work out cheaper. The Eneloop cells are cadmium free and arrive fully charged and ready to use. Sanyo states that this charge is supplied by their solar PV system in Japan!

An excellent setup (as recommended by JCW of JeeLabs.org) is a spare set of Eneloop AA’s (Apple AA’s are rebranded Eneloops) permanently plugged in an Apple AA charger which he measured using 0W when the batteries are fully charged! This way you always have a set ready to go: http://jeelabs.org/2010/08/19/new-aa-battery-option/

Update: I’ve just discovered iGogreen’s Alkaline Rechargeable batteries which claim to hold their charge for 7 years and contain no heavy metals : Mercury, Cadmium, Lead or Nickel. They do fall down for high power draw applications but this is not an issue here, maybe a perfect match for long term low power nodes? They are also cheaper than Eneloops. The only draw back is they they need a special charger, iGo do a reasonably priced nice looking USB charger which will also charge standard NiMH

Update #2: The iGoGreen rechargable alkaline AA’s tended to leak acid after a couple of years. I would not recomend. They seem to have now been discontinued

CarbonCoop & OpenEnergyMonitor build weekend, November 16 & 17th

After much re-arranging we've got the new date for the energy monitoring build weekend that we're hosting with Carbon Coop at MadLab in Manchester. Its now on the 16th and 17th of November.

Meetup page:

As before there will be three main parts to the weekend:

Build an OpenEnergyMonitor system (Saturday 10AM - 6PM + Completion on Sunday if needed)

This is a chance to build a monitoring system with the support of others who have built systems before, Matt Fawcett and I will be on hand to help, we will walk through building the emontx energy monitoring sensor node and how to setup a raspberrypi basestation running emoncms.

With this you can explore and track changes in home electricity use over time via a web dashboard.

If you've already got an OpenEnergyMonitor system but need some help getting it to work your also welcome to attend this workshop, please bring your monitor along.

To complete the build you will need:

Emontx 868Mhz
Programmer - USB to serial UART
Mini USB cable
USB power adapter
• AC-AC Adapter - AC voltage sensor
Raspberry Pi - model B
RFM12Pi Raspberry Pi Expansion board kit 868Mhz
Micro USB Cable
USB Power supply for RaspberryPI
• Sandisk SD card for the Raspberrypi
• 1 mtr cat-5 cable

The total build price if you get everything from the OpenEnergyMonitor shop is £121.5 inc VAT.

If you already have a raspberry pi and spare USB Power supplies, micro and mini USB cables (they often come with newer mobile phones) you can do the whole build for £68.50.

There will be a limited number of these kits available on the day (at the same cost), to make sure you can build please order these beforehand.

Put a note in your order message that you need the kits for the weekend so that we can make sure you have them.

If you want to read-up on the build guides and learn more about the system before the event take a look here:


Show and tell
Were excited that Robin Emley will be joining us to demonstrate his Solar PV Diverter on the Saturday: 

If youd like to come and show what you've been working on around open source monitoring and control please do, get in contact to let us know if you are coming.


There will be a table dedicated to just developing something new, hardware or software. For example editing or amending the monitor's online display dashboards or improving energy modelling tools such as OpenSAP.

Sign up on the meetup page
Please add your name to the meetup page if your coming so that we have an idea about numbers and let us know how much of the kit you want for the build as above.

We look forward to seeing you there! please get in contact if you have any questions:

[email protected] or [email protected]

Website Backup

In the interest of open-source I thought I would share the backup setup we have running for the OpenEnergyMonitor website. I'm relatively new to sys-admin tasks and writing bash scripts so please suggest if you think if something could be implemented better.

Backing up our Drupal SQL databases which contain the user credentials and all the forum and text content of the website was relatively easy since the disk space they take up is relatively small. A nightly SQL dump then a scheduled secure FTP bash script running as a nightly cronjob on a Raspberry Pi with external hard drive to download the zipped SQL database does the trick. The FTP login credentials are stored away from prying eyes in .netrc file (with chmod 600), two sets of credentials are required and the relevant .netrc file is copied to the home folder when needed.

cp netrc/.netrc1 .netrc
today=$(date +"%d-%b-%Y")
ftp -vp -z secure $HOST << EOT
get $db_name-$today_backup.gz $LOCAL_BACKUP/$db_name-$today_backup.gz
rm .netrc

 Backing up the files (images, documents etc) is a bit more of an issue since the ever increasing size of the content mean it's impractical and would unnecessary load the server and bandwidth to download a full snapshot every night.

I found wget has many customisable options. A nightly scheduled bash script running on a Raspberry Pi with an external hard drive with the following wget options looks at files have been created or modified since the last time the command was run and only downloads the changes. Once the initial download is done the command only takes less then a minute to execute and often only downloads a few Mb of data. The option '-N' tells wget only to download new or modified files

cp netrc/.netrc2 .netrc
wget -m -nv -N -l 0 -P $LOCAL_BACKUP ftp://$HOST/public_html/FILES_LOCATION -o $LOCAL_BACKUP/filelog=$today.txt
rm .netrc
# This is what the other options do:
# -l 0 infinite level of recursive (folder depth)
# -m mirror
# -N only download new files
# -o logfile
# -b run in the background
# -q turn off logs
# -nv non-verbose logs
This setup seems to be working well. It has a few weak points and limitations that I can think of:
  • The wget files backup script only downloads new and modified files, it does not mirror the fact that a file could have been deleted on the server, the file would remain in the backup. 
  • The wget script does not keep historical snapshots meaning that if something bad was to happen it would not be possible to rollback to a certain date in history. Update: I have since had recommend to me Rsnapshot which is a backup utility based on Rsync. Rsnapshot looks great and can work over FTPS. My friend Ryan Brooks wrote a good blog post on how to set up Rsnapshot over FTPS
  • Currently the Raspberry Pi only has the one external 1TB hard drive used for backup, ideally this would be two hard drives in a raid array for double safety Backups are only done nightly, this is plenty good enough for us at the moment but might need to be improved in the future. 

I think it's amazing that a little £25 Raspberry Pi is powerful enough to handle backup for several websites. the Pi with an external 1TB hard drive connected through a USB hub consumes only 5.7W making it not too bad to leave on 24/7.

 One issue that I had initially with the Pi is that the external hard driver would move from /dev/sdb to /dev/sdc therefore loosing it's mount point. I think this was caused by the HDD momentarily losing power. Switching to using a Pimoroni PiHub to power the setup and mounting the drive by it's UUID instead of /dev/xxx reference in fstab fixed the problem: 

UUID=2921-FCE8 /home/pi/1TB vfat  user,umask=0000   0   0

I would be interested to hear if you think how the backup could be implemented more efficiently or more securely.

Backing up your raspberrypi emoncms or emoncms.org account

I've added a couple of scripts to the emoncms usefulscripts directory that makes backing up an emoncms account whether on a local raspberrypi or emoncms.org much easier than the sync module solution, although still not as easy as I would like it to be, I would like setting up a backup to be as easy as installing a dropbox client. 

The scripts work with Linux at the moment, hopefully soon I'l get a solution running on Windows. I would highly recommend keeping your own backup of your data, a backup of your emoncms data on your main computer can also be useful for both quicker access to the data and doing additional analysis of the data.
1) Install emoncms on your backup machine following the guide here: http://emoncms.org/site/docs/installlinux
Create an account and note down your mysql credentials.

2) Download the usefulscripts repository

There are two scripts available under usefulscripts/replication

  • import_full.php
  • import_inputs.php

3) Try importing your inputs first to test that it works: Open to edit import_inputs.php.
Set your mysql database name, username and password, the same credentials as for the settings.php step of the emoncms installation. Set the $server variable to the location of the raspberrypi or emoncms.org account you want to backup and set the $apikey to the write apikey of that account.

In terminal, goto the usefulscripts/replication directory. Run import_inputs.php:

    php import_inputs.php

If successful you should now see your input list and all input processing information backed up in your local emoncms account.

4) Backup all your feed data:

As for importing inputs open import_full.php and set the database credentials and remote emoncms account details.

Run the backup script with sudo

    sudo php import_full.php

That's it, it should now work through all your feeds whether mysql or timestore (no phptimeseries support yet) making a local backup. When you first run this script it can take a long time. When you run this script again it will only download the most recent data and so will complete much faster. I run this script once every few days to keep an up-to-date backup.

RaspberryPI: SD Cards, HDD, Gateway Forwarder

After several Raspberry Pi emonBase SD card failures between myself and Glyn in the last two weeks and the general experience of short SD card lifespan on the forums (SD cards have a limited number of writes), we thought we'd make a concerted effort this week to move the openenergymonitor Raspberry Pi documentation, pi images and pre-installed SD cards over to the more stable solutions:

Raspberry Pi emonBase with RFM12Pi in Pibow Timber Case

1. We read Martin Harizanov's blog post on creating a rock-solid gateway using a read-only filesystem with Jerome Lafréchoux's excellent python oem_gateway to forward data to emoncms.org

This is a reliable solution which is simple to setup and works well if you just want to forward data to a remote emoncms server like emoncms.org. We have created a Raspberry Pi SD card image for this read-only filesystem oem_gateway setup, see documentation page for full details, there's a link to download it there too:

The ready-to-go pre-loaded SD card available in the shop will be pre-loaded with this image from now on. 

2. We also heard from Paul Reed about his setup where the SD card is just used to boot an external hard drive (powered by a USB hub) with the Pi's root partition running a web server and the full version of emoncms on the external hard drive. This solution is great if you want to keep your data locally or have an additional backup. This could also double up as a 24/7 file server running in your home for music streaming, document backup etc.

Glyn's using the read-only gateway for his setup and I'm going to setup the hard drive local backup in addition to forwarding to emoncms.org for mine.

Here's a diagram that illustrates these different options, including the NanodeRF, local and remote storage, backup options and emoncms data storage options (mysql, timestore etc):
If your not sure which way to go, its probably easiest to start with the Raspberry Pi oem_gateway forwarder to emoncms.org, you can re-configure it for local storage and backup via an external harddrive if you want later. If your more comfortable with Arduino code than the Raspberry Pi and linux then the NanodeRF (pre-assembled SMT) may be best for you.

To summarise the above with pros and cons, here are the two main emonBase Raspberry Pi options:

1. Run a read only filesystem on the Pi and forward data straight to a remote emoncms server.
Developed by Jerome Lafréchoux and Martin Harizanov's
  • + More robust and easier to setup
  • + Cheaper and lower power than using an external HDD
  • - Does not utilise the full potential of the Pi
  • Run IPE – Industrial Perennial Environment a special blackout-proof flavour of Raspbian which can be locked down after setup to work in read only mode
  • The Oem Gateway Python script by Jérôme Lafréchoux to forward the data received by the RFM12Pi straight to emoncms.org
2. Move the Pi's file system to an external HDD using the SD card only to boot 
This option requires an external hard drive connected to the Pi through a powered USB hub. This does require extra expense and an increase in power consumption, the hard drive we tested here a 1TB USB hard drive adds 1.8W to the power consumption of the Pi (3.9W). However this option does have several advantages:
  • + Ability to run emoncms on the Pi to log data locally and have full control over your setup as well as forwarding data to a remote emoncms server.
  • + You now have the potential to have 24/7 file server running in your house for music streaming document backup etc.
  • - More complicated to setup
  • - Higher cost and power consumption

Measuring building thermal performance - coheating tests

Improving the thermal performance of buildings is an area where some of the largest energy and carbon savings can be made. Building energy use and carbon emissions can be reduced by as much as 60-80% through better insulation, draught proofing (improved thermal performance) and heating efficiency and controls. But how do you go about working out the performance of your house and what measures are best to undertake to reach this level of performance improvement? Improving building fabric is expensive, how do you work out which measures will be most cost-effective? and how do you make sure that your house actually achieves the target performance? how do you measure and check it?

These are the questions I’m currently grappling with for my own house and the openenergymonitor lab, these are also the questions we’re trying to develop improved processes for answering with our work with Carbon Coop.

Here’s a graphic from the Centre for Alternative Technology’s Zero Carbon Report illustrating the kind of performance improvements that are possible:

The most common way to investigate domestic building performance is by using simple building energy models such as SAP:

The accuracy of a model is always dependent on its input data and assumptions, work that has been done on comparing modelled energy consumption as calculated by SAP vs actual energy consumption show that there is often a discrepancy and in some cases the discrepancy is so large (even a 100% or more) as to undermine decisions made based on model outputs.

To rely therefore on modelled performance only can be misleading.

It is possible however to measure the total building thermal performance by measuring how much energy it takes to heat a building up to a given temperature above the outside temperature. This procedure is known as a coheating test and was pioneered by the Center for the Built Environment at Leeds Met University. (There's also a good info sheet on coheating tests here by Peter Warm www.peterwarm.co.uk/?dl_id=6 )

The standard co-heating test involves heating building when unoccupied to an elevated internal temperature of 25C over a period of 1-3 weeks with electric heating and monitoring the electrical heat input, internal and external temperatures. Keeping the building unoccupied reduces unknown variables and so increases the accuracy of the measurement but it alongside the elevated temperature makes widespread testing of this kind difficult.

The question that several people have been asking and I’m aware of several groups working on this (@Housahedron and Richard Jack, lboro) is; can a method be developed for undertaking an ongoing co-heating equivalent test that can be undertaken while the building is being used, that could even measure thermal performance over time as measures are undertaken. The analogy is a car MPG meter but for your house.

Over the last few weeks I’ve been working on an approach that I think is showing promising results, it involves applying a dynamic model to realtime monitored heating input and external temperature data to model indoor temperature. The modelled internal temperature is then compared to the actual indoor temperature.

The model parameters that give a good match tell you the heat loss factor of your building and also it’s thermal mass. The heat loss factor is your MPG equivalent for household fabric thermal performance.

The model type is a multi stage resistor-capacitor model, using the resistor-capacitor analogy for thermal modelling is a standard approach used in many thermal models, there's an interesting page on it here: http://lpsa.swarthmore.edu/Systems/Thermal/SysThermalModel.html

This is all open source and the code so far is up on github here: https://github.com/emoncms/openbem,
There's also a forum thread here with a bit more information on the model and tests http://openenergymonitor.org/emon/node/2783

Faire Maus! A 'fair' USB optical mouse

A few weeks ago I wrote a blog post about ethical and sustainable electronics.

In the post I mentioned a German project called Nager IT who have designed an optical USB mouse using low carbon materials and have gone to great lengths to source ethically produced components with a transparent supply chain and exploitation free manufacture.

We have decided to support the Nager IT project by becoming a UK reseller of their 'Faire Maus' or fair mouse. This mouse is as good as it gets at the moment in terms of ethical and sustainable electronics.

The mouse is a well-made three-button USB optical scroll wheel mouse made using low carbon materials, ethically produced components with a transparent supply chain and exploitation free manufacture.

The mouse is now available for purchase on a non profit basis (for us) in our shop: http://shop.openenergymonitor.com/fair-usb-optical-mouse/

The mice are assembled by hand in Germany and feel very well made, the texture of the wood plastic enclosure feels good in the hand and the sleek and rubbery USB cable runs smoothy over the table.

Every time I use the mouse it it reminds me that ethical and sustainable electronics is a goal which is worth pursuing.

Purchasing a fair mouse can be thought of like purchasing fair trade coffee; it might cost a little bit extra but you know that your doing good.

Checkout the great graphic Nager IT have produced illustrating their transparent supply chain behind this mouse.