Home Automation the easy way: My experience with the Spark Core

| 3 min. (578 words)

Raygun is built by a team of technical folks just like you, from time to time we like to share some of the side projects and interesting dev work that we’re taking part in.

One of the interesting talks at this year’s WDCNZ was by @sarajchipps on making programmable jewellery with a cool little piece of kit called the Spark Core. The Spark Core is a tiny Arduino-compatible development kit with a Wifi antenna, fronted by a nice and simple web API. Spark provide this API so that once you’ve got your Core up and running, you don’t need to worry about how to get it on the internet so you can push and pull data from it.

I’ve always wanted to automate my house, so I grabbed a handful of Cores from Spark and got to work building a sensor network. First up, I need some data on what’s happening in my house. I’m keen to know how the temperature cycles up and down, so I purchased some DHT-22 temperature and humidity sensors from Nice Gear. I could have gotten them from Deal Extreme, but it’s nice to support the local guys sometimes. Following this guide here I wired up a quick prototype.

Once it was wired up, I needed to get the code onto the Core. Spark provide a code editor on their website that can push code straight to any Cores you have registered to your account – they don’t need to be physically attached to your machine or anything, just powered on and connected to the internet. From the same Spark forum post, I took the code and modified it to expose the Temperature and Humidity as Spark Variables so we can retrieve them from the API. My code is here: https://gist.github.com/jamiepenney/7f6c9f5ffd0ba896dfe3.

Once I pushed that code onto my prototype Core, I just need to call the Spark API to get values back from my sensors. If we send an HTTP GET to https://api.spark.io/v1/devices/{mydeviceid}/temperature we get the temperature, and https://api.spark.io/v1/devices/{mydeviceid}/humidity gets us the humidity. The authentication scheme is just HTTP Basic Auth with a Bearer token, or an access_token query string parameter, so it’s pretty easy to craft a request using cURL or Postman. Hitting my device shows the temperature as 17.7 degrees C in my house at the moment, which sounds nice.

Spark make it pretty easy to use other people’s libraries in your code – since writing my prototype I’ve discovered that someone wrote a non-blocking library for reading values from the DHT-22. This would mean I could run multiple sensors on a single board. The next step for me is to make up a board with an infrared sensor and an infrared LED, so I should move to using this library instead of the proof of concept code above.

Finally, I’m building a Single Page App to view the temperature at each sensor using Backbone.js. Since there’s no backend (everything comes from the Spark API) I can run this off my Raspberry Pi just by serving up static HTML and Javascript. Spark have made it easy to write this sort of app by adding the Access-Control-Allow-Origin:* header so we can make Cross Domain AJAX calls to it. This removes the need for me to proxy those calls through my server, reducing the amount of work the Pi needs to do.

My dream of automating my house is finally in reach, and it’s only taken 2 days to build most of it!