I have been doing a lot of hardware stuff lately. Just after Thanksgiving, I went to pre-order a Spark Photon (https://www.spark.io/) and ultimately ended up ordering the prototyping bundle (https://store.spark.io/?product=prototyping-bundle). This bundle is a Spark Core on a solderless breadboard with a Photon to be shipped at a later date. I figured this would be a good way to get started with the Spark API sooner than later.
The gist of the Spark Core is that it is an ARM processor with a Wi-Fi module built right in. This means that you can do some pretty cool hardware projects that are easily connected to the internet. (At $39 the core is great…at $19 the photon will be awesome.) The other cool thing is that they are already FCC certified, so you could actually start building a real product on day one.
Following the startup guide (http://spark.io/start), I had my core connected to my wi-fi and claimed in a matter of minutes. I bought two, so I went through the smart app with the first one…easy peasy. I went through the USB configuration with the second one…a little more painful, but if you are building hardware at this level, you probably won’t face too much you can’t handle.
After setup, I started walking through a couple of samples to get the onboard LED blinking and make sure I understood the environment and how code was deployed to the Core.
So here’s the setup…I have a USB power block connected to the Core; the Core connected to the wi-fi. Code gets pushed to the Core through the web.
Now I wanted to take a temperature/humidity sensor and connect that to the Core. This is where things got interesting. In the Arduino side of the the world, I would load a library for the sensor and then with an “includes” statement I could just use the sensor. I can similarly do that on the Spark Build code site, but I have have the cpp and h files added to my project (they don’t just sit out in my folder like they do in Arduino).
In the end, I ended up finding a single file approach for my DHT22 sensor. wgbartley had a Spark Core class for reading temperature and humidity from my sensor. While his file approach is all in one file (I am sure for simplicity), I took the liberty to break it out into a class library that I could reuse.
Here’s my spark setup.
Next, I have three code files on my Spark App.
TempSensor.ino
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
#include "DHT.h" #define DHTPIN D5 // Digital pin D2 #define DHTTYPE DHT22 DHT dht(DHTPIN, DHTTYPE); double h; // humidity double tc; // temperature c double tf; // temperature f int temperature = 0; void setup() { Spark.variable("temperaturec", &tc, DOUBLE); Spark.variable("temperaturef", &tf, DOUBLE); Spark.variable("humidity", &h, DOUBLE); dht.begin(); } void loop() { delay(5000); h = dht.readHumidity(); tc = dht.readTemperature(false); tf = dht.readTemperature(true); } |
DHT.h
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
#include "inttypes.h" #define MAXTIMINGS 85 #define cli noInterrupts #define sei interrupts #define DHT11 11 #define DHT22 22 #define DHT21 21 #define AM2301 21 #define NAN 999999 class DHT { private: uint8_t data[6]; uint8_t _pin, _type, _count; bool read(void); unsigned long _lastreadtime; bool firstreading; public: DHT(uint8_t pin, uint8_t type, uint8_t count=6); void begin(void); float readTemperature(bool S=false); float convertCtoF(float); float readHumidity(void); }; |
DHT.cpp
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 |
#include "DHT.h" #include "application.h" DHT::DHT(uint8_t pin, uint8_t type, uint8_t count) { _pin = pin; _type = type; _count = count; firstreading = true; } void DHT::begin(void) { // set up the pins! pinMode(_pin, INPUT); digitalWrite(_pin, HIGH); _lastreadtime = 0; } //boolean S == Scale. True == Farenheit; False == Celcius float DHT::readTemperature(bool S) { float _f; if (read()) { switch (_type) { case DHT11: _f = data[2]; if(S) _f = convertCtoF(_f); return _f; case DHT22: case DHT21: _f = data[2] & 0x7F; _f *= 256; _f += data[3]; _f /= 10; if (data[2] & 0x80) _f *= -1; if(S) _f = convertCtoF(_f); return _f; } } return NAN; } float DHT::convertCtoF(float c) { return c * 9 / 5 + 32; } float DHT::readHumidity(void) { float _f; if (read()) { switch (_type) { case DHT11: _f = data[0]; return _f; case DHT22: case DHT21: _f = data[0]; _f *= 256; _f += data[1]; _f /= 10; return _f; } } return NAN; } bool DHT::read(void) { uint8_t laststate = HIGH; uint8_t counter = 0; uint8_t j = 0, i; unsigned long currenttime; // pull the pin high and wait 250 milliseconds digitalWrite(_pin, HIGH); delay(250); currenttime = millis(); if (currenttime < _lastreadtime) { // ie there was a rollover _lastreadtime = 0; } if (!firstreading && ((currenttime - _lastreadtime) < 2000)) { //delay(2000 - (currenttime - _lastreadtime)); return true; // return last correct measurement } firstreading = false; Serial.print("Currtime: "); Serial.print(currenttime); Serial.print(" Lasttime: "); Serial.print(_lastreadtime); _lastreadtime = millis(); data[0] = data[1] = data[2] = data[3] = data[4] = 0; // now pull it low for ~20 milliseconds pinMode(_pin, OUTPUT); digitalWrite(_pin, LOW); delay(20); cli(); digitalWrite(_pin, HIGH); delayMicroseconds(40); pinMode(_pin, INPUT); // read in timings for ( i=0; i< MAXTIMINGS; i++) { counter = 0; while (digitalRead(_pin) == laststate) { counter++; delayMicroseconds(1); if (counter == 255) break; } laststate = digitalRead(_pin); if (counter == 255) break; // ignore first 3 transitions if ((i >= 4) && (i%2 == 0)) { // shove each bit into the storage bytes data[j/8] <<= 1; if (counter > _count) data[j/8] |= 1; j++; } } sei(); // check we read 40 bits and that the checksum matches if ((j >= 40) && (data[4] == ((data[0] + data[1] + data[2] + data[3]) & 0xFF))) return true; return false; } |
Now I can use the following as standard HTTP POST to read the value of the three variables from the Spark.
1 2 3 |
https://api.spark.io/v1/devices/{core_id}/temperaturec?access_token={access_token} https://api.spark.io/v1/devices/{core_id}/temperaturef?access_token={access_token} https://api.spark.io/v1/devices/{core_id}/humidity?access_token={access_token} |
Next Up:
Pushing Data to the Cloud…This approach makes me reach out to the Spark and query for the values. In the next post, I will show you how to use the events to have the Spark updating the variables as well as using the TCPClient to POST the values to a waiting web service.