Photon: One good reading to data.sparkfun then "null"?

Everything pertaining to the data.sparkfun.com service, the phant project which powers it, and user projects which talk to the service.

Moderators: phalanx, toddtreece, Brennen@SparkFun

Post Reply
Silverminer
Posts: 1
Joined: Tue Oct 04, 2016 12:44 pm

Photon: One good reading to data.sparkfun then "null"?

Post by Silverminer » Tue Oct 04, 2016 1:19 pm

I think I bit off more than I can chew as a newbie but after spending several days, I feel like I'm getting close. Unfortunately, I seem to have myself stuck in a loop and I can't figure out why. Go easy, this is all new to me and I cry easily.

The following sketch reads an RHT sensor and a TMP sensor and then posts it to data.sparkfun. I have that part working. I wanted to add a Spark.publish statement so I can get a warning if the temps on the RHT and the TMP vary by a set amount.The code compiles but once I flash it, I get one good reading on data.sparkfun and then "null" repeating in my particle console with no more readings on data.sparkfun. The event name on my console attached to each null is "pipetempbig". Also, the temp diff is not over 4.

Most of the following code came from the sparkfun experiment 6 and I modified it to work with the TMP and am trying to add the .publish part. The commenting is a little disjointed right now. Thanks in advance.



// This #include statement was automatically added by the Particle IDE.
#include "SparkFunPhant/SparkFunPhant.h"

// This #include statement was automatically added by the Particle IDE.
#include "SparkFunRHT03/SparkFunRHT03.h"

/////////////////////
// Pin Definitions //
/////////////////////
const int RHT03_DATA_PIN = D3; // RHT03 data pin
const int PIPE_PIN = A0; // Photocell analog output
const int LED_PIN = D7; // LED to show when the sensor's are being read

///////////////////////////
// RHT03 Object Creation //
///////////////////////////
RHT03 rht; // This creates a RTH03 object, which we'll use to interact with the sensor

int pipef = 0;//I need this variable to run a temp comparison

boolean pipedelta = false;
///////////////////////////////////////////////
// Phant (data.sparkfun.com) Keys and Server //
///////////////////////////////////////////////
// These keys are given to you when you create a new stream:
const char server[] = "data.sparkfun.com"; // Phant destination server
const char publicKey[] = "publicblahblah"; // Phant public key
const char privateKey[] = "privateblahblah"; // Phant private key
Phant phant(server, publicKey, privateKey); // Create a Phant object

///////////////////////
// Post Rate Control //
///////////////////////
// data.sparkfun.com limits how often you can post to the service. You are allowed up
// to 100 posts every 15 minutes.
unsigned long lastPost = 0; // lastPost keeps track of the last UNIX time we posted
const unsigned int POST_RATE_S = 60; // This sets the post rate to 60 seconds. Avoid setting it below 10s.

//////////////////////////
// Station Name Globals //
//////////////////////////
String stationName = ""; // String object to keep track of our Photon's name

void setup()
{
Serial.begin(9600); // Start the serial interface at 9600 bps

rht.begin(RHT03_DATA_PIN); // Initialize the RHT03 sensor

pinMode(PIPE_PIN, INPUT); // Set the photocell pin as an INPUT.
pinMode(LED_PIN, OUTPUT); // Set the LED pin as an OUTPUT
digitalWrite(LED_PIN, LOW); // Initially set the LED pin low -- turn the LED off.

// getDeviceName() -- defined at the bottom of this code -- polls Particle's
// server to get the name of the Photon running this application.
getDeviceName(); // Update the stationName String
}

void loop()

{

STARTUP(WiFi.selectAntenna(ANT_EXTERNAL));//I'm using an external antenna on the Photon

analogRead(PIPE_PIN);//get a reading from the TMP
pipef = (((((((analogRead(PIPE_PIN) * (3.3/4095)) *1000) - 500)/10)*(9.0/5.0)) + 32), 1);//convert it to F

int update = rht.update();
if (update == 1)
{
float tempF = rht.tempF();

if (((tempF-pipef) > 4.0) && pipedelta == false){
Spark.publish("pipetempbig");
pipedelta == true;
}
if (((tempF-pipef) < 2.0) && pipedelta == true){
Spark.publish("pipetempnormal");
pipedelta = false;
}
else
{
delay(RHT_READ_INTERVAL_MS);
}
}

// This conditional should only run when the last successful post to Phant
// was POST_RATE_S (60 seconds) or longer ago.
// Time.now() returns the current UNIX timestamp (number of seconds since January 1, 1970).
// It should increase by 1 every second. On a successful POST, we set lastPost equal to Time.now().
if (lastPost + POST_RATE_S <= Time.now())
{
digitalWrite(LED_PIN, HIGH); // Turn the LED on to indicate we're posting
int update = rht.update(); // Get new values from the RHT03.

if (update == 1) // If the RHT03 update was successful
{
int postResult = 0; // This variable will keep track of whether or not

//while (postResult <= 0)
// Phant posts aren't always successful. Our postToPhant() function,
// defined below, will return 1 if it was successful. Or a negative
// number if it failed.
while (postToPhant() <= 0)
{
Serial.println("Phant post failed. Trying again."); // Debug statement
// Delay 1s, so we don't flood the server. Little delay's allow the Photon time
// to communicate with the Cloud.
for (int i=0; i<1000; i++)
delay(1);
}
// After a successful Phant POST:
Serial.println("Phant post success!"); // Debug print
// Set lastPost to current time, so we don't post for another POST_RATE_S seconds:
lastPost = Time.now();
}
else // If the RHT03 update failed:
{
delay(RHT_READ_INTERVAL_MS); // Delay to give the sensor time to reset
}
digitalWrite(LED_PIN, LOW); // Turn the LED off to indicate we're done posting (/trying to post)
}
}

// postToPhant() gathers all of our sensor data, bundles it into a Phant post,
// and sends it out to data.sparkfun.com.
// It'll return either 1 on success, or a negative number if the post fails
int postToPhant(void)
{
Serial.println("Posting to Phant!");// Debug statement

// Use phant.add(, ) to add data to each field.
// Phant requires you to update each and every field before posting,
// make sure all fields defined in the stream are added here.
phant.add("humidity", rht.humidity(), 1); // These first three phant adds set a field value to float variable
// phant.add("tempc", rht.tempC(), 1); // The third parameter -- valid for float variables -- sets the number
phant.add("tempf", rht.tempF(), 1); // of decimal points after the number.
phant.add("pipef", ((((((analogRead(PIPE_PIN) * (3.3/4095)) *1000) - 500))/10)*(9.0/5.0) + 32), 1);

//phant.add("pipeF", ((((analogRead(LIGHT_PIN) - 500)/10)*1.8)+32),1);

phant.add("station", stationName); // phant.add(, ) is perfectly valid too!

// phant.particlePost() performs all of the Phant server connection and HTTP POSTING for you.
// It'll either return a 1 on success or negative number on fail.
// It uses the field/value combinations added previously to send Phant its data.
// MAKE SURE YOU COMMIT A phant.add() FOR EVERY FIELD IN YOUR STREAM BEFORE POSTING!
return phant.particlePost();
}

///////////////////////////////
// Get Device Name Functions //
///////////////////////////////
// These sets of functions poll Particle's server for the name of our Photon.
// This method is described in Particle's documentation here:
// https://docs.particle.io/reference/firm ... evice-name
// Function handlers are used here -- when the Spark.subscribe function
// returns, it will call the nameHandler([topic], [data]) function with our
// Photon's name.
bool validName = false; // Boolean to track if we have a valid name or not

// nameHandler() is a function handler. It's passed to Spark.subscribe() and
// called when that function returns.
// The [data] variable should have the name of our Photon when it's done.
void nameHandler(const char *topic, const char *data)
{
stationName = String(data); // Store the name in the stationName global variable
validName = true; // Set validName to true, so getDeviceName can stop blocking.
}

// getDeviceName manages the spark subscribing and publishing required to get our
// Photon's name. It'll block for up to 30 seconds. On success it'll return a
// positive number. On fail, it'll return 0.
int getDeviceName(void)
{
Spark.subscribe("spark/", nameHandler);
Spark.publish("spark/device/name");

int timeout = 30;
while ((!validName) && (timeout > 0))
{
Serial.println("Waiting for name..." + String(timeout--));
delay(1000); // Spark.process() is called during delay()
}

Serial.println("Station name = " + stationName);

return timeout;
}

Post Reply