SparkFun Forums 

Where electronics enthusiasts find answers.

Have questions about a SparkFun product or board? This is the place to be.
By DaveAK
#154372
I have a project with a HIH-4030 / SEN-09569 analog humidity sensor. It's a 5V sensor but in the comments for the product it looks like it should work at 3.3V.
Nate: After testing, this sensor worked really well at 3.3V. You just have to linearly convert the 5V graph to 3.3V.
I should be able to do that. I used to be really good at math, but obviously somewhere along the way I lost my mind.

That's only half the problem though. I'm hooking this up to an XBee analog input which only has a range of 0-1.2V. I thought I could just use a voltage divider on the output to get me in the correct range. I'm not looking for super accuracy, just something that's consistent with my other sensors, even if it's consistently high or consistently low - I can accomodate that. Trouble is I'm just not getting any significant variation. When I first hooked it up it read a couple of points lower than my existing sensor so I thought all was good, but a few days later when the humidity had risen this sensor hadn't noticeably changed.

The XBee it's connected to goes in to sleep mode, but the sensor itself is permanently connected to the supply. My assumption is that this would avoid needing any warm up time before reading the sensor. Am I wrong? Do I need to take readings for a period of time to allow the sensor to stabilize?

I need more than one of these so I've bought another sensor and I'm going to go through the steps of getting it working with 5V, then with 3.3V, and then with 3.3V and a 1.2V output. Any suggestions on how to do the math from 5V to 3.3V and if the voltage divider is the right way of doing this would be greatly appreciated!
By Mee_n_Mac
#154382
That's a lot 'o' questions there Dave, let's take them 1 by 1.

http://www.sparkfun.com/datasheets/Sens ... asheet.pdf
1) If you run the sensor at something other than 5V you'll get differing output voltages than you would at 5V. The datasheet claims the device output is "ratiometric" which says the output should be a linear ratio of the supply voltage vs the reference voltage (which is 5V in this case). So ideally the output when running off of 3.3V is :

VO,3.3 = (3.3/5.0)*V0,5.0 = 0.66*VO,5.0 or it should be 66% of what the expected output would be if running off 5V.

What should it be if running off of 5V ? Glad you asked. There's 2 ways to get an answer to that. First is to use the equation given in Table1. Table 1 says :

V0,5.0 = (VSUPPLY)(0.0062(sensor RH) + 0.16), typical at 25 ºC
or ...
VO,5.0 = 5*0.0062*sensorRH + 5*0.16 = 0.031*sensorRH + 0.8V
{click on to enlarge}
4030Table1.jpg

This also seems to agree pretty well with the graph in Figure3, which says with 0% RH you get ~ 0.75V <-> 0.8V out. And at 40% RH you get ~ 2V out. Give or take perhaps a couple of %RH.
{click on to enlarge}
4030Fig3.jpg
I do want to point out though that there seems to be a large variance from sensor to sensor in the DC offset, that is the voltage you'd get out with 0% RH. Note that in the datasheet there's a Table2 which shows the values for a particular, calibrated unit. It's slope, the mV per %RH, is the same as in the graph above, about 30.68 mV/%RH. But the measured offset for that sensor was 0.958V vs the assumed typical value of 0.8V. That's equivalent to a 5% RH difference. So YMMV.

To run the sensor off 3.3V you'd use the equation :
VO,3.3 = 3.3*0.0062*sensorRH + 3.3*0.16 = 0.0205*sensorRH + 0.528V
Or to get %RH from the voltage :
sensorRH = (VO,3.3-0.528v)/0.0205

If possible you may want to calibrate your sensor at 2 or more known RHs and come up with a better value for the DC offset than the assumed 0.528V above.

2) To get the output to range from some min voltage to 1.2V by using just a voltage divider is tricky. I say that because the sensor says a min load of 80k ohms is as low as you can go. Judging by the comments on the product page, connecting it's output to something < 80k gets you some odd output voltages. Perhaps this was your problem ? How did you do your voltage divider ? What values did you use ? W/o going into more math you need to make sure the sum of the 2 resistors is > 80k, and I might aim for 100k. Even then there's a "trick". You might calculate the ideal divider resistors but if you haven't taken into account the A/D converters input resistance, the output will be more 'divided' than you expect. The fact that you have to use high R values to meet the 80k spec only exacerbates this problem. The good thing is that you can calibrate this error out. You can measure the output and divided voltage to get the true divider number and factor that into the above equation for RH.

3) As for warmup time, it's confusing to me. In Table1 above you can find 2 numbers that seem to pertain to this. The 'settling time' is stated as 70 mS. My semi-guess is that this is the time from power on to a good output voltage, given the sensor has been exposed to it's environment for some 'long' time. What's that time ? How long does it take the sensor to respond to a change in it's environment. Again my semi-guess is that is the response time. They state a time constant actually, and it takes 3 to perhaps 5 time constants for a device to settle to a good number. Since the response time (constant) is stated to be 5 sec, allow 15 to 25 secs for the output voltage to become a good representation of the environment's RH. Take note of Note 5.

ps - don't forget to do the temp compensation.
You do not have the required permissions to view the files attached to this post.
By DaveAK
#154424
Awesome response! I can't thank you enough.

1. Turns out my math hasn't deteriorated as much as I feared. This was pretty much what I was doing, although my code changed so many times it's screwed up now. I'll scrap it and start again fresh.

2. I think this is where my problem lies. I completely forgot the 80K requirement when I was doing my divider. R1+R2=61k. However, from the XBee data sheet input impedance when taking a sample is 1MOhm! Would this be considered to be in parallel with R2? If so I can use a suitable resistor to give me a value to plug in to the divider calculation, right? (As you can probably tell I have a basic grasp of the fundamentals, but if things don't fit into a nice example somewhere then I get lost quick!)

3. While I can adjust the XBee's sleep parameters to give an on time of say 30s before taking a reading I was hoping that if the sensor is constantly powered that I could avoid this. What would your opinion be on that? Since I'm not planning on taking readings more than once every 5 minutes I guess I could live with 30s on time. (This will be in a non-condensing environment - unless things go really really badly!)

4. I have an analog temp sensor attached to the same XBee which is working perfectly, :dance: and I use this temperature for the true RH calculation.
By Mee_n_Mac
#154429
DaveAK wrote:I thi this is where my problem lies. I completely forgot the 80K requirement when I was doing my divider. R1+R2=61k. However, from the XBee data sheet input impedance when taking a sample is 1MOhm! Would this be considered to be in parallel with R2? If so I can use a suitable resistor to give me a value to plug in to the divider calculation, right? (As you can probably tell I have a basic grasp of the fundamentals, but if things don't fit into a nice example somewhere then I get lost quick!)
Correct. But perhaps the example (alas with a different numbering scheme) in wikipedia is the best illustration of your setup. Look at this with the understanding that R1 is the XBee's resistance and R2 and R3 are your resistor divider pair.
http://en.wikipedia.org/wiki/Resistor#S ... _resistors
Image
Image
You want Req to be > 80k. The resulting divider is then (R1||R2)/Req. And you want that ratio to be something in the 0.4 to 0.45 range, assuming you're running off 3.3V and need the 100%RH voltage to be ~ 1.2V (for some reason).

With 50k in parallel with 1M, the equivalent parallel combo is 47.6k ... so it's not that big a deal with the range of values you'd likely to be using. But if you go with higher values for the divider resistors you can see where the error slowly creeps in. And you can see why with <10k resistors and an A/D input > 10M you can forget the "fine tuning". Your case falls into that middle ground.
DaveAK wrote: While I can adjust the XBee's sleep parameters to give an on time of say 30s before taking a reading I was hoping that if the sensor is constantly powered that I could avoid this. What would your opinion be on that? Since I'm not planning on taking readings more than once every 5 minutes I guess I could live with 30s on time. (This will be in a non-condensing environment - unless things go really really badly!)
I don't know that the sensor has to be powered, just in the environment, to get a good reading. Then, after power is applied, wait 100msec to take a reading ... to allow the electronics to settle. I suppose you could do some tests given a dry environment and a pot of boiling water. See how long it takes, after power up, for the readings to settle to some number in each of 3 test cases. Start with a (relatively) dry environment, turn power off for a few mintues and then power on and take readings. Power off for a few minutes again and then move the sensor to the "wet" environment near the pot. Immediately after moving power on and take a bunch of readings. Turn off again but leave the sensor near the pot. Finally, with the sensor having been off but in the "wet" for a few minutes, turn it back on a take a bunch of readings. A bunch in each case being a reading every 100 - 200 msec for 30 secs. My guess is that the 1'st and last test cases will show no difference btw the early and later readings but in the transition from dry to wet you'll see the sensor response time.
By DaveAK
#154452
Thanks again. I'm going to get back to work on this this weekend. I have a second part to this project that I'm going to attempt to make my own PCB for. Looks like it's all coming together now. I'm confident that I can get this sensor working satisfactorily with what you've given me here.
By Duane Degn
#154570
DaveAK wrote:I'm hooking this up to an XBee analog input which only has a range of 0-1.2V.
I don't think this is correct. The high reference voltage on the XBee needs to between 2.08V and Vddad. I'm pretty sure Vddad is Vcc or 3.3V. I'm pretty sure if you use 3.3V on the Vref pin, you can measure voltages between 0V and 3.3V with the XBee's ADC pins.

I saw the 1.2V value on an AVR Freaks post but it looked like they had made the mistake of subtacting 2.08V from 3.3V to get the range. This is not correct.
By waltr
#154607
Duane Degn wrote:
DaveAK wrote:I'm hooking this up to an XBee analog input which only has a range of 0-1.2V.
I don't think this is correct. The high reference voltage on the XBee needs to between 2.08V and Vddad. I'm pretty sure Vddad is Vcc or 3.3V. I'm pretty sure if you use 3.3V on the Vref pin, you can measure voltages between 0V and 3.3V with the XBee's ADC pins.

I saw the 1.2V value on an AVR Freaks post but it looked like they had made the mistake of subtacting 2.08V from 3.3V to get the range. This is not correct.
Careful................it depends on which XBee modules the OP is using. The Vref for the Series 1 modules is different then the Vref for the Series 2 modules.
The OP needs to clarify exactly which XBee modules he/she is using.