SparkFun Forums 

Where electronics enthusiasts find answers.

Have questions about a SparkFun product or board? This is the place to be.
By Vraz
#71622
Thanks to previous feedback, I have redesigned the power supply section for my upcoming solar powered wireless weather station. Essentially I went with symmetric circuits for both the solar panel and battery that allow voltage sampling via 2:1 divider to an ADC and the ability to individually connect/disconnect the solar panel and battery to the regulator.

Couple of issues that came up with my redesign:

1- I used 100K resistors in the voltage divider to limit current drain. I know this impacts the ADC sampling time. By my calculations, I need about 5ms to be certain the sample/hold capacitor is properly charged. Are there any other implications to using large resistors for the divider? Could I go larger if the sampling time is not an issue?

2- There is one gotcha in that my design won't actually power up without help. I used pulldown resistors on the N-MOSFETs which act as level shifters. I cannot use pullups because that would exceed the voltage limits of the microcontroller. Any clever idea on how I get this circuit to startup with minimal extra parts?

Image
By Vraz
#71623
2- There is one gotcha in that my design won't actually power up without help. I used pulldown resistors on the N-MOSFETs which act as level shifters. I cannot use pullups because that would exceed the voltage limits of the microcontroller. Any clever idea on how I get this circuit to startup with minimal extra parts?
Upon looking at this further, it appears I confused myself as I the logic is inverted. The pulldown on the lower N-MOSFET will prevent it from conducting which will in turn allow the upper P-MOSFETs to be pulled high, allowing them to conduct. So in theory, both the solar panel and batter will be connected to the linear regulator if the uC is powered off (thus powering it on). So power-on might be a non-issue, but would still appreciate feedback on the ADC sampling.
By emf
#71657
Charge time isn't the only problem, there's also the leakage current that becomes more of a factor as your input impedance goes up. With a leakage of +-100nA and a 50K impedance, you're looking at something like +-1.5 counts on an 10-bit ADC at 3.3V.
By felis
#71670
[quote="Vraz"] Are there any other implications to using large resistors for the divider? Could I go larger if the sampling time is not an issue?

Buffer the output of voltage divider with an op-amp voltage follower.
By Vraz
#71691
With a leakage of +-100nA and a 50K impedance, you're looking at something like +-1.5 counts on an 10-bit ADC at 3.3V.
Is the underlying issue that while leakage is pretty small, when impedance starts getting very large that reduces ADC input current to a point that the leakage current becomes meaningful? That in turn effectively translates into jitter bits on the ADC and you lose resolution?
Buffer the output of voltage divider with an op-amp voltage follower.
Interesting idea I had not considered. (The analog side of things is not my strong suit.) Seems like I would need to do a break-even analysis comparing the current consumption of using high-impedance voltage dividers plus the Qc of the opamp compared to using lower-impedance voltage dividers without the opamp.
By emf
#71695
I've never had strong understanding of it myself, but that sounds about right to me. For some PICs, at least, there are different leakage values given in the ADC section of the manual, from the comparator section, from the electrical characteristics section.

I've seen some designs that use a lower-valued resistor divider to come in under the 2.5k spec for the ADC, but use another microcontroller pin and a few transistors to only enable the divider when they want to measure the voltage. You still pay the higher current draw when measuring the voltage, but you leave it turned off the vast majority of the time so your average draw is low. I don't know if that does any better than the op-amp solution... I guess it depends on how often you need to do your measurements.
By Pyrofer
#73327
Id like to shortcut this process by simply connecting my solar cell (5v in good light) directly to the sparkfun LiPo charger.
Two questions,
1) would I still need a protection diode in line with the cell or would the liPo charger take care of that?
2) Would I need an off switch in line with the cell to prevent overcharging or again, is the Sparkfun LiPo Charger clever enough to deal with a constant by varying voltage applied to the charging input and not overcharge the cell?

Any ideas?
By metaforest
#73328
emf wrote: I've seen some designs that use a lower-valued resistor divider to come in under the 2.5k spec for the ADC, but use another microcontroller pin and a few transistors to only enable the divider when they want to measure the voltage. You still pay the higher current draw when measuring the voltage, but you leave it turned off the vast majority of the time so your average draw is low. I don't know if that does any better than the op-amp solution... I guess it depends on how often you need to do your measurements.
Measuring low current voltage sources is a pain in the butt... another way of dealing with it is to use an external capacitor to buffer the sample.

By using one of the PIC signal lines you can drain the capacitor off before a sample and then take the line to high-impedance to let the holding capacitor charge from the divider. After a polite interval allowing the capacitor to recharge, you can then take the sample. If the capacitor is large enough(relative to the ADC's internal holding circuit) the result won't suffer much loss, and any measurable loss would be fairly constant. I am assuming that the capacitor is allowed to fully charge before the ADC is triggered.

Since the OP is working with a Li-Po battery a reasonable lower bound on the charge rate for the capacitor is known. If this were used on the solar array, then it might underestimate the voltage significantly, but only when the output is well below where the panel is useful.
By Vraz
#73398
By using one of the PIC signal lines you can drain the capacitor off before a sample and then take the line to high-impedance to let the holding capacitor charge from the divider. After a polite interval allowing the capacitor to recharge, you can then take the sample.
Just so I understand-- you are suggesting putting the capacitor across the ADC input and ground and then using a second pin to effectively short the ADC input to ground (to discharge the capacitor between charge & sample cycles)? On the AVR I am using, the pin functions can be changed on-the-fly so I could ground the actual ADC input pin (don't need the second pin). My only concern is whether the brief current spike from the capacitor would be a problem for the pin driver. The internal ADC S&H capacitor is 14pf so the external capacitor would not need to be large even to provide a 1000x differential. Otherwise I would need the second pin with a current limiting resistor to discharge it.
By metaforest
#73426
Thats the idea.

using a second pin and a current limiting resistor is sensible.