SparkFun Forums 

Where electronics enthusiasts find answers.

Have questions about a SparkFun product or board? This is the place to be.
By beebop
#12837
Hello,

I am using an MCP3204 analog to digital converter to measure the output of an OP amp. The A/D converter has a maximum input voltage of 5v, however the OP amp is powered by + and -12v. I need to ensure that the voltage into the ADC will never exceed 5 volts without changing the linearity of any voltage below that point. I think I can use two diodes for input protection: one cathode to+5V, anode to input, the other, cathode to input, anode to GND. I also have a 3.9k series resistor from opamp output and a 10uF cap, between the signal and GND.

My question is: Do these need to be 5v zeners, or can I use a 1N4148 or similar?

Any advice would be so nice!

Thanks,
Robert
By wiml
#12858
I think you can use a normal diode there. But remember that diodes don't have a "brick-wall" I-V curve, but actually a steep exponential one. So it can still affect linearity near the edges of the range.

Could you set up the ADC so that its conversion range is smaller than its max input range? If it can't handle inputs outside of 0-5v, can you set its references so that the actual conversion range is (say) 1.25v-3.75v? Then the diode nonlinearity can be kept in that 1.25v of margin on each side. This would trade off linearity for noise.
By beebop
#12873
Thank you so much! I think it will work out well, then, as the high end should never get to 4v (perhaps around 3.2 max,) and the low end would only be slightly above 0v, (but not the 1.25v margine you suggest.) If the non linearity becomes an issue, then I suppose I could bump my scale up by a little bit.
Anyway, thanks for the advice!
Regards,
Robert