- Tue May 01, 2012 3:36 pm
#143810
I meant the op amp only supplied 3v to the LEDs, not across them (I did measure that with a 5v supply, it was around 2.75v).
I tried reading the current with my cheap POS HF multimeter but it didn't really work (no readings anywhere). I don't trust this thing anyway. can I get in the ballpark a different way? there are 13 smd LEDs wired in parallel. I don't think LEDs differ drastically in their current draw? maybe they do?
Mee_n_Mac wrote: Any chance you could post a schematic or wiring diagram of what you think you have ? I gleam you're taking the voltage off a pot that's used to dim lights somewhere and when that voltage goes lower and lower you want the LEDs to get brighter and brighter ?? Thus you've got some inverting op-amp circuit to (try to) do that. That op-amp then drives a bunch of resistor+LEDs that are in parallel. So what you seem to need is an amp that can supply more current at about the same output voltage as the present op-amp circuit. You might be able to find a "power amp" to replace the op-amp in your circuit or one that follows it (op-amp output goes into power amp input).I can post something, but basically what you've described is what Iv'e got - the voltage source I have from the dimmer/switch decreases as the backlights need increasing voltage, and vice versa. It is a dimmer but I never run it at a lower brightness (it's not very bright anyway).
OTOH when you say that "it's purpose is to invert the voltage from the light switch/dimmer, which I always run at full brightness," it sounds like you have a fixed voltage, that's either there or not, and the problem is just that it's not the correct voltage for the LEDs. There may be a much better and simpler way to make the LEDs turn on if this is the case. But I do need to know the circuit as best a you can and what the "wrong" voltage from this switch/dimmer is. Depending on the current involved, you might be able to get rid of this op-amp circuit and just use a reasonable power dropping resistor (though a DC-DC converter would be less wasteful and run cooler and a linear regulator+resistor would split the difference btw these 2 approaches).see above. it doesn't work like this. a DC to DC won't work because at "full brightness" the source I can use from the dimmer switch is 1.35v, at "full dim" it's 8v.
I think you're correct - it's just not getting enough amps. when I supply 5v from a different source it works fine.hassmaschine wrote:btw, even if I crank up the op amp to say, 7v output, it still only gets about 3v to the LEDs.If my understanding is correct, that's to be expected. The op-amp can't provide all the current needed by the LEDs. They are in effect shorting the output of the op-amp to ground (even if it's not a "hard" short). But let's be very clear with the terms used so there's no mistake. When you said "about 3v to the LEDs", did you really mean ~3V between the connector pins that go to the resistors+LEDs and ground ? The actual voltage across just the LEDs themselves should only ever be 1.5 to maybe a little over 2 V, even when driven at their max current levels.
Can you measure what the current draw needed by all the resistors+LEDs is ? I'm thinking you could put an ammeter inline with an ~ 4.8V battery pack and connect the resistors+LEDs to that. With that measurement, we could know what's needed for a "driver" of some sort.
I meant the op amp only supplied 3v to the LEDs, not across them (I did measure that with a 5v supply, it was around 2.75v).
I tried reading the current with my cheap POS HF multimeter but it didn't really work (no readings anywhere). I don't trust this thing anyway. can I get in the ballpark a different way? there are 13 smd LEDs wired in parallel. I don't think LEDs differ drastically in their current draw? maybe they do?