90 Watt LED Driver - Need a bit of help here :)
Posted: Sun Mar 26, 2006 3:13 am
I know a thing or two about electronics, but when it comes to the really complex stuff, it seems like a good idea to get the assistance of those in the know. Ok so maybe this is not THAT complicated but it is beyond me right now. You guys really know your stuff. Can somebody who knows more about this than I do give me a quick hand. This is a rather unusual project...
Here is my situation. I have a 200W PC Power supply. I have 18 3w 1000 mA LEDs. That's 63 amps @ 3.5v. I need to drive all 18 LEDs, each at 1000 mA. Three parallel lines of 6 in series (on second thought, 6 lines of 3 sound better). I have a constant VOLTAGE supply (the PC Power Supply). I do not have a constant CURRENT supply, which is what I need. As you all know, when the LEDs heat up, their resistance changes. Since I intend for these to run 24/7, if one were to short...we would have a problem on our hands. I would like to avoid that.
First solution - buy a LED driver. None of the "off the shelf" constant current regulators will do the job. They are QUITE expensive to boot - and wiring up two of them is going to cost too much.
http://www.advancetransformer.com/uploa ... luxeon.pdf
However, I figured out (with this boards help) that if I have a constant VOLTAGE then I can create a circuit to maintain constant CURRENT. CPU power supplies = constant voltage, and they are pretty cheap. Since I only need ~ 90 watts.... 200W supply should be plenty (as long as it can handle 100+ across the 12v lines).
Constant voltage to constant current example circuit:
http://www.onsemi.com/pub/Collateral/AND8109-D.PDF
My main problem though is this. I know the CPU power supply can drive 100+ watts no problem, and most should be able to pull enough watts across the 12v line. How do I set this circuit up so that it can 1) handle 100 watts (safety margin) 2) can constantly adust to provide a constant current, instead of constant volts, and 3) be fault tolerant - if a led goes out or overheats - I don't want the system to fail. That branch of LEDs would fail but the other 2 branches (or 5 if it is a 3 in series, 6 of those in parallel) would keep humming. Lastly, what happenes if a single LED fails but instead of opens the circuit, simply stays closed? I assume that the runaway current increases would be managed by such a circuit?
I have been an electronics enthusiast for many years but still a bit new to power regulation - so any help you could give me would be appreciated.
Schematic:
From: http://led.linear1.org/led.wiz
12v source voltage
3.5v forward voltage
1000 ma current
18 leds in array
Solution 0: 3 x 6 array uses 18 LEDs exactly
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
The wizard says: In solution 0:
each 1.5 ohm resistor dissipates 1500 mW
!! the wizard thinks the power dissipated in your resistors is a concern
together, all resistors dissipate 9000 mW
together, the diodes dissipate 63000 mW
total power dissipated by the array is 72000 mW
the array draws current of 6000 mA from the source.
So I would need at least 2 Watt resistors. I share the wizards concerns that I am dissipating a LOT of heat through those resistors (not to mention wasting a lot of power). Heat is already a big concern with the LEDs - I certainly don't want to add to the problem but 9 watts of resistor loss isn't TOO bad. 6 amps @ 12v should not be a problem for a 200W CPU power supply..... but what I am missing here? I still don't feel that this circuit as is is going to be as failsafe as it should be. I can (and would) add a fuse but what if 1 of the LEDs fail? Would it chain reaction the rest? What if the input voltage all of a sudden goes to 18v? This is what I need the assistance of some of you guys who are more seasoned than I!
I appreciate and and all help you are willing to provide.
<small>[ March 26, 2006, 03:45 AM: Message edited by: LTC ]</small>
Here is my situation. I have a 200W PC Power supply. I have 18 3w 1000 mA LEDs. That's 63 amps @ 3.5v. I need to drive all 18 LEDs, each at 1000 mA. Three parallel lines of 6 in series (on second thought, 6 lines of 3 sound better). I have a constant VOLTAGE supply (the PC Power Supply). I do not have a constant CURRENT supply, which is what I need. As you all know, when the LEDs heat up, their resistance changes. Since I intend for these to run 24/7, if one were to short...we would have a problem on our hands. I would like to avoid that.
First solution - buy a LED driver. None of the "off the shelf" constant current regulators will do the job. They are QUITE expensive to boot - and wiring up two of them is going to cost too much.
http://www.advancetransformer.com/uploa ... luxeon.pdf
However, I figured out (with this boards help) that if I have a constant VOLTAGE then I can create a circuit to maintain constant CURRENT. CPU power supplies = constant voltage, and they are pretty cheap. Since I only need ~ 90 watts.... 200W supply should be plenty (as long as it can handle 100+ across the 12v lines).
Constant voltage to constant current example circuit:
http://www.onsemi.com/pub/Collateral/AND8109-D.PDF
My main problem though is this. I know the CPU power supply can drive 100+ watts no problem, and most should be able to pull enough watts across the 12v line. How do I set this circuit up so that it can 1) handle 100 watts (safety margin) 2) can constantly adust to provide a constant current, instead of constant volts, and 3) be fault tolerant - if a led goes out or overheats - I don't want the system to fail. That branch of LEDs would fail but the other 2 branches (or 5 if it is a 3 in series, 6 of those in parallel) would keep humming. Lastly, what happenes if a single LED fails but instead of opens the circuit, simply stays closed? I assume that the runaway current increases would be managed by such a circuit?
I have been an electronics enthusiast for many years but still a bit new to power regulation - so any help you could give me would be appreciated.
Schematic:
From: http://led.linear1.org/led.wiz
12v source voltage
3.5v forward voltage
1000 ma current
18 leds in array
Solution 0: 3 x 6 array uses 18 LEDs exactly
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
+----|>|----|>|----|>|---/\/\/----+ R = 1.5 ohms
The wizard says: In solution 0:
each 1.5 ohm resistor dissipates 1500 mW
!! the wizard thinks the power dissipated in your resistors is a concern
together, all resistors dissipate 9000 mW
together, the diodes dissipate 63000 mW
total power dissipated by the array is 72000 mW
the array draws current of 6000 mA from the source.
So I would need at least 2 Watt resistors. I share the wizards concerns that I am dissipating a LOT of heat through those resistors (not to mention wasting a lot of power). Heat is already a big concern with the LEDs - I certainly don't want to add to the problem but 9 watts of resistor loss isn't TOO bad. 6 amps @ 12v should not be a problem for a 200W CPU power supply..... but what I am missing here? I still don't feel that this circuit as is is going to be as failsafe as it should be. I can (and would) add a fuse but what if 1 of the LEDs fail? Would it chain reaction the rest? What if the input voltage all of a sudden goes to 18v? This is what I need the assistance of some of you guys who are more seasoned than I!
I appreciate and and all help you are willing to provide.
<small>[ March 26, 2006, 03:45 AM: Message edited by: LTC ]</small>