Originally Posted by
pooal
if you have say 5 x 6v leds that are rated at say 100ma then you have 6v 600ma and 3.6w being drawn from the source, thus you are trying to run 6v 600ma 3.6w through all 5 x leds, any extra power than what there rated at will pretty much just produce wasted heat, wich will require more power from source to produce the power the rated leds are trying to draw..
to measure the wasted heat is easy, just measure volt and current at input and output of the leds. and times the volt by the current to get your watts, the output will naturally have a little less, the difference is your wasted heat(heat produced by the leds) so you could use that chris to work out how much heat your globes are actually putting off
e.g. if you measure the difference in current from input to output on one led the output will naturally be lower(the dif is the heat produced from the led), thats ok!! but lets say you now want to send that power to another led to power it instead of returning it to the source(parallel),well the first led can only allow so much current to pass through, the higher you go the more it resists and the more it heats up, and by the time you get to the 5th led your drawing way more current than you needed..so if you want to draw more current for power hungry devices like speakers,subwoofers etc.. then use parallel..but in your case then use series it will draw less current(not voltage bro..common mistake) :up: