Let's just say Vf, or foward voltage,is either the typical voltage the LED will use or the maximum voltage it will use in a circuit. The driver you want is one that will supply
at least the maximum forward voltage of your circuit. Let's say you want to run 10 of them in a circuit. Now you don't want to run those Philips at anything over 900mA and even that isn't too good an idea, so in your case, in a constant current situation in other words,the driver you would want is a 900mA constant current driver, supplying a minimum 25 volts, (2.5 vf x 10 LEDs), and that it will be able to supply the wattage needed to run the circuit, so at least, 22.5 watts, (.9 Amps x 25 volts) or say 23 watts. Now the closer your circuits are to the driver's maximum wattage or the minimum voltage the circuit needs to run, the more stressful for the driver and the more heat the driver produces, so in this case we want something around 30 watts. It's really pretty
simple actually, just doesn't make sense at first
. The LEDs take the voltage they need, so make sure the driver supplies them with the maximum they will draw. Then you just have to make sure the amperage isn't greater than the individual's LED maximum amperage and then the driver will be able to supply the watts needed from V x A.
Did that help? And someone check my math
. Please.