Abiqua
Well-Known Member
So due to atmospheric scattering, not even sunlight is an "ideal" light source, it's simply very close to one (of varying CCT depending on time of day, the date, and latitudinal degree). You are correct, human sensitivity is taken into account with CRI (UV and IR do not play very substantial roles), however, it has to be said that the closer a light source's SPD (human sensitivity not accounted for) follows that of a blackbody radiator (and not just at peaks, across the entire visible spectrum) of any CCT, the higher it will score in a CRI test. It's not a perfect system, but it gives you a good idea on what's going on.
Also, the sun's light quality has little to do with why we want high CRI/R9, though. If we were to compare it to anything we'd be attempting to recreate the light that a plant would see on a planet in a ~2700K star solar system, right? Regardless, R9 tells me how much Red I can expect for a given CCT. So two values (CCT and R9) together roughly tells me two things: its stand-alone flowering potential, and its inefficiency due to Red phosphor conversion.
If im not mistaken you are correct expect that it is only accounting for certain nm peaks, and then not even at what intensity? I agree though starting as baseline with all this crazyness, it could be acceptable. I still think every manufacturer just needs some kind of intensity rating at each nm of PAR, fuck it, you wan't my money, ante up, ha ha.....