It's a good question. CRI70 is more efficient than CRI80, which is more efficient than CRI90. But not by as much as you might think.
It all has to do with the phosphors they use on each LED and "efficacy" vs "efficiency" - which I will explain.
Most LEDs emit blue light around the 450nm range. Some of that blue light is absorbed by the phosphor (yellow or orange coloured coating on the LED) and re-emitted as red and green light. Some light energy is lost (or rather, converted to heat) when this happens. Blue light is shorter in wavelength than green and red, and has more energy (Plank's law: energy is proportional to frequency).
So the more blue light you convert to red and green, the more energy is lost. But, if you convert that same blue light to a lower frequency red (620nm vs 590nm, for example), then even more energy is lost in the conversion.
Here are some spectrum charts. Have a look at where the peaks are. Note the red peaks between 590nm and 620nm, but also look at the blue (around 450nm) and green (around 530nm) peaks:
CRI70
View attachment 4351516
CRI80
View attachment 4351517
CRI90
View attachment 4351518
As the CRI increases, more blue light is converted to more green and red. Also, the red light is being converted into lower frequencies, from 595nm in CRI70, to 605nm in CRI80, to 615nm in CRI90. More light energy is lost at each step.
School physics taught you energy can neither be created nor destroyed, so the energy isn't actually "lost", it is just converted into other forms of energy, such as radiant heat.
OK, so "efficiency" is defined as the amount of energy that goes into a LED vs the amount of light energy (quantum radiation) that comes out.
"Efficacy" refers to how efficiently the LED converts blue light into light that is visible to the human eye, which we measure as "lumens". Lumens are weighted towards the green spectrum, which is what the human eye sees best.
So it is possible to have a LED that has higher efficiency, but lower efficacy. Which means it may be more efficient at converting energy to visible light, but it appears darker to the human eye - the more green light the LED emits, the brighter it appears for the same amount of energy.
Plants are not humans, so "efficacy" does not matter so much to them (plants have their own sensitivity to light).
Why am I taking so long to get to the point?
Because once you know why CRI80 LEDs are more efficient than CRI90 LEDS, you can understand why they might be preferable to use.
Or are they?
This is another good question.
Just like humans, plants are more responsive to certain wavelengths. You may have seen the McCree and other curves (below):
View attachment 4351519
There is a whole other argument here about plant responsiveness, but to keep things simple, all you need to know is that - just like the human eye - plants use certain wavelengths more efficiently than others.
So, a CRI80 LED might be more efficient at converting blue light to red and green (which together form white light), but a CRI90 LED might be more efficient at converting that blue light into other wavelengths the plant can use - for example 620-660nm, which produces a very strong photosynthetic response in plants, especially during flowering.
To get around this, a horticultural LED manufacturer might use CRI80 LEDs as an efficient form of blue and green light, but add single-colour red LEDs (monos) to boost red output. Because the red LED produces red light which is not converted to other colours, it is also a very efficient LED.
Overall, efficiency is improved, and so is efficacy (at least in plant terms).
The real argument comes down to what is the most efficient spectrum for plants? And what is the most efficient way of meeting that spectrum?
This is were horticultural LED manufacturers may differ in opinion. There's no doubt that the "gold standard" is sunlight. But sunlight changes at different times of day, during different weather events, at different times of the year, in different parts of the world (latitudes) and even different altitudes.
And plants appear to use all of it: even green light, which is mostly reflected, plays an important part in regulating plant health and aiding photosynthesis by reflecting into the lower canopy, being absorbed deeper into the plant cells, and aiding other pigments to absorb other wavelengths, such as red.
So perhaps we could argue that the best plant spectrum is the "fullest" spectrum that is very even - just like sunlight - but weighted more towards the red part, where plants appear to photosynthesis and flower best.
That argument, I am sure, you can read and participate in elsewhere on this site.
In my opinion, there are good reasons to use high CRI LEDs, as they already produce a lot more useful light for plants (high red for example), and a fuller spectrum (more cyan, for example), even if their overall efficiency is a little bit behind CRI80. The other advantage is that high colour rendering also allows you to see the plants in their "true light", which makes picking up plant deficiencies and other problems easier.
I will touch on one other argument, and that is that CRI80 LED around 3000K has a vaguely similar spectrum to HPS light - which contains a lot of yellow, and which cannabis has been selectively bred under for the past 40+ years.
Some argue that today's cannabis responds better to more yellow light for this reason. I would argue that we have seen a better response by adding more red and a limited amount of blue (10-15%) and green. Nearly all the scientific results I have read support this. But we are constantly learning, so even the science is evolving.