A TDS meter is really just an electrical conductivity (EC) meter that has a built-in conversion factor that displays the output in parts per million (ppm) of total dissolved solids (TDS). The trouble is that the relationship between the conductivity of a solution and its content varies not only by the concentration of the dissolved ions, but is also based upon the charge and mobility of the dissolved ionic species.
As a very simplified explanation of that, imagine a small ion and a large ion having the same electrical charge. The small ion will find it easier to move in the solution, so "conducts" that charge faster, so gives a higher EC for the same concentration (TDS) in the solution. Likewise, if two ions have the same size, but one has a higher charge than the other, it too will show a higher EC.
A commercial fertilizer can be made up of dozens of different chemicals, each of which ionizes and contributes to the EC of the solution, and different brands of fertilizer can use different chemicals to make up the total formula. With all of that variability, how can a single "constant" conversion factor be valid?
I own two TDS meters - a Hanna TDS1, and a generic TDShm. When placed in the same solution, they give different apparent TDS values.
...
I typically shoot for 100-125 ppm N in my fertilizer solution, I simply use my TDS meters as a check, knowing that the Hanna TDS1 should show between 380 and 475 ppm TDS, while the TDShm should be in the 470 to 590 range.