I suppose there would be a rough conversion between that value and photon count (at some wavelength range). I just imagine the error bars to be very wide.
Not really, it depends from the application.
Even something like 20 years ago the CCDs available at the time were capable of astrometric measure of star magnitudes with 1/100 of magnitude precision using some control stars.