Signal-to-noise ratio (imaging)

From KYNNpedia

Signal-to-noise ratio (SNR) is used in imaging to characterize image quality. The sensitivity of a (digital or film) imaging system is typically described in the terms of the signal level that yields a threshold level of SNR.

Industry standards define sensitivity in terms of the ISO film speed equivalent, using SNR thresholds (at average scene luminance) of 40:1 for "excellent" image quality and 10:1 for "acceptable" image quality.<ref>ISO 12232: 1998 Photography – Electronic Still Picture Cameras – Determining ISO Speed</ref>

SNR is sometimes quantified in decibels (dB) of signal power relative to noise power, though in the imaging field the concept of "power" is sometimes taken to be the power of a voltage signal proportional to optical power; so a 20 dB SNR may mean either 10:1 or 100:1 optical power, depending on which definition is in use.

Definition of SNR

Traditionally, SNR is defined to be the ratio of the average signal value <math>\mu_\mathrm{sig}</math> to the standard deviation of the signal <math>\sigma_\mathrm{sig}</math>:<ref>Janesick, James R. (2007). 9780819478382/10.1117/3.725073 Photon Transfer. doi:10.1117/3.725073. ISBN 978-0-8194-7838-2. {{cite book}}: Check |url= value (help)</ref><ref>Rowlands, Andy (April 2017). Physics of Digital Photography. IOP Publishing. ISBN 978-0-7503-1243-1.</ref>

<math> \mathrm{SNR} = \frac{\mu_\mathrm{sig}}{\sigma_\mathrm{sig}}</math>

when the signal is an optical intensity, or as the square of this value if the signal and noise are viewed as amplitudes (field quantities).[further explanation needed]

See also

References

<references group="" responsive="1"></references>

Further reading