We study the performance of the classical relation for the correction for ambient temperature drift of the signal of a hot-wire anemometer and the influence of practical assumptions. It is shown that most methods to estimate the operational temperature via the temperature/resistance coefficient lead to underestimation of the operational temperature and thus to overcorrection of signals for temperature drift. We found that, in the presence of a sensible heat flow, temperature fluctuations cannot be sufficiently removed from the hot-wire signal when one relies on temperature/resistance coefficients from literature. When only slow temperature drift is involved, most literature values give a satisfactory temperature correction, but this depends on the specific combination of a probe and a literature reference. Therefore it is generally advisable to calibrate the value. A method that uses a ratio of (measured) resistances as a function of temperature, which does not require estimation of the operational temperature of the wire, is shown to depend crucially on a parasitic resistance of a few tenths of an ohm. This parameter can be found by optimizing its value using data from a collection of velocity calibrations at different temperatures. This additional calibration alone suffices to estimate the operational temperature of the wire via optimization. A quick calibration procedure (15 min) is proposed and tested.
- fluid temperature