Abstract
We present systematic measurements of the temperature dependence of detection efficiencies in TaN and NbN superconducting nanowire single-photon detectors. We have observed a clear increase of the cut-off wavelength with decreasing temperature that we can qualitatively describe with a temperature-dependent diffusion coefficient of the quasi-particles created after photon absorption. Furthermore, the detection efficiency at wavelengths shorter than the cut-off wavelength as well as at longer wavelengths exhibit distinct temperature dependencies. The underlying causes and possible consequences for microscopic detection models are discussed.