The original intensity interferometers were instruments built in the 1950s and 1960s by Hanbury Brown and collaborators, achieving milliarcsec resolutions in visible light without optical-quality mirrors. They exploited a then-novel physical effect, nowadays known as HBT correlation after the experiments of Hanbury Brown and Twiss, and considered fundamental in quantum optics. Now a new generation of intensity interferometers is being designed, raising the possibility of measuring intensity correlations with three or more detectors. Quantum optics predicts two interesting features in many-detector HBT: (i) the signal contains spatial information about the source (such as the bispectrum or closure phase) not present in standard HBT and (ii) correlation increases combinatorially with the number of detectors. The signal-to-noise ratio (SNR) depends crucially on the number of photons - in practice always ≪1 - detected per coherence time. A simple SNR formula is derived for thermal sources, indicating that three-detector HBT is feasible for bright stars. The many-detector enhancement of HBT would be much more difficult to measure, but seems plausible for bright masers.