The effects of radiofrequency (RF) exposure on wake and sleep electroencephalogram (EEG) have been in focus since mobile phone usage became pervasive. It has been hypothesized that effects may be explained by (1) enhanced induced fields due to RF coupling with the electrode assembly, (2) the subsequent temperature increase around the electrodes, or (3) RF induced thermal pulsing caused by localized exposure in the head. We evaluated these three hypotheses by means of both numerical and experimental assessments made with appropriate phantoms and anatomical human models. Typical and worst-case electrode placements were examined at 900 and 2140 MHz. Our results indicate that hypothesis 1 can be rejected, as the induced fields cause <20% increase in the 10 g-averaged specific absorption rate (SAR). Simulations with an anatomical model indicate that hypothesis 2 is also not supported, as the realistic worst-case electrode placement results in a maximum skin temperature increase of 0.31 °C while brain temperature elevations remained <0.1 °C. These local short-term temperature elevations are unlikely to change brain physiology during the time period from minutes to several hours after exposure. The maximum observed temperature ripple due to RF pulses is <0.001 °C for GSM-like signals and <0.004 °C for 20-fold higher pulse energy, and offers no support for hypothesis 3. Thus, the mechanism of interaction between RF and changes in the EEG power spectrum remains unknown.