Near-infrared spectroscopy (NIRS) is widely used in biomedical optics with applications ranging from basic science, such as in functional neuroimaging, to clinical, as in pulse oximetry. Despite the relatively low absorption of tissue in the near-infrared, there is still a significant amount of optical attenuation produced by the highly scattering nature of tissue. Because of this, designers of NIRS systems have to balance source optical power and source–detector separation to maximize the signal-to-noise ratio (SNR). However, theoretical estimations of SNR neglect the effects of speckle. Speckle manifests as fluctuations of the optical power received at the detector. These fluctuations are caused by interference of the multiple random paths taken by photons in tissue. We present a model for the NIRS SNR that includes the effects of speckle. We performed experimental validations with a NIRS system to show that it agrees with our model. Additionally, we performed computer simulations based on the model to estimate the contribution of speckle noise for different collection areas and source–detector separations. We show that at short source–detector separation, speckle contributes most of the noise when using long coherence length sources. Considering this additional noise is especially important for hybrid applications that use NIRS and speckle contrast simultaneously, such as in diffuse correlation spectroscopy.