Many natural sounds have frequency spectra composed of integer multiples of a fundamental frequency. This property, known as harmonicity, plays an important role in auditory information processing. However, the extent to which harmonicity influences the processing of sound features beyond pitch is still unclear. This is interesting because harmonic sounds have lower information entropy than inharmonic sounds. According to predictive processing accounts of perception, this property could produce more salient neural responses due to the brain’s weighting of sensory signals according to their uncertainty. In the present study, we used electroencephalography to investigate brain responses to harmonic and inharmonic sounds commonly occurring in music: Piano tones and hi-hat cymbal sounds. In a multifeature oddball paradigm, we measured mismatch negativity (MMN) and P3a responses to timbre, intensity, and location deviants in listeners with and without congenital amusia—an impairment of pitch processing. As hypothesized, we observed larger amplitudes and earlier latencies (for both MMN and P3a) in harmonic compared with inharmonic sounds. These harmonicity effects were modulated by sound feature. Moreover, the difference in P3a latency between harmonic
and inharmonic sounds was larger for controls than amusics. We propose an explanation of these results based on predictive coding and discuss the relationship between harmonicity, information entropy, and precision weighting of prediction errors.