A biophysically inspired signal processing model of the human cochlea is deployed to simulate the effects of specific noise-induced inner hair cell (IHC) and outer hair cell (OHC) lesions on hearing thresholds, cochlear compression, and the spectral and temporal features of the auditory nerve (AN) coding. The model predictions were evaluated by comparison with corresponding data from animal studies as well as human clinical observations. The hearing thresholds were simulated for specific OHC and IHC damages and the cochlear nonlinearity was assessed at 0.5 and 4 kHz. The tuning curves were estimated at 1 kHz and the contributions of the OHC and IHC pathologies to the tuning curve were distinguished by the model. Furthermore, the phase locking of AN spikes were simulated in quiet and in presence of noise. The model predicts that the phase locking drastically deteriorates in noise indicating the disturbing effect of background noise on the temporal coding in case of hearing impairment. Moreover, the paper presents an example wherein the model is inversely configured for diagnostic purposes using a machine learning optimization technique (Nelder–Mead method). Accordingly, the model finds a specific pattern of OHC lesions that gives the audiometric hearing loss measured in a group of noise-induced hearing impaired humans.