The food industry faces two paradoxical demands: on the one hand, foods need to be microbiologically safe for consumption and on the other hand, consumers want fresh, minimally processed foods. To meet these demands, more insight into the mechanisms of microbial growth is needed, which includes, among others, the microbial lag phase. This is the time needed by bacterial cells to adapt to a new environment (for example, after food product contamination) before starting an exponential growth regime. Since food products are often contaminated with low amounts of pathogenic microorganisms, it is important to know the distribution of these individual cell lag times to make accurate predictions concerning food safety. More precisely, cells with the shortest lag times (i.e., appearing in the left tail of the distribution) are largely decisive for the outgrowth of the population. In this study, an integrated modeling approach is proposed and applied to an existing data set of individual cell lag time measurements of Listeria monocytogenes. In a first step, a logistic modeling approach is applied to predict the fraction of zero-lag cells (which start growing immediately) as a function of temperature, pH, and water activity. For the nonzero-lag cells, the mean and variance of the lag time distribution are modeled with a hyperbolic-type model structure. This mean and variance allow identification of the parameters of a two-parameter Weibull distribution, representing the nonzero-lag cell lag time distribution. The integration of the developed models allows prediction of a global distribution of individual cell lag times for any combination of environmental conditions in the interpolation domain of the original temperature, pH, and water activity settings. The global fitting quality of the model is quantified using several measures indicating that the model gives accurate predictions, erring slightly on the fail-safe side when predicting the shortest lag times.