Journal of Physical Chemistry B vol:115 issue:20 pages:6732-6739
It has recently been shown that in some DNA microarrays the time needed to reach thermal equilibrium may largely exceed the typical experimental time, which is about 15 h in standard protocols (Hooyberghs et al. Phys. Rev. E 2010, 81, 012901). In this paper we discuss how this breakdown of thermodynamic equilibrium could be detected in rnicroarray experiments without resorting to real time hybridization data, which are difficult to implement in standard experimental conditions. The method is based on the analysis of the distribution of fluorescence intensities I from different spots for probes carrying base mismatches. In thermal equilibrium and at sufficiently low concentrations, log I is expected to be linearly related to the hybridization free energy Delta G with a slope equal to 1/RTexp, where T-exp is the experimental temperature and R is the gas constant. The breakdown of equilibrium results in the deviation from this law. A model for hybridization kinetics explaining the observed experimental behavior is discussed, the so-called 3-state model. It predicts that deviations from equilibrium yield a proportionality of log I to Delta G/RTeff. Here, T-eff an "effective" temperature, higher than the experimental one. This behavior is indeed observed in some experiments on Agilent arrays [Hooyberghs et al. Phys. Rev, E 2010, 81, 012901 and Hooyberghs et al. Nucleic Acids Res. 1009, 37, e53]. We analyze experimental data from two other microarray platforms and discuss, on the basis of the results, the attainment of equilibrium in these cases. Interestingly, the same 3-state model predicts a (dynamical) saturation of the signal at values below the expected one at equilibrium.