Maximizing signal-to-noise ratio in the random mutation capture assay

    loading  Checking for direct PDF access through Ovid

Abstract

The ‘Random Mutation Capture’ assay allows for the sensitive quantitation of DNA mutations at extremely low mutation frequencies. This method is based on PCR detection of mutations that render the mutated target sequence resistant to restriction enzyme digestion. The original protocol prescribes an end-point dilution to about 0.1 mutant DNA molecules per PCR well, such that the mutation burden can be simply calculated by counting the number of amplified PCR wells. However, the statistical aspects associated with the single molecular nature of this protocol and several other molecular approaches relying on binary (on/off) output can significantly affect the quantification accuracy, and this issue has so far been ignored. The present work proposes a design of experiment (DoE) using statistical modeling and Monte Carlo simulations to obtain a statistically optimal sampling protocol, one that minimizes the coefficient of variance in the measurement estimates. Here, the DoE prescribed a dilution factor at about 1.6 mutant molecules per well. Theoretical results and experimental validation revealed an up to 10-fold improvement in the information obtained per PCR well, i.e. the optimal protocol achieves the same coefficient of variation using one-tenth the number of wells used in the original assay. Additionally, this optimization equally applies to any method that relies on binary detection of a small number of templates.

Related Topics

    loading  Loading Related Articles