bg-img9.jpg

Controlling Preanalytical Variability in Biospecimen Collections

Posted by Abdul Ally on Sep 15, 2016 11:00:00 AM

Biobank.jpgLarge, well-designed population studies and the interrelationships they reveal are the backbone of public health and serve as a foundation for medical research. The health and lifestyle information of participants, collected via questionnaire and linked to their biospecimen samples, allow investigators to examine the intricate relationships between genetics, physiology, behavior, environment, and disease.

However, in order to have confidence in both laboratory assay results and their associations with other data, the biospecimens must be correctly collected, processed, shipped, and stored. If these samples were subjected to conditions that compromised their molecular integrity—inappropriate temperatures, contaminated containers, and others—then the results of the study may also be questionable. This is an issue that must be thoroughly addressed in public health research—the possibility that the handling of specimens before laboratory testing (i.e., preanalytical variability) has skewed results.

For this reason, controlling preanalytical variability requires the same attention to detail as the design of questionnaires. This eBook is an introduction to some of the variables that must be considered when collecting biospecimens as part of a cohort study.

Complete Process Management

Control of the total process must begin at sample collection—the moment a specimen leaves the donor—and must continue until a specimen is analyzed. Any variation in how samples are collected, transported, processed, or handled can result in a decrease in specimen integrity.

Complete_Process_Mgmt_image.pngComplete process management includes use of the same protocols across all study sites for collection, shipping, and storage. The protocols should
ensure that all steps in the sample management process conform to best practices [1,2,3], and site staff should be well trained on uniform use of the protocols. In addition, laboratory processing of large numbers of samples should be automated where possible, to further minimize aliquoting variability across nested sets of aliquots from the same sample.

Preanalytical variability during sample collection and other events may account for up to 68 percent of erroneous results in the lab [4]. When planning collection of specimens for distant future use, there are many questions to consider, and some are less obvious than others. These include:

  • Are environmental compounds relevant, and how significant a role does the exposome and microbiome play?
  • What specimen type is most appropriate for the study goal?
  • What are the target analytes (cotinine in urine? microbial mix in stool samples?), and what container/additives should be used to preserve or stabilize these analytes?
  • What processing (e.g., DNA/RNA extraction) and aliquoting are needed?
  • What data elements should be appended to the sample?
  • What are the optimal storage conditions, including temperature, for maintaining stability of molecules of interest (e.g. cell surface receptors, metabolites, molecular markers)? 

To continue reading this eBook, please select the download button below!

controlling_preanalytical_variability_ebookcovershot.jpg

New Call-to-action