Next Generation Cohort Studies and Biobanking: How Cloud Technology is Accelerating Translational Research
Cutting edge technology and innovations in molecular epidemiology are usually associated with the laboratory. However, the California Teachers Study (CTS), a prospective epidemiology cohort study, is making use of a different cutting edge technology. The CTS is using mobile devices and cloud-based technology to dramatically cut the time and cost of managing the huge amounts of data that are the cornerstone of epidemiological studies.
Funded in 1995, the primary objective of the CTS was to investigate breast cancer, and the study recruited more than 133,000 female active and retired members of California's educational system. The participants have filled out five questionnaires since 1995, and the CTS is now transforming the value of the questionnaire data with biospecimens to enable research into biomarkers. However, the biggest transformation may be the way the team is collecting the specimens and associated data.
A number of previous CTS projects have collected blood and saliva samples in order to compare DNA from women with cancer to DNA from a matched (same age, ethnicity, and other factors) group of women without cancer, to look for genetic differences. Because a person’s genetic code—i.e., their germline DNA—does not change over time, genetic variations can be efficiently studied using samples that are collected from cancer survivors after their cancer has been diagnosed.
More than Just Drawing Blood – A Mobile App for Sample and Data Collection
Collecting, storing, and keeping track of thousands of pre-diagnostic biospecimens is no easy task. Most studies use standard desktop software to manage data and track study activities and convert participants’ responses into analyzable data, a system that has not changed much in the past 30 years. Unfortunately, those methods are not very efficient, and they often do not scale up well enough to accommodate the needs of complex projects that collect thousands of samples in a short amount of time.
The CTS team needed new and more efficient approaches to manage, coordinate, and standardize the many moving pieces of their complex protocol. Our “Aha!” moment came when we realized that the core processes at play between research team members and potential specimen donors are remarkably similar to the process between sales personnel and their customers. Given this similarity, the CTS team developed a novel Data Management System (DMS) based on a sales-related platform, Salesforce.com.
Salesforce.com is a customer relationship management, or CRM, system; CRM platforms are widely used in sales and other industries because they efficiently manage and track customer interactions and transactions. The CTS team worked with Cloud Sherpas, a cloud advisory and technology services company, to build the DMS on the Salesforce.com platform. The result was a system that manages the recruitment, scheduling of appointments, sample tracking, and other participant data in an integrated manner.
Using the internet, mobile phones, and tablets, the CTS team can access the DMS anytime, view the latest data in real time, and even create customized questionnaires that incorporate each participant’s individual data. The system includes extensive security and validation tools, all of which are under the team’s direct control, to protect the privacy and confidentiality of the data and ensure that all operations are standardized across all sites. For the CTS, this novel DMS has reduced the overall data management expenses, increased the staff productivity, and shortened the overall data pipeline from months or years to days.
Managing and Biobanking the Biospecimens
Another cost-saving innovation created by City of Hope was in the handling of shipment data. To make the receiving operations as efficient and accurate as possible, Fisher BioServices requests an electronic copy of the shipment manifest in advance, for upload in the inventory management system. This allows Fisher BioServices’ receiving team to verify inventory accuracy and rapidly reconcile the received samples with the shipping manifest. Unfortunately, studies that rely on manual processes to send shipment manifests will inevitably encounter snafus, such as a study site forgetting to email the manifest or sending a manifest that is incomplete or contains errors. When that happens, the resulting investigations and fixes are time-consuming and disrupt workflow schedules.
The CTS team created an innovative and cost-saving way to send the shipment manifests to Fisher BioServices. After the phlebotomists have scanned their FedEx airbill numbers, the DMS automatically combines information about every sample that was shipped into a single manifest. Each night at 11 pm West Coast time, the DMS automatically sends that manifest directly to the Fisher BioServices’ computers via a secure FTP. Sending the FTP at 11 PM gives the CTS team time to review the day’s work and, if any corrections need to be made, update the manifest before it is sent and uploaded into Fisher BioServices’ system. Each of these steps speeds up the process and provides both the CTS and the biorepository staff efficient data management and tracking of the biospecimens. If you are interested in learning more about big data and how this affects sample collection and storage, click here to read our blog "Biobanking's Big Problem: How do you manage all that data" by Amelia Ruzzo, Director of IT, Fisher BioServices.
Do you have challenges with standardizing sample and data collection? Do you struggle with how to save time and money when working with large amounts of data? Please comment below and share with us your experiences.