“Every year, the National Institutes of Health spends billions of dollars for biomedical research, ranging from basic science investigations into cell processes to clinical trials. The results are published in journals, presented in academic meetings, and then — building off of their findings — researchers move on to their next project.”
“But what happens to the data that’s collected and what more could we learn from it? If we aggregated all the data from countless years of research, might we learn something new about ourselves, the diseases that infect us, and possible treatments?”
“That’s the hope behind the Biomedical Data Translator program, launched by the NIH in 2016: to create a “Google” for biomedical data that could sift through hundreds of separate data sources to help researchers connect “dots” in datasets with distinct formats and peculiarities…”
“The program has awarded about $17.5 million to 19 institutions across the country that are working to integrate years of data, ranging from electronic health records to genomic sequences, that had previously been spread across a variety of platforms, and then applying new machine learning tools to help organize and reason through the wealth of information.” Read the full article here.
Source: NIH-funded project aims to build a ‘Google’ for biomedical data – By Ruth Hailu, July 31, 2019. STAT.




