(Press-News.org) MADISON, Wis. - In 1997, IBM's Deep Blue computer beat chess wizard Gary Kasparov. This year, a computer system developed at the University of Wisconsin-Madison achieved something far more complex. It equaled or bested scientists at the complex task of extracting data from scientific publications and placing it in a database that catalogs the results of tens of thousands of individual studies.
"We demonstrated that the system was no worse than people on all the things we measured, and it was better in some categories," says Christopher Ré, who guided the software development for a project while a UW professor of computer science. "That's extremely exciting!"
The development, described in the current issue of PLoS, marks a milestone in the quest to rapidly and precisely summarize, collate and index the vast output of scientists around the globe, says first author Shanan Peters, a professor of geoscience at UW-Madison.
Chess, however complex, is built on rigid rules; in any given situation, only certain moves are legal. The rules for scientific publication are less exact, and so extracting structured information from publications is a challenge for both humans and machines.
Peters and colleagues set up the face-off between PaleoDeepDive, their new machine reading system, and data that scientists had manually entered into the Paleobiology Database. This repository, compiled by hundreds of researchers, is the destination for data from paleontology studies funded by the National Science Foundation and other agencies internationally.
The knowledge produced by paleontologists is fragmented into hundreds of thousands of publications. Yet many research questions require what Peters calls a "synthetic approach: for example, how many species were on the planet at any given time?"
Despite 16 years of effort, the Paleobiology Database remains incomplete and a large amount of hard-earned field data remains locked in publications. Was it possible to automate and accelerate the process?
Teaming up with Ré, who is now at Stanford University, and UW-Madison computer science professor Miron Livny, the group built on the DeepDive machine reading system and the HTCondor distributed job management system to create PaleoDeepDive. "We were lucky that Miron Livny brought the high throughput computing capabilities of the UW-Madison campus to bear," says Peters. "Getting started required a million hours of computer time."
Much like the people who assembled the Paleobiology Database, PaleoDeepDive inhales documents and extracts structured data, such as species names, time periods, and geographic locations. "We extracted the same data from the same documents and put it into the exact same structure as the human researchers, allowing us to rigorously evaluate the quality of our system, and the humans," Peters says.
Many organizations, including IBM and Google, are trying to extract meaning from natural language, but Ré says, "The thing that is different here is that we decided to pivot and look at the scientific literature, where the language is cleaner."
Instead of trying to divine the single correct meaning from any body of copy, the tactic was to "to look at the entire problem of extraction as a probabilistic problem," says Ré, who credits much of the heavy lifting to UW-Madison Ph.D. candidate Ce Zhang. "People had done pieces of that, but not the entire problem, end to end. This was the DeepDive advance."
Ré imagines a study containing the terms "Tyrannosaurus rex" and "Alberta, Canada." Is Alberta where the fossil was found, or where it is stored? Did the finder work there? Did the study actually focus on a fossil related to T.rex? Computers often have trouble deciphering even simple-sounding statements, Ré says. "We take a more relaxed approach: There is some chance that these two are related in this manner, and some chance they are related in that manner."
In these large-data tasks, PaleoDeepDive has a major advantage, Peters says. "Information that was manually entered into the Paleobiology Database by humans cannot be assessed or enhanced without going back to the library and re-examining original documents. Our machine system, on the other hand, can extend and improve results essentially on the fly as new information is added. It can also extract related information that may not have been in the original database, but that is critical to tackling new science questions, and do so on a huge scale."
Further advantages can result from improvements in the computer tools. "As we get more feedback and data, it will do a better job across the board," Peters says. "There are, potentially, systematic and wholesale improvements to the quality of all of the data."
Jacquelyn Crinion, assistant director of licensing and acquisitions services at the UW-Madison General Library System, says the volume of downloads of scientific papers from publishers threatened logjams in document delivery. "Publishers are not going to complain about usage, but about how hard their system is getting hit." Eventually, Elsevier gave the UW-Madison team broad access to 10,000 downloads per week.
As text- and data-mining takes off, Crinion says the library system and publishers will adapt. "Elsevier is very interested in this project; they see it as the future, and it might allow them to develop new products and ways to deliver service. The challenge for all of us is to provide specialized services for researchers while continuing to meet the core needs of the vast majority of our customers."
The Paleobiology Database has already generated hundreds of studies about the history of life, Peters says. "It's a very good example of the added scientific value provided by synthetic databases, where the whole truly is greater than the sum of its individual data parts."
Peters notes that many fields are being challenged to optimize usage of old findings and make streams of new data readily accessible.
Paleontology and geology are inseparably linked through the role that fossils have played in characterizing geologic sequences, Peters notes. "Ultimately, we hope to have the ability to create a computer system that can do almost immediately what many geologists and paleontologists try to do on a smaller scale over a lifetime: read a bunch of papers, arrange a bunch of facts, and relate them to one another in order to address big questions."
INFORMATION:
-- David Tenenbaum, 608-265-8549, djtenenb@wisc.edu
LEXINGTON, KY. (Dec. 1, 2014) -- A group of physiologists led by University of Kentucky's Tim McClintock have identified the receptors activated by two odors using a new method that tracks responses to smells in live mice.
Their research was published in the latest edition of The Journal of Neuroscience.
Using a fluorescent protein to mark nerve cells activated by odors, McClintock and his coworkers identified the receptors that allow mouse nerve cells to respond to two odors: eugenol, which is a component of several spices, most notably cloves, and muscone, known ...
When people hear about the dangers of the ozone hole, they often think of sunburns and associated health risks, but new research shows that ozone depletion changes atmospheric and oceanic circulation with potentially devastating effects on weather in the Southern Hemisphere weather.
These could include increased incidence of extreme events, resulting in costly floods, drought, wildfires, and serious environmental damage. The ecosystem impacts documented so far include changes to growth rates of South American and New Zealand trees, decreased growth of Antarctic mosses, ...
Inundation of nitrogen into the atmosphere and terrestrial environments, through fossil fuel combustion and extensive fertilization, has risen tenfold since preindustrial times according to research published in Global Biogeochemical Cycles. Excess nitrogen can infiltrate water tables and can trigger extensive algal blooms that deplete aquatic environments of oxygen, among other damaging effects.
Although scientists have extensively studied the effects of excess nitrogen in terrestrial habitats, the effect on the open ocean remains unknown. Altieri et al. point out that ...
Through research in mice, scientists have found that proteins at the blood-brain barrier pump out riluzole, the only FDA-approved drug for ALS, or Lou Gehrig's disease, limiting the drug's effectiveness. However, when the investigators blocked these proteins, the effectiveness of riluzole increased and the animals experienced improved muscle function, slower disease progression, and prolonged survival.
The findings suggest that blocking these transporter proteins at the blood-brain barrier might improve delivery, and ultimately, efficacy, of drugs used to treat ALS and ...
The first long-term clinical trial on the use of Lung Volume Reduction (LVR-) Coil treatment in patients with severe emphysema has found that the minimally-invasive therapy, which enables the lung to function more effectively, is safe over a 3-year period. The results are published in Respirology.
The trial revealed that half of the patients continued to improve their lung function capacity, feelings of breathlessness, and overall quality of life after 3 years, with no unexpected safety issues.
"This trial reports only the first ever treated patients in the world with ...
December 1, 2014 -- A study just released by Columbia University's Mailman School of Public Health compared the use of prescription opioids and stimulants among high school graduates, non-graduates, and their college-attending peers, and found that young adults who do not attend college are at particularly high risk for nonmedical prescription opioid use and disorder. In contrast, the nonmedical use of prescription stimulants is higher among college-educated young adults. Results of the study are published online in the journal Social Psychiatry and Psychiatric Epidemiology.
Non-medical ...
Diagnosing HIV and other infectious diseases presents unique challenges in remote locations that lack electric power, refrigeration, and appropriately trained health care staff. To address these issues, researchers funded by the National Institutes of Health (NIH) have developed a low-cost, electricity-free device capable of detecting the DNA of infectious pathogens, including HIV-1. The device uses a small scale chemical reaction, rather than electric power, to provide the heat needed to amplify and detect the DNA or RNA of pathogens present in blood samples obtained ...
Researchers have found a possible underlying genetic susceptibility to being dependent on UV tanning.
After interviewing young people about their indoor and outdoor tanning history and using questionnaires to classify people as being dependent on UV tanning or not, the investigators conducted a large scale scan of approximately 319,000 rare and common genetic variants in the participants' genomes.
"We observed that inherited variation in one gene - known as patched domain containing 2 (PTCHD2) - was significantly associated with whether or not young people, all of ...
Various guidelines for caring for patients infected with Ebola virus are being issued from different national and state public health authorities, professional societies, and individual hospitals. Experts are questioning aspects of some of the guidelines that go beyond current CDC recommendations, especially those that call for suspending certain routine lab tests.
The authors of a Transfusion commentary note that most individuals with suspected Ebola virus disease will have a fever due to another cause, and forgoing such testing may compromise patients' health more ...
Among survivors of the 2011 Utøya Island terrorist attack in Norway, most perceived contact with media as a positive experience. Among those who allowed themselves to be interviewed by the media, 13% found the experience distressing and 11% regretted participating.
Taking part in media interviews was not associated with post-traumatic stress reactions among survivors, but negative evaluations and regrets about participation were.
"Media representatives need to understand that they may add to the burden of survivors if they are not sufficiently careful, and clinicians ...