PRESS-NEWS.org - Press Release Distribution
PRESS RELEASES DISTRIBUTION

New standard proposed for supercomputing

2010-11-16
(Press-News.org) ALBUQUERQUE, N.M. — A new supercomputer rating system will be released by an international team led by Sandia National Laboratories at the Supercomputing Conference 2010 in New Orleans on Nov. 17.

The rating system, Graph500, tests supercomputers for their skill in analyzing large, graph-based structures that link the huge numbers of data points present in biological, social and security problems, among other areas.

"By creating this test, we hope to influence computer makers to build computers with the architecture to deal with these increasingly complex problems," Sandia researcher Richard Murphy said.

Rob Leland, director of Sandia's Computations, Computers, and Math Center, said, "The thoughtful definition of this new competitive standard is both subtle and important, as it may heavily influence computer architecture for decades to come."

The group isn't trying to compete with Linpack, the current standard test of supercomputer speed, Murphy said. "There have been lots of attempts to supplant it, and our philosophy is simply that it doesn't measure performance for the applications we need, so we need another, hopefully complementary, test," he said.

Many scientists view Linpack as a "plain vanilla" test mechanism that tells how fast a computer can perform basic calculations, but has little relationship to the actual problems the machines must solve.

The impetus to achieve a supplemental test code came about at "an exciting dinner conversation at Supercomputing 2009," said Murphy. "A core group of us recruited other professional colleagues, and the effort grew into an international steering committee of over 30 people." (See www.graph500.org.)

Many large computer makers have indicated interest, said Murphy, adding there's been buy-in from Intel, IBM, AMD, NVIDIA, and Oracle corporations. "Whether or not they submit test results remains to be seen, but their representatives are on our steering committee."

Each organization has donated time and expertise of committee members, he said.

While some computer makers and their architects may prefer to ignore a new test for fear their machine will not do well, the hope is that large-scale demand for a more complex test will be a natural outgrowth of the greater complexity of problems.

Studies show that moving data around (not simple computations) will be the dominant energy problem on exascale machines, the next frontier in supercomputing, and the subject of a nascent U.S. Department of Energy initiative to achieve this next level of operations within a decade, Leland said. (Petascale and exascale represent 10 to the 15th and 18th powers, respectively, operations per second.)

Part of the goal of the Graph500 list is to point out that in addition to more expense in data movement, any shift in application base from physics to large-scale data problems is likely to further increase the application requirements for data movement, because memory and computational capability increase proportionally. That is, an exascale computer requires an exascale memory.

"In short, we're going to have to rethink how we build computers to solve these problems, and the Graph500 is meant as an early stake in the ground for these application requirements," said Murphy.

How does it work?

Large data problems are very different from ordinary physics problems.

Unlike a typical computation-oriented application, large-data analysis often involves searching large, sparse data sets performing very simple computational operations.

To deal with this, the Graph 500 benchmark creates two computational kernels: a large graph that inscribes and links huge numbers of participants and a parallel search of that graph.

"We want to look at the results of ensembles of simulations, or the outputs of big simulations in an automated fashion," Murphy said. "The Graph500 is a methodology for doing just that. You can think of them being complementary in that way — graph problems can be used to figure out what the simulation actually told us."

Performance for these applications is dominated by the ability of the machine to sustain a large number of small, nearly random remote data accesses across its memory system and interconnects, as well as the parallelism available in the machine.

Five problems for these computational kernels could be cybersecurity, medical informatics, data enrichment, social networks and symbolic networks:

Cybersecurity: Large enterprises may create 15 billion log entries per day and require a full scan. Medical informatics: There are an estimated 50 million patient records, with 20 to 200 records per patient, resulting in billions of individual pieces of information, all of which need entity resolution: in other words, which records belong to her, him or somebody else. Data enrichment: Petascale data sets include maritime domain awareness with hundreds of millions of individual transponders, tens of thousands of ships, and tens of millions of pieces of individual bulk cargo. These problems also have different types of input data. Social networks: Almost unbounded, like Facebook. Symbolic networks: Often petabytes in size. One example is the human cortex, with 25 billion neurons and approximately 7,000 connections each.

"Many of us on the steering committee believe that these kinds of problems have the potential to eclipse traditional physics-based HPC [high performance computing] over the next decade," Murphy said.

While general agreement exists that complex simulations work well for the physical sciences, where lab work and simulations play off each other, there is some doubt they can solve social problems that have essentially infinite numbers of components. These include terrorism, war, epidemics and societal problems.

"These are exactly the areas that concern me," Murphy said. "There's been good graph-based analysis of pandemic flu. Facebook shows tremendous social science implications. Economic modeling this way shows promise.

"We're all engineers and we don't want to over-hype or over-promise, but there's real excitement about these kinds of big data problems right now," he said. "We see them as an integral part of science, and the community as a whole is slowly embracing that concept.

"However, it's so new we don't want to sound as if we're hyping the cure to all scientific ills. We're asking, 'What could a computer provide us?' and we know we're ignoring the human factors in problems that may stump the fastest computer. That'll have to be worked out."

### Ssee Sandia website for image: This small, synthetic graph was generated by a method called Kronecker multiplication. Larger versions of this generator, modeling real-world graphs, are used in the Graph500 benchmark. (Courtesy of Jeremiah Willcock, Indiana University) Click on the thumbnail for a high-resolution image. A higher resolution EPS file of the image also is available upon request.

Sandia National Laboratories is a multiprogram laboratory operated and managed by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.



ELSE PRESS RELEASES FROM THIS DATE:

DHS report on risks of proposed Kansas biocontainment lab is incomplete

2010-11-16
WASHINGTON — A new National Research Council report requested by Congress finds "several major shortcomings" in a U.S. Department of Homeland Security assessment of risks associated with operating the proposed National Bio- and Agro-Defense Facility (NBAF) in Manhattan, Kan. The laboratory would study dangerous foreign animal diseases -- including the highly contagious foot-and-mouth disease (FMD), which affects cattle, pigs, deer, and other cloven-hoofed animals -- and diseases deadly to humans that can be transmitted between animals and people. Based on the DHS risk ...

Time to prepare for climate change

Time to prepare for climate change
2010-11-16
WASHINGTON – Though the massive glaciers of the greater Himalayan region are retreating slowly, development agencies can take steps now to help the region's communities prepare for the many ways glacier melt is expected to impact their lives, according to a new report. Programs that integrate health, education, the environment and social organizations are needed to adequately address these impacts, the report states. "The extremely high altitudes and sheer mass of High Asian glaciers mean they couldn't possibly melt in the next few decades," said Elizabeth Malone, a Battelle ...

T. rex's big tail was its key to speed and hunting prowess

2010-11-16
Tyrannosaurus rex was far from a plodding Cretaceous era scavenger whose long tail only served to counterbalance the up-front weight of its freakishly big head. T. rex's athleticism (and its rear end) has been given a makeover by University of Alberta graduate student Scott Persons. His extensive research shows that powerful tail muscles made the giant carnivore one of the fastest moving hunters of its time. As Persons says, "contrary to earlier theories, T. rex had more than just junk in its trunk." The U of A paleontology student began his research by comparing ...

Microsensors offer first look at whether cell mass affects growth rate

2010-11-16
CHAMPAIGN, Ill. — University of Illinois researchers are using a new kind of microsensor to answer one of the weightiest questions in biology – the relationship between cell mass and growth rate. The team, led by electrical and computer engineering and bioengineering professor Rashid Bashir, published its results in the online early edition of the Proceedings of the National Academy of Sciences. "It's merging micro-scale engineering and cell biology," said Bashir, who also directs the Micro and Nanotechnology Engineering Laboratory at Illinois. "We can help advance ...

Biochemistry of how plants resist insect attack determined

2010-11-16
Many plants, including crops, release volatiles in response to insect attack. The chemical compounds can be a defense or can be an aromatic call for help to attract enemies of the attacking insect. Researchers from Virginia Tech, Michigan State University, and Georg-August-University Göttingen have discovered how plants produce the defensive compounds. The research is reported this week in the online early edition of the Proceedings of the National Academy of Sciences. The article, "Herbivore-induced and floral homoterpene volatiles are biosynthesized by a single P450 ...

New research changes understanding of C4 plant evolution

2010-11-16
Frostburg, Md. (November 15, 2010) – A new analysis of fossilized grass-pollen grains deposited on ancient European lake and sea bottoms 16-35 million years ago reveals that C4 grasses evolved earlier than previously thought. This new evidence casts doubt on the widely-held belief that the rise of this incredibly productive group of plants was driven by a large drop in atmospheric carbon dioxide concentrations during the Oligocene epoch. The research team, led by University of Maryland Center for Environmental Science Appalachian Laboratory researcher Dr. David Nelson ...

Adding pharmacists to docs' offices helps patient outcomes, study shows

2010-11-16
Adding pharmacists to the primary care team right in doctors' offices may help patients with chronic diseases such as diabetes better manage associated risks, a new University of Alberta study had found. The blood pressure of patients with Type 2 diabetes dropped significantly when pharmacists were included in the on-site clinical examination and consulting process, the U of A study showed. Among 153 patients whose hypertension was inadequately controlled at the beginning of the study, the 82 who had advice from a pharmacist were more likely to reach blood pressure treatment ...

Guiltless gluttony: Misleading size labels lead to overeating

2010-11-16
People are easily fooled when it comes to food labels, and will eat more of something if they believe it's a "small" portion, according to a new study in the Journal of Consumer Research. Authors Nilufer Z. Ayinoglu (Koç University, Istanbul) and Aradhna Krishna (University of Michigan) found that inconsistent portion sizes contribute to people's uncertainty about the appropriate amount to eat. "In this context of large portion sizes and consumer uncertainty about appropriate food intake, we show that size labels chosen by food and drink vendors (such as 'small-medium-large') ...

Do consumers prefer 1 percent interest over 0 percent interest or is zero simply confusing?

2010-11-16
Why would someone choose a credit card with a one percent interest rate over another with a zero percent rate? A new study in the Journal of Consumer Research finds that consumers are often flummoxed when it comes to zero. "A reasonable assumption is that a product will be more attractive when it offers more of a good thing, such as free pictures (with a digital camera purchase), or less of a bad thing, like interest rates on a credit card," writes author Mauricio Palmeira (Monash University, Australia). But Palmeira's research found that consumer comparison methods tend ...

Natural compound shows promise against Huntington's disease

Natural compound shows promise against Huntingtons disease
2010-11-16
LA JOLLA, CA-Fisetin, a naturally occurring compound found in strawberries and other fruits and vegetables, slows the onset of motor problems and delays death in three models of Huntington's disease, according to researchers at the Salk Institute for Biological Studies. The study, published in the online edition of Human Molecular Genetics, sets the stage for further investigations into fisetin's neuroprotective properties in Huntington's and other neurodegenerative conditions. Huntington's disease (HD) is an inherited disorder that destroys neurons in certain parts of ...

LAST 30 PRESS RELEASES:

Scientists unlock secrets behind flowering of the king of fruits

Texas A&M researchers illuminate the mysteries of icy ocean worlds

Prosthetic material could help reduce infections from intravenous catheters

Can the heart heal itself? New study says it can

Microscopic discovery in cancer cells could have a big impact

Rice researchers take ‘significant leap forward’ with quantum simulation of molecular electron transfer

Breakthrough new material brings affordable, sustainable future within grasp

How everyday activities inside your home can generate energy

Inequality weakens local governance and public satisfaction, study finds

Uncovering key molecular factors behind malaria’s deadliest strain

UC Davis researchers help decode the cause of aggressive breast cancer in women of color

Researchers discovered replication hubs for human norovirus

SNU researchers develop the world’s most sensitive flexible strain sensor

Tiny, wireless antennas use light to monitor cellular communication

Neutrality has played a pivotal, but under-examined, role in international relations, new research shows

Study reveals right whales live 130 years — or more

Researchers reveal how human eyelashes promote water drainage

Pollinators most vulnerable to rising global temperatures are flies, study shows

DFG to fund eight new research units

Modern AI systems have achieved Turing's vision, but not exactly how he hoped

Quantum walk computing unlocks new potential in quantum science and technology

Construction materials and household items are a part of a long-term carbon sink called the “technosphere”

First demonstration of quantum teleportation over busy Internet cables

Disparities and gaps in breast cancer screening for women ages 40 to 49

US tobacco 21 policies and potential mortality reductions by state

AI-driven approach reveals hidden hazards of chemical mixtures in rivers

Older age linked to increased complications after breast reconstruction

ESA and NASA satellites deliver first joint picture of Greenland Ice Sheet melting

Early detection model for pancreatic necrosis improves patient outcomes

Poor vascular health accelerates brain ageing

[Press-News.org] New standard proposed for supercomputing