The Source for Informatics, HPC and IT Solutions
Subscribe to Scientific Computing All
Deadly spiral of capture and disintegration: The red dots are glands. When the trap closes, forming a green stomach, these glands release a lytic enzyme cocktail, digest the victim, and incorporate the nutrients released from the building blocks of the me

Deadly Mathematics: Venus Flytraps employ Calculation to Kill Prey

January 25, 2016 3:02 pm | by Cell Press | News | Comments

Carnivorous plants, such as the Venus flytrap, depend on meals of insects to survive in nutrient-poor soil. They sense the arrival of juicy insects with the aid of sensitive trigger hairs on the inner surfaces of their traps. Researchers have looked more closely at exactly how the plants decide when to keep their traps shut and begin producing their acidic, prey-decomposing cocktail of enzymes. The short answer is: they count.

Beware the digital evolution. Pixabay, CC BY

The Internet could Out-evolve Humanity

January 25, 2016 1:45 pm | by Michael Gillings; Darrell Kemp, and Martin Hilbert, University of California, Davis | Articles | Comments

Living things accumulate and reproduce information. That’s really the driving principle behind life, and behind evolution. But humans have invented a new method of accumulating and reproducing information. It’s digital information, and it’s growing at an astonishing speed. The number of people using the Internet is growing, as are the devices connected through the Internet of Things. Digital technology is like an organism that can evolve.

The Electricity Infrastructure Operations Center at Pacific Northwest National Laboratory will host the web portal and repository for realistic grid data developed under a new ARPA-E program.

Building Open-access Datasets to help Test, Evolve 21st Century Power Grid

January 25, 2016 1:40 pm | by PNNL | News | Comments

Say you have a great new theory or technology to improve the nation's energy backbone — the electric grid. Wouldn't it be great to test it against a model complete with details that would tell you how your ideas would work? But it's a challenge, because existing sets of data are too small or outdated; and you don't have access to real data from the grid because of security and privacy issues.

 R.D. McDowall is Director, R D McDowall Limited.

Review of the Draft WHO Data Integrity Guidance Part 2: Suppliers, Training and ALCOA

January 25, 2016 1:04 pm | by R.D. McDowall, Ph.D. | Articles | Comments

This is the second and last part of a review of the draft World Health Organisation guidance entitled Guidance on Good Data and Record Management Practices. In this part of the review, we will discuss the role of suppliers and service providers, staff training, good documentation practices, designing systems for data quality and addressing data reliability issues.

This diagram demonstrates the simplified results that can be obtained by using quantum analysis on enormous, complex sets of data. Shown here are the connections between different regions of the brain in a control subject (left) and a subject under the in

New Quantum Approach to Big Data could make Impossibly Complex Problems Solvable

January 25, 2016 10:51 am | by David L. Chandler, MIT | News | Comments

From gene mapping to space exploration, humanity continues to generate ever-larger sets of data — far more information than people can actually process, manage or understand. Machine learning systems can help deal with this ever-growing flood of information. Some of the most powerful of these tools are based on a branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched.

From an aerial shot of a shantytown to an image of brain dendrites, the lab offers the UM community a high-tech way to display, present, and analyze information.

Visualization Goes High-tech with New UM CCS Viz Lab

January 25, 2016 10:05 am | by University of Miami | News | Comments

From a bird’s-eye view of a shantytown to an illustration of dendrites, the new CCS Visualization Lab allows faculty members, researchers, scientists and students to display high-resolution images, data, charts and other information in visually stunning formats. A plug-and-play system, the 22-foot-long 2-D display monitor is capable of displaying one large image or breaking up components of data into as many as 10 individual screens.

ECMWF specializes in global numerical weather prediction for the medium range (up to two weeks ahead), and also produces extended-range forecasts for up to a year ahead, with varying degrees of detail. The center uses advanced computer modeling techniques

Cray Signs $36M Contract to Upgrade European Centre for Medium-Range Weather Forecasts Systems

January 25, 2016 9:44 am | by Cray | News | Comments

Cray announced the Company has signed a $36 million contract to upgrade and expand the Cray XC supercomputers and Sonexion storage system at the European Centre for Medium-Range Weather Forecasts. When the project is completed, the enhanced systems will allow the world-class numerical weather prediction and research center to continue to drive improvements in its highly-complex models to provide more accurate weather forecasts.

This January 20, 2015 photo provided by The Nature Conservancy shows Twin Lakes Beach in Santa Cruz, CA,and Schwann Lagoon, the body of water on the right. The Nature Conservancy, an environmental group in California, is recruiting drone hobbyists to map

Citizen Scientists Use Drones to Map El Nino Flooding

January 25, 2016 9:26 am | by Gillian Flaccus, Associated Press | News | Comments

Forget about selfies. In California, residents are using smartphones and drones to document the coastline's changing face. Starting this month, The Nature Conservancy is asking tech junkies to capture flooding and coastal erosion that come with El Nino. The idea is that crowd-sourced, geotagged images of storm surges and flooded beaches will give scientists a brief window into what the future holds as sea levels rise from global warming.

Every release of the NAG Library has included numerical code contributed by professionals working in industry and academia.

The Code Contributors: Experts offer Insights on Future-proofing Algorithmic Code with the NAG Library

January 22, 2016 4:15 pm | by NAG | Articles | Comments

The NAG Library is a set of mathematical and statistical algorithms used by thousands around the world for solution of numerical problems. Every release has included numerical code contributed by professionals working in industry and academia. These esteemed “Code Contributors” generously give their code to help others gain benefit from their expert algorithms. Each code donated is then documented, tested, maintained and supported by NAG.

R.D. McDowall is Director, R D McDowall Limited.

Review of the Draft WHO Data Integrity Guidance Part 1: Principles, Risk and Management

January 22, 2016 4:09 pm | by R.D. McDowall, Ph.D. | Articles | Comments

Data integrity continues to be the hottest regulatory topic for the pharmaceutical industry, with citations from all major regulatory authorities on a global scale. In September 2015, WHO issued a draft document entitled Guidance on Good Data and Record Management Practices or, in other words, a data integrity guidance.

NREL researchers test enzymes and micro-organisms used in a biological process that converts biomass to alternative fuels. Courtesy of Warren Gretz/NREL

Innovative Strategy Advances Renewable Energy Research, Creates World’s Most Energy-efficient Data Center

January 22, 2016 1:46 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Articles | Comments

In the research projects it conducts and in the way it conducts research, the National Renewable Energy Laboratory lives out the true meaning of its energy-efficient creed. In this way, NREL is that rarest of entities: a preacher of virtue that incorporates virtue into its daily life. NREL’s sincerity of purpose begins with its Peregrine supercomputer and the ultra-efficient data center in which it resides.


5 Stories You Shouldn’t Miss — January 15-21

January 22, 2016 1:03 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

Here they are — the five most-visited stories from the past week. Entire buildings 3-D mapped in just 10 minutes to with a mobile device; a super supernova that easily outshines our entire Milky Way; building a mirror for one of the world’s biggest telescopes; the brain’s memory capacity found to be as the entire Web; and discovery of the largest known prime number — almost 5M digits larger than the previous record...

Nitro optimization of Supermicro’s low-latency SuperServer played a central role in pushing benchmarks up to their present position.

Adaptive Computing Achieves Record High Throughput on Supermicro Solutions

January 22, 2016 11:27 am | by Adaptive Computing | News | Comments

Adaptive Computing announced it has set a new record in high throughput computing in collaboration with Supermicro. Supermicro SuperServers, custom optimized for Nitro, the new high throughput resource manager from Adaptive, were able to launch up to 530 tasks per second per core on Supermicro based low latency UP SuperServer and over 17,600 tasks per second on its 4-Way based SuperServer.

On the top row are two images of a nanomesh bilayer of PDMS cylinders in which the top layer is perpendicular to the complex orientation of the bottom layer. The bottom images show well-ordered nanomesh patterns of PDMS cylinders. The images on the right

Self-stacking Nanogrids could offer Route to Tinier Chip Components

January 22, 2016 10:34 am | by Larry Hardesty, MIT | News | Comments

Since the 1960s, computer chips have been built using a process called photolithography. But in the past five years, chip features have gotten smaller than the wavelength of light, which has required some ingenious modifications of photolithographic processes. Keeping up the rate of circuit miniaturization that we’ve come to expect will eventually require new manufacturing techniques. Block copolymers are one promising alternative...

A reconstruction of cortical connectivity. Courtesy of Lichtman Lab/Harvard

Moonshot Brain Studies push Computer Science Boundaries, bring AI Closer to Reality

January 22, 2016 10:19 am | by Leah Burrows, Harvard University | News | Comments

Researchers have been awarded over $28 million to develop advanced machine learning algorithms by pushing the frontiers of neuroscience. The Intelligence Advanced Research Projects Activity funds large-scale programs that address the most difficult challenges facing the intelligence community. IARPA’s challenge: figure out why brains are so good at learning, and use that information to design computer systems...



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.