Carnivorous plants, such as the Venus flytrap, depend on meals of insects to survive in nutrient-poor soil. They sense the arrival of juicy insects with the aid of sensitive trigger hairs on the inner surfaces of their traps. Researchers have looked more closely at exactly how the plants decide when to keep their traps shut and begin producing their acidic, prey-decomposing cocktail of enzymes. The short answer is: they count.
Living things accumulate and reproduce information. That’s really the driving principle behind life, and behind evolution. But humans have invented a new method of accumulating and reproducing information. It’s digital information, and it’s growing at an astonishing speed. The number of people using the Internet is growing, as are the devices connected through the Internet of Things. Digital technology is like an organism that can evolve.
Say you have a great new theory or technology to improve the nation's energy backbone — the electric grid. Wouldn't it be great to test it against a model complete with details that would tell you how your ideas would work? But it's a challenge, because existing sets of data are too small or outdated; and you don't have access to real data from the grid because of security and privacy issues.
This is the second and last part of a review of the draft World Health Organisation guidance entitled Guidance on Good Data and Record Management Practices. In this part of the review, we will discuss the role of suppliers and service providers, staff training, good documentation practices, designing systems for data quality and addressing data reliability issues.
From gene mapping to space exploration, humanity continues to generate ever-larger sets of data — far more information than people can actually process, manage or understand. Machine learning systems can help deal with this ever-growing flood of information. Some of the most powerful of these tools are based on a branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched.
From a bird’s-eye view of a shantytown to an illustration of dendrites, the new CCS Visualization Lab allows faculty members, researchers, scientists and students to display high-resolution images, data, charts and other information in visually stunning formats. A plug-and-play system, the 22-foot-long 2-D display monitor is capable of displaying one large image or breaking up components of data into as many as 10 individual screens.
Cray announced the Company has signed a $36 million contract to upgrade and expand the Cray XC supercomputers and Sonexion storage system at the European Centre for Medium-Range Weather Forecasts. When the project is completed, the enhanced systems will allow the world-class numerical weather prediction and research center to continue to drive improvements in its highly-complex models to provide more accurate weather forecasts.
Forget about selfies. In California, residents are using smartphones and drones to document the coastline's changing face. Starting this month, The Nature Conservancy is asking tech junkies to capture flooding and coastal erosion that come with El Nino. The idea is that crowd-sourced, geotagged images of storm surges and flooded beaches will give scientists a brief window into what the future holds as sea levels rise from global warming.
The Code Contributors: Experts offer Insights on Future-proofing Algorithmic Code with the NAG LibraryJanuary 22, 2016 4:15 pm | by NAG | Articles | Comments
The NAG Library is a set of mathematical and statistical algorithms used by thousands around the world for solution of numerical problems. Every release has included numerical code contributed by professionals working in industry and academia. These esteemed “Code Contributors” generously give their code to help others gain benefit from their expert algorithms. Each code donated is then documented, tested, maintained and supported by NAG.
Data integrity continues to be the hottest regulatory topic for the pharmaceutical industry, with citations from all major regulatory authorities on a global scale. In September 2015, WHO issued a draft document entitled Guidance on Good Data and Record Management Practices or, in other words, a data integrity guidance.
Innovative Strategy Advances Renewable Energy Research, Creates World’s Most Energy-efficient Data CenterJanuary 22, 2016 1:46 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Articles | Comments
In the research projects it conducts and in the way it conducts research, the National Renewable Energy Laboratory lives out the true meaning of its energy-efficient creed. In this way, NREL is that rarest of entities: a preacher of virtue that incorporates virtue into its daily life. NREL’s sincerity of purpose begins with its Peregrine supercomputer and the ultra-efficient data center in which it resides.
Here they are — the five most-visited stories from the past week. Entire buildings 3-D mapped in just 10 minutes to with a mobile device; a super supernova that easily outshines our entire Milky Way; building a mirror for one of the world’s biggest telescopes; the brain’s memory capacity found to be as the entire Web; and discovery of the largest known prime number — almost 5M digits larger than the previous record...
Adaptive Computing announced it has set a new record in high throughput computing in collaboration with Supermicro. Supermicro SuperServers, custom optimized for Nitro, the new high throughput resource manager from Adaptive, were able to launch up to 530 tasks per second per core on Supermicro based low latency UP SuperServer and over 17,600 tasks per second on its 4-Way based SuperServer.
Since the 1960s, computer chips have been built using a process called photolithography. But in the past five years, chip features have gotten smaller than the wavelength of light, which has required some ingenious modifications of photolithographic processes. Keeping up the rate of circuit miniaturization that we’ve come to expect will eventually require new manufacturing techniques. Block copolymers are one promising alternative...
Researchers have been awarded over $28 million to develop advanced machine learning algorithms by pushing the frontiers of neuroscience. The Intelligence Advanced Research Projects Activity funds large-scale programs that address the most difficult challenges facing the intelligence community. IARPA’s challenge: figure out why brains are so good at learning, and use that information to design computer systems...