By Mike Connell, COO & Chief Digital Transformation Officer at Enthought Data is at the foundation of digital strategy Digital transformation is fundamentally changing the way we live, work and relate to each other, both personally and professionally. New entrants are disrupting established industries with novel, digitally enabled operating and business models seemingly overnight, and…
Sofar Ocean debuts Maritime Open Standard, Bristlemouth, at OCEANS 2021
At the annual OCEANS 2021 in San Diego, Sofar Ocean Technologies, the world’s largest real-time ocean intelligence platform, introduced a new marine hardware standard, Bristlemouth, aimed at catalyzing more collaboration, research and innovation for big data from the oceans. Sofar CEO, Tim Janssen, will join partners from DARPA, the Office of Naval Research (ONR) and…
The natural resources industry can no longer afford to be a digital laggard
By John Skero, Director of Product Management at Elsevier The process of extracting natural resources has formed the backbone of modern economies – whether this be mining, forestry or oil and gas exploration. But today, many organizations have recognized the urgent need to find ways to produce natural resources in a more sustainable way. This…
Cambridge Quantum develops algorithm to accelerate Monte Carlo Integration on quantum computers
Cambridge Quantum Computing (CQC) has announced the discovery of a new algorithm that accelerates quantum Monte Carlo integration – shortening the time to quantum advantage and confirming the critical importance of quantum computing to the finance industry in particular. Monte Carlo integration – the process of numerically estimating the mean of a probability distribution by averaging…
Computer-based weather forecast: New algorithm outperforms mainframe computer systems
The exponential growth in computer processing power seen over the past 60 years may soon come to a halt. Complex systems such as those used in weather forecasting, for example, require high computing capacities, but the costs for running supercomputers to process large quantities of data can become a limiting factor. Researchers at Johannes Gutenberg…
Particle physics turns to quantum computing for solutions to tomorrow’s big-data problems
Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges. To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster…
TACC Ranch Technology Upgrade Improves Valuable Data Storage
There’s a joke by comedian Steven Wright that goes, “You can’t have everything. Where would you put it?” Users of advanced computing can likely relate to this. The exponential growth of data poses a steep challenge to efforts for its reliable storage. For over 12 years, the Ranch system at the Texas Advanced Computing Center…
Learning Magnets Could Lead to Energy-efficient Data Processing
The power consumption of data centers around the world is increasing. This creates a high demand for new technologies that could lead to energy-efficient computers. In a new study, physicists at Radboud University have demonstrated that this could also be achieved by using chips whose operation is inspired by that of the human brain. The…
ALCF Data Science Program Seeks Proposals for Data and Learning Projects
The Argonne Leadership Computing Facility (ALCF), a U.S. Department of Energy (DOE) Office of Science User Facility, is now accepting proposals for the ALCF Data Science Program (ADSP) through July 1, 2019. Launched in 2016, the ADSP supports data-centric computing projects that require the scale and performance of leadership-class supercomputers, such as Theta, the ALCF’s…
Largest, Fastest Array of Microscopic ‘Traffic Cops’ for Optical Communications
Engineers at the University of California, Berkeley have built a new photonic switch that can control the direction of light passing through optical fibers faster and more efficiently than ever. This optical “traffic cop” could one day revolutionize how information travels through data centers and high-performance supercomputers that are used for artificial intelligence and other…
New Metascape Platform Enables Biologists to Unlock Big-data Insights
New Computational Tool Harnesses Big Data, Deep Learning to Reveal Dark Matter of the Transcriptome
A research team at Children’s Hospital of Philadelphia (CHOP) has developed an innovative computational tool offering researchers an efficient method for detecting the different ways RNA is pieced together (spliced) when copied from DNA. Because variations in how RNA is spliced play crucial roles in many diseases, this new analytical tool will provide greater capabilities for discovering…
Open Source Software Helps Researchers Extract Key Insights From Huge Sensor Datasets
Professor Andreas Schütze and his team of experts in measurement and sensor technology at Saarland University have released a free data processing tool called simply Dave—a MATLAB toolbox that allows rapid evaluation of signals, pattern recognition and data visualization when processing huge datasets. The free software enables very large volumes of data, such as those…
NCSA Reveals Promising Diagnostics for Detecting Latent Tuberculosis
Small Babies, Big Data
The first week of a newborn’s life is a time of rapid biological change as the baby adapts to living outside the womb, suddenly exposed to new bacteria and viruses. Yet surprisingly little is known about these early changes. An international research study co-led by Boston Children’s Hospital has pioneered a technique to get huge…
New Method of Scoring Protein Interactions Mines Large Data Sets From a Fresh Angle
Researchers from the Stowers Institute for Medical Research have created a novel way to define individual protein associations in a quick, efficient and informative way. These findings, published in the March 8, 2019, issue of Nature Communications, show how the topological scoring (TopS) algorithm, created by Stowers researchers, can—by combining data sets—identify proteins that come together. The approach is…
TACC Assists in Massive Data Collection Effort in Lung Development to Help Premature Babies
In 2016, over a dozen scientists and engineers toured a neonatal intensive care unit, the section of the hospital that specializes in the care of ill or premature newborn infants. The researchers had come together from all around the country, and brought with them a wide variety of expertise. Visiting the newborns helped put into…
Big Data Harvesting Tool Will Deliver Smart Farming
Researchers from across Norwich Research Park have launched a new system for organising vast datasets on climate and crops. CropSight is a scalable and open-source information management system that can be used to maintain and collate important crop performance and microclimate information. Big data captured by diverse technologies known collectively as the Internet of Things…
Chemical Data Mining Boosts Search for New Organic Semiconductors
Producing traditional solar cells made of silicon is very energy intensive. On top of that, they are rigid and brittle. Organic semiconductor materials, on the other hand, are flexible and lightweight. They would be a promising alternative, if only their efficiency and stability were on par with traditional cells. Together with his team, Karsten Reuter,…
Supercomputing Effort Reveals Antibody Secrets
Using sophisticated gene sequencing and computing techniques, researchers at Vanderbilt University Medical Center (VUMC) and the San Diego Supercomputer Center have achieved a first-of-its-kind glimpse into how the body’s immune system gears up to fight off infection. Their findings, published this week in the journal Nature, could aid development of “rational vaccine design,” as well as…
AI and Big Data Provide the First Global Maps on Key Vegetation Traits
Citizen Science Projects Have a Surprising New Partner — the Computer
For more than a decade, citizen science projects have helped researchers use the power of thousands of volunteers who help sort through datasets that are too large for a small research team. Previously, this data generally couldn’t be processed by computers because the work required skills that only humans could accomplish. Now, computer machine learning…
Modeling Uncertain Terrain With Supercomputers
Many areas of science and engineering try to predict how an object will respond to a stimulus—how earthquakes propagate through the Earth or how a tumor will respond to treatment. This is difficult even when you know exactly what the object is made of, but how about when the object’s structure is unknown? The class…
Researchers Call for Big Data Infrastructure to Support Future of Personalized Medicine
Researcher Wins Machine-Learning Competition With Code That Sorts Through Simulated Telescope Data
A new telescope will take a sequence of hi-res snapshots with the world’s largest digital camera, covering the entire visible night sky every few days—and repeating the process for an entire decade. That presents a big data challenge: What’s the best way to rapidly and automatically identify and categorize all of the stars, galaxies, and…