Carnegie Science announces the launch of the Carnegie Airborne Observatory-3 (CAO-3), the most scientifically advanced aircraft-based mapping and data analytics system in civil aviation today. This third-generation aircraft has been completely overhauled from previous models, boasting a multitude of cutting-edge improvements to its onboard laboratory.
The GenomeStack Big Data Analytics platform has been developed specifically for bioinformatics...
For many computationally-intensive applications, such as simulation, seismic processing and...
The Salford Predictive Modeler (SPM) software suite includes CART, MARS, TreeNet and Random...
A predictive model using machine learning algorithms is able to predict with 75 percent accuracy how many asthma-related emergency room visits a hospital could expect on a given day. Twitter users who post information about their personal health online might be considered by some to be "over-sharers," but new research suggests that health-related tweets may have the potential to be helpful for hospitals.
Ryft One is an open platform to analyze streaming, historical, unstructured, and multi-structured data in real-time. It is a commercial 1U platform capable of providing fast and actionable business insights by analyzing both historical and streaming data at an unprecedented 10 Gigabytes/second or faster.
A consortium of European space scientists has succeeded in establishing a common data hub that allows the comparison of data from numerous space missions. A task that until now was hampered by different data processing protocols of individual space missions. Furthermore, observational data can now easily be compared with theoretical numerical models — regardless of the protocols used.
It’s almost a rite of passage in physics and astronomy. Scientists spend years scrounging up money to build a fantastic new instrument. Then, when the long-awaited device finally approaches completion, the panic begins: How will they handle the torrent of data? The Square Kilometer Array will have an unprecedented ability to deliver data on the location and properties of stars, galaxies and giant clouds of hydrogen gas.
The Weather Company Migrates Data Services to IBM Cloud, Plans to Advance Internet of Things SolutionsMarch 31, 2015 1:43 pm | by IBM | News | Comments
IBM and The Weather Company have announced a global strategic alliance to integrate real-time weather insights into business to improve operational performance and decision-making. As part of the alliance, The Weather Company, including its global B2B division WSI, will shift its massive weather data services platform to the IBM Cloud and integrate its data with IBM analytics and cloud services.
When disaster strikes, it is critical that experts, decision makers and emergency personnel have access to real-time information in order to assess the situation and respond appropriately. It is equally critical that individuals and organizations have the capacity to analyze the wealth of data generated in the midst of the disaster and its immediate aftermath in order to produce accurate, customized warnings.
MOVIA Big Data Analytics Platform is designed to help organizations watch for important patterns in their data and generate instant alerts to users or other systems. The software enables improved prediction of trends through advanced data modeling that captures situational context, so decisions are not ‘made in a vacuum.’
Smart grids help avoid blackouts and deter cyber attacks. They also pose new challenges. As power generation — and the communication and information processing associated with it — shifts from centralized power stations to distributed, heterogeneous systems, massive amounts of sensor data from stations must be transmitted efficiently and effectively analyzed in real time.
Having a strategy in place for effective asset performance management (APM) is critical in today’s zero downtime world. To guarantee that you are fully utilizing your assets, you should consider implementing the three “M” strategy: Measure, Monitor and Manage. This allows you to best gauge the state and quality of your assets, make changes where needed before a problem arises and strategically plan for future production.
University of Pittsburgh, Carnegie Mellon University, UPMC Form Alliance to Transform Healthcare through Big DataMarch 17, 2015 2:19 pm | by UPMC | News | Comments
Today’s health care system generates massive amounts of data — electronic health records, diagnostic imaging, prescriptions, genomic profiles, insurance records, even data from wearable devices. Information has always been essential for guiding care, but computer tools now make it possible to use that data to provide deeper insights. Leveraging big data to revolutionize healthcare is the focus of the Pittsburgh Health Data Alliance.
For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.
New research will likely be crucial to measuring the impact of climate change on thunderstorms — one of the weather occurrences most problematic for human life on the planet. The varying frequency and intensity of thunderstorms have direct repercussions for the public, agriculture and industry.
In the largest collaborative study of the brain to date, researchers from the Keck School of Medicine of the University of Southern California (USC) led a global consortium of 190 institutions to identify eight common genetic mutations that appear to age the brain an average of three years. The discovery could lead to targeted therapies and interventions for Alzheimer’s disease, autism and other neurological conditions.
On Wednesday, January 21, Scientific Computing will host a live panel discussion that looks at how big data and data science have fast become the next frontier for innovation, competition and productivity. One of today’s significant advances in data science introduces us to the Next Generation Cyber Capability (NGCC) at Arizona State University (ASU)...
IBM has announced that it received a record 7,534 patents in 2014 — marking the 22nd consecutive year that the company topped the annual list of U.S. patent recipients. IBM inventors earned an average of more than 20 patents per day in 2014, propelling the company to become the first to surpass more than 7,000 patents in a single year.
Researchers have found that, based on enough Facebook Likes, computers can judge your personality traits better than your friends, family and even your partner. Using a new algorithm, researchers have calculated the average number of Likes artificial intelligence (AI) needs to draw personality inferences about you as accurately as your partner or parents.
In the coming year, while consumers will be treated to a dizzying array of augmented reality, wearables, and low-cost 3-D printers, computer researchers will be tackling the underlying technology issues that make such cutting-edge consumer electronics products possible. IEEE Computer Society has announced the top 10 most important technology trends for 2015 and explores how these technologies will be integrated into daily life.
Environmental Intelligence: Significant Investment in Next-Gen Supercomputers to Improve Weather ForecastsJanuary 6, 2015 12:26 pm | by NOAA | News | Comments
NOAA has announced the next phase in the agency’s efforts to increase supercomputing capacity to provide more timely, accurate, reliable and detailed forecasts. By October 2015, the capacity of each of NOAA’s two operational supercomputers will jump to 2.5 petaflops, for a total of 5 petaflops — a nearly tenfold increase from the current capacity.
With drug-resistant bacteria on the rise, even common infections that were easily controlled for decades — such as pneumonia — are proving trickier to treat with standard antibiotics. New drugs are desperately needed, but so are ways to maximize the effective lifespan of these drugs. Researchers used software they developed to predict a constantly-evolving infectious bacterium's countermoves to one of these new drugs ahead of time...
Researchers have detected at least three instances of cross-species mating that likely influenced the evolutionary paths of “old world” mice, two in recent times and one in the distant past. They think these instances of introgressive hybridization are only the first of many needles waiting to be found in a very large genetic haystack. The finding suggests that hybridization in mammals may not be an evolutionary dead end.
IBM announced that the U.S. Department of Veterans Affairs is using Watson technology in a pilot to assist physicians in helping accelerate the process of evidence-based medical decision making. The VA joins leading healthcare organizations that are working with IBM Watson to help improve efficiency and quality of care being delivered. The VHA will also work with Watson for a clinical focus supporting veterans with PTSD.
Although there are a diverse range of applications for predictive analytics in R&D, two common basic requirements are data and insight. Data may be generated by running experiments/analyses, or re-applied from previous work when available. Insights come from application of knowledge — both explicitand tacit. There are a variety of roles for informatics in predictive analytics...
An international competition using the wisdom of crowds has developed computer algorithms to detect, predict, and ultimately prevent epileptic seizures. A total of five-hundred and four teams competed in two challenges, one for Seizure Detection and a second for Seizure Prediction.
PayPal engineers developed a platform for real‐time event analytics using HPC designs on new hardware technology.
The predictive analytics landscape covers a wide variety of techniques and methods designed to derive insights from data. These techniques have been used successfully for many years on structured data. In recent times, the volume and variety of data available for analysis has exploded, and most of this data is in non-traditional forms.
- Page 1