As the fourth largest cooperative bank in Germany, DZ Bank supports the business activities of over 900 other cooperative banks in the country. Dr. Jan Vitt, the Head of IT Infrastructure at DZ Bank will be talking about how a conservative institution like his is effectively adopting cloud computing to address the IT needs of their various business divisions.
SQream DB is a high-speed GPU-based columnar SQL database designed to uniquely address the speed...
NASA is bringing together experts spanning a variety of scientific fields for an unprecedented...
Computer Science, Statistical Methods Combine to Analyze Stunningly Diverse Genomic Big Data CollectionsApril 28, 2015 3:36 pm | by Simons Foundation | News | Comments
A multi-year study led by researchers from the Simons Center for Data Analysis and major...
As ubiquitous as the term “big data” has become, the path for drawing real, actionable insights hasn’t always been as clear. And the need is only becoming greater as organizations generate greater and greater amounts of structured and unstructured data. While data-intensive computing is not new to (HPC environments, newer analytic frameworks, including Hadoop, are emerging as viable compasses for navigating the complex amounts of data.
Universal Resource Broker is an enterprise-class workload optimization solution for high performance, containerized and shared data centers. It is designed to enable organizations to achieve massive scalability of shared data center resources and to lay the foundation for the Internet of Things.
Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory FacilityApril 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments
The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.
A new technique of visualizing the complicated relationships between anything from Facebook users to proteins in a cell provides a simpler and cheaper method of making sense of large volumes of data.
For many computationally-intensive applications, such as simulation, seismic processing and rendering, overall speed is still the name of the game. However, new branch of HPC is gaining momentum. IDC calls it “High Performance Data Analysis” (HPDA for short). Essentially, it’s the union of big data and HPC. How will these architectures evolve? Let’s start by looking at the data.
Seagate Technology has announced that four Cray customers will be among the first to implement Seagate’s latest high performance computing storage technology. Combined, the implementations of these four customers in the government, weather, oil and gas, and university sectors will consume more than 120 petabytes of storage capacity.
The Salford Predictive Modeler (SPM) software suite includes CART, MARS, TreeNet and Random Forests, as well as powerful automation and modeling capabilities. The software is designed to be a highly accurate and ultra-fast analytics and data mining platform for creating predictive, descriptive and analytical models from databases of any size, complexity or organization.
A predictive model using machine learning algorithms is able to predict with 75 percent accuracy how many asthma-related emergency room visits a hospital could expect on a given day. Twitter users who post information about their personal health online might be considered by some to be "over-sharers," but new research suggests that health-related tweets may have the potential to be helpful for hospitals.
Ryft One is an open platform to analyze streaming, historical, unstructured, and multi-structured data in real-time. It is a commercial 1U platform capable of providing fast and actionable business insights by analyzing both historical and streaming data at an unprecedented 10 Gigabytes/second or faster.
Technological advances are enabling scientists to sequence the genomes of cancer tumors, revealing a detailed portrait of genetic mutations that drive these diseases. But genomic studies are only one piece of the puzzle that is precision medicine. In order to realize the promise of this field, there needs to be an increased focus on creating robust clinical databases.
A new technology in development has the potential to revolutionize the sourcing of renewable energy from rivers.
Efficient, Time Sensitive Execution of Next-gen Sequencing Pipelines Critical for Translational MedicineApril 6, 2015 3:26 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments
Demand for genomics processing is rapidly spreading from research labs to the clinical arena. Genomics is now a "must have" tool for researchers in areas of oncology and rare diseases. It is also becoming a requirement in the clinical space for precision medicine, translational medicine and similar "bench to bedside" initiatives.
It’s almost a rite of passage in physics and astronomy. Scientists spend years scrounging up money to build a fantastic new instrument. Then, when the long-awaited device finally approaches completion, the panic begins: How will they handle the torrent of data? The Square Kilometer Array will have an unprecedented ability to deliver data on the location and properties of stars, galaxies and giant clouds of hydrogen gas.
The Weather Company Migrates Data Services to IBM Cloud, Plans to Advance Internet of Things SolutionsMarch 31, 2015 1:43 pm | by IBM | News | Comments
IBM and The Weather Company have announced a global strategic alliance to integrate real-time weather insights into business to improve operational performance and decision-making. As part of the alliance, The Weather Company, including its global B2B division WSI, will shift its massive weather data services platform to the IBM Cloud and integrate its data with IBM analytics and cloud services.
When disaster strikes, it is critical that experts, decision makers and emergency personnel have access to real-time information in order to assess the situation and respond appropriately. It is equally critical that individuals and organizations have the capacity to analyze the wealth of data generated in the midst of the disaster and its immediate aftermath in order to produce accurate, customized warnings.
MOVIA Big Data Analytics Platform is designed to help organizations watch for important patterns in their data and generate instant alerts to users or other systems. The software enables improved prediction of trends through advanced data modeling that captures situational context, so decisions are not ‘made in a vacuum.’
Big data: It’s a term we read and hear about often, but is hard to grasp. Computer scientists tackled some big data about an important protein and discovered its connection in human history as well as clues about its role in complex neurological diseases. Through a novel method of analyzing these big data, they discovered a region encompassing the gephyrin gene on chromosome 14 that underwent rapid evolution after splitting in two...
The Association for Computing Machinery has named Michael Stonebraker of MIT recipient of the 2014 ACM A.M. Turing Award for fundamental contributions to the concepts and practices underlying modern database systems. Database systems are critical applications of computing and preserve much of the world's important data. Stonebraker invented many of the concepts that are used in almost all modern database systems.
Having a strategy in place for effective asset performance management (APM) is critical in today’s zero downtime world. To guarantee that you are fully utilizing your assets, you should consider implementing the three “M” strategy: Measure, Monitor and Manage. This allows you to best gauge the state and quality of your assets, make changes where needed before a problem arises and strategically plan for future production.
University of Pittsburgh, Carnegie Mellon University, UPMC Form Alliance to Transform Healthcare through Big DataMarch 17, 2015 2:19 pm | by UPMC | News | Comments
Today’s health care system generates massive amounts of data — electronic health records, diagnostic imaging, prescriptions, genomic profiles, insurance records, even data from wearable devices. Information has always been essential for guiding care, but computer tools now make it possible to use that data to provide deeper insights. Leveraging big data to revolutionize healthcare is the focus of the Pittsburgh Health Data Alliance.
Pharmaceutical companies are under intense pressure. With patents expiring and cost pressures growing, the speed and productivity of drug discovery and manufacturing are under the microscope. It is timely, then, that researchers recently shared promising findings on Eve — an artificially-intelligent robot scientist. Eve discovered a compound with anti-cancer properties. Is this a glimpse of what the lab of the future might look like?
Scientists have long known that our ability to think quickly and recall information, also known as fluid intelligence, peaks around age 20 and then begins a slow decline. However, more recent findings, including a new study from neuroscientists at MIT and Massachusetts General Hospital (MGH), suggest that the real picture is much more complex.
The organizers of the inaugural ISC Cloud & Big Data conference are offering engineers and scientists in academia, industry and the government the opportunity to be a part of their new forum. Researchers in cloud computing and big data are encouraged to submit research papers, which will be presented to attendees during the conference proceedings.
A National Institutes of Health-led public-private partnership to transform and accelerate drug development achieved a significant milestone recently with the launch of a new Alzheimer’s Big Data portal — including delivery of the first wave of data — for use by the research community.
- Page 1