The drug creation process often misses many side effects that kill at least 100,000 patients a year. LLNL researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions, using high-performance computers to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.
What do the DNA in Australian seaweed, Amazon River water, tropical plants, and forest soil all...
IBM will pay $1.5 billion to Globalfoundries in order to shed its costly chip division. IBM...
The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science and engineering.
Next-gen leaders push themselves every day to answer this key question: How can my organization make a difference? IBM is helping to deliver the answer with new apps powered by Watson to improve the quality of life. IBM's Watson is a groundbreaking platform with the ability to interact in natural language, process vast amounts of disparate forms of big data and learn from each interaction.
A little-known secret in data mining is that simply feeding raw data into a data analysis algorithm is unlikely to produce meaningful results. New discoveries often begin with comparison of data streams to find connections and spot outliers. But most data comparison algorithms today have one major weakness — somewhere, they rely on a human expert. But experts aren’t keeping pace with the complexities of big data.
IBM Watson Group's global headquarters, at 51 Astor Place in New York City's Silicon Alley, is open for business. The Watson headquarters will serve as a home base for more than 600 IBM Watson employees, just part of the more than 2,000 IBMers dedicated to Watson worldwide. In addition to a sizeable employee presence, IBM is opening its doors to area developers and entrepreneurs, hosting industry workshops, seminars and networking.
Building on client demand to integrate real-time analytics with consumer transactions, IBM has announced new capabilities for its System z mainframe. The integration of analytics with transactional data can provide businesses with real-time, actionable insights on commercial transactions as they occur to take advantage of new opportunities to increase sales and help minimize loss through fraud prevention.
For centuries, scientific research has been about data, and as data in research continues to grow exponentially, so does the importance of how it’s stored. A key example of how the scientific field can tackle Big Data storage is DESY, a scientific research organization dedicated to providing scientists worldwide faster access to insights into samples, making optimal data management in a high-volume environment extremely critical.
On September 20, early-bird pricing for the ISC Cloud and ISC Big Data registrations will be replaced with regular registration fees. With the regular rates, the passes will cost 100 Euro more for each conference, and the combined conference ticket, which allows attendees to participate in both events, will cost 150 Euro more. Thus, ISC is encouraging attendees to register this week in order to benefit from the current savings.
The National Center for Atmospheric Research (NCAR) has recently implemented an enhanced data sharing service that allows scientists increased access to data as well as improved capabilities for collaborative research. In addition to data sharing, NCAR has significantly upgraded its centralized file service, known as the Globally Accessible Data Environment (GLADE).
IBM has announced significant advances in Watson's cognitive computing capabilities that are enabling researchers to accelerate the pace of scientific breakthroughs by discovering previously unknown connections in Big Data.
In the age of big data, visualization tools are vital. With a single glance at a graphic display, a human being can recognize patterns that a computer might fail to find even after hours of analysis. But what if there are aberrations in the patterns? Or what if there’s just a suggestion of a visual pattern that’s not distinct enough to justify any strong inferences? Or what if the pattern is clear, but not what was to be expected?
Florida Polytechnic University, Flagship Solutions Group and IBM have announced a new supercomputing center at the University composed of IBM high performance systems, software and cloud-based storage, to help educate students in emerging technology fields. Florida Polytechnic University is the newest addition to the State University System and the only one dedicated exclusively to science, technology, engineering and mathematics (STEM).
NCSA’s Blue Waters project will offer a graduate course on High Performance Visualization for Large-Scale Scientific Data Analytics in Spring 2015 and is seeking university partners who are interested in offering the course for credit to their students. This semester-long online course will include video lectures, quizzes and homework assignments and will provide students with free access to the Blue Waters supercomputer.
In a society that has to understand increasingly big and complex datasets, EU researchers are turning to the subconscious for help in unraveling the deluge of information. Big Data refers to large amounts of data produced very quickly by a high number of diverse sources. Data can either be created by people or generated by machines, such as sensors gathering climate information, satellite imagery, digital pictures and videos...
The Michael J. Fox Foundation for Parkinson's Research (MJFF) and Intel have announced a collaboration aimed at improving research and treatment for Parkinson's disease — a neurodegenerative brain disease second only to Alzheimer's in worldwide prevalence. The collaboration includes a multiphase research study using a new big data analytics platform that detects patterns in participant data collected from wearable technologies.
Prof. Dr. Stefan Wrobel, M.S., is director of the Fraunhofer Institute for Intelligent Analysis and Information Systems (IAIS) and Professor of Computer Science at University of Bonn. He studied Computer Science in Bonn and Atlanta, GA, USA (M.S. degree, Georgia Institute of Technology), receiving his doctorate from University of Dortmund.
Dirk Slama is Director of Business Development at Bosch Software Innovations. Bosch SI is spearheading the Internet of Things (IoT) activities of Bosch, the global engineering group. As Conference Chair of the Bosch ConnectedWorld, Dirk helps shaping the IoT strategy of Bosch. Dirk has over 20 years experience in very large-scale application projects, system integration and Business Process Management. His international work experience includes projects for Lufthansa Systems, Boeing, AT&T, NTT DoCoMo, HBOS and others.
Scientists from IBM have unveiled the first neurosynaptic computer chip to achieve an unprecedented scale of one million programmable neurons, 256 million programmable synapses and 46 billion synaptic operations per second per watt. At 5.4 billion transistors, this fully functional and production-scale chip is currently one of the largest CMOS chips ever built, yet, while running at biological real time, it consumes a minuscule 70mW.
Cambridge UK-based start up Optalysys has stated that it is only months away from launching a prototype optical processor with the potential to deliver exascale levels of processing power on a standard-sized desktop computer. The company will demonstrate its prototype, which meets NASA Technology Readiness Level 4, in January of next year.
Big Data, it seems, is everywhere, usually characterized as a Big Problem. But researchers at Lawrence Berkeley National Laboratory are adept at accessing, sharing, moving and analyzing massive scientific datasets. At a July 14-16, 2014, workshop focused on climate science, Berkeley Lab experts shared their expertise with other scientists working with big datasets.
Enabling Innovation and Discovery through Data-Intensive High Performance Cloud and Big Data InfrastructureJuly 29, 2014 2:34 pm | by George Vacek, DataDirect Networks | Blogs | Comments
As the size and scale of life sciences datasets increases — think large-cohort longitudinal studies with multiple samples and multiple protocols — so does the challenge of storing, interpreting and analyzing this data. Researchers and data scientists are under increasing pressure to identify the most relevant and critical information within massive and messy data sets, so they can quickly make the next discovery.
In an age of “big data,” a single computer cannot always find the solution a user wants. Computational tasks must instead be distributed across a cluster of computers that analyze a massive data set together. It's how Facebook and Google mine your Web history to present you with targeted ads, and how Amazon and Netflix recommend your next favorite book or movie. But big data is about more than just marketing.
Music fans and critics know that the music of the Beatles underwent a dramatic transformation in just a few years. But, until now, there hasn’t been a scientific way to measure the progression. Computer scientists at Lawrence Technological University have developed an artificial intelligence algorithm that can analyze and compare musical styles, enabling research into their musical progression.
Ensemble forecasting is a key part of weather forecasting. Computers typically run multiple simulations using slightly different initial conditions or assumptions, and then analyze them together to try to improve forecasts. Using Japan’s K computer, researchers have succeeded in running 10,240 parallel simulations of global weather, the largest number ever performed, using data assimilation to reduce the range of uncertainties.
IBM is making high performance computing more accessible through the cloud for clients grappling with big data and other computationally intensive activities. A new option from SoftLayer will provide industry-standard InfiniBand networking technology to connect SoftLayer bare metal servers. This will enable very high data throughput speeds between systems, allowing companies to move workloads traditionally associated with HPC to the cloud.
The second ISC Big Data conference themed “From Data To Knowledge,” builds on the success of the inaugural 2013 event. A comprehensive program has been put together by the Steering Committee under the leadership of Sverre Jarp, who retired officially as the CTO of CERN openlab in March of this year.
- Page 1