Large-scale scientific organizations are grappling with the implications of rapid data growth. Massive data collections, analytics and the need for data collaboration are driving the need for high-performance storage solutions that can deliver time to results, fast. A different breed of technologies developed originally for the supercomputing industry are being adapted to meet the needs of technical computing organizations.
The PowerEdge C6320 server is purpose-built for high-performance computing and hyper-converged...
IBM and Bon Appétit have introduced a one-of-a-kind Chef Watson cognitive computing cooking app...
Broad Institute is teaming up with Google Genomics to explore how to break down major technical...
Dell has announced an extended partnership with TGen to help clinical researchers and doctors globally expand the reach and impact of the world's first FDA-approved precision medicine trial for pediatric cancer. The renewed commitment includes an additional $3 million Dell grant to support continued collaboration with TGen and the Neuroblastoma and NMTRC expanded pediatric cancer clinical trials in EMEA.
New York Scientific Data Summit is a no-fee annual meeting that aims to accelerate data-driven discovery and innovation by bringing together researchers, developers and end-users from academia, industry, utilities and state and federal governments. Jointly organized by Brookhaven National Laboratory, Stony Brook University and New York University, this year’s conference will take place from August 2 to 5, 2015, at NYU.
ISC Events has announced that registration is now open for the inaugural ISC Cloud & Big Data event, which will be held this fall in Frankfurt, Germany. The entire conference will take place at the Frankfurt Marriott hotel, located in the city center. The three-day event will kick off with a full-day workshop on September 28, followed by the main program on September 29 and 30.
The Internet contains a vast trove of information - sometimes called the "Deep Web" - that isn't indexed by search engines: information that would be useful for tracking criminals, terrorist activities, sex trafficking and the spread of diseases. Scientists could also use it to search for images and data from spacecraft.
In case you haven’t caught them yet, here's a recap of this week's most popular stories. Looking at the universe as a hologram; diesel fuel from carbon dioxide and water; first observations of a rare subatomic process; a big data history of music charts; secrets of colossal, invisible waves; perceptions of dress colors; and more are among the top hits.
Everyone has heard the old adage that time is money. In today’s society, business moves at the speed of making a phone call, looking something up online via your cell phone, or posting a tweet. So, when time is money (and can be a lot of money), why are businesses okay with waiting weeks or even months to get valuable information from their data?
Advances in technology have generated vast amounts of “omics” data: genomic, epigenomic, transcriptomic, proteomic and metabolomic changes for all types of specimens. Bridging the gap between data generation and investigators’ ability to retrieve and interpret data is essential to realize the biological and clinical value of this wealth of information.
Evolutionary biologists and computer scientists have come together study the evolution of pop music. Their analysis of 17,000 songs from the US Billboard Hot 100 charts, 1960 to 2010, is the most substantial scientific study of the history of popular music to date. They studied trends in style, the diversity of the charts, and the timing of musical revolutions.
The ISC Cloud & Big Data Research Committee is accepting submissions until Tuesday, May 19, 2015. The Research Paper Sessions “aim to provide first-class open forums for engineers and scientists in academia, industry and government to present and discuss issues, trends and results to shape the future of cloud computing and big data.” The sessions will be held on Tuesday, September 29 and on Wednesday, September 30, 2015.
IBM Watson is collaborating with more than a dozen leading cancer institutes to accelerate the ability of clinicians to identify and personalize treatment options for their patients. The institutes will apply Watson's advanced cognitive capabilities to reduce from weeks to minutes the ability to translate DNA insights, understand a person's genetic profile and gather relevant information from medical literature to personalize treatment.
Protecting the world from destruction by asteroids sounds like superhuman power, but NASA scientists work tirelessly to ensure that humans today are protected from this potential harm. Asteroids need to be hunted in order to identify which ones may endanger Earth, and analyzing the big data puzzle of asteroid detection has been an arduous process. That is, until the power of crowdsourcing was discovered.
As the emergence of social media, cloud and big data continues to fuel the digital evolution, today’s digital workplace must drive new levels of employee engagement, operational efficiency and service excellence. To help deliver on this digital transformation, EMC has announced new enhancements across its Documentum portfolio of ECM applications, enabling users to further address next-generation ECM.
As the fourth largest cooperative bank in Germany, DZ Bank supports the business activities of over 900 other cooperative banks in the country. Dr. Jan Vitt, the Head of IT Infrastructure at DZ Bank will be talking about how a conservative institution like his is effectively adopting cloud computing to address the IT needs of their various business divisions.
SQream DB is a high-speed GPU-based columnar SQL database designed to uniquely address the speed, scalability and efficiency hurdles that face big data analytics. It is capable of processing and analyzing high volumes of data, while delivering a high cost/performance ratio.
NASA is bringing together experts spanning a variety of scientific fields for an unprecedented initiative dedicated to the search for life on planets outside our solar system. The Nexus for Exoplanet System Science, or “NExSS,” hopes to better understand the various components of an exoplanet, as well as how the planet stars and neighbor planets interact to support life.
Computer Science, Statistical Methods Combine to Analyze Stunningly Diverse Genomic Big Data CollectionsApril 28, 2015 3:36 pm | by Simons Foundation | News | Comments
A multi-year study led by researchers from the Simons Center for Data Analysis and major universities and medical schools has broken substantial new ground, establishing how genes work together within 144 different human tissues and cell types in carrying out those tissues’ functions. The paper also demonstrates how computer science and statistical methods may combine to analyze genomic ‘big-data’ collections.
As ubiquitous as the term “big data” has become, the path for drawing real, actionable insights hasn’t always been as clear. And the need is only becoming greater as organizations generate greater and greater amounts of structured and unstructured data. While data-intensive computing is not new to (HPC environments, newer analytic frameworks, including Hadoop, are emerging as viable compasses for navigating the complex amounts of data.
Universal Resource Broker is an enterprise-class workload optimization solution for high performance, containerized and shared data centers. It is designed to enable organizations to achieve massive scalability of shared data center resources and to lay the foundation for the Internet of Things.
Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory FacilityApril 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments
The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.
A new technique of visualizing the complicated relationships between anything from Facebook users to proteins in a cell provides a simpler and cheaper method of making sense of large volumes of data.
For many computationally-intensive applications, such as simulation, seismic processing and rendering, overall speed is still the name of the game. However, new branch of HPC is gaining momentum. IDC calls it “High Performance Data Analysis” (HPDA for short). Essentially, it’s the union of big data and HPC. How will these architectures evolve? Let’s start by looking at the data.
Seagate Technology has announced that four Cray customers will be among the first to implement Seagate’s latest high performance computing storage technology. Combined, the implementations of these four customers in the government, weather, oil and gas, and university sectors will consume more than 120 petabytes of storage capacity.
The Salford Predictive Modeler (SPM) software suite includes CART, MARS, TreeNet and Random Forests, as well as powerful automation and modeling capabilities. The software is designed to be a highly accurate and ultra-fast analytics and data mining platform for creating predictive, descriptive and analytical models from databases of any size, complexity or organization.
A predictive model using machine learning algorithms is able to predict with 75 percent accuracy how many asthma-related emergency room visits a hospital could expect on a given day. Twitter users who post information about their personal health online might be considered by some to be "over-sharers," but new research suggests that health-related tweets may have the potential to be helpful for hospitals.
Ryft One is an open platform to analyze streaming, historical, unstructured, and multi-structured data in real-time. It is a commercial 1U platform capable of providing fast and actionable business insights by analyzing both historical and streaming data at an unprecedented 10 Gigabytes/second or faster.
- Page 1