The WOS 360 2.0 object storage platform enables secure public and private cloud deployments, and delivers efficient data protection options for data-intensive environments, offering a full suite of options to expand use cases in enterprises, cloud, BYOD shared storage, deep archive, video streaming, and file sync and share offerings.
Large-scale scientific organizations are grappling with the implications of rapid data growth....
In case you missed it, here's another chance to catch this week's biggest hits. Software and “...
Researchers have successfully demonstrated pattern recognition using a magnonic holographic memory device, a development that could greatly improve speech and image recognition hardware. Pattern recognition focuses on finding patterns and regularities in data. The uniqueness of the demonstrated work is that the input patterns are encoded into the phases of the input spin waves.
The German Climate Computing Center is managing the world's largest climate simulation data archive, used by climate researchers worldwide. The archive consists of more than 40 petabytes of data and is projected to grow by roughly 75 petabytes annually over the next five years. As climate simulations are carried out on increasingly powerful supercomputers, massive amounts of data are produced that must be effectively stored and analyzed.
It’s often said that no two human fingerprints are exactly alike. For that reason, police often use them as evidence to link suspects to crime scenes. The same goes for silicon chips: Manufacturing processes cause microscopic variations in chips that are unpredictable, permanent and effectively impossible to clone.
As ubiquitous as the term “big data” has become, the path for drawing real, actionable insights hasn’t always been as clear. And the need is only becoming greater as organizations generate greater and greater amounts of structured and unstructured data. While data-intensive computing is not new to (HPC environments, newer analytic frameworks, including Hadoop, are emerging as viable compasses for navigating the complex amounts of data.
Ever since computers have been small enough to be fixtures on desks, their central processing has functioned something like an atomic Etch A Sketch, with electromagnetic fields pushing data bits into place to encode data. Unfortunately, the same drawbacks and perils of the mechanical sketch board have been just as pervasive in computing: making a change often requires starting over, and dropping the device could wipe out the memory.
Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory FacilityApril 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments
The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.
The GenomeStack Big Data Analytics platform has been developed specifically for bioinformatics researchers, data scientists and analysts conducting genome research. The database replaces the traditional file-based, manual process for storing and analyzing genome sequenced data.
Seagate Technology has announced that four Cray customers will be among the first to implement Seagate’s latest high performance computing storage technology. Combined, the implementations of these four customers in the government, weather, oil and gas, and university sectors will consume more than 120 petabytes of storage capacity.
IBM scientists have demonstrated an areal recording density of 123 billion bits of uncompressed data per square inch on low-cost, particulate magnetic tape, a breakthrough which represents the equivalent of a 220 terabyte tape cartridge that could fit in the palm of your hand.
Efficient, Time Sensitive Execution of Next-gen Sequencing Pipelines Critical for Translational MedicineApril 6, 2015 3:26 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments
Demand for genomics processing is rapidly spreading from research labs to the clinical arena. Genomics is now a "must have" tool for researchers in areas of oncology and rare diseases. It is also becoming a requirement in the clinical space for precision medicine, translational medicine and similar "bench to bedside" initiatives.
Total has chosen SGI to upgrade its supercomputer Pangea. Total is one of the largest integrated oil and gas companies in the world, with activities in more than 130 countries. Its 100,000 employees put their expertise to work in every part of the industry — the exploration and production of oil and natural gas, refining, chemicals, marketing and new energies. This updated system would place in the top 10 of the latest TOP500 list.
Researchers have demonstrated the first-ever recording of optically encoded audio onto a non-magnetic plasmonic nanostructure, opening the door to multiple uses in informational processing and archival storage.
The U.S. Centers for Disease Control and Prevention (CDC) reports that the 2014 Ebola epidemic is the largest in history, affecting multiple countries in West Africa. As of February 26, 2015, the CDC had tracked 23,816 cases, and Ebola had already claimed nearly 10,000 lives.
The drive toward exascale computing, renewed emphasis on data-centric processing, energy efficiency concerns, and limitations of memory and I/O performance are all working to reshape HPC platforms, according to Intersect360 Research’s Top Six Predictions for HPC in 2015. The report cites many-core accelerators, flash storage, 3-D memory, integrated networking, and optical interconnects as just some of the technologies propelling future...
One of the fundamental and open problems in computer science is effective data storage. Unfortunately, magnetic and flash storage devices alone have proven to be unreliable to guarantee data availability and survivability, due to their frequent and unpredictable failures. ATOMICDFS aims to investigate the existence of highly efficient DFS able to provide atomic guarantees in harsh environments.
How can we preserve our knowledge today for the next millennia? Researchers have found a way to store information in the form of DNA, preserving it for nearly an eternity. As encapsulation in silica is roughly comparable to that in fossilized bones, researchers could draw on prehistoric information about long-term stability and calculate a prognosis: through storage in low temperatures, DNA-encoded information can survive.
The Seagate EVault Backup Target Appliance is a backup device for large enterprises and service providers seeking data protection for multiple operating systems and appliances, including Oracle databases. It is based on Seagate’s hybrid cloud model, which supports environments where both private (on-premise) and public (off-premise) storage is used.
Researchers have succeeded in switching tiny, magnetic structures using laser light and tracking the change over time. In the process, a nanometer-sized area bizarrely reminiscent of the Batman logo appeared. The research results could render data storage on hard drives faster, more compact and more efficient.
Cray Sonexion 2000 system combines expertise in designing, scaling and managing end-to-end Lustre solutions with a unique architecture that allows for maximum scalability.
SiteSync is a high-speed, parallel replication and file transfer solution designed to solve the challenge of moving massive data sets across the LAN quickly and with ease. The software can replicate data up to 10x faster than conventional file transfer utilities. It provides fast and flexible parallel data migration and synchronization for disaster recovery, disk-to-disk backup and remote archive applications.
The ActiveStor16 hybrid scale-out NAS appliance is powered by the PanFS 6.0 storage operating system. Based on a fifth-generation storage blade architecture, it is designed to deliver performance that increases with scale, alongside enterprise-grade reliability that improves at scale, for the energy, finance, government, life sciences, manufacturing, media and university research markets.
WOS S3 is designed for efficient storing and sharing of massive quantities of big data. It adds support for the industry-standard Amazon S3 API, offering the broadest range of supported interfaces to an object storage platform in the industry.
Cray announced it has been awarded a contract to provide the Met Office in the United Kingdom with multiple Cray XC supercomputers and Cray Sonexion storage systems. Consisting of three phases spanning multiple years, the $128 million contract expands Cray’s presence in the global weather and climate community, and is the largest supercomputer contract ever for Cray outside of the United States.
Russian scientists have developed a theoretical model of quantum memory for light, adapting the concept of a hologram to a quantum system. The authors demonstrate for the first time that it is theoretically possible to retrieve, on demand, a given portion of the stored quantized light signal of a holographic image — set in a given direction in a given position in time sequence.
In a keynote speech at IBM Enterprise, Jamie Thomas, General Manager, Storage and Software Defined Systems at IBM, unveiled a bold strategy for the company’s storage business. Building upon the Software Defined Storage portfolio announced last May, IBM is focusing its storage business on a new model for enterprise data storage that is optimized for interoperability across hardware and software solutions.
- Page 1