Cloud computing is not only the latest revolution in the Information and Communication Technology (ICT) world, but a key enhancer of innovation and economic development. Within the framework of the project CLOUDS, Madrid-based researchers have made crucial scientific advances in the state-of-the-art of cloud computing.
IBM announced it is collaborating with DESY, a national research center in Germany, to speed up...
Enabling Innovation and Discovery through Data-Intensive High Performance Cloud and Big Data InfrastructureJuly 29, 2014 2:34 pm | by George Vacek, DataDirect Networks | Blogs | Comments
As the size and scale of life sciences datasets increases — think large-cohort longitudinal...
Interest in magnetic random access memory (MRAM) is escalating, thanks to demand for fast, low-...
IBM is announcing a new software defined storage-as-a-service on IBM SoftLayer, code named Elastic Storage on Cloud, that gives organizations access to a fully-supported, ready-to-run storage environment, which includes SoftLayer bare metal resources and high performance data management and allows organizations to move data between their on-premise infrastructure and the cloud.
The storage capacity of hard drives is increasing explosively, but the speed with which all that data can be written has reached its limits. Researchers presented a promising new technology which potentially allows data to be stored 1,000 times as fast in Nature Communications. The technology, in which ultra-short laser pulses generate a ‘spin current,’ also opens the way to future optical computer chips.
Cellphone metadata has been in the news quite a bit lately, but the National Security Agency isn’t the only organization that collects information about people’s online behavior. Newly downloaded cellphone apps routinely ask to access your location information, your address book or other apps and, of course, Web sites like Amazon or Netflix track your browsing history in the interest of making personalized recommendations.
In nearly every field of science, experiments, instruments, observations, sensors, simulations, and surveys are generating massive data volumes that grow at exponential rates. Discoverable, shareable data enables collaboration and supports repurposing for new discoveries — and for cross-disciplinary research enabled by exchange across communities that include both scientists and citizens.
ThinkParQ and Q-Leap Networks have announced a close partnership to deliver scalable storage solutions to their customers that are easy to deploy and operate. Based on the parallel filesystem BeeGFS and the cluster OS Qlustar, this will enable customers to build extremely fast storage for a wide range of workloads.
At the 2014 International Supercomputing Conference in Leipzig, Germany, global supercomputer leader Cray Inc. has announced the launch of a new data management and protection solution for Lustre file systems. This new capability is designed to simplify the management and preservation of data stored on Lustre, using Cray Tiered Adaptive Storage (TAS).
The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.
It is the breakthrough that physicists and chemists around the world have long anticipated, and it will play a pivotal role in information technology in coming years. Researchers have managed, for the first time, to directly observe the 100-percent spin polarization of a Heusler compound. The physicists at Mainz University have demonstrated that the Heusler compound Co2MnSi has the necessary electronic properties.
The University of California, Santa Cruz, (UCSC) is advancing astrophysics research using a high-performance storage solution — Dell | Terascala HPC Storage Solution (DT-HSS) — part of Hyades, the university’s powerful astrophysics supercomputer installed in 2013.
The SSG-5018A-A(R/S)12L SuperStorage Server is a compact specialized storage server designed to minimize power consumption and reduce cooling requirements by spinning down or powering off idle drives and managing data streams via Supermicro’s compact, low-power Intel Atom C2750-based serverboard for cold storage.
IBM researchers announced they have demonstrated a new record of 85.9 billion bits of data per square inch in areal data density on low-cost linear magnetic particulate tape — a significant update to one of the computer industry's most resilient data storage technologies for Big Data.
Elastic Storage is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device. The technology allows enterprises to exploit the exploding growth of data in a variety of forms generated by devices, sensors, business processes and social networks.
Icebreaker HPC is a family of highly scalable, high performance file-system solutions based on EMC VNX and EMC Isilon storage. Currently available with either Lustre or NFS file-systems, the family integrates EMC VNX and Isilon platforms with Penguin standard rack infrastructure in a form factor that allows the easy addition of other HPC solutions.
Supermicro 42U rack object-based storage clusters feature end-to-end 10GbE interconnectivity across triple redundant 1U Monitor nodes and compute/capacity balanced 3.5” HDD/SSD storage servers in 2U 12x bay, 4U 36x bay and 4U 72x bay configurations.
Chris Catherasoo has broad expertise in state-of-the-art supercomputers, high-end visualization systems, high-performance storage and networking, including hardware, software, scheduling and operations. Technical expertise in numerical methods and algorithm development, and in software design and development, including programming (Fortran and C), code testing and validation, configuration control and documentation.
Silicon Mechanics announces their adoption of NexentaStor 4.0, the next generation of Software Defined Storage (SDS) solution from partner Nexenta. Nexenta's SDS solution is a key building block of the Software Defined Data Center and one that is helping companies' transition away from massively expensive storage systems.
Technology Academy Finland (TAF) has declared innovator Prof. Stuart Parkin as winner of the 2014 Millennium Technology Prize, the prominent award for technological innovation. Parkin receives the Prize in recognition of his discoveries, which have enabled a thousand-fold increase in the storage capacity of magnetic disk drives. Parkin’s innovations have led to a huge expansion of data acquisition and storage capacities
Intelligent Storage Bridge (ISB) is designed to enhance the throughput and reliability of large data transfers, thereby increasing fast scratch efficiency and overall application workflow performance. Used in HPC environments, it includes vendor-agnostic support of Lustre solutions, allowing organizations to bring together a wider range of HPC and enterprise storage solutions.
Optical data storage does not require expensive magnetic materials as synthetic alternatives work just as well. This is the finding of an international team from York, Berlin and Nijmegen, published Thursday February 27 in Applied Physics Letters. The team’s discovery brings the much cheaper method...
The Intel Xeon processor E7 v2 family delivers capabilities to process and analyze large, diverse amounts of data to unlock information that was previously inaccessible. The processor family has triple the memory capacity of the previous generation processor family, allowing much faster and thorough data analysis.
ScienceCloud is an SaaS-based information management and collaboration workspace for externalized life science research and development. It is designed to advance collaborative drug discovery with a new generation of integrated applications built on a scalable, cloud-based scientific platform.
As computers enter ever more areas of our daily lives, the amount of data they produce has grown enormously. But for this “big data” to be useful it must first be analyzed, meaning it needs to be stored in such a way that it can be accessed quickly when required.
China's Lenovo Group is buying IBM's server business for $2.3 billion, expanding a product line-up dominated by PCs, tablets and smartphones. Lenovo, the world's biggest personal computer maker, said January 23, 2014, it expects to offer jobs to 7,500 IBM employees as part of its acquisition of the x86 server business.
This month, the Texas Advanced Computer Center (TACC) at The University of Texas at Austin, US, along with technology partners HP and NVIDIA, will deploy Maverick — a powerful, high-performance visualization and data analytics resource for the open science and engineering community.
The sixth generation of enterprise X-Architecture for System x and PureSystems servers, provides improvements in the performance and economics of x86-based systems for analytics and cloud. As users adopt analytics for greater business insight and move critical workloads like ERP, analytics and database to the cloud for increased efficiency and lower costs, x86-based systems are a popular choice.
- Page 1