Particle Fever, an award-winning documentary that has garnered international attention, follows six brilliant scientists during the launch of the Large Hadron Collider at the CERN facility in Switzerland, marking the start-up of the biggest and most expensive experiment in the history of the planet, pushing the edge of human innovation.
The World Digital Library reached a milestone on March 6, 2014, surpassing 10,000 items with the...
This month’s review is a bit off of the usual track, e.g. statistical, mathematical and genomics...
Astronomers at the University of Washington have developed a new method of gauging the...
How can organizations embrace — instead of brace for v the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning as little more than expensive storage space.
At Cycle Computing we’re seeing several large trends as it relates to Big Data and Analytics. We started talking about this concept of Big Compute back in Oct. 2012. In many ways, it’s the collision of where HPC is meeting the challenges of Big Data. As our technical capabilities continue to expand in the ways we can collect and store data, the problem of how we access and use data is only growing.
Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and ComputationMarch 7, 2014 3:52 pm | by Barry Bolding | Blogs | Comments
Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-intensive” supercomputing. The dramatic growth in scientific, commercial and social data is resulting in an expanded customer base that is asking for much more complex analysis and simulation.
In 2013, the term big data continued to dominate as a source of technology challenges, experimentation and innovation. It’s no surprise then that many business and IT executives are suffering from big data exhaustion, causing Gartner to deem 2013 as the year the technology entered the “Trough of Disillusionment.”
From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.
Scientific Computing is excited to be celebrating its 30th year in 2014, and we have a terrific line-up of new things we will be introducing throughout the coming months. This includes a new global cross-platform app that is available across multiple devices and allows you to browse and read each issue anytime, anywhere. In our latest issue, we explore the theme of “Emerging Technologies” ...
Rackform iServ R456 is a server equipped with Intel Xeon E7-4800v2 processors, formerly named Ivy Bridge-EX. The server features four Intel Xeon E7-4800v2 processors and up to 96 DDR3 DIMMs, triple the memory of previous products based on Intel Xeon E7-4800 processors.
With cutting-edge technology, sometimes the first step scientists face is just making sure it actually works as intended. The University of Southern California (USC) Viterbi School of Engineering is home to the USC-Lockheed Martin Quantum Computing Center, a super-cooled, magnetically shielded facility specially built to house the first commercially available quantum computing processors
An Orbital Sciences Corporation Antares rocket is seen as it is rolled out to Launch Pad-0A at NASA's Wallops Flight Facility, January 5, 2014, in advance of a planned January 8 launch on Wallops Island, VA. The Antares launched a Cygnus spacecraft on a cargo resupply mission to the International Space Station.
NASA is plotting a daring robotic mission to Jupiter's watery moon Europa, a place where astronomers speculate there might be some form of life. The space agency set aside $15 million in its 2015 budget proposal to start planning some kind of mission to Europa. No details have been decided yet, but NASA chief financial officer Elizabeth Robinson said March 4, 2014, that it would be launched in the mid-2020s.
This 5x image of peripheral nerves in an E11.5 mouse embryo won 14th place in the 2013 Nikon Small World Photomicrophotography Competition. It was taken by Mr. Zhong Hua of the Department of Molecular Biology & Genetics, Johns Hopkins University School of Medicine, in Baltimore, MD, using confocal microscopy.
Intelligent Storage Bridge (ISB) is designed to enhance the throughput and reliability of large data transfers, thereby increasing fast scratch efficiency and overall application workflow performance. Used in HPC environments, it includes vendor-agnostic support of Lustre solutions, allowing organizations to bring together a wider range of HPC and enterprise storage solutions.
The Galápagos Islands are home to some of the most active volcanoes in the world, with more than 50 eruptions in the last 200 years. Yet until recently, scientists knew far more about the history of finches, tortoises, and iguanas than of the volcanoes on which these unusual fauna had evolved.
The UK Science and Technology Facilities Council and Rogue Wave Software have signed a collaboration agreement to work together on software tools to increase significantly the productivity of software development for scientific computing. As part of this new agreement, they will collaborate to develop next-generation HPC software tools to enhance the software development capabilities of the newest supercomputers.