A case study published in The International Journal of Business Process Integration and Management demonstrates that the adoption of integrated cloud-computing solutions can lead to significant cost savings for businesses, as well as large reductions in the size of an organization's carbon footprint.
The second ISC Big Data conference themed “From Data To Knowledge,” builds on the success of the...
Big Web sites usually maintain their own “data centers,” banks of tens or even hundreds of...
Internet access is becoming increasingly mobile, and the next billion users will experience the Internet in new ways from those already online. The experience of Internet connectivity is far from uniform, and observing the variety of connectivity, and how it is changing over time is important. Smartphone users around the globe can download an app and contribute their measurements to a global picture of Internet diversity and evolution.
Over the years, computer chips have gotten smaller, thanks to advances in materials science and manufacturing technologies. This march of progress, the doubling of transistors on a microprocessor roughly every two years, is called Moore’s Law. But there’s one component of the chip-making process in need of an overhaul if Moore’s law is to continue: the chemical mixture called photoresist.
The National Institute of Standards and Technology (NIST) has issued for public review and comment a draft report summarizing 65 challenges that cloud computing poses to forensics investigators who uncover, gather, examine and interpret digital evidence to help solve crimes.
The Cray XC30 system will be used by a nation-wide consortium of scientists called the Indian Lattice Gauge Theory Initiative (ILGTI). The group will research the properties of a phase of matter called the quark-gluon plasma, which existed when the universe was approximately a microsecond old. ILGTI also carries out research on exotic and heavy-flavor hadrons, which will be produced in hadron collider experiments.
Scientists have demonstrated the ability to track real quantum errors as they occur, a major step in the development of reliable quantum computers. Quantum computers could significantly improve the computational power of modern computers, but a major problem stands in the way: information loss, or quantum errors. To combat errors, physicists must be able to detect that an error has occurred and then correct it in real time.
The solar panels that Idaho inventor Scott Brusaw has built aren't meant for rooftops. They are meant for roads, driveways, parking lots, bike trails and, eventually, highways. Brusaw, an electrical engineer, says the hexagon-shaped panels can withstand the wear and tear that comes from inclement weather and vehicles, big and small, to generate electricity.
The discovery 30 years ago of soccer-ball-shaped carbon molecules called buckyballs helped to spur an explosion of nanotechnology research. Now, there appears to be a new ball on the pitch. Researchers have shown that a cluster of 40 boron atoms forms a hollow molecular cage similar to a carbon buckyball. It’s the first experimental evidence that a boron cage structure — previously only a matter of speculation — does indeed exist.
IBM is announcing a new software defined storage-as-a-service on IBM SoftLayer, code named Elastic Storage on Cloud, that gives organizations access to a fully-supported, ready-to-run storage environment, which includes SoftLayer bare metal resources and high performance data management and allows organizations to move data between their on-premise infrastructure and the cloud.
Registration is now open for the 2014 ISC Cloud and ISC Big Data Conferences, which will be held this fall in Heidelberg, Germany. The fifth ISC Cloud Conference will take place in the Marriott Hotel from September 29 to 30, and the second ISC Big Data will be held from October 1 to 2 at the same venue.
Michael M. Resch, the Director of the Stuttgart High Performance Computing Center (HLRS) will be talking about “HPC and Simulation in the Cloud – How Academia and Industry Can Benefit.” His keynote is of special interest to cloud skeptics, given that prior to 2011, Resch himself was a vocal cloud pessimist. Three years later, he feels that this technology provides a practical option for many users.
How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...
In the middle of the 19th century, the massive binary system Eta Carinae underwent an eruption that ejected at least 10 times the sun's mass and made it the second-brightest star in the sky. Now, a team of astronomers has used extensive new observations to create the first high-resolution 3-D model of the expanding cloud produced by this outburst.
In quantum mechanics, interactions between particles can give rise to entanglement, which is a strange type of connection that could never be described by a non-quantum, classical theory. These connections, called quantum correlations, are present in entangled systems even if the objects are not physically linked. Entanglement is at the heart of what distinguishes purely quantum systems from classical ones — why they are potentially useful.
The storage capacity of hard drives is increasing explosively, but the speed with which all that data can be written has reached its limits. Researchers presented a promising new technology which potentially allows data to be stored 1,000 times as fast in Nature Communications. The technology, in which ultra-short laser pulses generate a ‘spin current,’ also opens the way to future optical computer chips.
Cellphone metadata has been in the news quite a bit lately, but the National Security Agency isn’t the only organization that collects information about people’s online behavior. Newly downloaded cellphone apps routinely ask to access your location information, your address book or other apps and, of course, Web sites like Amazon or Netflix track your browsing history in the interest of making personalized recommendations.
The National Nuclear Security Administration (NNSA) and Cray have entered into a contract agreement for a next-generation supercomputer, called Trinity, to advance the mission for the Stockpile Stewardship Program. Managed by NNSA, Trinity is a joint effort of the New Mexico Alliance for Computing at Extreme Scale between Los Alamos and Sandia national laboratories as part of the NNSA Advanced Simulation and Computing Program.
For the past 21 years, the TOP500.org has been ranking supercomputers by their performance on the LINPACK Benchmark. Reported two times a year, the release of the list is anticipated by the industry. As with any such ranking, the top of the list often garners the most attention. However, such emphasis on the top of such a list, would limit one’s understanding of the different supercomputers in the TOP500...
IBM Announces $3B Research Initiative to Tackle Chip Grand Challenges for Cloud and Big Data SystemsJuly 9, 2014 4:58 pm | by IBM | News | Comments
IBM has announced it is investing $3 billion over the next five years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments are intended to push IBM's semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.
The National Science Foundation has announced a five-year, $4 million “Frontier” award to tackle the challenge of time in cyber-physical systems (CPS) — engineered systems that are built from and depend upon the seamless integration of computational algorithms and physical components. Frontier awards constitute NSF’s largest single investments in CPS
In nearly every field of science, experiments, instruments, observations, sensors, simulations, and surveys are generating massive data volumes that grow at exponential rates. Discoverable, shareable data enables collaboration and supports repurposing for new discoveries — and for cross-disciplinary research enabled by exchange across communities that include both scientists and citizens.
The Supercomputing Conference (SC14) awards committee has announced that “A Multi-level Algorithm for Partitioning Graphs,” co-authored by Bruce Hendrickson and Rob Leland of Sandia National Laboratories, has won the prestigious Test of Time Award. The award recognizes the most transformative and inspiring research published at the SC conference and will be presented at the SC14 awards ceremony in New Orleans, LA in November 2014.
Tandem protein mass spectrometry is one of the most widely used methods in proteomics, the large-scale study of proteins, particularly their structures and functions. Researchers in the Marcotte group at the University of Texas at Austin are using the Stampede supercomputer to develop and test computer algorithms that let them more accurately and efficiently interpret proteomics mass spectrometry data.
A stunning video based on fossils of a 410-million-year-old arachnid — one of the first predators on land — recreates the animal walking. Researchers used exceptionally preserved fossils from the Natural History Museum in London to create the video showing the most likely walking gait of the animal; the study is published in a special issue of the Journal of Paleontology.
At a busy shopping mall, shoppers walk by store windows to find attractive items to purchase. Through the windows, shoppers can see the products displayed, but may have a hard time imagining doing something beyond just looking, such as touching the displayed items or communicating with sales assistants inside the store. With TransWall, however, window shopping could become more fun and real than ever before.
The FirePro W8100 professional graphics card is designed to enable new levels of workstation performance delivered by the second-generation AMD Graphics Core Next (GCN) architecture. Powered by OpenCL, it is ideal for the next generation of 4K CAD (computing-aided design) workflows, engineering analysis and supercomputing applications.
- Page 1