Rackform iServ R456 is a server equipped with Intel Xeon E7-4800v2 processors, formerly named Ivy Bridge-EX. The server features four Intel Xeon E7-4800v2 processors and up to 96 DDR3 DIMMs, triple the memory of previous products based on Intel Xeon E7-4800 processors.
With cutting-edge technology, sometimes the first step scientists face is just making sure it...
Intelligent Storage Bridge (ISB) is designed to enhance the throughput and reliability of large...
The UK Science and Technology Facilities Council and Rogue Wave Software have signed a...
According to a current study from the University of Cambridge, software developers are spending about the half of their time on detecting errors and resolving them. Projected onto the global software industry, according to the study, this would amount to a bill of about 312 billion US dollars every year.
Mellanox Technologies a supplier of high-performance, end-to-end interconnect solutions for data center servers and storage systems, announced a collaboration with the University of Cambridge for the Square Kilometer Array (SKA) project. The University of Cambridge selected the company’s Virtual Protocol Interconnect (VPI) solution to provide it with interconnect performance and protocol flexibility for SKA test-bed clusters. The University of Cambridge and Mellanox will use the compute clusters for various development projects for the SKA project, an international effort to build the world’s largest radio telescope.
The NAG Library for Python gives users of the Python language access to over 1,700 mathematical and statistical routines in the NAG Library. It has been enhanced in-line with Python2.7 and Python3, as well as an improved pythonic interface and a new python egg installer.
Here is an agency-by-agency summary of President Barack Obama's proposed budget for fiscal 2015, beginning next Oct. 1. The top-line figures do not include spending on automatic entitlement benefits like Medicare and Social Security. The top-line figures for each agency also omit the $55.4 billion "opportunity" initiative Obama would divide equally between domestic and military programs.
Optical data storage does not require expensive magnetic materials as synthetic alternatives work just as well. This is the finding of an international team from York, Berlin and Nijmegen, published Thursday February 27 in Applied Physics Letters. The team’s discovery brings the much cheaper method...
Big Data tools such as Grok and IBM Watson are enabling large organizations to behave more like agile startups. Of the transformative technology developments that have ushered in the current frenzy of activity along the information superhighway, the 1994 invention of the “Wiki” by Ward Cunningham is among the most disruptive.
Encryption and nuclear weapons are two easily recognized examples where a combinatorial explosion is a sought after characteristic. In the software development world, combinatorial explosions are bad. In particular, it is far too easy to become lost in the minutia of writing code that can run efficiently on NVIDIA GPUs, AMD GPUs, x86, ARM and Intel Xeon Phi while also addressing the numerous compiler and user interface vagaries
The 10-day tour of Europe was not your typical itinerary — Garching, Karlsruhe, Villigen, Hamburg and Oxford. In January. But David Brown and Craig Tull of the Computational Research Division and Alex Hexemer of the Advanced Light Source weren’t touring to see the sights — they more interested in seeing the lights — powerful scientific instruments known as light sources that use intense X-rays to study materials
The Department of Energy’s National Energy Research Scientific Computing Center (NERSC) announced the winners of its second annual High Performance Computing (HPC) Achievement Awards on February 4, 2014, during the annual NERSC User Group meeting at Lawrence Berkeley National Laboratory (Berkeley Lab).
Size alone does not define big data — it is best defined as a combination of volume, velocity, variety and value.Kevin Geraghty, head of Analytics 360i defined the goal of big data analytics well when he said: “We are trying to listen to what the customer is telling us through their behavior.” The goal of big data analytics is to make the best business decisions possible.
Just as Netflix uses an algorithm to recommend movies we ought to see, a Stanford software system offers by-the-moment advice to thousands of server-farm computers on how to efficiently share the workload. We hear a lot about the future of computing in the cloud, but not much about the efficiency of the data centers that make the cloud possible, where clusters work together to host applications ranging from big data analytics
An international team of scientists led by physicists from the University of York has paved the way for a new class of magnetic materials and devices with improved performance and power efficiency. Magnetic materials are currently used to store almost all digital information. However, with information processing and storage now making up a significant fraction of the world's energy consumption
Lawrence Livermore has joined forces with two other national labs to deliver next generation supercomputers able to perform up to 200 peak petaflops (quadrillions of floating point operations per second), about 10 times faster than today's most powerful high performance computing (HPC) systems.
Researchers at IBM have set a new record for data transmission over a multimode optical fiber, a type of cable that is typically used to connect nearby computers within a single building or on a campus. The achievement demonstrated that the standard, existing technology for sending data over short distances should be able to meet the growing needs of servers, data centers and supercomputers through the end of this decade
The Russian Ministry of Education and Science has awarded a $3.4 million “mega-grant” to Alexei Klimentov, Physics Applications Software Group Leader at the U.S. Department of Energy’s Brookhaven National Laboratory, to develop new “big data” computing tools for the advancement of science.
How do you build a universal quantum computer? Turns out, this question was addressed by theoretical physicists about 15 years ago. The answer was laid out in a research paper and has become known as the DiVincenzo criteria. The prescription is pretty clear at a glance; yet in practice the physical implementation of a full-scale universal quantum computer remains an extraordinary challenge.
CloudX is a reference architecture for building efficient cloud platforms. It is based on the OpenCloud architecture, which leverages off-the-shelf components of servers, storage, interconnect and software to form flexible and cost-effective public, private and hybrid clouds.
Every second, a computer must process billions of computational steps to produce even the simplest outputs. Imagine if every one of those steps could be made just a tiny bit more efficient. researchers have developed a series of novel devices that do just that.
Since he was a graduate student, Armando Solar-Lezama, an associate professor in MIT’s Department of Electrical Engineering and Computer Science, has been working on a programming language called Sketch, which allows programmers to simply omit some of the computational details of their code. Sketch then automatically fills in the gaps.
AT&T and IBM have announced a new global alliance agreement to develop solutions that help support the "Internet of Things." The companies will combine their analytic platforms, cloud and security technologies with privacy in mind to gain more insights on data collected from machines in a variety of industries.
Although the time and cost of sequencing an entire human genome has plummeted, analyzing the resulting three billion base pairs of genetic information from a single genome can take many months. However, a team working with Beagle, one of the world's fastest supercomputers devoted to life sciences, reports that genome analysis can be radically accelerated. This computer is able to analyze 240 full genomes in about two days.
A research collaboration has demonstrated the world's fastest silicon-based device to date. The investigators from IHP-Innovations for High Performance Microelectronics in Germany and the Georgia Institute of Technology operated a silicon-germanium (SiGe) transistor at 798 gigahertz (GHz) fMAX, exceeding the previous speed record for silicon-germanium chips by about 200 GHz.
Multi-scale Simulation Software for Chemistry Research Developed Using Trestles and Gordon SupercomputersFebruary 19, 2014 6:48 pm | by San Diego Supercomputer Center | News | Comments
Researchers at the San Diego Supercomputer Center at the University of California, San Diego, have developed software that greatly expands the types of multi-scale QM%2FMM (mixed quantum and molecular mechanical) simulations of complex chemical systems that scientists can use to design new drugs, better chemicals, or improved enzymes for biofuels production.
The Intel Xeon processor E7 v2 family delivers capabilities to process and analyze large, diverse amounts of data to unlock information that was previously inaccessible. The processor family has triple the memory capacity of the previous generation processor family, allowing much faster and thorough data analysis.
Black holes may be dark, but the areas around them definitely are not. These dense, spinning behemoths twist up gas and matter just outside their event horizon, and generate heat and energy that gets radiated, in part, as light. And when black holes merge, they produce a bright intergalactic burst that may act as a beacon for their collision.
- Page 1