NGS is at the heart of many of today’s biomedical discoveries, and dependence on HPC to achieve these insights is growing rapidly, in direct proportion to the amount of data collected. As genomic data accumulates and analytic approaches evolve, so must the computational platforms. View Now
Examining the present and future of platforms built around industry-standard chips Suzanne Tracy, Editor in Chief Many analysts and IT managers agree that the continued growth of Linux signals the beginning of the end of proprietary hardware and software as the solution for high-end computing. Recently, a panel of experts came together for a live Webcast entitled Scale-up Linux Goes Mainstream. The panelists examined how large scaled-up Linux systems that are built around industry-standard computer chips are now readily available and becoming
How to pull it all together to achieve a simple user experience Suzanne Tracy, Editor in Chief From drug discovery through approval and monitoring life cycles, a successful knowledge management solution can go a long way toward helping to achieve business goals. But how do you pull it all together to implement a successful platform? Recently, a panel of experts provided guidance on overcoming many of the common challenges — both technological and cultural — that are often associated with implementing a knowledge management solution. Discussion highlights
Designing flexible systems for rapid evolution Sponsored by Scientific Computing and Dell Using graphics processing units to run applications has become one of the hottest trends in high performance computing. The past few years have seen major changes, including increasing migration of single instruction, multiple data (SIMD) computations to highly-parallelized GPU environments available in current hardware configurations, and the ensuing significant performance increases many HPC algorithms have achieved through the use of hybrid GPU computing.
Compute More, Consume Less Sponsored by Scientific Computing and Dell In recent years, researchers have witnessed a tremendous explosion in the amount of data generated each day, and have experienced an increase in the complexity of computations needed to effectively understand and analyze this data. This impact is most evident in high performance computing (HPC), with its very large data volumes and the high computational effort needed to process them.
Sponsored by Scientific Computing and Dell High-performance computing (HPC) has come to the forefront as a means of effectively addressing many problems in scientific, educational, research and business settings. In many cases, it is giving users the option of replacing expensive physical experimentation, design, prototyping, and testing with their virtual counterparts based on computational models and solutions
Seeing is Understanding Sponsored by Scientific Computing and Dell Visualization has become an essential tool in scientific research, spanning many areas of science and engineering research and design. As complexity and data size have scaled dramatically, so have the requirements for processing, managing, and storing the enormous amounts of data needed for scientific visualization and analysis
Confronting the Need for Speed Sponsored by Scientific Computing and Dell The dramatic increase in more standardized capability, coupled with the delivery of exceptional performance envelopes, all within a lower cost band, positions GPU technologies at the forefront of the next wave of broad architecture adoption
21st Century Storage Best Practices Sponsored by Scientific Computing and Dell As science drives a rapidly growing need for storage within the scientific community, existing environments are facing increasing pressure to expand capabilities while controlling costs
An interactive, educational, streaming audio Webcast Sponsored by Hewlett-Packard As technology provides computational chemists with exponential improvements in computational capability, they are finding ways to employ the new power to investigate formerly unaddressable research problems. Join our panel of experts as they discuss new resources and technologies, best practices and successful applications
Sponsored by Hewlett-Packard & Intel Addresses directions for next-generation sequencing, solutions available to provide the scalable HPC infrastructure demanded by the research, and successful implementation of next-generation sequencing in a clinical and translational research environment.
Sponsored by Hewlett-Packard & Intel 2006 has seen dramatic introductions of faster multi-core high-performance computing solutions and new choices of operating environments and cluster switches. But for the average Computer-Aided Engineering user, it is a challenge to sift through these choices and select a computing solution that achieves fast, reliable, and affordable job throughput.
Webcast Archive of all Scientific Computing's past webcasts.
Leading-edge Trends and Technologies in Server Virtualization Suzanne Tracy, Editor in Chief Virtualization is a hot buzzword in the computing industry today. What does it mean? What are the benefits? What technologies are available? How might it apply in a high performance computing world? "Virtually All You Need: Leading-edge Trends and Technologies in Server Virtualization," the latest in Scientific Computing's educational Webcast series, will discuss attaining real value from server virtualization both today and tomorrow. Leading-edge Trends and Technologies in Server Virtualization
High performance computing goes mainstream Jennifer A. Miller, Managing Editor High performance computing is becoming a transitional area for productivity workers engaged in the engineering disciplines, research, analysis, design, rendering and animation, as well as ancillary work activities. Applications that used to be performed on Microsoft Windows or Linux desktops, where large-scale computing was once cost prohibitive, can now be migrated to cost-effective compute clusters for improved, scalable application performance.
Selecting and configuring scalable CAE computing solutions Suzanne Tracy, Editor in Chief The past year has seen dramatic introductions of faster multi-core high performance computing solutions, as well as new choices of operating environments and cluster switches. However, the average computer-aided engineering user is challenged to sift through these choices and to select a computing solution that achieves a fast, reliable and affordable job throughout. For example, which multi-core microprocessor architecture is the fastest? Can customers get the performance