NGS is at the heart of many of today’s biomedical discoveries, and dependence on HPC to achieve these insights is growing rapidly, in direct proportion to the amount of data collected. As genomic data accumulates and analytic approaches evolve, so must the computational platforms. View Now
Efficient and Time Sensitive Execution of NGS Pipelines is Critical Translational Medicine Join us for a free webinar where James Lowey, VP of IT at TGEN, Anatol Blass, lead architect for Dell Genomics Solutions and Kristina Kermanshahche, Chief Architect for Intel Health & Life Sciences, discuss best practices for architecting and best practices in the processes and architecture of genomics processing systems. We will discuss: Use cases for genomics data Common issues and challenges with genomics data projects Translating research requirements from clinicians and researchers with multiple disparate projects into appropriate IT infrastructure and support Now Available On-Demand!
BIG DATA INSIGHTS: How to Accelerate Discovery in Medicine, Research, Government, Business & More Please join us for this webinar, now available on-demand, where you will hear Dr. Buetow discuss: Insights and answers that allow the data itself to enable you to identify and to solve problems faced in medicine, in research, in government, in business and more Special features of the NGCC and the multi-dimensional compute environment with integrated platforms that enable organizations to use big data in all its complexity The implementation of Hadoop at ASU with its capabilities as a big data management and analytics architecture
Backup vs. Archive: What’s the difference and why are both needed to protect your laboratory data? Join us for this educational webinar about the data management challenges organizations face today. What you'll learn: The cheapest, the fastest, or the most definitive data management strategy is meaningless if the data is untrustworthy. How including a complementary archive strategy to your existing backup strategy will not only improve efficiency, but can reduce costs and regulatory risk. How both approaches are viable and the business value each one can offer.
Leading-edge Trends and Technologies in Server Virtualization Suzanne Tracy, Editor in Chief Virtualization is a hot buzzword in the computing industry today. What does it mean? What are the benefits? What technologies are available? How might it apply in a high performance computing world? "Virtually All You Need: Leading-edge Trends and Technologies in Server Virtualization," the latest in Scientific Computing's educational Webcast series, will discuss attaining real value from server virtualization both today and tomorrow. Leading-edge Trends and Technologies in Server Virtualization
High performance computing goes mainstream Jennifer A. Miller, Managing Editor High performance computing is becoming a transitional area for productivity workers engaged in the engineering disciplines, research, analysis, design, rendering and animation, as well as ancillary work activities. Applications that used to be performed on Microsoft Windows or Linux desktops, where large-scale computing was once cost prohibitive, can now be migrated to cost-effective compute clusters for improved, scalable application performance.
Selecting and configuring scalable CAE computing solutions Suzanne Tracy, Editor in Chief The past year has seen dramatic introductions of faster multi-core high performance computing solutions, as well as new choices of operating environments and cluster switches. However, the average computer-aided engineering user is challenged to sift through these choices and to select a computing solution that achieves a fast, reliable and affordable job throughout. For example, which multi-core microprocessor architecture is the fastest? Can customers get the performance
High performance computing joins the mainstream Suzanne Tracy, Editor in Chief The door on the high performance computing world is opening to new and exciting solutions. Once restricted to high-level researchers and academia, HPC is rapidly moving toward more mainstream applications. Recently, an expert panel came together to explore the significant shift that is occurring in the state of the industry, take a close look at recent trends, and provide insight into changes that are occurring in today’s user experience
Examining the present and future of platforms built around industry-standard chips Suzanne Tracy, Editor in Chief Many analysts and IT managers agree that the continued growth of Linux signals the beginning of the end of proprietary hardware and software as the solution for high-end computing. Recently, a panel of experts came together for a live Webcast entitled Scale-up Linux Goes Mainstream. The panelists examined how large scaled-up Linux systems that are built around industry-standard computer chips are now readily available and becoming
How to pull it all together to achieve a simple user experience Suzanne Tracy, Editor in Chief From drug discovery through approval and monitoring life cycles, a successful knowledge management solution can go a long way toward helping to achieve business goals. But how do you pull it all together to implement a successful platform? Recently, a panel of experts provided guidance on overcoming many of the common challenges — both technological and cultural — that are often associated with implementing a knowledge management solution. Discussion highlights
Designing flexible systems for rapid evolution Sponsored by Scientific Computing and Dell Using graphics processing units to run applications has become one of the hottest trends in high performance computing. The past few years have seen major changes, including increasing migration of single instruction, multiple data (SIMD) computations to highly-parallelized GPU environments available in current hardware configurations, and the ensuing significant performance increases many HPC algorithms have achieved through the use of hybrid GPU computing.
Compute More, Consume Less Sponsored by Scientific Computing and Dell In recent years, researchers have witnessed a tremendous explosion in the amount of data generated each day, and have experienced an increase in the complexity of computations needed to effectively understand and analyze this data. This impact is most evident in high performance computing (HPC), with its very large data volumes and the high computational effort needed to process them.
Sponsored by Scientific Computing and Dell High-performance computing (HPC) has come to the forefront as a means of effectively addressing many problems in scientific, educational, research and business settings. In many cases, it is giving users the option of replacing expensive physical experimentation, design, prototyping, and testing with their virtual counterparts based on computational models and solutions
Seeing is Understanding Sponsored by Scientific Computing and Dell Visualization has become an essential tool in scientific research, spanning many areas of science and engineering research and design. As complexity and data size have scaled dramatically, so have the requirements for processing, managing, and storing the enormous amounts of data needed for scientific visualization and analysis
Confronting the Need for Speed Sponsored by Scientific Computing and Dell The dramatic increase in more standardized capability, coupled with the delivery of exceptional performance envelopes, all within a lower cost band, positions GPU technologies at the forefront of the next wave of broad architecture adoption
21st Century Storage Best Practices Sponsored by Scientific Computing and Dell As science drives a rapidly growing need for storage within the scientific community, existing environments are facing increasing pressure to expand capabilities while controlling costs