This delightful and informative guide from my friends at No Starch Press comes with the following cover blurb: “Statistics Done Wrong is a pithy, essential guide to statistical blunders in modern Science that will show you how to keep your research blunder-free.” It is somewhat pithy, but as to blunder free, I will quote the old maxim that “nothing is foolproof, as fools are so very clever.” Still, the book has much to recommend it.
Electromagnetic radiation – it might sound like something that you’d be better off avoiding, but electromagnetic waves of various kinds underpin our senses and how we interact with the world – from the light emissions through which your eyes perceive these words, to the microwaves that carry the Wi-Fi signal to your laptop or phone on which you’re reading it.
The World Health Organization reports that cardiovascular diseases are the number one cause of death globally. Working to address this imperative public health problem, researchers world-wide are seeking new ways to accelerate research, raise the accuracy of diagnoses and improve patient outcomes. Several initiatives have utilized ground-breaking new simulations to advance research into aspects such as rhythm disturbances and ...
The HPC and enterprise communities are experiencing a paradigm shift as FLOPs per watt, rather than FLOPs (floating-point operations per second), are becoming the guiding metric in procurements, system design, and now application development. In short, “performance at any cost” is no longer viable, as the operational costs of supercomputer clusters are now on par with the acquisition cost of the hardware itself.
We computational chemists are an impatient lot. Despite the fact that we routinely deal with highly complicated chemical processes running on our laboratory’s equally complex HPC clusters, we want answers in minutes or hours, not days, months or even years. In many instances, that’s just not feasible; in fact, there are times when the magnitude of the problem simply exceeds the capabilities of the HPC resources available to us.
As scientific researchers, we are often surprised by some of the assumptions made about us by those outside our profession. So we put together a list of common myths we and our colleagues have heard anecdotally regarding scientific researchers.
Elon Musk has built a US$12 billion company in an endeavour to pave the way to Mars for humanity. He insists that Mars is a “long-term insurance policy” for “the light of consciousness” in the face of climate change, extinction events, and our recklessness with technology. On the other hand, astronaut Chris Hadfield is sceptical: “Humanity is not going extinct,” he told me.
For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.
The author of this wonderful text delivers a brief, easy-to-absorb, yet very comprehensive text on modeling real-world data with Maple. Maple is software for performing mathematics, with a none-too-steep learning curve. In the introduction, the author is quick to point out that this is neither a detailed textbook of mathematical modeling, nor Maple. It is, however, a very well-written manual of introductory modeling and use of Maple.
Optimization for high-performance and energy efficiency is a necessary next step after verifying that an application works correctly. In the HPC world, profiling means collecting data from hundreds to potentially many thousands of compute nodes over the length of a run. In other words, profiling is a big-data task, but one where the rewards can be significant — including potentially saving megawatts of power or reducing the time to solution
Quick, think of a four-letter name beginning with “N” for a federal agency involved in space science. Though NASA or NOAA would rightfully pop into mind first, crossword puzzle aficionados should know that NIST would be a correct answer as well — because the National Institute of Standards and Technology has been an integral part of readying technology for blastoff for decades.
This is not a text for the novice. However, for those math/statistics aficionados, there is much to be had. The book’s great strength lies in two areas: the first is Peter Congdon’s generation of an excellent bibliography of the most modern techniques available, and the other is his (slightly) more straightforward explanations of the strengths and weaknesses of these techniques and suggestions for optimizing the results.
Qlucore is a software platform for the analysis of genomics, proteomics and related data. As with most statistical and genomics software, it generates an immediate graphic for most analyses. Its specific areas of use include gene expression, protein arrays, DNA methylation, miRNA, proteomics, and pattern and structure identification in multivariate data.
Despite the fact that industries won’t change working processes unless there is a mandatory need to do so, major milestones are expected in 2015 in the battle to adopt data and standardization in our scientific community. The need for deployment of these integration standards to enable efficient sharing of knowledge across our internal and external partners is re-enforced by regulatory bodies.
Sense of urgency and economic impact emphasized: The “hardware first” ethic is changing. Hardware retains the glamour, but there is now the stark realization that the newest parallel supercomputers will not realize their full potential without reengineering the software code to efficiently divide computational problems among the thousands of processors that comprise next-generation many-core computing platforms.