On a telescope at the summit of Mauna Kea in Hawaii, it’s not easy to put in a full night of work. At 14,000 feet, you’re operating at only 60 percent of the oxygen available at sea level, which makes concentrating difficult. Top that off with a shift that begins at 6:30 pm and ends at 6:30 am, and it becomes hard to imagine astronomers working like that year-round. Luckily, most of us don’t have to.
Today's LIMS allow research institutions to monitor and manage a broad array of biomedical research processes end-to-end and remotely. But how do they accommodate the ongoing flood of discoveries in areas such as genetics, the -omics, regenerative medicine and behavior, ongoing adjustments to workflows and protocols, tens of thousands of animals, and the evolution of legislative, welfare quality, and ethics directives?
These days, using a LIMS seems to feature in every scientist's life, and for some small and medium-size labs, open source code is the way forward with a LIMS. In fact, businesses have grown up around helping labs implement open source LIMS and learn to make modifications in house. A bridge too far for a nonprofessional? Not according to Greg Wilson, who believes that most scientists can easily learn enough to slip into coding...
Quantum Computing has been a concept since the 1980s that has remained outside the domain of real-world HPC. Through the era of Moore’s Law and exponential progress in feature size, clock rates and resulting performance, the need for alternative paradigms and technologies has attracted little interest. But there has remained a curiosity among a limited community that has driven slow but persistent advances in associated ideas.
All the computing power in the world isn’t useful if the software designed to access it is poorly designed. And we’re all much more discerning about user interfaces and usability: we expect our laboratory software to behave as intuitively as our smartphones. After all, laboratory employees are unlikely to be preoccupied with lines of codes and processors — they’re focused more on how easy the software is to use.
There are a number of excellent commercial performance analysis tools on the market. Their big drawback is that they cost money. As a result, acquisition of commercial performance analysis software falls through the cracks, as most funding agencies discourage or prohibit the use of grant money for infrastructure improvements, and few grant authors are willing to take money away from research. Open-source tools are able to fill this gap.
This delightful and informative guide from my friends at No Starch Press comes with the following cover blurb: “Statistics Done Wrong is a pithy, essential guide to statistical blunders in modern Science that will show you how to keep your research blunder-free.” It is somewhat pithy, but as to blunder free, I will quote the old maxim that “nothing is foolproof, as fools are so very clever.” Still, the book has much to recommend it.
Electromagnetic radiation – it might sound like something that you’d be better off avoiding, but electromagnetic waves of various kinds underpin our senses and how we interact with the world – from the light emissions through which your eyes perceive these words, to the microwaves that carry the Wi-Fi signal to your laptop or phone on which you’re reading it.
The World Health Organization reports that cardiovascular diseases are the number one cause of death globally. Working to address this imperative public health problem, researchers world-wide are seeking new ways to accelerate research, raise the accuracy of diagnoses and improve patient outcomes. Several initiatives have utilized ground-breaking new simulations to advance research into aspects such as rhythm disturbances and ...
The HPC and enterprise communities are experiencing a paradigm shift as FLOPs per watt, rather than FLOPs (floating-point operations per second), are becoming the guiding metric in procurements, system design, and now application development. In short, “performance at any cost” is no longer viable, as the operational costs of supercomputer clusters are now on par with the acquisition cost of the hardware itself.
We computational chemists are an impatient lot. Despite the fact that we routinely deal with highly complicated chemical processes running on our laboratory’s equally complex HPC clusters, we want answers in minutes or hours, not days, months or even years. In many instances, that’s just not feasible; in fact, there are times when the magnitude of the problem simply exceeds the capabilities of the HPC resources available to us.
As scientific researchers, we are often surprised by some of the assumptions made about us by those outside our profession. So we put together a list of common myths we and our colleagues have heard anecdotally regarding scientific researchers.
Elon Musk has built a US$12 billion company in an endeavour to pave the way to Mars for humanity. He insists that Mars is a “long-term insurance policy” for “the light of consciousness” in the face of climate change, extinction events, and our recklessness with technology. On the other hand, astronaut Chris Hadfield is sceptical: “Humanity is not going extinct,” he told me.
For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.
The author of this wonderful text delivers a brief, easy-to-absorb, yet very comprehensive text on modeling real-world data with Maple. Maple is software for performing mathematics, with a none-too-steep learning curve. In the introduction, the author is quick to point out that this is neither a detailed textbook of mathematical modeling, nor Maple. It is, however, a very well-written manual of introductory modeling and use of Maple.