A consortium of European space scientists has succeeded in establishing a common data hub that allows the comparison of data from numerous space missions. A task that until now was hampered by different data processing protocols of individual space missions. Furthermore, observational data can now easily be compared with theoretical numerical models — regardless of the protocols used.
Simulink is a block diagram environment for multidomain simulation and model-based design. It...
MATLAB 8.5 is a high-level language and interactive environment used by engineers and scientists...
Seahorse Scientific Workbench is a vendor-neutral software suite for capturing, analyzing and sharing analytical data. It consolidates raw and result data from multiple experimental techniques in a single tool, based on the emerging ASTM AnIML Data Standard. Seahorse Mobile delivers scientific data to mobile devices and supports chromatography (HPLC, GC), mass spectrometry, NMR, optical spectroscopy, microplate reader, bioreactor and fermenter, medical imaging and process chromatography data types.
Researchers have assembled the largest and most accurate tree of life calibrated to time. Surprisingly, it reveals that life has been expanding at a constant rate, not slowing down.
A relentless global effort to shrink transistors has made computers continually faster, cheaper and smaller over the last 40 years. This effort has enabled chipmakers to double the number of transistors on a chip roughly every 18 months — a trend referred to as Moore's Law. In the process, the U.S. semiconductor industry has become one of the nation's largest export industries, valued at more than $65 billion a year.
Children don’t have to be told that “cat” and “cats” are variants of the same word — they pick it up just by listening. To a computer, though, they’re as different as, well, cats and dogs. Yet it’s computers that are assumed to be superior in detecting patterns and rules, not four-year-olds. Researchers are trying to, if not to solve that puzzle definitively, at least provide the tools to do so.
Seahorse Scientific Workbench is a vendor-neutral software suite for capturing, analyzing and sharing analytical data. The software consolidates raw and result data from multiple experimental techniques in a single tool, based on the emerging ASTM AnIML Data Standard. It captures each step of the workflow and presents it in its entirety.
Researchers say they can predict the spread of flu a week into the future with as much accuracy as Google Flu Trends can display levels of infection right now. The study uses social network analysis and combines the power of Google Flu Trends’ big data with traditional flu monitoring data from the CDC.
After working for more than 10 years on unlocking an ancient piece of history, what lies inside damaged Herculaneum scrolls, UK Department of Computer Science Chair and Professor Brent Seales will accomplish the next step in allowing the world to read the scrolls, which cannot be physically opened. A major development in the venture, Seales is building software that will visualize the scrolls' writings as they would be if unrolled.
Partek Flow 4.0 is designed specifically for the analysis needs of next-generation sequencing applications including RNA, small RNA and DNA sequencing. With the ability to either build custom analysis pipelines or download pre-built pipelines, users can perform alignment, quantification, quality control, statistics and visualization.
A new virtual reality system allows architects to view 3-D models of buildings in their intended shape, precisely where the buildings will be constructed. This provides a much clearer, realistic impression of the design. Digitization is fundamentally changing the work processes in architectural design, planning and construction work. Increasingly, CAD drawings are transferred to a central 3-D Building Information Model...
Despite the fact that industries won’t change working processes unless there is a mandatory need to do so, major milestones are expected in 2015 in the battle to adopt data and standardization in our scientific community. The need for deployment of these integration standards to enable efficient sharing of knowledge across our internal and external partners is re-enforced by regulatory bodies.
It’s like a scene from a gamer’s wildest dreams: 12 high-definition, 55-inch 3-D televisions all connected to a computer capable of supporting high-end, graphics-intensive gaming. On the massive screen, images are controlled by a Wii remote that interacts with a Kinnect-like Bluetooth device (called SmartTrack), while 3-D glasses worn by the user create dizzying added dimensions.
What does a black hole look like up close? As the sci-fi movie Interstellar wows audiences with its computer-generated views of one of most enigmatic and fascinating phenomena in the universe, University of Arizona (UA) astrophysicists Chi-kwan Chan, Dimitrios Psaltis and Feryal Ozel are likely nodding appreciatively and saying something like, "Meh, that looks nice, but check out what we've got."
A mathematician has developed a new way to uncover simple patterns that might underlie apparently complex systems, such as clouds, cracks in materials or the movement of the stockmarket.
For the US Army, and DoD and intelligence community as a whole, GIS Federal developed an innovative approach to quickly filter, analyze, and visualize big data from hundreds of data providers with a particular emphasis on geospatial data.
Folk wisdom can sometimes be right on target. For example, there’s that old bromide about leading a horse to water. In this case, the water is high performance computing, and the reluctant equine is the huge base of small- to medium-sized manufacturers in the U.S. According to the National Center for Manufacturing Sciences, there are approximately 300,000 manufacturers in the U.S. Over 95 percent of them can be characterized as SMMs.
Highly motivated to organize the Argonne Training Program on Extreme-Scale Computing, Paul Messina reflects on what makes the program unique and a can’t-miss opportunity for the next generation of HPC scientists. ATPESC is an intense, two-week program that covers most of the topics and skills necessary to conduct computational science and engineering research on today’s and tomorrow’s high-end computers.
GeneSpring Pathway Architect software is designed to enable faster discovery of complex relationships across multi-omic data. Designed for researchers focused on genomics, proteomics, metabolomics, transcriptomics or any combination of life science disciplines, the package includes GeneSpring GX and Mass Profiler Professional, as well as Pathway Architect.
In a darkened, hangar-like space inside MIT’s Building 41, a small, Roomba-like robot is trying to make up its mind. Standing in its path is an obstacle — a human pedestrian who’s pacing back and forth. To get to the other side of the room, the robot has to first determine where the pedestrian is, then choose the optimal route to avoid a close encounter.
Huygens Titan is lightweight software that indexes, finds and shows 2-D to 5-D microscopic image data. It can read all common microscope formats from any folder and subdirectory on any location to which a user has access. The software does not alter or move images.
Playing violent video games in 3-D makes everything seem more real — and that may have troubling consequences for players, a new study reveals. Researchers found that people who played violent video games in 3-D showed more evidence of anger afterward than did people who played using traditional 2-D systems — even those with large screens.
The Jefferson Project announced new milestones in a multimillion-dollar collaboration that seeks to understand and manage complex factors impacting Lake George. A new data visualization laboratory features advanced computing and graphics systems that allow researchers to visualize sophisticated models and incoming data on weather, runoff and circulation patterns. The lab will display streaming data from various sensors in real-time.
NVIDIA is looking for a dozen would-be competitors for next year’s Early Stage Challenge, which takes place as part of its Emerging Companies Summit (ECS). In this seventh annual contest, hot young startups using GPUs vie for a single $100,000 grand prize.
The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science and engineering.
NASA has formally delivered to Alaskan officials a new technology that could help pilots flying over the vast wilderness expanses of the northern-most state. The technology is designed to help pilots make better flight decisions, especially when disconnected from the Internet, telephone, flight services and other data sources normally used by pilots.
Technology is putting complex topics like severe weather and climate change on the map — literally. Mapping data associated with specific geographic locations is a powerful way to glean new and improved knowledge from data collections and to explain the results to policymakers and the public. Particularly useful is the ability to layer different kinds of geospatial data on top of one another and see how they interact.
- Page 1