Computer technology that can mine data from social media during times of natural or other disaster could provide invaluable insights for rescue workers and decision makers. Advances in information technology have had a profound impact on disaster management.
The new HITS research group “Data Mining and Uncertainty Quantification” analyzes large amounts...
Josiah Stamp said: “The individual source of the statistics may easily be the weakest link.”...
Researchers are developing computers capable of "approximate computing" to perform calculations...
Researchers are developing computers capable of "approximate computing" to perform calculations good enough for certain tasks that don't require perfect accuracy, potentially doubling efficiency and reducing energy consumption.
Everything leading up to the actual coding, figuring out how to make it work, is what Samak enjoys most. One of the problems she is working on with the Department of Energy’s Joint Genome Institute (JGI) is a data mining method to automatically identify errors in genome assembly, replacing the current approach of manually inspecting the assembly.
In his 1937 book, "Think and Grow Rich," author Napoleon Hill identified 13 steps to success, one of which was the power of the mastermind. "No two minds ever come together without thereby creating a third, invisible, intangible force, which may be likened to a third mind," Hill wrote.
Recent announcements by Intel and NVIDIA indicate that massively parallel computing with GPUs and Intel Xeon Phi will no longer require passing data via the PCIe bus. The bad news is that these standalone devices are still in the design phase and are not yet available for purchase.
IBM announced on August 24, 2013, that it has added nine new academic collaborations to its more than 1,000 partnerships with universities across the globe, focusing on Big Data and analytics - all of which are designed to prepare students for the 4.4 million jobs that will be created worldwide to support Big Data by 2015. The company also announced more than $100,000 in awards for Big Data curricula.
The HPC market is entering a kind of perfect storm. For years, HPC architectures have tilted farther and farther away from optimal balance between processor speed, memory access and I/O speed. As successive generations of HPC systems have upped peak processor performance without corresponding advances in per-core memory capacity and speed, the systems have become increasingly compute centric
The 14th annual KDnuggets Software Poll, conducted in May 2013, attracted record participation of 1,880 internet voters, more than doubling the previous year's numbers. KDnuggets.com is a data mining portal and newsletter publisher for the data mining community with more than 12,000 subscribers.
The time may be fast approaching for researchers to take better advantage of the vast amount of valuable patient information available from U.S. electronic health records. Lian Duan, an NJIT computer scientist with an expertise in data mining, has done just that with the recent publication of "Adverse Drug Effect Detection," IEEE Journal of Biomedical and Health Informatics (March, 2013).
Pathway Studio, a research solution for biologists, is now available in a Web-based version. The integrated data mining and visualization software features comprehensive knowledge bases produced by applying MedScan, Elsevier’s proprietary text-mining technology, to a large corpus of biological literature.
When it officially came online at the San Diego Supercomputer Center (SDSC) in early January 2012, Gordon was instantly impressive. In one demonstration, it sustained more than 35 million input/output operations per second--then, a world record.
i3D Enterprise Service integrates storage, processing and data mining in an enterprise-level private cloud. Laboratory data can be automatically and securely uploaded from instruments to a private cloud and processed on the cloud, enabling workflow execution and data mining in a fraction of the time.
SampleManager 11 laboratory information management system (LIMS) features advanced tools that are designed to improve laboratory process mapping, management and automation. Users can build workflows to reflect their individual laboratory processes and take ownership of workflow management.
Today, we are more connected than ever. We live in an always-on world whose digital economy has made data a new form of resource that fundamentally changes our lives. But has this revolution really occurred across R&D domains? At a time when global R&D investment is over $1.5 trillion, leading voices still bemoan a lack of open access to decision-making data and an innovation deficit syndrome.
Art Levinson, Sergey Brin, Anne Wojcicki, Mark Zuckerberg, Priscilla Chan and Yuri Milner announced the launch of the Breakthrough Prize in Life Sciences, recognizing excellence in research aimed at curing intractable diseases and extending human life
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, is seeking innovative applications for the next round of user allocations on its data-intensive Gordon supercomputer, which went into operation earlier this year
Researchers at Columbia Engineering have developed new software that can simultaneously calculate the carbon footprints of thousands of products faster than ever before. “Our novel approach generates standard-compliant product carbon footprints for companies with large portfolios at a fraction of previously required time and expertise,”
Laboratories working in the pharmaceutical industry in the areas of R&D and quality control find themselves increasingly having to cope with conflicting demands — tougher regulatory requirements and harsher economic realities. In order to meet these demands, new ways of dealing with process, data and system management are necessary.
SC12 will streamline conference information and move to a virtually real-time method of determining technical program thrusts
SGI, a leader in technical computing, has partnered with Kalev H. Leetaru of the University of Illinois to create the first-ever historical mapping and exploration of the full text contents of the English-language edition of Wikipedia, in time and space.
Exemplar Biomarker Discovery LIMS for personalized medicine is designed to address pharmaceutical and biotech companies’ needs for a single, integrated solution that addresses everything from sample management through study data management, assay data management, data mining and statistical analysis
A recent discovery, using state-of-the-art informatics tools, increases the likelihood that it will be possible to predict much of the fundamental structure and function of the brain without having to measure every aspect of it
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego is launching a new “center of excellence” aimed at leveraging SDSC’s data-intensive expertise and resources to help create the next generation of data researchers by leading a collaborative, nationwide education and training effort among academia, industry and government
When asked about the biggest challenges in using genomics software today, data analyst Johanna Swanson, who works in Deborah Nickerson’s laboratory at the University of Washington, says, “For our lab, it’s scalability"
Eight international research funders from four countries today jointly announced the 14 winners of the second Digging into Data Challenge, a competition to promote innovative humanities and social science research using large-scale data analysis
When it officially comes online in early January, Gordon, a unique new supercomputer at the San Diego Supercomputer Center (SDSC), will help researchers tackle the most vexing data-intensive challenges, from mapping genomes for personalized medicine
- Page 1