The Internet contains a vast trove of information - sometimes called the "Deep Web" - that isn't indexed by search engines: information that would be useful for tracking criminals, terrorist activities, sex trafficking and the spread of diseases. Scientists could also use it to search for images and data from spacecraft.
Extracting meaningful information out of clinical datasets can mean the difference between a...
Evolutionary biologists and computer scientists have come together study the evolution of pop...
A world-leading team of academic researchers and industrial experts from across Europe are...
The Salford Predictive Modeler (SPM) software suite includes CART, MARS, TreeNet and Random Forests, as well as powerful automation and modeling capabilities. The software is designed to be a highly accurate and ultra-fast analytics and data mining platform for creating predictive, descriptive and analytical models from databases of any size, complexity or organization.
Research using cutting edge computer analysis reveals that despite mutating, Ebola hasn’t evolved to become deadlier since the first outbreak 40 years ago.
The decades worth of data collected about the billions of neurons in the brain is astounding. To help scientists make sense of this “brain big data,” researchers at Carnegie Mellon University have used data mining to create a publicly available Web site that acts like Wikipedia, indexing physiological information about neurons. The site will help to accelerate the advance of neuroscience research by providing a centralized resource.
The intraterrestrials, they might be called. Strange creatures live in the deep sea, but few are odder than the viruses that inhabit deep ocean methane seeps and prey on single-celled microorganisms called archaea. The least understood of life's three primary domains, archaea thrive in the most extreme environments: near hot ocean rift vents, in acid mine drainage, in the saltiest of evaporation ponds and in petroleum deposits.
A new app for the iPad could change the way wildlife is monitored. Wildsense, an initiative from a group of researchers at the University of Surrey, is designed to use citizen science, the concept of allowing people to get directly involved in science, to help in the conservation of rare and endangered species.
Research has for the first time analyzed over 130,000 online news articles to find out how the 2012 US presidential election played out in the media.
Children don’t have to be told that “cat” and “cats” are variants of the same word — they pick it up just by listening. To a computer, though, they’re as different as, well, cats and dogs. Yet it’s computers that are assumed to be superior in detecting patterns and rules, not four-year-olds. Researchers are trying to, if not to solve that puzzle definitively, at least provide the tools to do so.
For Paul Torrens, wintry weather is less about sledding and more about testing out models of human behavior. Torrens, a geographer at the University of Maryland, studies how snow and icy conditions affect human decisions about transportation. He also studies how these decisions ripple through other infrastructure systems.
Scientists using supercomputers found genes sensitive to cold and drought in a plant help it survive climate change. The computational challenges were daunting, involving thousands of individual strains of the plant with hundreds of thousands of markers across the genome and testing for a dozen environmental variables. Their findings increase basic understanding of plant adaptation and can be applied to improve crops.
Researchers have found that, based on enough Facebook Likes, computers can judge your personality traits better than your friends, family and even your partner. Using a new algorithm, researchers have calculated the average number of Likes artificial intelligence (AI) needs to draw personality inferences about you as accurately as your partner or parents.
In 1997, IBM’s Deep Blue computer beat chess wizard Garry Kasparov. This year, a computer system developed at the University of Wisconsin-Madison equaled or bested scientists at the complex task of extracting data from scientific publications and placing it in a database that catalogs the results of tens of thousands of individual studies.
The drug creation process often misses many side effects that kill at least 100,000 patients a year. LLNL researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions, using high-performance computers to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.
NeuroSolutions Infinity predictive data analytics and modeling software is designed to streamline data mining by automatically taking care of the entire data modeling process. It includes everything from accessing, cleaning and arranging data, to intelligently trying potential inputs, preprocessing and neural network architectures, to selecting the best neural network and verifying the results.
Robo Brain — a large-scale computational system that learns from publicly available Internet resources — is currently downloading and processing about 1 billion images, 120,000 YouTube videos, and 100 million how-to documents and appliance manuals. The information is being translated and stored in a robot-friendly format that robots will be able to draw on when they need it.
A wealth of images of Earth at night taken by astronauts on the International Space Station (ISS) could help save energy, contribute to better human health and safety and improve our understanding of atmospheric chemistry. But, scientists need your help to make that happen.
University of Wisconsin Researchers utilized HPC resources in combination with multiple advanced forms of protein structure prediction algorithms and deep sequence data mining to construct a highly plausible capsid model for Rhinovirus-C (~600,000 atoms). The simulation model helps researchers in explaining why the existing pharmaceuticals don’t work on this virus.
Making changes within a complex software system is often error-prone – even the smallest mistake can endanger the entire system. Ten years ago, computer scientists from Saarbrücken around Professor Andreas Zeller developed a technique that automatically issues suggestions on how to manage changes...
Athens, Ga. – New tools to collect and share information could help stem the loss of the world's threatened species, according to a paper published today in the journal Science. The study—by an international team of scientists that included John L. Gittleman, dean of the University of Georgia Odum...
Huynh Phung Huynh's research interests include high performance computing (HPC): compiler optimization for GPU, many cores and other accelerators; Parallel computing: framework for parallel programming or scheduling; and HPC for data mining and machine learning algorithms.
Computer technology that can mine data from social media during times of natural or other disaster could provide invaluable insights for rescue workers and decision makers. Advances in information technology have had a profound impact on disaster management.
The new HITS research group “Data Mining and Uncertainty Quantification” analyzes large amounts of data and calculates uncertainties in technical systems. With Prof. Vincent Heuveline as their group leader, the group of mathematicians and computer scientists especially focuses on increasing the security of technology in operating rooms.
Josiah Stamp said: “The individual source of the statistics may easily be the weakest link.” Nowhere is this more true than in the new field of text mining, given the wide variety of textual information. By some estimates, 80 percent of the information available occurs as free-form text which, prior to the development of text mining, needed to be read in its entirety in order for information to be obtained from it.
Researchers are developing computers capable of "approximate computing" to perform calculations good enough for certain tasks that don't require perfect accuracy, potentially doubling efficiency and reducing energy consumption.
Everything leading up to the actual coding, figuring out how to make it work, is what Samak enjoys most. One of the problems she is working on with the Department of Energy’s Joint Genome Institute (JGI) is a data mining method to automatically identify errors in genome assembly, replacing the current approach of manually inspecting the assembly.
In his 1937 book, "Think and Grow Rich," author Napoleon Hill identified 13 steps to success, one of which was the power of the mastermind. "No two minds ever come together without thereby creating a third, invisible, intangible force, which may be likened to a third mind," Hill wrote.
- Page 1