Big Data tools such as Grok and IBM Watson are enabling large organizations to behave more like agile startups. Of the transformative technology developments that have ushered in the current frenzy of activity along the information superhighway, the 1994 invention of the “Wiki” by Ward Cunningham is among the most disruptive.
The 10-day tour of Europe was not your typical itinerary — Garching, Karlsruhe, Villigen,...
Size alone does not define big data — it is best defined as a combination of volume, velocity,...
The Russian Ministry of Education and Science has awarded a $3.4 million “mega-grant” to Alexei Klimentov, Physics Applications Software Group Leader at the U.S. Department of Energy’s Brookhaven National Laboratory, to develop new “big data” computing tools for the advancement of science.
AT&T and IBM have announced a new global alliance agreement to develop solutions that help support the "Internet of Things." The companies will combine their analytic platforms, cloud and security technologies with privacy in mind to gain more insights on data collected from machines in a variety of industries.
Although the time and cost of sequencing an entire human genome has plummeted, analyzing the resulting three billion base pairs of genetic information from a single genome can take many months. However, a team working with Beagle, one of the world's fastest supercomputers devoted to life sciences, reports that genome analysis can be radically accelerated. This computer is able to analyze 240 full genomes in about two days.
The Intel Xeon processor E7 v2 family delivers capabilities to process and analyze large, diverse amounts of data to unlock information that was previously inaccessible. The processor family has triple the memory capacity of the previous generation processor family, allowing much faster and thorough data analysis.
IBM announced that it has achieved a new technological advancement that will helpimprove Internet speeds to 200-400 Gigabits per second (Gb/s) at extremely low power. The speed boost is based on a device that can be used to improve transferring Big Data between clouds and data centers four times faster than current technology. At this speed, 160 Gigabytes could be downloaded in only a few seconds.
HPC matters, now more than ever. What better way to show how it matters than through your submission to the SC14 Technical Program? Technical Program submissions opened, February 14th for Research Papers, Posters (Regular, Education, and ACM Student Research Competition), Panels, Tutorials, BOF Sessions, Scientific Visualization and Data Analytics Showcase, Emerging Technologies, and Doctoral Showcase.
It is now possible to sense scientific data as a means to deal with the mountains of information we face in our environment by applying subconscious processing to big data analysis. Imagine that data could be transposed into a tactile experience.
A major event in France and Europe, TERATEC Forum brings together the top international experts in high performance numerical design and simulation, confirming the strategic importance of these technologies for developing industrial competitiveness and innovation capacity. For its 9th edition, the Forum Teratec will be, for over 1000 professionals, the right place to be, with its hot topics, plenary sessions, technical workshops and exhibition of hardware, software and service providers.
Thomas Edison received 1,093 patents during his lifetime for inventions spanning everything from the tattoo machine to the electric grid. His innovative mind inspired President Ronald Reagan to celebrate National Inventors’ Day on the anniversary of Edison’s birthday on February 11. Edison’s curious spirit never the left the company he started. Three GE scientists received the...
HPCS is a multi-disciplinary conference, considered Canada's premier advanced computing forum. Each year, Canadian researchers, analysts, and IT professionals from academia and industry gather to exchange the ideas, tools, and new discoveries that are driving today's innovations in computational research.
IBM has launched a 10-year initiative to bring Watson and other cognitive systems to Africa in a bid to fuel development and spur business opportunities across the world's fastest growing continent. Dubbed "Project Lucy" after the earliest known human ancestor, IBM will invest US$100 million in the initiative
Computational scientists now have the opportunity to apply for the upcoming Argonne Training Program on Extreme-Scale Computing (ATPESC), to take place from August 3-15, 2014. The program provides intensive hands-on training on the key skills, approaches and tools to design, implement, and execute computational science and engineering applications on current supercomputers and the HPC systems of the future.
The HPC Advisory Council and the Swiss Supercomputing Centre will host the HPC Advisory Council Switzerland Conference 2014. The conference will focus on High-Performance Computing essentials, new developments and emerging technologies, best practices and hands-on training.
As computers enter ever more areas of our daily lives, the amount of data they produce has grown enormously. But for this “big data” to be useful it must first be analyzed, meaning it needs to be stored in such a way that it can be accessed quickly when required.
Texas A&M System Teams with IBM to Drive Computational Sciences Research through Big Data and AnalyticsJanuary 29, 2014 1:38 pm | by IBM | News | Comments
Texas A&M University System and IBM announced an agreement that is the beginning of a broad research collaboration supported by one of the largest computational sciences infrastructures dedicated to advances in agriculture, geosciences and engineering.
On Monday, January 27, over 250 people gathered at the small evangelical church in Daisbach – a very small, quiet town in Southern Germany – to bid farewell to Hans Meuer, the founder of TOP500 and the ISC General Chair. Hans passed away on January 20 at the age of 77.
The goal of this conference is to bring together all the developers and researchers involved in solving the software challenges of the exascale era. The conference focuses on issues of applications for exascale and the associated tools, software programming models and libraries.
ISC General Chair, Prof. Dr. Hans Werner Meuer passed away at the age of 77 at his home in Daisbach, Southern Germany, on January 20, 2014, after a brief battle with cancer. Meuer has been involved in data processing since 1960. He served as specialist, project leader, group and department chief during his 11 years, from 1962 – 1973, at the Research Center in Jülich, Germany.
The sixth generation of enterprise X-Architecture for System x and PureSystems servers, provides improvements in the performance and economics of x86-based systems for analytics and cloud. As users adopt analytics for greater business insight and move critical workloads like ERP, analytics and database to the cloud for increased efficiency and lower costs, x86-based systems are a popular choice.
2U TwinPro and TwinPro2 2U servers are available in 2-node (TwinPro) and 4-node (TwinPro²) configurations optimized for high-end, high-density data center, cloud computing, enterprise, HPC and big data applications.
IBM has unveiled three new Watson services delivered over the cloud. The first, Watson Discovery Advisor, is designed to accelerate and strengthen R&D projects in industries such as pharmaceutical, publishing and biotech. The second, Watson Analytics, delivers visualized Big Data insights, based on questions posed in natural language. The third, Watson Explorer, helps users across an enterprise uncover and share data-driven insights
One of the most famous "Jeopardy!" champs of all time is moving to Manhattan. No, it's not Ken Jennings. IBM announced January 9, 2014, that it's investing over $1 billion to give its Watson supercomputer its own business division and a new home in the heart of New York City.
A computer program is running 24 hours a day at Carnegie Mellon University, searching the Web for images, doing its best to understand them on its own and, as it builds a growing visual database, gathering common sense on a massive scale. The Never Ending Image Learner, or NEIL, leverages recent advances in computer vision that enable computer programs to identify and label objects in images
IDC, in close partnership with users and vendors in the technical high-performance computing (HPC) industry, created the HPC User Forum, a unique service that identifies buyer/user requirements and vendor capabilities within this market. This forum promotes the interests of HPC users worldwide in industry/commerce, government, and academia. The HPC User Forum is an extension of IDC's long-standing commitment to the global HPC community.
The second ISC Big Data conference will take place at the beginning of October and will build on multiple pillars: the positive experience from the inaugural event, the global strength of the main ISC conference that gather 2500 experts from the High Performance Community each year, and the co-scheduled ISC Cloud event just prior to ISC Big Data.
- Page 1