Advertisement
Programming
Subscribe to Programming

The Lead

Ramasany Gowthami participated in the creation of an Android app by means of which users get together to crack a modern cryptographic code.

Smartphones Set Out to Decipher Cryptographic System

August 25, 2014 4:33 am | by Sébastien Corthésy, EPFL | News | Comments

An Android app has been created that allows users to get together to crack a modern cryptographic code. All encryption types, among which we can find the widely used RSA, can theoretically be broken. If so, how to ensure that our data remains protected? The answer lies in the time and effort required to break the code.

NVIDIA CUDA 6.5 Production Release

August 22, 2014 12:15 pm | Product Releases | Comments

NVIDIA CUDA 6.5 brings GPU-accelerated computing to 64-bit ARM platforms. The toolkit provides...

Top Cybersecurity Advice from the Rock Stars

August 22, 2014 10:57 am | by Amanda Sawyer, IEEE Computer Society | Blogs | Comments

High-profile security breaches, data thefts and cyberattacks are increasing in frequency,...

Improving Temperature Modeling across Mountainous Landscapes

August 21, 2014 4:28 pm | by University of Montana | News | Comments

New research by University of Montana doctoral student Jared Oyler provides improved computer...

View Sample

FREE Email Newsletter

With their new method, computer scientists from Saarland University are able, for the first time, to compute all illumination effects in a simpler and more efficient way. Courtesy of AG Slusallek/Saar-Uni

Realistic Computer Graphics Technology Vastly Speeds Process

August 18, 2014 2:15 pm | by University Saarland | News | Comments

Creating a realistic computer simulation of how light suffuses a room is crucial not just for animated movies like Toy Story or Cars, but also in industry. Special computing methods should ensure this, but require great effort. Computer scientists from Saarbrücken have developed a novel approach that vastly simplifies and speeds up the whole calculating process.

The Kilobots, a swarm of one thousand simple but collaborative robots. Courtesy of Mike Rubenstein and Science/AAAS

AI: Self-organizing Thousand-robot Swarm Forms Vast, Complex Shapes

August 18, 2014 12:03 pm | by Caroline Perry, Harvard SEAS | News | Comments

The first thousand-robot flash mob has assembled at Harvard University. Instead of one highly-complex robot, a “kilo” of robots collaborate, providing a simple platform for the enactment of complex behaviors. Called Kilobots, these extremely simple robots are each just a few centimeters across and stand on three pin-like legs.

NERSC's next-generation supercomputer, a Cray XC, will be named after Gerty Cori, the first American woman to be honored with a Nobel Prize in science. She shared the 1947 Nobel Prize with her husband Carl (pictured) and Argentine physiologist Bernardo Ho

NERSC Launches Next-Generation Code Optimization Effort

August 15, 2014 9:41 am | by NERSC | News | Comments

With the promise of exascale supercomputers looming on the horizon, much of the roadmap is dotted with questions about hardware design and how to make these systems energy efficient enough so that centers can afford to run them. Often taking a back seat is an equally important question: will scientists be able to adapt their applications to take advantage of exascale once it arrives?

Advertisement
LabVIEW 2014 software adds new capabilities to acquire, analyze and visualize data from anywhere, at any time.

LabVIEW 2014 System Design Software

August 12, 2014 11:06 am | National Instruments | Product Releases | Comments

LabVIEW 2014 system design software standardizes the way users interact with hardware through reuse of the same code and engineering processes across systems, which scales applications for the future. This saves time and money as technology advances, requirements evolve and time-to-market pressure increases.

Four Seasons: A new algorithm developed by Brown computer scientists allows users to change the season and other “transient attributes” in outdoor photos.

Photo Editing Algorithm Changes Weather, Seasons Automatically

August 11, 2014 12:34 pm | by Brown University | News | Comments

A computer algorithm being developed by Brown University researchers enables users to instantly change the weather, time of day, season, or other features in outdoor photos with simple text commands. Machine learning and a clever database make it possible.

NSF's Secure and Trustworthy Cyberspace (SaTC) program will support more than 225 new projects in 39 states in 2014. The awards enable research from the theoretical to the experimental, and aim to minimize the misuses of cyber technology, bolster educatio

Frontier-scale Projects Expand Breadth and Impact of Cybersecurity, Privacy Research

August 6, 2014 3:35 pm | by NSF | News | Comments

As our lives and businesses become ever more intertwined with the Internet and networked technologies, it is crucial to continue to develop and improve cybersecurity measures to keep our data, devices and critical systems safe, secure, private and accessible. The NSF's Secure and Trustworthy Cyberspace program has announced two new center-scale "Frontier" awards to support projects that address grand challenges in cybersecurity science

The blurred image on the left shows how a farsighted person would see a computer screen without corrective lenses. In the middle is how that same person would perceive the picture using a display that compensates for visual impairments.

Vision-correcting Display Makes Reading Glasses So Yesterday

July 30, 2014 3:46 pm | by Sarah Yang, UC Berkeley | News | Comments

What if computer screens had glasses instead of the people staring at the monitors? That concept is not too far afield from technology being developed by UC Berkeley computer and vision scientists. The researchers are developing computer algorithms to compensate for an individual’s visual impairment, and creating vision-correcting displays that enable users to see text and images clearly without wearing eyeglasses or contact lenses.

AMD Opteron 64-Bit ARM-Based Developer Kits

AMD Opteron 64-Bit ARM-Based Developer Kits

July 30, 2014 12:45 pm | Advanced Micro Devices, Inc. | Product Releases | Comments

The AMD Opteron A1100-Series developer kit features AMD's first 64-bit ARM-based processor, codenamed "Seattle." The processor supports 4 and 8 ARM Cortex-A57 cores; up to 4 MB of shared L2 and 8 MB of shared L3 cache; configurable dual DDR3 or DDR4 memory channels with ECC at up to 1866...

Advertisement
K computer installed in the computer room. Each computer rack is equipped with about 100 CPUs. In the Computer Building, 800 or more computer racks are installed for the K computer.  Courtesy of Riken

K Computer Runs Largest Ever Ensemble Simulation of Global Weather

July 25, 2014 2:25 pm | by RIKEN | News | Comments

Ensemble forecasting is a key part of weather forecasting. Computers typically run multiple simulations using slightly different initial conditions or assumptions, and then analyze them together to try to improve forecasts. Using Japan’s K computer, researchers have succeeded in running 10,240 parallel simulations of global weather, the largest number ever performed, using data assimilation to reduce the range of uncertainties.

HPC-X Scalable Software Toolkit for High-Performance Computing Platforms and Applications

HPC-X Scalable Software Toolkit for High-Performance Computing Platforms and Applications

July 25, 2014 2:01 pm | Mellanox Technologies, Inc. | Product Releases | Comments

HPC-X Scalable Software Toolkit  is a comprehensive software suite for high-performance computing environments that provides enhancements to significantly increase the scalability and performance of message communications in the network. The toolkit provides complete communication libraries to support MPI, SHMEM and PGAS programming languages, as well as performance accelerators that take advantage of Mellanox scalable interconnect solutions.

Math Can Make the Internet 5 to 10 Times Faster

July 18, 2014 3:52 pm | by Aalborg University | News | Comments

Mathematical equations can make Internet communication via computer, mobile phone or satellite many times faster and more secure than today. Results with software developed by researchers from Aalborg University in collaboration with the Massachusetts Institute of Technology (MIT) and California Institute of Technology (Caltech) are attracting attention in the international technology media.

Electrostatic potential fluctuations in an annular region at mid-radius in the MAST tokamak, from a gyrokinetic simulation of the saturated turbulence using the GS2 code. A wedge of plasma has been removed from the visualisation so as to view the nature o

EPCC wins HPC Innovation Excellence Award

July 8, 2014 3:48 pm | by Adrian Jackson, EPCC | News | Comments

EPCC is delighted to be part of a team that has won an. Presented at the International Supercomputing Conference (ISC14) in Leipzig (22-26 June 2014), the awards recognize outstanding application of HPC Computing for Business and Scientific Achievements.

Integration between Moab HPC Suite and Bright Cluster Manager provides enhanced functionality that enables users to dynamically provision HPC clusters based on both resource and workload monitoring.

Cluster Manager

June 30, 2014 9:04 am | Adaptive Computing, Bright Computing | Product Releases | Comments

Integration between Moab HPC Suite and Bright Cluster Manager provides enhanced functionality that enables users to dynamically provision HPC clusters based on both resource and workload monitoring. Combined capabilities also create a more optimal solution to managing technical computing and Big Workflow requirements.

Advertisement
Altair has announced that the National Supercomputing Center for Energy and the Environment (NSCEE) at the University of Nevada, Las Vegas, (UNLV) has chosen PBS Professional to replace its previous high-performance computing (HPC) workload management imp

UNLV's Supercomputing Center Switches to Altair PBS Professional

June 27, 2014 10:40 am | Altair Engineering | News | Comments

Altair has announced that the National Supercomputing Center for Energy and the Environment (NSCEE) at the University of Nevada, Las Vegas, (UNLV) has chosen PBS Professional to replace its previous high-performance computing (HPC) workload management implementation.

Some of the many variations the new Learning Everything about Anything, or LEVAN, program has learned for three different concepts.

Fully Automated Computer Program Teaches Itself Everything about Anything

June 13, 2014 3:18 pm | by Michelle Ma | News | Comments

In today’s digitally driven world, access to information appears limitless. But when you have something specific in mind that you don’t know, like the name of that niche kitchen tool you saw at a friend’s house, it can be surprisingly hard to sift through the volume of information online and know how to search for it. Or, the opposite problem can occur — we can look up anything on the Internet, but how can we be sure we're finding every...

By means of an algorithm, increasing networking of students on Facebook can be displayed according to their age. Courtesy of Michael Hamann, KIT

Algorithms for Big Data: Optimizing Daily, Routine Processing

June 10, 2014 4:32 am | by Karlsruhe Institute of Technology | News | Comments

Computer systems today can be found in nearly all areas of life, from smartphones to smart cars to self-organized production facilities. These systems supply rapidly growing data volumes, and computer science now faces the challenge of processing these huge amounts of data (big data) in a reasonable and secure manner.

High-resolution CESM simulation run on Yellowstone. This featured CAM-5 spectral element at roughly 0.25deg grid spacing, and POP2 on a nominal 0.1deg grid.

Building Momentum for Code Modernization: The Intel Parallel Computing Centers

June 9, 2014 12:06 pm | by Doug Black | Articles | Comments

Like a Formula One race car stuck in a traffic jam, HPC hardware performance is frequently hampered by HPC software. This is because some of the most widely used application codes have not been updated for years, if ever, leaving them unable to leverage advances in parallel systems. As hardware power moves toward exascale, the imbalance between hardware and software will only get worse. The problem of updating essential scientific ...

Tom Vander Aa is a researcher/project coordinator in the ExaScience Life Lab at imec.

High Performance Communication

June 9, 2014 10:12 am | by Tom Vander Aa, imec ExaScience Life Lab | Blogs | Comments

In the late 90s, I was teaching parallel programming in C using MPI to students. The most important lesson I wanted them to remember is that communication is much more important than computation. The form of the benchmark couldn't be more common: a set of convolutional filters applied to an image, one filter after the other in a pipelined fashion.

In the field of Artificial Intelligence, there is no more iconic and controversial milestone than the Turing Test, when a computer convinces a sufficient number of interrogators into believing that it is not a machine but rather is a human.

Can Machines Think? Turing Test Success a Milestone in Computing History

June 9, 2014 9:07 am | by University of Reading | News | Comments

An historic milestone in artificial intelligence set by Alan Turing — the father of modern computer science — has been achieved. The 65 year-old iconic Turing Test was passed for the very first time by supercomputer Eugene Goostman during Turing Test 2014 held at the Royal Society in London on June 7, 2014, and organized by the University of Reading.

Gregorio Valdez and his team designed a search engine – called EvoCor – that quickly sifts through the evolutionary history of all mapped genes – human and otherwise.

Search Engine finds Functionally Linked Genes

June 4, 2014 7:47 pm | by Ashley WennersHerron, Virginia Tech | News | Comments

A frontier lies deep within our cells. Our bodies are as vast as oceans and space, composed of a dizzying number of different types of cells. Exploration reaches far, yet the genes that make each cell and tissue unique have remained largely obscure. That’s changing with a search engine called EvoCor that identifies functionally linked genes.

A Turing machine built from legos. Courtesy of Projet Rubens, ENS Lyon

Basic Logic Research Crucial for Computer, Software Engineering

June 3, 2014 3:27 pm | by Vienna University of Technology | News | Comments

All men are mortal. Socrates is a man. Therefore, Socrates is mortal. Logical arguments like this one have been studied since antiquity. In the last few decades, however, logic research has changed considerably: the computer sciences were born. The success of informatics would have been impossible without the groundwork provided by logicians — and, in turn, computer sciences keep posing new interesting questions

Intel Issues RFP for Intel Parallel Computing Centers

Join the Journey to Accelerate Discovery through Increased Parallelism

May 28, 2014 11:20 am | by Intel Parallel Computing Centers | Blogs | Comments

Solving some of the biggest challenges in society, industry and sciences requires dramatic increases in computing efficiency. Many HPC customers are sitting on incredible untapped compute reserves and they don’t even know it. The very people who are focused on solving the world’s biggest problems with high-performance computing are often only using a small fraction of the compute capability their systems provide. Why? Their software ...

Sandia National Laboratories’ Francois Leonard holds a wire mesh cylinder similar in design to a carbon nanotube that might form the basis for future computing technology. Computing experts at Sandia are exploring what computers of the future might look l

Get Ready for Computers of the Future: Sandia Launches Push to Innovate Next-Gen Machines

May 28, 2014 10:30 am | by Sandia National Laboratories | News | Comments

Computing experts at Sandia National Laboratories have launched an effort to help discover what computers of the future might look like, from next-generation supercomputers to systems that learn on their own — new machines that do more while using less energy.

The GE-225 mainframe computer in the basement of Dartmouth’s College Hall. Courtesy of the Trustees of Dartmouth College

BASIC, Woz and How GE's Mainframe Midwifed Modern Computing

May 22, 2014 9:51 am | by GE | Blogs | Comments

Fifty years ago, on May 1, 1964, two Dartmouth professors and their students developed the BASIC programming language and supercharged the information age. BASIC revolutionized personal computing and helped launch icons like Apple and Microsoft.

The special focus of this workshop will be on interactive parallel computing with IPython.

4th Workshop on Python for High Performance and Scientific Computing (PyHPC 2014)

May 21, 2014 12:03 pm | by PyHPC 2014 | Events

The workshop will bring together researchers and practitioners from industry, academia, and the wider community using Python in all aspects of high performance and scientific computing. The goal is to present Python applications from mathematics, science, and engineering, to discuss general topics regarding the use of Python (such as language design and performance issues), and to share experience using Python in scientific computing education.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading