Advertisement
Programming
Subscribe to Programming

The Lead

An innovative piece of research looks into the matter of machine morality, and questions whether it is “evil” for robots to masquerade as humans.

How to Train your Robot: Can We Teach Robots Right from Wrong?

October 14, 2014 12:46 pm | by Taylor & Francis | News | Comments

From performing surgery to driving cars, today’s robots can do it all. With chatbots recently hailed as passing the Turing test, it appears robots are becoming increasingly adept at posing as humans. While machines are becoming ever more integrated into human lives, the need to imbue them with a sense of morality becomes increasingly urgent. But can we really teach robots how to be good? An innovative piece of research looks into the matter

2015 Rice Oil & Gas High Performance Computing Workshop

October 13, 2014 2:45 pm | by Rice University | Events

The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University,...

High Performance Parallelism Pearls: A Teaching Juggernaut

October 13, 2014 9:52 am | by Rob Farber | Blogs | Comments

High Performance Parallelism Pearls, the latest book by James Reinders and Jim Jeffers...

Reaching the Limit of Error-Correcting Codes

October 2, 2014 3:44 pm | by Larry Hardesty, MIT | News | Comments

Error-correcting codes are one of the glories of the information age: They’re what...

View Sample

FREE Email Newsletter

The team recently took the MIT cheetah-bot for a test run, where it bounded across the grass at a steady clip.  Courtesy of Jose-Luis Olivares/MIT

Algorithm Enables Untethered Cheetah Robot to Run and Jump

September 16, 2014 2:14 pm | by Jennifer Chu, MIT | News | Comments

MIT researchers have developed an algorithm for bounding that they’ve successfully implemented in a robotic cheetah — a sleek, four-legged assemblage of gears, batteries and electric motors that weighs about as much as its feline counterpart. The team recently took the robot for a test run, where it bounded across the grass at a steady clip. The researchers estimate the robot may eventually reach speeds of up to 30 mph.

“Scalability and performance means taking a careful look at the code modernization opportunities that exist for both message passing and threads as well as opportunities for vectorization and SIMDization.” Rick Stevens, Argonne National Laboratory

Extending the Lifespan of Critical Resources through Code Modernization

September 9, 2014 2:05 pm | by Doug Black | Articles | Comments

As scientific computing moves inexorably toward the Exascale era, an increasingly urgent problem has emerged: many HPC software applications — both public domain and proprietary commercial — are hamstrung by antiquated algorithms and software unable to function in manycore supercomputing environments. Aside from developing an Exascale-level architecture, HPC code modernization is the most important challenge facing the HPC community.

So far, JOANA is the only software analysis tool worldwide that does not only find all security gaps but also minimizes the number of false alarms without affecting the functioning of programs.

For Secure Software: X-rays instead of Passport Control

August 27, 2014 3:13 pm | by Karlsruhe Institute of Technology | News | Comments

Trust is good, control is better. This also applies to the security of computer programs. Instead of trusting “identification documents” in the form of certificates, the JOANA software analysis tool examines the source text (code) of a program. In this way, it detects leaks, via which secret information may get out or strangers may enter the system from outside. At the same time, JOANA reduces the number of false alarms to a minimum.

Advertisement
Ramasany Gowthami participated in the creation of an Android app by means of which users get together to crack a modern cryptographic code.

Smartphones Set Out to Decipher Cryptographic System

August 25, 2014 4:33 am | by Sébastien Corthésy, EPFL | News | Comments

An Android app has been created that allows users to get together to crack a modern cryptographic code. All encryption types, among which we can find the widely used RSA, can theoretically be broken. If so, how to ensure that our data remains protected? The answer lies in the time and effort required to break the code.

NVIDIA CUDA 6.5 Production Release

NVIDIA CUDA 6.5 Production Release

August 22, 2014 12:15 pm | Nvidia Corporation | Product Releases | Comments

NVIDIA CUDA 6.5 brings GPU-accelerated computing to 64-bit ARM platforms. The toolkit provides programmers with a platform to develop advanced scientific, engineering, mobile and HPC applications on GPU-accelerated ARM and x86 CPU-based systems. Features include support for Microsoft Visual Studio 2013, cuFFT callbacks capability and improved debugging for CUDA FORTRAN applications.

Rock Stars of Cybersecurity will take place in Austin, TX, on September 24, 2014

Top Cybersecurity Advice from the Rock Stars

August 22, 2014 10:57 am | by Amanda Sawyer, IEEE Computer Society | Blogs | Comments

High-profile security breaches, data thefts and cyberattacks are increasing in frequency, ferocity and stealth. They result in significant loss of revenue and reputation for organizations, destabilize governments, and hit everyone’s wallets. Cybersecurity is in the global spotlight and, now more than ever, organizations must understand how to identify weaknesses and protect company infrastructure from incursions.

The climate dataset is the first fine-scale work to correct for artificial trends within weather station data caused by changes in equipment or weather station locations. It also is the first to provide direct estimates of uncertainty and to provide open-

Improving Temperature Modeling across Mountainous Landscapes

August 21, 2014 4:28 pm | by University of Montana | News | Comments

New research by University of Montana doctoral student Jared Oyler provides improved computer models for estimating temperature across mountainous landscapes. Oyler provided a new climate dataset for ecological and hydrological research and natural resource management.

With their new method, computer scientists from Saarland University are able, for the first time, to compute all illumination effects in a simpler and more efficient way. Courtesy of AG Slusallek/Saar-Uni

Realistic Computer Graphics Technology Vastly Speeds Process

August 18, 2014 2:15 pm | by University Saarland | News | Comments

Creating a realistic computer simulation of how light suffuses a room is crucial not just for animated movies like Toy Story or Cars, but also in industry. Special computing methods should ensure this, but require great effort. Computer scientists from Saarbrücken have developed a novel approach that vastly simplifies and speeds up the whole calculating process.

Advertisement
The Kilobots, a swarm of one thousand simple but collaborative robots. Courtesy of Mike Rubenstein and Science/AAAS

AI: Self-organizing Thousand-robot Swarm Forms Vast, Complex Shapes

August 18, 2014 12:03 pm | by Caroline Perry, Harvard SEAS | News | Comments

The first thousand-robot flash mob has assembled at Harvard University. Instead of one highly-complex robot, a “kilo” of robots collaborate, providing a simple platform for the enactment of complex behaviors. Called Kilobots, these extremely simple robots are each just a few centimeters across and stand on three pin-like legs.

NERSC's next-generation supercomputer, a Cray XC, will be named after Gerty Cori, the first American woman to be honored with a Nobel Prize in science. She shared the 1947 Nobel Prize with her husband Carl (pictured) and Argentine physiologist Bernardo Ho

NERSC Launches Next-Generation Code Optimization Effort

August 15, 2014 9:41 am | by NERSC | News | Comments

With the promise of exascale supercomputers looming on the horizon, much of the roadmap is dotted with questions about hardware design and how to make these systems energy efficient enough so that centers can afford to run them. Often taking a back seat is an equally important question: will scientists be able to adapt their applications to take advantage of exascale once it arrives?

LabVIEW 2014 software adds new capabilities to acquire, analyze and visualize data from anywhere, at any time.

LabVIEW 2014 System Design Software

August 12, 2014 11:06 am | National Instruments | Product Releases | Comments

LabVIEW 2014 system design software standardizes the way users interact with hardware through reuse of the same code and engineering processes across systems, which scales applications for the future. This saves time and money as technology advances, requirements evolve and time-to-market pressure increases.

Four Seasons: A new algorithm developed by Brown computer scientists allows users to change the season and other “transient attributes” in outdoor photos.

Photo Editing Algorithm Changes Weather, Seasons Automatically

August 11, 2014 12:34 pm | by Brown University | News | Comments

A computer algorithm being developed by Brown University researchers enables users to instantly change the weather, time of day, season, or other features in outdoor photos with simple text commands. Machine learning and a clever database make it possible.

NSF's Secure and Trustworthy Cyberspace (SaTC) program will support more than 225 new projects in 39 states in 2014. The awards enable research from the theoretical to the experimental, and aim to minimize the misuses of cyber technology, bolster educatio

Frontier-scale Projects Expand Breadth and Impact of Cybersecurity, Privacy Research

August 6, 2014 3:35 pm | by NSF | News | Comments

As our lives and businesses become ever more intertwined with the Internet and networked technologies, it is crucial to continue to develop and improve cybersecurity measures to keep our data, devices and critical systems safe, secure, private and accessible. The NSF's Secure and Trustworthy Cyberspace program has announced two new center-scale "Frontier" awards to support projects that address grand challenges in cybersecurity science

Advertisement
The blurred image on the left shows how a farsighted person would see a computer screen without corrective lenses. In the middle is how that same person would perceive the picture using a display that compensates for visual impairments.

Vision-correcting Display Makes Reading Glasses So Yesterday

July 30, 2014 3:46 pm | by Sarah Yang, UC Berkeley | News | Comments

What if computer screens had glasses instead of the people staring at the monitors? That concept is not too far afield from technology being developed by UC Berkeley computer and vision scientists. The researchers are developing computer algorithms to compensate for an individual’s visual impairment, and creating vision-correcting displays that enable users to see text and images clearly without wearing eyeglasses or contact lenses.

AMD Opteron 64-Bit ARM-Based Developer Kits

AMD Opteron 64-Bit ARM-Based Developer Kits

July 30, 2014 12:45 pm | Advanced Micro Devices, Inc. | Product Releases | Comments

The AMD Opteron A1100-Series developer kit features AMD's first 64-bit ARM-based processor, codenamed "Seattle." The processor supports 4 and 8 ARM Cortex-A57 cores; up to 4 MB of shared L2 and 8 MB of shared L3 cache; configurable dual DDR3 or DDR4 memory channels with ECC at up to 1866...

K computer installed in the computer room. Each computer rack is equipped with about 100 CPUs. In the Computer Building, 800 or more computer racks are installed for the K computer.  Courtesy of Riken

K Computer Runs Largest Ever Ensemble Simulation of Global Weather

July 25, 2014 2:25 pm | by RIKEN | News | Comments

Ensemble forecasting is a key part of weather forecasting. Computers typically run multiple simulations using slightly different initial conditions or assumptions, and then analyze them together to try to improve forecasts. Using Japan’s K computer, researchers have succeeded in running 10,240 parallel simulations of global weather, the largest number ever performed, using data assimilation to reduce the range of uncertainties.

HPC-X Scalable Software Toolkit for High-Performance Computing Platforms and Applications

HPC-X Scalable Software Toolkit for High-Performance Computing Platforms and Applications

July 25, 2014 2:01 pm | Mellanox Technologies, Inc. | Product Releases | Comments

HPC-X Scalable Software Toolkit  is a comprehensive software suite for high-performance computing environments that provides enhancements to significantly increase the scalability and performance of message communications in the network. The toolkit provides complete communication libraries to support MPI, SHMEM and PGAS programming languages, as well as performance accelerators that take advantage of Mellanox scalable interconnect solutions.

Math Can Make the Internet 5 to 10 Times Faster

July 18, 2014 3:52 pm | by Aalborg University | News | Comments

Mathematical equations can make Internet communication via computer, mobile phone or satellite many times faster and more secure than today. Results with software developed by researchers from Aalborg University in collaboration with the Massachusetts Institute of Technology (MIT) and California Institute of Technology (Caltech) are attracting attention in the international technology media.

Electrostatic potential fluctuations in an annular region at mid-radius in the MAST tokamak, from a gyrokinetic simulation of the saturated turbulence using the GS2 code. A wedge of plasma has been removed from the visualisation so as to view the nature o

EPCC wins HPC Innovation Excellence Award

July 8, 2014 3:48 pm | by Adrian Jackson, EPCC | News | Comments

EPCC is delighted to be part of a team that has won an. Presented at the International Supercomputing Conference (ISC14) in Leipzig (22-26 June 2014), the awards recognize outstanding application of HPC Computing for Business and Scientific Achievements.

Integration between Moab HPC Suite and Bright Cluster Manager provides enhanced functionality that enables users to dynamically provision HPC clusters based on both resource and workload monitoring.

Cluster Manager

June 30, 2014 9:04 am | Adaptive Computing, Bright Computing | Product Releases | Comments

Integration between Moab HPC Suite and Bright Cluster Manager provides enhanced functionality that enables users to dynamically provision HPC clusters based on both resource and workload monitoring. Combined capabilities also create a more optimal solution to managing technical computing and Big Workflow requirements.

Altair has announced that the National Supercomputing Center for Energy and the Environment (NSCEE) at the University of Nevada, Las Vegas, (UNLV) has chosen PBS Professional to replace its previous high-performance computing (HPC) workload management imp

UNLV's Supercomputing Center Switches to Altair PBS Professional

June 27, 2014 10:40 am | Altair Engineering | News | Comments

Altair has announced that the National Supercomputing Center for Energy and the Environment (NSCEE) at the University of Nevada, Las Vegas, (UNLV) has chosen PBS Professional to replace its previous high-performance computing (HPC) workload management implementation.

Some of the many variations the new Learning Everything about Anything, or LEVAN, program has learned for three different concepts.

Fully Automated Computer Program Teaches Itself Everything about Anything

June 13, 2014 3:18 pm | by Michelle Ma | News | Comments

In today’s digitally driven world, access to information appears limitless. But when you have something specific in mind that you don’t know, like the name of that niche kitchen tool you saw at a friend’s house, it can be surprisingly hard to sift through the volume of information online and know how to search for it. Or, the opposite problem can occur — we can look up anything on the Internet, but how can we be sure we're finding every...

By means of an algorithm, increasing networking of students on Facebook can be displayed according to their age. Courtesy of Michael Hamann, KIT

Algorithms for Big Data: Optimizing Daily, Routine Processing

June 10, 2014 4:32 am | by Karlsruhe Institute of Technology | News | Comments

Computer systems today can be found in nearly all areas of life, from smartphones to smart cars to self-organized production facilities. These systems supply rapidly growing data volumes, and computer science now faces the challenge of processing these huge amounts of data (big data) in a reasonable and secure manner.

High-resolution CESM simulation run on Yellowstone. This featured CAM-5 spectral element at roughly 0.25deg grid spacing, and POP2 on a nominal 0.1deg grid.

Building Momentum for Code Modernization: The Intel Parallel Computing Centers

June 9, 2014 12:06 pm | by Doug Black | Articles | Comments

Like a Formula One race car stuck in a traffic jam, HPC hardware performance is frequently hampered by HPC software. This is because some of the most widely used application codes have not been updated for years, if ever, leaving them unable to leverage advances in parallel systems. As hardware power moves toward exascale, the imbalance between hardware and software will only get worse. The problem of updating essential scientific ...

Tom Vander Aa is a researcher/project coordinator in the ExaScience Life Lab at imec.

High Performance Communication

June 9, 2014 10:12 am | by Tom Vander Aa, imec ExaScience Life Lab | Blogs | Comments

In the late 90s, I was teaching parallel programming in C using MPI to students. The most important lesson I wanted them to remember is that communication is much more important than computation. The form of the benchmark couldn't be more common: a set of convolutional filters applied to an image, one filter after the other in a pipelined fashion.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading