Advertisement
Computers
Subscribe to Computers

The Lead

The NORAD Tracks Santa Web site features a mobile version, a holiday countdown, and new games and daily activities.

NORAD Ready to Track Santa’s Flight

December 16, 2014 11:19 am | by North American Aerospace Defense Command | News | Comments

The North American Aerospace Defense Command is once again ready to track Santa’s yuletide journey. It all started in 1955 when a local media advertisement directed children to call Santa direct — only the number was misprinted. Instead of reaching Santa, the phone rang through to the Crew Commander on duty at the Continental Air Defense Command Operations Center. Thus began the tradition, which NORAD has carried on since 1958.

New Theory could Yield More Reliable Communication Protocols

December 12, 2014 5:23 pm | by Larry Hardesty, MIT | News | Comments

Communication protocols for digital devices are very efficient but also very brittle: They...

New Open Access Book Series Introduces Essentials of Computing Science

December 11, 2014 3:58 pm | by Springer | News | Comments

Springer and Simula have launched a new book series, which aims to provide introductions to...

World's Oldest Computer, Ancient Greek Antikythera Mechanism, 100 Years Older than Previously Believed

December 9, 2014 2:10 pm | by University of Puget Sound | News | Comments

An ancient Greek astronomical puzzle has one more piece in place. The new evidence results from...

View Sample

FREE Email Newsletter

A new machine-learning algorithm clusters data according to both a small number of shared features (circled in blue) and similarity to a representative example (far right). Courtesy of Christine Daniloff

Teaching by Example: Pattern-recognition Systems Convey What they Learn to Humans

December 9, 2014 2:00 pm | by Larry Hardesty, MIT | News | Comments

Computers are good at identifying patterns in huge data sets. Humans, by contrast, are good at inferring patterns from just a few examples. In a paper appearing at the Neural Information Processing Society’s conference next week, MIT researchers present a new system that bridges these two ways of processing information, so that humans and computers can collaborate to make better decisions.

This tiny slice of silicon, etched in Jelena Vuckovic's lab at Stanford with a pattern that resembles a bar code, is one step on the way toward linking computer components with light instead of wires. Courtesy Vuckovic Lab

New Algorithm a Big Step toward Using Light to Transmit Data

December 9, 2014 1:38 pm | by Stanford University, Chris Cesare | News | Comments

Engineers have designed and built a prism-like device that can split a beam of light into different colors and bend the light at right angles, a development that could eventually lead to computers that use optics, rather than electricity, to carry data. The optical link is a tiny slice of silicon etched with a pattern that resembles a bar code. When a beam of light is shined at the link, two different wavelengths of light split off

Biological engineers have created a new computer model that allows them to design the most complex three-dimensional DNA shapes ever produced, including rings, bowls, and geometric structures such as icosahedrons that resemble viral particles.

Computer Model Enables Design of Complex DNA Shapes

December 3, 2014 3:45 pm | by Anne Trafton, MIT | News | Comments

Biological engineers have created a new computer model that allows them to design the most complex three-dimensional DNA shapes ever produced, including rings, bowls, and geometric structures such as icosahedrons that resemble viral particles. 

Advertisement
It's a robot unlike any other: inspired by the world's fastest land animal, controlled by video game technology and packing nifty sensors — including one used to maneuver drones, satellites and ballistic missiles. The robot, called the cheetah, is the cre

MIT Engineers Have High Hopes for Cheetah Robot

December 2, 2014 3:27 pm | by Rodrique Ngowi, Associated Press | News | Comments

It's a robot unlike any other: inspired by the world's fastest land animal, controlled by video game technology and packing nifty sensors — including one used to maneuver drones, satellites and ballistic missiles. The robot, called the cheetah, is the creation of researchers at the Massachusetts of Technology, who had to design key elements from scratch because of a lack of or shortcomings in existing technology.

Engineers have designed and built a prism-like device that can split a beam of light into different colors and bend the light at right angles, a development that could eventually lead to computers that use optics, rather than electricity, to carry data.

Using Light Instead of Wires Inside Computers

December 2, 2014 3:01 pm | by Chris Cesare, Stanford University | News | Comments

Engineers have designed and built a prism-like device that can split a beam of light into different colors and bend the light at right angles, a development that could eventually lead to computers that use optics, rather than electricity, to carry data.

The program showed "mind-blowing" sophistication by penetrating several different computer networks in an unnamed Middle Eastern country. Rather than communicate with each target, the malware was able to avoid detection by using one network to relay comma

Mind-blowingly Sophisticated Hacking Program is Groundbreaking, Almost Peerless

November 26, 2014 12:15 pm | by Brandon Bailey, AP Technology Writer | News | Comments

Cyber-security researchers say they've identified a highly sophisticated computer hacking program that appears to have been used to spy on banks, telecommunications companies, official agencies and other organizations around the world. The malicious software known as "Regin" is designed to collect data from its targets for periods of months or years, penetrating deep into computer networks while covering its tracks to avoid detection.

NAG Compiler 6.0

NAG Compiler 6.0

November 26, 2014 9:06 am | Nag Ltd | Product Releases | Comments

NAG Compiler 6.0 accurately follows Fortran and OpenMP programming language standards, supporting OpenMP 3.1 and Fortran 2008, 2003 and 95. Because the code is correct; applications that are developed with and checked by the NAG Compiler are ready to be run on a wide range of current and future computer processors.

The National Medal of Science and National Medal of Technology and Innovation medals ready to be presented to awardees. Courtesy of Sandy Schaeffer, NSF

National Medals of Science, Technology and Innovation Presented

November 25, 2014 12:00 pm | by NSF | News | Comments

At a White House ceremony on November 20, 2014, President Obama presented the National Medal of Science and National Medal of Technology and Innovation to individuals who have made outstanding contributions to science and engineering. The awards are the nation's highest honors for achievement and leadership in advancing the fields of science and technology.

Advertisement
Wireless communication and consumption of digital media might profit from freely accessible transmission frequencies in the UHF range. Courtesy of KIT

New Frequency Ranges May Make Free Super WiFi Possible

November 24, 2014 4:03 pm | by KIT – University of the State of Baden-Wuerttemberg and National Research Center of the Helmholtz Association | News | Comments

Wireless data transmission largely takes place via WLAN networks, such as WiFi. However, these networks are currently limited to high frequency ranges at 2 GHz and above and, hence, have a limited range. The authors of the study propose to extend the frequencies for free communication to include lower ranges and even increased transmission power.

The Turing Test — originally called the Imitation Game — was proposed by computing pioneer Alan Turing in 1950. Courtesy of Juan Alberto Sánchez Margallo

Alternative to Turing Test Proposed

November 21, 2014 4:39 pm | by Georgia Institute of Technology | News | Comments

A Georgia Tech professor recently offered an alternative to the celebrated “Turing Test” to determine whether a machine or computer program exhibits human-level intelligence. The Turing Test — originally called the Imitation Game — was proposed by computing pioneer Alan Turing in 1950. In practice, some applications of the test require a machine to engage in dialogue and convince a human judge that it is an actual person.

John Joyce is a laboratory informatics specialist based in Richmond, VA.

Still Holiday Shopping? 19 Great Gifts for Just about Anyone

November 19, 2014 1:35 pm | by John R. Joyce, Ph.D. | Blogs | Comments

Welcome to the second installment of Scientific Computing's holiday gift guide. This section focuses on gifts of more general interest, so that they'll be suitable for your giftees without a technical background — you know, muggles!  But don't worry, many of these are suitable for the Geeks in your life as well. Part I focuses on more technical items, though there are a number that might appeal to your more sophisticated muggle as well.

Eric Eide, University of Utah research assistant professor of computer science, stands in the computer science department's "Machine Room" where racks of Web servers sit. It is on these computers that Eide, computer science associate professor John Regehr

Self-Repairing Software Tackles Malware

November 14, 2014 3:54 pm | by University of Utah | News | Comments

Computer scientists have developed software that not only detects and eradicates never-before-seen viruses and other malware, but also automatically repairs damage caused by them. The software then prevents the invader from ever infecting the computer again.

Since its inception in 1966, ACM’s Turing Award has honored the computer scientists and engineers who created the systems and underlying theoretical foundations that have propelled the information technology industry.

Turing Award Prize Raised to $1 Million Cash

November 14, 2014 2:25 pm | by ACM | News | Comments

ACM has announced that the funding level for the ACM A.M. Turing Award is now $1,000,000, to be provided by Google. The new amount is four times its previous level. The cash award, which goes into effect for the 2014 ACM Turing Award to be announced early next year, reflects the escalating impact of computing on daily life through the innovations and technologies it enables.

Advertisement
Top down view of the gmon qubit chip (0.6 cm x 0.6 cm) connected to microwave frequency control lines (copper) with thin wire bonds. Courtesy of Michael Fang, Martinis Lab

Piece of the Quantum Puzzle: Achieving Controllability to Explore Simulation

November 13, 2014 1:57 pm | by Julie Cohen, UC Santa Barbara | News | Comments

While the Martinis Lab at UC Santa Barbara has been focusing on quantum computation, former postdoctoral fellow Pedram Roushan and several colleagues have been exploring qubits (quantum bits) for quantum simulation on a smaller scale. In conjunction with developing a general-purpose quantum computer, Martinis’ team worked on a new qubit architecture, which is an essential ingredient for quantum simulation.

John Joyce is a laboratory informatics specialist based in Richmond, VA.

Holiday Shopping? 25 Gifts Sheldon and Friends would Love

November 13, 2014 8:40 am | by John R. Joyce, Ph.D. | Blogs | Comments

Welcome to Scientific Computing's annual holiday gift guide. In this section, we've focused on identifying gifts suitable for the true Geeks out there. However, I believe everyone has a little geek in them, it just needs to be properly nurtured for it to catch fire.

By using a technique called ion doping, the team of researchers have discovered a material that could use light to bring together different computing functions into one component, leading to all-optical systems.

Lighting the Way for Super-fast Computers

November 12, 2014 3:28 pm | by University of Surrey | News | Comments

Findings demonstrate how glass can be manipulated to create a material that will enable computers to transfer information using light. This development could significantly increase computer processing speeds and power in the future. The findings show that it’s possible to change the electronic properties of amorphous chalcogenides, a glass material integral to data technologies such as CDs and DVDs.

Highly motivated to organize the Argonne Training Program on Extreme-Scale Computing, Paul Messina reflects on what makes the program unique and a can’t-miss opportunity for the next generation of HPC scientists.

A Q&A with Paul Messina, Director of Science for the Argonne Leadership Computing Facility

November 6, 2014 4:22 pm | by Brian Grabowski, Argonne National Laboratory | Articles | Comments

Highly motivated to organize the Argonne Training Program on Extreme-Scale Computing, Paul Messina reflects on what makes the program unique and a can’t-miss opportunity for the next generation of HPC scientists. ATPESC is an intense, two-week program that covers most of the topics and skills necessary to conduct computational science and engineering research on today’s and tomorrow’s high-end computers.

Traditionally, a person might enter a password or pull out a driver's license or passport as proof of identity. But increasingly, identification and authentication can also require an eye scan or a well-placed hand. It's a science known as biometrics, rec

Cybersecurity: Computer Scientist Sees New Possibilities for Ocular Biometrics

November 4, 2014 12:37 pm | by Miles O'Brien, NSF | News | Comments

Researhers are developing a three-layered, multi-biometric approach that tracks the movement of the eye globe and its muscles, and monitors how and where a person's brain focuses visual attention, in addition to scanning patterns in the iris. The system essentially upgrades the security of existing iris recognition technology with nothing more than a software upgrade.

Become an Eyelander vision training game: Researchers from the University of Lincoln, UK, and Wesc Foundation have developed a new computer game which could hold the key to helping visually-impaired children lead independent lives. It is now in clinical t

Computational Neuroscientist Develops Computer Game to Aid Visually Impaired

November 3, 2014 11:14 am | by University of Lincoln | News | Comments

Researchers will begin testing a new computer game that they hope could hold the key to helping visually impaired children lead independent lives. Developed by a team of neuroscientists and video game designers, the Eyelander game features exploding volcanoes, a travelling avatar and animated landscapes.

A new system lets programmers identify sections of their code that can tolerate a little error. The system then determines which program instructions to assign to unreliable hardware components, to maximize energy savings while still meeting the programme

Harnessing Error-prone Chips Trades Computational Accuracy for Energy Savings

October 31, 2014 2:09 pm | by Larry Hardesty, MIT | News | Comments

As transistors get smaller, they also grow less reliable. Increasing their operating voltage can help, but that means a corresponding increase in power consumption. With information technology consuming a steadily growing fraction of the world’s energy supplies, some researchers and hardware manufacturers are exploring the possibility of simply letting chips botch the occasional computation.

The software stores only the changes of the system state at specific points in time. Courtesy of Université du Luxembourg, Boshua

New Algorithm Provides Enormous Reduction in Computing Overhead

October 30, 2014 4:37 pm | by University of Luxembourg | News | Comments

The control of modern infrastructure, such as intelligent power grids, needs lots of computing capacity. Scientists have developed an algorithm that might revolutionize these processes. With their new software, researchers are able to forego the use of considerable amounts of computing capacity, enabling what they call micro mining.

When properly understood, privacy rules are essential, Neil M. Richards, JD, professor of law says.

Right to Privacy: Achieving Meaningful Protection in a Big Data World

October 24, 2014 8:42 pm | by Washington University in St. Louis | News | Comments

In the digital age in which we live, monitoring, security breaches and hacks of sensitive data are all too common. It has been argued that privacy has no place in this big data environment and anything we put online can, and probably will, be seen by prying eyes. In a new paper, a privacy law expert makes the case that, when properly understood, privacy rules will be an essential and valuable part of our digital future.

Researchers found that people who played violent video games in 3-D showed more evidence of anger afterward than did people who played using traditional 2-D systems — even those with large screens. © lassedesignen / Fotolia

Violent 3-D Gaming Provokes More Anger

October 24, 2014 5:17 pm | by Jeff Grabmeier, The Ohio State University | News | Comments

Playing violent video games in 3-D makes everything seem more real — and that may have troubling consequences for players, a new study reveals. Researchers found that people who played violent video games in 3-D showed more evidence of anger afterward than did people who played using traditional 2-D systems — even those with large screens.

Researchers are expanding the applicability of biological circuits. Background: Microscopic image of human kidney cells with fluorescent proteins in cell culture.

Constructing Precisely Functioning, Programmable Bio-computers

October 23, 2014 3:40 pm | by Fabio Bergamin, ETH | News | Comments

Bio-engineers are working on the development of biological computers with the aim of designing small circuits made from biological material that can be integrated into cells to change their functions. In the future, such developments could enable cancer cells to be reprogrammed, thereby preventing them from dividing at an uncontrollable rate. Stem cells could likewise be reprogrammed into differentiated organ cells.

Set up of the experiment showing the orthogonal side illumination  © Vetlugin et al.

Quantum Holograms could become Quantum Information Memory

October 22, 2014 12:22 pm | by Springer | News | Comments

Russian scientists have developed a theoretical model of quantum memory for light, adapting the concept of a hologram to a quantum system. The authors demonstrate for the first time that it is theoretically possible to retrieve, on demand, a given portion of the stored quantized light signal of a holographic image — set in a given direction in a given position in time sequence.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading