An article shows the potential applications for Google Glass in the surgical setting, particularly in relation to training. Personal portable information technology is advancing at a breathtaking speed. The authors of the study obtained a Glass device through Google's Explorer Program and have tested its applicability in their daily pediatric surgical practice.
The New York World's Fair of 1964 introduced 51 million visitors to a range of...
NASA has made available to the public, at no cost, more than 1,000 codes with its release on...
A new DARPA technology office will merge biology, engineering and computer science to harness the power of natural systems for national security. Technology, like biology, constantly evolves. It is DARPA’s mission to stay ahead of the shifting technology curve by making critical, early investments in areas that cut across fields of research and enable revolutionary new capabilities for U.S. national security.
Cyber attacks are the primary domestic security threat facing the United States, FBI Director James Comey told the Senate Homeland Security Committee last year. In our brave new world, traditional warfare is now inextricably linked to economic and cyber warfare. In just one example, cyber strikes have the potential to derail a nation's power grid, causing widespread damage, chaos and loss of life.
Is it too easy for high-tech companies to patent inventions that are not really new, but simply take an old idea and blend it with computer wizardry? The Supreme Court wrestled with that question on March 31, 2014, as justices considered making it tougher for the government to issue patents for computer software. The outcome could send tremors through an industry that touches virtually every sector of the economy
The results of a two-year study into dream control show that it is now possible for people to create their perfect dream and wake up feeling especially happy and refreshed. An iPhone app monitors a person during sleep and plays a carefully crafted soundscape when they dream. Each soundscape is carefully designed to evoke a pleasant scenario. The app was downloaded over 500,000 times, and the researchers collected millions of dream reports.
Personal fabrication machines, such as 3-D printers and laser cutters, are becoming increasingly ubiquitous. But designing objects for fabrication still requires 3-D modelling skills, making them inaccessible without specialist training. The MixFab environment enables users to design objects in an immersive augmented reality environment, interact with virtual objects in a direct gestural manner...
An international team of researchers, with participation from the UAB, has managed to create an entanglement of 103 dimensions with only two photons. The record had been established at 11 dimensions. The discovery could represent a great advance toward the construction of quantum computers with much higher processing speeds than current ones, and toward a better encryption of information.
Aurora G-Station and the Aurora Cube is the CPU-only version are full HPC systems that combine computation, management, storage functionality as well as liquid cooling infrastructure that guarantees compactness and absence of noise.
In theory, doubling the number of cores doubles the chip’s efficiency, but splitting up computations so that they run efficiently in parallel isn’t easy. On the other hand, say a trio of computer scientists from MIT, Israel’s Technion, and Microsoft Research, neither is it as hard as had been feared.
Civil rights leader Rev. Jesse Jackson led a delegation to the Hewlett-Packard annual shareholders meeting on March 19, 2014, to bring attention to Silicon Valley's poor record of including blacks and Latinos in hiring, board appointments and startup funding. Jackson's strategy borrows from the traditional civil rights era playbook of shaming companies to prod them into transformation.
A team of physicists has proposed a novel and efficient way to leverage the strange quantum physics phenomenon known as entanglement. The approach would involve combining light-emitting diodes (LEDs) with a superconductor to generate entangled photons and could open up a rich spectrum of new physics as well as devices for quantum technologies, including quantum computers and quantum communication.
The British inventor of the World Wide Web wants a digital bill of rights to protect Internet users from surveillance. Speaking on the 25th anniversary of his creation, Tim Berners-Lee says he hopes to spark a global conversation about the need to defend principles that have made the Web successful.
According to a current study from the University of Cambridge, software developers are spending about the half of their time on detecting errors and resolving them. Projected onto the global software industry, according to the study, this would amount to a bill of about 312 billion US dollars every year.
Optical data storage does not require expensive magnetic materials as synthetic alternatives work just as well. This is the finding of an international team from York, Berlin and Nijmegen, published Thursday February 27 in Applied Physics Letters. The team’s discovery brings the much cheaper method...
Researchers at IBM have set a new record for data transmission over a multimode optical fiber, a type of cable that is typically used to connect nearby computers within a single building or on a campus. The achievement demonstrated that the standard, existing technology for sending data over short distances should be able to meet the growing needs of servers, data centers and supercomputers through the end of this decade
How do you build a universal quantum computer? Turns out, this question was addressed by theoretical physicists about 15 years ago. The answer was laid out in a research paper and has become known as the DiVincenzo criteria. The prescription is pretty clear at a glance; yet in practice the physical implementation of a full-scale universal quantum computer remains an extraordinary challenge.
Every second, a computer must process billions of computational steps to produce even the simplest outputs. Imagine if every one of those steps could be made just a tiny bit more efficient. researchers have developed a series of novel devices that do just that.
Since he was a graduate student, Armando Solar-Lezama, an associate professor in MIT’s Department of Electrical Engineering and Computer Science, has been working on a programming language called Sketch, which allows programmers to simply omit some of the computational details of their code. Sketch then automatically fills in the gaps.
A research collaboration has demonstrated the world's fastest silicon-based device to date. The investigators from IHP-Innovations for High Performance Microelectronics in Germany and the Georgia Institute of Technology operated a silicon-germanium (SiGe) transistor at 798 gigahertz (GHz) fMAX, exceeding the previous speed record for silicon-germanium chips by about 200 GHz.
It used to be that "hacking" was just a type of crime, a computer break-in. But today, the term is also part of a growing — and perfectly legal — mainstay of the tech sector. Computer programming competitions known as "hackathons" have spread like viruses in recent years as ways for geeks, nerds and designers to get together to eat pizza, lose sleep and create something new.
The scientists and inventors who make big-screen superheroes, spectacular explosions and other only-in-the-movies effects possible have their own Oscar ceremony. Kristen Bell and Michael B. Jordan hosted the film academy's Scientific and Technical Awards February 15, 2014, at the Beverly Hills Hotel, recognizing more than 50 of the most creative scientists and engineers in the movie business.
IBM announced a new service offering to help critical infrastructure organizations utilize a new Cybersecurity Framework announced by the Administration at the White House. The new Cybersecurity Framework is the product of a year-long collaboration between the U.S. government and industry, coordinated and led by the National Institute of Standards and Technology (NIST).
“When I was growing up, I thought the 'gender war' was over and women had won. But it’s still not over,” says Amy Yin ’14, cofounder of Harvard Women in Computer Science (Harvard WICS). “The biases may be more subtle now, but the statistics are not.
A brain-computer interface allows people to use only their thoughts to control a flying quadcopter. With support from the National Science Foundation (NSF), biomedical engineer Bin He and his team at the University of Minnesota have created the interface with the goal of helping people with disabilities, such as paralysis, regain the ability to do everyday tasks.
Alan Turing: His Work and Impact, was selected for the top honor, R.R. Hawkins Award, at the 38th annual PROSE Awards. Celebrating the centenary of his birth, the bookwas praised as a fitting tribute to the life of the legendary mathematical and scientific genius, considered to be the father of theoretical computer science and artificial intelligence.
Microsoft has named Satya Nadella, head of its cloud computing business, as the company's next CEO. He immediately replaces Steve Ballmer, who had announced in August 2013 that he would retire from the world's biggest software company after more than 13 years at its helm. Here's a look at key moments in Microsoft's history.
- Page 1