A new robot that raises its tail like a scorpion is scheduled to look at melted nuclear fuel inside one of the three wrecked Fukushima reactors in Japan. Toshiba, co-developer of the "scorpion" crawler, said the robot will venture into the Unit 2 reactor's primary containment vessel in August after a month of training for its handlers. Officials hope the robot can see the fuel in the pressure vessel in the middle of the reactor.
At the recent International Conference on Robotics and Automation, researchers presented a...
For someone suffering from paralysis or limited mobility, visiting with other people is...
To scientists' relief and delight, the Philae spacecraft that landed on a comet last fall has...
It's difficult enough to see things in the dark, but what if you also had to hover in mid-air while tracking a flower moving in the wind? That's the challenge the hummingbird-sized hawkmoth (Manduca sexta) must overcome while feeding on the nectar of its favorite flowers.
Last weekend was the final round of DARPA's contest to design control systems for a humanoid robot that could climb a ladder, remove debris, drive a utility vehicle and perform several other tasks related to a hypothetical disaster. When a bipedal robot takes a step, its foot strikes the ground at a number of different points. MIT researchers found a way to generalize the approach to more complex motions in 3-D.
The way insects see and track their prey is being applied to a new robot in the hopes of improving robot visual systems. The project — which crosses the boundaries of neuroscience, mechanical engineering and computer science — builds on years of research into insect vision. The learnings from both insects and humans can be applied in a model virtual reality simulation, enabling an artificial intelligence system to 'pursue' an object.
Imagine that everything in your mind had been erased, and you had to learn everything all over again. What would that process be like? Two researchers at NTNU have made a robot that learns like a young child. At least, that’s the idea. The machine starts with nothing — it has to learn everything from scratch. The machine is called [self.]. It analyzes sound through a system based on the human ear, and learns to recognize images.
A human can make intuitive choices about what actions to take in order to achieve a goal. Robots have a far more difficult time choosing from of a universe of possible actions. Researchers at Brown University are developing a new algorithm that can learn that skill from a video game environment. They are developing the algorithm to help robots better plan their actions in complex environments.
Twenty robotics teams, ranging from university students to small businesses, are preparing to compete June 8 to 13 in the fourth running of the NASA Sample Return Robot Challenge for a prize purse of $1.5 million. At the autonomous robot competition held at Worcester Polytechnic Institute, teams must demonstrate their robot can locate and collect geologic samples from a large and varied landscape, without human control.
Decentralized partially observable Markov decision processes are a way to model autonomous robots’ behavior in circumstances where neither their communication with each other nor their judgments about the outside world are perfect. The problem is that they’re as complicated as their name. They provide the most rigorous mathematical models of multiagent systems — not just robots, but any autonomous networked devices — under uncertainty.
Despite a holiday week in the US, there were several top stories you won’t want to miss. A hacked Kinect controller becomes a game-changer for Parkinson’s; insight into unification of General Relativity and quantum mechanics; deep web searching in the name of science; how infections affect your IQ; the legacy of John Nash; Pope Francis on moral justification to fight global warming; how a robot revolution is creating a visionary world...
“They’re here … to help and improve our lives,” The Museum of Science and Industry, Chicago announces on its Web site. MSI is hosting a new national touring exhibit, Robot Revolution, which explores how robots, created by human ingenuity, will ultimately be our companions and colleagues, changing how we play, live and work together. It allows guests to step into a visionary world where robots are not just a curiosity, but a vital asset.
Robots will one day provide tremendous benefits to society, such as in search and rescue — but not until they can learn to keep working when damaged. A paper shows robots automatically recover from injury in less than two minutes. A video of the work shows a six-legged robot adapt to keep walking even if two of its legs are broken. It also shows a robotic arm that learned how to correctly place an object even with several broken motors.
Scientists have created underwater robot swarms that function like schools of fish, exchanging information to monitor the environment, searching, maintaining, exploring and harvesting resources in underwater habitats. The EU supported COCORO project explored and developed collective cognition in autonomous robots in a rich set of 10 experimental demonstrators, which are shown in 52 videos.
Today’s industrial robots are remarkably efficient — as long as they’re in a controlled environment where everything is exactly where they expect it to be. But put them in an unfamiliar setting, where they have to think for themselves, and their efficiency plummets. And the difficulty of on-the-fly motion planning increases exponentially with the number of robots involved. For even a simple collaborative task...
What if handheld tools know what needs to be done and were even able to guide and help inexperienced users to complete jobs that require skill? Researchers have developed and started studying a novel concept in robotics - intelligent handheld robots.
Paralyzed from the neck down after suffering a gunshot wound when he was 21, Erik G. Sorto now can move a robotic arm just by thinking about it and using his imagination. Sorto is the first person in the world to have a neural prosthetic device implanted in a region of the brain where intentions are made, giving him the ability to perform a fluid hand-shaking gesture, drink a beverage and even play “rock, paper, scissors.”
Grasping and shaking hands is a defining human ritual. But what about our android counterparts? What does the grip of a robot’s hand say about the machine’s capabilities, especially dexterity — the ability to wield and manipulate different objects under challenging circumstances, such as in manufacturing or assembly operations? NIST is developing tests to take full measure of robotic grasping in order to provide useful benchmarking tools.
What do you know? There is now a world standard for capturing and conveying knowledge robots possess — or, to get philosophical about it, an ontology for automatons. Crafted by a working group of 166 experts from 23 nations, IEEE Standard for Ontologies for Robotics and Automation is designed to simplify programming, extend information processing and reasoning capabilities, and enable clear robot-to-robot and human-to-robot communication.
In case you missed it, here's another chance to catch this week's biggest hits. Writing like a genius; the largest individual structure ever identified by humanity; imaging fascinating, wild and unpredictable thunder; a car prototype that folds, shrinks and drives sideways; a high-efficiency laser system to remove space debris from orbit; and more are among the latest top stories.
When ants go exploring in search of food they end up choosing collective routes that fit statistical distributions of probability. This has been demonstrated by a team of mathematicians after analyzing the trails of a species of Argentine ant. Studies like this could be applied to coordinate the movement of micro-robots in cleaning contaminated areas for example.
Using a smart tablet and a red beam of light, researchers have created a system that allows people to control a fleet of robots with the swipe of a finger. A person taps the tablet to control where the beam of light appears on a floor. The swarm robots then roll toward the illumination, constantly communicating with each other and deciding how to evenly cover the lit area.
A team of German software developers and designers, along with electronics and construction engineers, has developed an innovative design for a new type of electric smart “micro car.” Now in its second-phase, the prototype is able to convert from “traditional driving” to driving sideways in just seconds, with each wheel powered by its own motor. The two-seater also can shrink from eight feet to less than five feet in length.
Last September, Cal Poly's CubeSat team and The Planetary Society unfurled a solar-powered sail that some believe could revolutionize satellite propulsion. This was a deployment test and key milestone for the LightSail project. Among those present was Bill Nye, CEO of The Planetary Society. Lightsail is a Planetary Society initiative with the goal of demonstrating effective use of solar sails for satellite control and movement.
To make cars as safe as possible, we crash them into walls to pinpoint weaknesses and better protect people who use them. That’s the idea behind a series of experiments conducted by an engineering team who hacked a next-gen teleoperated surgical robot — one used only for research purposes — to test how easily a malicious attack could hijack remotely-controlled operations in the future and to make those systems more secure.
When developing the autonomous mission-planning system, Williams’ group took inspiration from the Star Trek franchise and the top-down command center of the fictional starship Enterprise, after which he modeled and named the system. Just as a hierarchical crew runs the fictional starship, the Enterprise system incorporates levels of decision-makers and similar to one that Williams developed for NASA following the loss of Mars Observer.
Most people are naturally adept at reading facial expressions — from smiling and frowning to brow-furrowing and eye-rolling — to tell what others are feeling. Now, scientists have developed ultra-sensitive, wearable sensors that can do the same thing. Their technology, reported in the journal ACS Nano, could help robot developers make their machines more human.
The cast of "Avengers: Age of Ultron" may battle out-of-control artificial intelligence on-screen but, in real life, they're not so sure about cutting-edge technology. AP talked with the cast about what they embrace and fear in today's high-tech landscape: ROBERT DOWNEY, JR.: I feel you have to embrace it. You know, there's always that shadow play that goes on ... But look, it took over a while ago...
- Page 1