Special Coverage

Home

Researchers Equip Robot with Novel Tactile Sensor

Researchers at MIT and Northeastern University have equipped a robot with a novel tactile sensor that lets it grasp a USB cable draped freely over a hook and insert it into a USB port.The sensor is an adaptation of a technology called GelSight, which was developed by the lab of Edward Adelson, the John and Dorothy Wilson Professor of Vision Science at MIT, and first described in 2009. The new sensor isn’t as sensitive as the original GelSight sensor, which could resolve details on the micrometer scale. But it’s smaller — small enough to fit on a robot’s gripper — and its processing algorithm is faster, so it can give the robot feedback in real time.A GelSight sensor — both the original and the new, robot-mounted version — consists of a slab of transparent, synthetic rubber coated on one side with a metallic paint. The rubber conforms to any object it’s pressed against, and the metallic paint evens out the light-reflective properties of diverse materials, making it much easier to make precise optical measurements.In the new device, the gel is mounted in a cubic plastic housing, with just the paint-covered face exposed. The four walls of the cube adjacent to the sensor face are translucent, and each conducts a different color of light — red, green, blue, or white — emitted by light-emitting diodes at the opposite end of the cube. When the gel is deformed, light bounces off of the metallic paint and is captured by a camera mounted on the same cube face as the diodes.From the different intensities of the different-colored light, the algorithms developed by Adelson’s team can infer the three-dimensional structure of ridges or depressions of the surface against which the sensor is pressed. Source Read other Sensors tech briefs.

Posted in: Photonics, Optics, Materials, Motion Control, Sensors, Lighting, LEDs, Machinery & Automation, Robotics, News

Read More >>

New Algorithm Lets Cheetah Robot Run

Speed and agility are hallmarks of the cheetah: The big predator is the fastest land animal on Earth, able to accelerate to 60 mph in just a few seconds. As it ramps up to top speed, a cheetah pumps its legs in tandem, bounding until it reaches a full gallop.Now MIT researchers have developed an algorithm for bounding that they’ve successfully implemented in a robotic cheetah — a sleek, four-legged assemblage of gears, batteries, and electric motors that weighs about as much as its feline counterpart. The team recently took the robot for a test run on MIT’s Killian Court, where it bounded across the grass at a steady clip. In experiments on an indoor track, the robot sprinted up to 10 mph, even continuing to run after clearing a hurdle. The MIT researchers estimate that the current version of the robot may eventually reach speeds of up to 30 mph.The key to the bounding algorithm is in programming each of the robot’s legs to exert a certain amount of force in the split second during which it hits the ground, in order to maintain a given speed: In general, the faster the desired speed, the more force must be applied to propel the robot forward. In experiments, the team ran the robot at progressively smaller duty cycles, finding that, following the algorithm’s force prescriptions, the robot was able to run at higher speeds without falling. Sangbae Kim, an associate professor of mechanical engineering at MIT, says the team’s algorithm enables precise control over the forces a robot can exert while running. SourceAlso: Learn about Hall Thrusters for Robotic Solar System Exploration.

Posted in: Motion Control, Motors & Drives, Software, Machinery & Automation, Robotics, News

Read More >>

Untethered Soft Robot Walks Through Flames

Developers from Harvard’s School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering have produced the first untethered soft robot — a quadruped that can stand up and walk away from its designers.The researchers were able to scale up earlier soft-robot designs, enabling a single robot to carry on its back all the equipment it needs to operate — micro-compressors, control systems, and batteries.Compared with earlier soft robots, which were typically no larger than a steno pad, the system is huge, measuring more than a half-meter in length and capable of carrying as much as 7½ pounds on its back.Giving the untethered robot the strength needed to carry mechanical components meant air pressures as high as 16 pounds per square inch, more than double the seven psi used by many earlier robot designs. To deal with the increased pressure, the robot had to be made of tougher stuff.The material settled on was a “composite” silicone rubber made from stiff rubber impregnated with hollow glass microspheres to reduce the robot’s weight. The robot’s bottom was made from Kevlar fabric to ensure it was tough and lightweight. The result was a robot that can stand up to a host of extreme conditions.SourceAlso: Learn about a Field-Reconfigurable Manipulator for Rovers.

Posted in: Materials, Composites, Mechanical Components, Motion Control, Motors & Drives, Machinery & Automation, Robotics, News

Read More >>

NASA Tests Robot Swarms for Autonomous Movement

NASA engineers and interns are testing a group of robots and related software that will show whether it's possible for autonomous machines to scurry about an alien world such as the Moon, searching for and gathering resources just as an ant colony does.

Posted in: Electronics & Computers, Motion Control, Software, Communications, Wireless, Machinery & Automation, Robotics, RF & Microwave Electronics, Antennas, News

Read More >>

DARPA Teams With Industry to Create Spaceplane

DARPA has created an Experimental Spaceplane (XS-1) to create a new paradigm for more routine, responsive, and affordable space operations. In an important step toward that goal, DARPA has awarded prime contracts for Phase 1 of XS-1 to three companies: The Boeing Company (working with Blue Origin, LLC), Masten Space Systems (working with XCOR Aerospace), and Northrop Grumman Corporation (working with Virgin Galactic).

Posted in: Aerospace, Aviation, Machinery & Automation, Robotics, RF & Microwave Electronics, Defense, News

Read More >>

New Laser Technology to Make 2020 Mission to Mars

NASA announced recently that laser technology originally developed at Los Alamos National Laboratory has been selected for its new Mars mission in 2020. SuperCam, which builds upon the successful capabilities demonstrated aboard the Curiosity Rover during NASA’s current Mars Mission, will allow researchers to sample rocks and other targets from a distance using a laser.

Posted in: Electronics & Computers, Electronics, Imaging, Photonics, Lasers & Laser Systems, Sensors, Detectors, Test & Measurement, Measuring Instruments, Aerospace, Machinery & Automation, News

Read More >>

Astronauts to Test Free-Flying Robotic 'Smart SPHERES'

Three bowling ball-size free-flying Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES) have been flying inside the International Space Station since 2006. These satellites provide a test bed for development and research, each having its own power, propulsion, computer, navigation equipment, and physical and electrical connections for hardware and sensors for various experiments.Aboard Orbital Sciences Corp.'s second contracted commercial resupply mission to the space station, which arrived to the orbital laboratory on July 16, NASA's Ames Research Center in Moffett Field, California, sent two Google prototype Project Tango smartphones that astronauts will attach to the SPHERES for technology demonstrations inside the space station. By connecting a smartphone to the SPHERES, the technology becomes "Smart SPHERES, " a more "intelligent" free-flying robot with built-in cameras to take pictures and video, sensors to help conduct inspections, powerful computing units to make calculations and Wi-Fi connections to transfer data in real time to the computers aboard the space station and at mission control in Houston.In a two-phase experiment, astronauts will manually use the smartphones to collect visual data using the integrated custom 3-D sensor to generate a full 3-D model of their environment. After the map and its coordinate system are developed, a second activity will involve the smartphones attached to the SPHERES, becoming the free-flying Smart SPHERES. As the free-flying robots move around the space station from waypoint to waypoint, utilizing the 3-D map, they will provide situational awareness to crewmembers inside the station and flight controllers in mission control. These experiments allow NASA to test vision-based navigation in a very small mobile product.SourceAlso: Learn about Automatic Lunar Rock Detection and Mapping.

Posted in: Electronics & Computers, Power Management, PCs/Portable Computers, Cameras, Video, Visualization Software, Imaging, Sensors, Test & Measurement, Communications, Aerospace, Aviation, Machinery & Automation, Robotics, RF & Microwave Electronics, News

Read More >>