Metamaterial 'Bends' Acoustic and Elastic Waves

Sound waves passing through the air, objects that break a body of water and cause ripples, or shockwaves from earthquakes all are considered “elastic” waves. These waves travel at the surface or through a material without causing any permanent changes to the substance’s makeup. Now, engineering researchers at the University of Missouri have developed a material that has the ability to control elastic waves, creating possible medical, military and commercial applications.In the past, scientists have used a combination of materials, such as metal and rubber, to effectively ‘bend’ and control waves. Guoliang Huang, associate professor of mechanical and aerospace engineering in the College of Engineering at MU, and his team designed a material using a single component: steel. The engineered structural material possesses the ability to control the increase of acoustical or elastic waves. Improvements to broadband signals and super-imaging devices also are possibilities.The material was made in a single steel sheet using lasers to engrave “chiral,” or geometric microstructure patterns, which are asymmetrical to their mirror images Huang said there are numerous possibilities for the material to control elastic waves, including super-resolution sensors, acoustic and medical hearing devices, as well as a “superlens” that could significantly advance super-imaging.SourceAlso: See more Materials & Coatings tech briefs.

Posted in: News


Wearable Nanowire Sensors Monitor Electrophysiological Signals

Researchers from North Carolina State University have developed a new, wearable sensor that uses silver nanowires to monitor electrophysiological signals, such as electrocardiography (EKG) or electromyography (EMG). The new sensor is as accurate as the “wet electrode” sensors used in hospitals, but can be used for long-term monitoring and when a patient is moving.

Posted in: News, News, Electronic Components, Electronics & Computers, Medical, Patient Monitoring, Nanotechnology, Semiconductors & ICs, Sensors


Aircraft with Hybrid Engine Can Recharge in Flight

Researchers from the University of Cambridge, in association with Boeing, have successfully tested the first aircraft to be powered by a parallel hybrid-electric propulsion system, where an electric motor and gas engine work together to drive the propeller. The demonstrator aircraft uses up to 30% less fuel than a comparable plane with a gas-only engine. The aircraft is also able to recharge its batteries in flight, the first time this has been achieved.

Posted in: News, Aerospace, Aviation, Batteries, Electronics & Computers, Power Management, Green Design & Manufacturing, Motion Control, Motors & Drives, Power Transmission


Energy Harvesting Could Help Power Spacecraft of the Future

A consortium is working on a project to maximize energy harvesting on a spacecraft of the future. The initiative seeks to find energy-saving and -maximizing solutions to enable eco-friendly aircraft to stay in space for long periods of time without the need to return to Earth to re-fuel, or to avoid carrying vast amounts of heavy fuel on long-stay journeys.

Posted in: News, Aerospace, Aviation, Communications, Energy, Energy Efficiency, Energy Harvesting, Green Design & Manufacturing, Test & Measurement


Mini Solar Observatory Can Be Used on Manned Spacecraft

Southwest Research Institute (SwRI) developed a miniature portable solar observatory for use onboard a commercial, manned, suborbital spacecraft. The SwRI Solar Instrument Pointing Platform (SSIPP) uses a classic, two-stage pointing system similar to larger spacecraft, but in this case, the first stage is a pilot who initially steers the instrument toward the Sun. SSIPP does the rest, locking onto the Sun to allow observations. The first SSIPP spaceflight will search for “solar ultrasound,” a phenomenon first observed in the early 2000s by the Transitional Region and Coronal Explorer (TRACE) spacecraft. The ultrasound is sound waves with a 10-second period, some 18 octaves deeper than ultrasound on Earth, and forms visible ripples in the Sun’s surface layers. The waves are difficult to detect without space instrumentation because the tiny, rapid fluctuations cannot be separated from the confounding influence of Earth’s turbulent atmosphere. Although at first SSIPP will be operated from inside the cockpit, a full system eventually will be mounted outside the host vehicle to enable UV and X-ray observations that are inaccessible from the ground. Source:

Posted in: News, Aerospace, Data Acquisition, Sensors


New Navigation Software Cuts Self-Driving Car Costs

A new software system developed at the University of Michigan uses video game technology to help solve one of the most daunting hurdles facing self-driving and automated cars: the high cost of the laser scanners they use to determine their location.Ryan Wolcott, a U-M doctoral candidate in computer science and engineering, estimates that the new concept could shave thousands of dollars from the cost of these vehicles. The technology enables them to navigate using a single video camera, delivering the same level of accuracy as laser scanners at a fraction of the cost."The laser scanners used by most self-driving cars in development today cost tens of thousands of dollars, and I thought there must be a cheaper sensor that could do the same job," he said. "Cameras only cost a few dollars each and they're already in a lot of cars. So they were an obvious choice."Wolcott's system builds on the navigation systems used in other self-driving cars that are currently in development, including Google's vehicle. The navigation systems use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they are able to determine the vehicle's location within a few centimeters.The software converts the map data into a three-dimensional picture much like a video game. The car's navigation system can then compare these synthetic pictures with the real-world pictures streaming in from a conventional video camera.SourceAlso: See more Software tech briefs.

Posted in: News, Automotive, Cameras, Imaging, Lasers & Laser Systems, Photonics, Software


NASA Robot Explores Volcanoes

Carolyn Parcheta, a NASA postdoctoral fellow based at NASA's Jet Propulsion Laboratory in Pasadena, California, and JPL robotics researcher Aaron Parness are developing robots that can explore volcanic fissures."We don't know exactly how volcanoes erupt. We have models but they are all very, very simplified. This project aims to help make those models more realistic," Parcheta said.Parcheta, Parness, and JPL co-advisor Karl Mitchell first explored this idea last year using a two-wheeled robot they call VolcanoBot 1, with a length of 12 inches (30 centimeters) and 6.7-inch (17-centimeter) wheels.VolcanoBot 2, smaller and lighter than its predecessor, will explore Hawaii's Kilauea volcano in March 2015. Parcheta's research endeavors were recently honored in National Geographic’s Expedition Granted campaign. SourceAlso: Learn about Autonomous Response for Targeting and Monitoring of Volcanic Activity.

Posted in: News, Machinery & Automation, Robotics, Measuring Instruments, Monitoring, Test & Measurement


The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.