Imaging

New Method Generates High-Resolution, Moving Holograms in 3D

The 3D effect produced by stereoscopic glasses used to watch movies cannot provide perfect depth cues. Furthermore, it is not possible to move one’s head and observe that objects appear different from different angles — a real-life effect known as motion parallax. Researchers have developed a new way of generating high-resolution, full-color, 3D videos that uses holographic technology. Holograms are considered to be truly 3D, because they allow the viewer to see different perspectives of a reconstructed 3D object from different angles and locations. Holograms are created using lasers, which can produce the complex light interference patterns, including spatial data, required to re-create a complete 3D object. To enhance the resolution of holographic videos, researchers used an array of spatial light modulators (SLMs). SLMs are used to display hologram pixels and create 3D objects by light diffraction. Each SLM can display up to 1.89 billion hologram pixels every second. Source:

Posted in: News, Video

Read More >>

ORCA Prototype Ready to Observe Ocean

If selected for a NASA flight mission, the Ocean Radiometer for Carbon Assessment (ORCA) instrument will study microscopic phytoplankton, the tiny green plants that float in the upper layer of the ocean and make up the base of the marine food chain.Conceived in 2001 as the next technological step forward in observing ocean color, the ORCA-development team used funding from Goddard’s Internal Research and Development program and NASA’s Instrument Incubator Program (IIP) to develop a prototype. Completed in 2014, ORCA now is a contender as the primary instrument on an upcoming Earth science mission.The ORCA prototype has a scanning telescope designed to sweep across 2,000 kilometers (1,243 miles) of ocean at a time. The technology collects light reflected from the sea surface that then passes through a series of mirrors, optical filters, gratings, and lenses. The components direct the light onto an array of detectors that cover the full range of wavelengths.Instead of observing a handful of discrete bands at specific wavelengths reflected off the ocean, ORCA measures a range of bands, from 350 nanometers to 900 nanometers at five-nanometer resolution. The sensor will see the entire rainbow, including the color gradations of green that fade into blue. In addition to the hyperspectral bands, the instrument has three short-wave infrared bands that measure specific wavelengths between 1200 and 2200 nanometers for atmospheric applications.The NASA researchers will use ORCA to obtain more accurate measurements of chlorophyll concentrations, the size of a phytoplankton bloom, and how much carbon it holds. Detecting chlorophyll in various wavelengths also will allow the team to distinguish between types of phytoplankton. Suspended sediments in coastal regions could also be detected by the instrument.SourceAlso: Learn about a Ultra-Low-Maintenance Portable Ocean Power Station.

Posted in: News, Optics, Photonics, Sensors, Measuring Instruments

Read More >>

New Navigation Software Cuts Self-Driving Car Costs

A new software system developed at the University of Michigan uses video game technology to help solve one of the most daunting hurdles facing self-driving and automated cars: the high cost of the laser scanners they use to determine their location.Ryan Wolcott, a U-M doctoral candidate in computer science and engineering, estimates that the new concept could shave thousands of dollars from the cost of these vehicles. The technology enables them to navigate using a single video camera, delivering the same level of accuracy as laser scanners at a fraction of the cost."The laser scanners used by most self-driving cars in development today cost tens of thousands of dollars, and I thought there must be a cheaper sensor that could do the same job," he said. "Cameras only cost a few dollars each and they're already in a lot of cars. So they were an obvious choice."Wolcott's system builds on the navigation systems used in other self-driving cars that are currently in development, including Google's vehicle. The navigation systems use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they are able to determine the vehicle's location within a few centimeters.The software converts the map data into a three-dimensional picture much like a video game. The car's navigation system can then compare these synthetic pictures with the real-world pictures streaming in from a conventional video camera.SourceAlso: See more Software tech briefs.

Posted in: News, Cameras, Lasers & Laser Systems, Photonics

Read More >>

New Serenity Payload Detects Hostile Fire

Two government-developed sensors are working together to increase the security of deployed soldiers. The Firefly and Serenity sensors employ government developed algorithms, software, and hardware to locate hostile fire around a base. The technology, a joint effort between the Army Aviation Research, Development and Engineering Center, or AMRDEC, and the Army Research Lab, referred to as ARL, has been under development for more than a decade.

Posted in: News, Cameras, Optics, Photonics, Detectors, Sensors

Read More >>

NASA Advances Next-Generation 3D-Imaging Lidar

Building, fixing, and refueling space-based assets or rendezvousing with a comet or asteroid will require a robotic vehicle and a super-precise, high-resolution 3D imaging lidar that will generate real-time images needed to guide the vehicle to a target traveling at thousands of miles per hour. A team at NASA’s Goddard Space Flight Center is developing a next-generation 3D scanning lidar — dubbed the Goddard Reconfiguable Solid-state Scanning Lidar (GRSSLi) — that could provide the imagery needed to execute these orbital dances. GRSSLi is a small, low-cost, low-weight platform capable of centimeter-level resolution over a range of distances, from meters to kilometers. Equipped with a low-power, eye-safe laser; a MEMS scanner; and a single photodetector, GRSSLi will "paint" a scene with the scanning laser, and its detector will sense the reflected light to create a high-resolution 3D image at kilometer distances. A non-scanning version of GRSSLi would be ideal for close approaches to asteroids. It would employ a flash lidar, which doesn’t paint a scene with a mechanical scanner, but rather illuminates the target with a single pulse of laser light — much like a camera flash. Source:

Posted in: News, Lasers & Laser Systems, Photonics, Machinery & Automation, Robotics

Read More >>

Moving Cameras “Talk” to Identify and Track Pedestrians

University of Washington electrical engineers have developed a way to automatically track people across moving and still cameras by using an algorithm that trains the networked cameras to learn one another’s differences. The cameras first identify a person in a video frame then follow that same person across multiple camera views. With the new technology, a car with a mounted camera could take video of a scene, then identify and track humans and overlay them into the virtual 3D map on a GPS screen. The researchers are developing this to work in real time, which could help track a specific person who is dodging the police. The team also installed the tracking system on cameras placed inside a robot and a flying drone, allowing the robot and drone to follow a person, even when the instruments came across obstacles that blocked the person from view. Source:

Posted in: News, Cameras, Video, Visualization Software, Machinery & Automation, Robotics

Read More >>

Imaging Via Nanoparticles Could Monitor Cancer and Other Diseases

MIT chemists have developed new nanoparticles that can simultaneously perform magnetic resonance imaging (MRI) and fluorescent imaging in living animals. Such particles could help scientists to track specific molecules produced in the body, monitor a tumor’s environment, or determine whether drugs have successfully reached their targets. The researchers have demonstrated the use of the particles, which carry distinct sensors for fluorescence and MRI, to track vitamin C in mice. Wherever there is a high concentration of vitamin C, the particles show a strong fluorescent signal but little MRI contrast. If there is not much vitamin C, a stronger MRI signal is visible but fluorescence is very weak. The researchers are now working to enhance the signal differences that they get when the sensor encounters a target molecule such as vitamin C. They have also created nanoparticles carrying the fluorescent agent plus up to three different drugs. This allows them to track whether the nanoparticles are delivered to their targeted locations. These particles could also be used to evaluate the level of oxygen radicals in a patient’s tumor, which can reveal valuable information about how aggressive the tumor is. Source:

Posted in: News, Patient Monitoring

Read More >>

Technique Enables Imaging of Transparent Organisms

Researchers at the RIKEN Quantitative Biology Center in Japan and the University of Tokyo have developed a method that combines tissue decolorization and light-sheet fluorescent microscopy to take extremely detailed images of the interior of individual organs and even entire organisms. The work allows scientists to make tissues and whole organisms transparent, and then image them at extremely precise, single-cell resolution. The method, called CUBIC (Clear, Unobstructed Brain Imaging Cocktails and Computational Analysis), was used to take images of mouse brains, hearts, lungs, kidneys, and livers, and then was attempted on infant and adult mice. In all cases, they could get clear tissues. The method could be used to study how embryos develop or how cancer and autoimmune diseases develop at the cellular level, leading to a deeper understanding of such diseases and perhaps to new therapeutic strategies. The group plans to allow for the rapid imaging of whole bodies of adult mice or larger samples such as human brains, and to apply this technology to further our understanding of autoimmune and psychiatric diseases. Source:

Posted in: News

Read More >>

Ultrasound Creates 3D Haptic Shapes

Touch feedback, known as haptics, has been used in entertainment, rehabilitation, and even surgical training. University of Bristol researchers, using ultrasound, have developed an invisible 3D haptic shape that can be seen and felt.Led by Dr Ben Long and colleagues Professor Sriram Subramanian, Sue Ann Seah, and Tom Carter from the University of Bristol’s Department of Computer Science, the research could change the way 3D shapes are used.  The new technology could enable surgeons to explore a CT scan by enabling them to feel a disease, such as a tumor, using haptic feedback.By focusing complex patterns of ultrasound, the air disturbances can be seen as floating 3D shapes. Visually, the researchers have demonstrated the ultrasound patterns by directing the device at a thin layer of oil so that the depressions in the surface can be seen as spots when lit by a lamp.The system generates an invisible three-dimensional shape that can be added to 3D displays to create an image that can be seen and felt. The research team have also shown that users can match a picture of a 3D shape to the shape created by the system. SourceAlso: Learn about an Ophthalmic Ultrasound System for Ocular Structures.

Posted in: News

Read More >>

Imaging Technique Could Detect Acoustically “Invisible” Cracks

It has long been understood that acoustic nonlinearity is sensitive to many physical properties including material microstructure and mechanical damage. The lack of effective imaging has, however, held back the use of this important method. Currently, engineers are able to produce images of the interior of components using ultrasound, but can only detect large problems such as cracks.

Posted in: News

Read More >>

White Papers

Roller Pinion System: An Alternative to Traditional Linear Drive Systems
Sponsored by nexen
How to Select an Analog Signal Generator
Sponsored by rohde and schwarz a and d
How Do You Assess Image Quality?
Sponsored by basler
How to Maximize Temperature Measurement Accuracy
Sponsored by VTI Instruments
Guidelines for Ensuring PCB Manufacturability
Sponsored by Sunstone Circuits
Putting FPGAs to Work in Software Radio Systems
Sponsored by Pentek

White Papers Sponsored By: