The round lens of PrivacyLens captures standard digital video while the square lens senses heat. The heat sensor improves the camera’s ability to spot and remove people from videos. (Image: Brenda Ahearn, Michigan Engineering)

A new camera could prevent companies from collecting embarrassing and identifiable photos and videos from devices like smart home cameras and robotic vacuums. It’s called PrivacyLens and was made by University of Michigan engineers.

PrivacyLens uses both a standard video camera and a heat-sensing camera to spot people in images from their body temperature. The person’s likeness is then completely replaced by a generic stick figure, whose movements mirror those of the person it stands in for. The accurately animated stick figure allows a device relying on the camera to continue to function without revealing the identity of the person in view of the camera.

That extra anonymity could prevent private moments from leaking onto the internet, which is increasingly common in today’s world.

“Most consumers do not think about what happens to the data collected by their favorite smart home devices. In most cases, raw audio, images and videos are being streamed off these devices to the manufacturers’ cloud-based servers, regardless of whether or not the data is actually needed for the end application,” said Corresponding Author Alanson Sample, Associate Professor of Computer Science and Engineering.

“A smart device that removes personally identifiable information before sensitive data is sent to private servers will be a far safer product than what we currently have.”

Raw photos are never stored anywhere on the device or in the cloud, completely eliminating access to unprocessed images.

Yasha Iravantchi looks like an anonymous stick figure in this monitor connected to PrivacyLens. (Image: Brenda Ahearn, Michigan Engineering)

“Cameras provide rich information to monitor health. It could help track exercise habits and other activities of daily living, or call for help when an elderly person falls,” said Yasha Iravantchi, Doctoral Student in Computer Science and Engineering, who recently presented PrivacyLens at the Privacy Enhancing Technologies Symposium in Bristol, U.K.

Replacing patients with stick figures helps make them more comfortable having a camera in even the most private parts of the home, according to an initial survey of 15 participants. The team has incorporated a sliding privacy scale into the device that allows users to control how much of their faces and bodies are censored.

The device could not only make patients more comfortable with chronic health monitoring, but it could also help protect privacy in public spaces. Vehicle manufacturers could potentially use PrivacyLens to prevent their autonomous vehicles from being used as surveillance drones, and companies that use cameras to collect data outdoors might find the device useful for complying with privacy laws.

Sample has filed a provisional patent for the device and hopes to eventually bring it to market.

Here is an exclusive Tech Briefs interview, edited for length and clarity, with Sample and Iravantchi.

Tech Briefs: What was the biggest technical challenge you faced while developing PrivacyLens?

Sample: It's really blending this thermal data with the optical data. There's a technical challenge of how to get it done in real time, for five watts. We're trying to make this a USB camera replacement, and nobody's been able to figure out how to get all that together because we are trying to target the lowest end of cameras to be most applicable. So really getting all that done in real time for low power.

Iravantchi: Developing a driver for the cameras to be able to synchronize with each other on the device and have them get the data appearing on the GPU in a way that we can do all of these processes in real time. These are not necessarily USB plug-and-play-style cameras, so there's a lot of low-level engineering that had to go into it.

Tech Briefs: Can you explain in simple terms how everything works?

Sample: Our goal is to find people in images and remove them, and then re-represent them with some other visualizations so that we don't reveal their skin color, their gender — private images of them will not be posted on the internet. A traditional way to do that would be just use computer vision, but computer vision algorithms just aren't accurate enough to always catch those things. So, thermal cameras can find out that unique thing about humans, and that unique thing is that we're all about the same temperature.

The round lens of PrivacyLens captures standard digital video while the square lens senses heat. The heat sensor improves the camera’s ability to spot and remove people from videos. (Image: Brenda Ahearn, Michigan Engineering)

Thermal cameras are very good at masking people, finding the outline of people far better than any computer vision algorithm. By using the thermal image masking to find out where the humans are, and a little computer vision, you can find the people and then make generative models to replace them with something else.

Tech Briefs: How did this work come about? What was the catalyst for the project?

Sample: My lab's been doing work on what we call privacy preserved sensing for a while. That's the goal when we think about all these IoT devices around us. There's this thrust of ubiquitous computing in our field, which is trying to have computers that are built into our daily lives, understand what we're doing, and help us. The hidden problem there is that they're capturing all this data about us and then typically how it works out is it goes off to some third-party server; we don't have much control about what happens with that data.

So, we started off with a microphone and we said, ‘OK, how do we make a microphone understand what's happening in the environment, but never be able to record speech?’ Then we said, ‘OK, what's the next modality?’ The next modality is vision; how do we make a camera that can never put a compromising picture of you up on the internet.

Tech Briefs: What are your next steps? Do you have plans for further research work, etc.?

Sample: I think there are questions about other sensing modalities. We looked at sound, vision, so there are other sensing modalities we'll look at. But also, we're just looking for partners. We've made one-off things, but I am personally interested in helping my medical partners be able to investigate new things. So, what happens when you can watch disease progression in someone's home for years at a time? I'll tell you which devices I don't want in my home — one where it’s constantly monitoring me. Here's a device that many people would be willing — although there still has to be some trust with the organization that installs it — to have in their kitchen. So, really, my goal is to help my medical partners enable their scientific investigations.

Iravantchi: We’ve already done some work to this effect. In my previous work, we designed microphones that don't capture speech. We've deployed this in people's homes for kidney health so we can do completely inaudible urinary voiding tracking; we can tell every time someone urinates, and we can track the urination envelope, which is clinically valuable. People are a whole lot more comfortable using a microphone they know cannot capture any audible sounds. It's an entirely inaudible ultrasonic microphone.

Tech Briefs: How did the presentation overseas go?

Iravantchi: People were very receptive to it. It's a rather new concept, having privacy-aware and privacy-preserving hardware. It was very nice to get a warm embrace from the privacy community; they agree that this is an interesting avenue to continue exploring — to look at how we can build some safeguards at the sensor level, before their data goes out into the world, and who knows what happens to it.

Sample: I'd add that I think generally in computer science there's this somewhat misperception between the difference between security and privacy. Big tech is pretty good at applying security features to ensure that the private information is transported back to their servers in a way that third parties can't get it. But that's different than privacy, right? I just don't want big tech having naked pictures of me.

What we're talking about isn't the best privacy in the world. We're just saying there are certain things that I as a user want to have control over and know what is going to leave my device.