The XRLoc localization module, sized less than one meter, can be integrated with TVs or soundbars to provide centimeter-accurate locations of users by leveraging wireless UWB signals. This location information can deliver realistic 3D audio or dynamic location-aware content to users. From left to right, Dinesh Bharadia, Aditya Arun, Shunsuke Saruwatari. (Image: Ryotaro Hada)
Tech Briefs: What led you to start on the project?

Aditya Arun: The primary goal of developing this was to improve the way augmented reality and VR applications are deployed in the real world. In the past, I, along with Professor Dinesh Bharadia and members of the Wireless Communication Sensing and Network Group (WCSNG), as well as Shunsuke Saruwatari at Osaka University, Japan, have worked on many wireless localization systems. We specifically worked with Wi-Fi and ultra-wide band technologies and found the deployment of these systems to be very time-consuming and cumbersome. We had to deploy anchors all over the room and it was extremely hard to set up. So, we then primarily targeted the ease of deployment for localization.

We care about this because when you have objects in the real world and you want to transform them or take them into the digital world, you need to have centimeter-level accurate locations of the objects with respect to you, and with respect to the rest of the surroundings. In that way you can have the same context in the digital world as in the real world.

So, the core question we addressed was how to provide an extremely easy-to-deploy centimeter-accurate localization with up to millimeter-accurate tracking capabilities for multiple objects in the environment. Our XRloc ultra-wideband (UWB) localization system is an answer. It is a single, one-meter device, the size of a sound bar you place under your TV screen, with which you can get these localization accuracies for various tagged objects in the environment.

A simple example could be if I want to take a bottle and use it as a sword in a game or use it as a tennis racket, I can very easily do that by just attaching a UWB tag to it and playing with it. I would see my actions and movements replicated in the virtual reality (VR) world.

Tech Briefs: Are you intending this mostly for the gaming sector?

Arun: I think gaming is a very strong sector, but the way we think about it is that augmented reality (AR) itself has a larger scope. If I were to, let's say, do a collaborative meeting, I need to understand where all the people in the room are. Perhaps I need to understand where different objects in the room are if I'm having some kind of a demonstration and I’m trying to do that virtually. Video games are a very simple application I can think of off the bat, but I can definitely imagine other scenarios where this kind of transferal of real-world objects into the virtual world will be very useful.

Tech Briefs: Could you give me an example?

Arun: So, apart from gaming, one example could be: I want to showcase a product to you, and I have it in my hand. Maybe it's a PCB board or a model of a car, and I want to show it to you in the virtual world — I want to share the feeling of interacting with it physically — I want to show you how I'm interacting with it and convey that thought process to you. In that scenario, I need to understand the location, the orientation of the object in the world, and how it's interacting with other things, and put that across into the virtual world.

Tech Briefs: Is your tag like an RFID tag?

Arun: RFID is based on reflections of signals from a transmitter that activates the tag. However, UWB is slightly different — it is like the air tags that you get from Apple, which use active transmission. In our case, the transmissions are received by the anchor we developed.

In our system, we detect the direction the signal is coming from and the different times that it takes to arrive at different antennas on the device. So, it's the angle of arrival and the time difference of arrival. We can use these two measurements to figure out where you are.

Since it's a powered transmission, not a reflective one, you can have a larger range than with RFID — up to 10 meters. You can imagine covering an entire conference room area with one of these localization devices.

Tech Briefs: Your paper refers to time and phase — is what you mean by phase, the angle?

Arun: Exactly, phase underlies the angle of arrival (AoA) measurement. When signals arrive from different angles, they introduce a slightly different phase at the receiving antennas. Based on the way we place the antennas on the localization module we will accrue different amounts of phase shift for different angles. By placing two antennas farther away you can improve the precision of the measurement.

However, if you separate the antennas far from each other, there is a lot of ambiguity in the angles that you measure. In other words, when two antennas are spaced very far apart, you might think that the signals are arriving from multiple angles. This ambiguity arises because phase measurements are periodic every 2 radians. To address that ambiguity, we use the time difference of arrival — we use the understanding of how much additional time it took to arrive at one antenna versus the other to figure out the specific angles it came from. The way we think about it is the phase gives us the precision that we need to localize the tag accurately; whereas the time measurements, which are aperiodic, allow us to eliminate any ambiguities that might exist.

Tech Briefs: Why did you pick six transmitters?

Arun: We experimented with multiple different numbers and six seemed to be the point where we got good enough accuracy without affecting the cost too much. Another thing, as you add more antennas, you’re also increasing the amount of power the module consumes. We found that six was the sweet spot where we got the accuracy we were looking to hit with the minimum number of transmitters.

But that said, we are now working on designs that require fewer antennas. If we can reduce the noise in the hardware itself, we could go down to as few as three antennas.

Tech Briefs: Can you tell me what you're doing to try to get there?

Arun: improving the PCB design and fleshing out the system a little bit better. We're also thinking of perhaps combining multiple antennas in a single module. Right now, we have individual transceivers for each antenna, but there are ways we can combine multiple antennas with a single transceiver.

Tech Briefs: What steps are you taking to commercialize this?

Arun: We have a collaborator who has joint appointments in the music and in the engineering departments. He is working on art pieces that this could be used for. He’s looking to develop pieces that use interactive sound, like spatial audio. If you listen to a television, you have stereo speakers. The sound from the left-side speakers is slightly different from the right side, which is what gives you a feeling that the sound is moving in the environment. With our system, we can potentially improve that spatial perception. For example, currently available sound bars allow you to beamform audio specifically to each ear so that you can have better 3D perception. However, in that scenario you need to understand where the user is with close to centimeter accuracy.

So, one of the things that we are working on with him is to integrate our system with an actual sound bar to showcase the capabilities of delivering spatial audio in large spaces.

From a commercialization perspective, that might take off because the sound bar industry is looking for better ways to figure out where people are in a space. At this moment, there are no other really good solutions to fix this problem. Most iPhones and Android phones have UWB transceivers, so if you are carrying your phone with you, that can tell the loudspeaker where you are.

Another way you can deliver spatial audio is through wireless headphones. If you’re in a big room and are wearing wireless headphones to receive the audio, I can create a 3D audio experience for you by attaching UWB systems to them to determine your exact location.

Tech Briefs: You’ve written that one of the problems with using cameras for localization is you can't look around objects, but with UWB, you can. How does that work?

Arun: Suppose there is a VR or AR setup where you want to localize multiple people and objects in the environment and all of them are moving about and changing positions. If you rely purely on something like HTC Vive, for example, in which the older versions have cameras or IR scanners that they place in the environment, a person walking in front of it creates a shadow. But shadowing is less of an issue with the 6-gigahertz signals we use because they can go around people.

Light has a very small wavelength compared to UWB signals — light is at terahertz frequencies, while UWB is at gigahertz frequencies. The way you can think about it is the narrower the wavelength, the smaller the object that blocks the signal. Since light has a very small wavelength, it can't go around many obstacles.

Radio waves have large enough wavelengths that they're not occluded or shadowed by many common obstacles. Of course, if there's a concrete or metal wall in front of you, it will block the system. But not the human body, plaster or wooden walls, or furniture. Let's say a person is crouching behind a sofa while playing a game, you can still get their location.

Tech Briefs: Is there anything you’d like to add?

Arun: I’d like to add that I'm not the sole author on this paper, it’s been a highly collaborative effort, and the collaboration is still in place. We just submitted another paper together with Professor Saruwatari and his team at Osaka University. It’s been a very collaborative effort; working with different people adds to the pleasure of this project.

Tech Briefs: How do you distribute the work between yourself and Professor Saruwatari — do you each specialize in a different aspect?

Arun: Yes, that's exactly it. He is extremely good at embedded system development — he is very good with hardware and understanding how to write and build systems on embedded platforms. And we are very good at wireless system design. We really understand how to do localization, and work with UWB. So, it's a really good mix — we can say: We’d like to try something, and they can help us by telling us whether it’s possible. They could do it in much less time than it would take for us.

So, for some of the next steps, we’re giving them our thoughts on what we’d like to do and they are trying it out and giving us their thoughts on what is possible and what isn’t. Then they suggest ways it could be done better. This creates a great back-and-forth process to improve our systems.

Project Link: https://ucsdwcsng.github.io/XRLoc 

Lab Link: https://wcsng.ucsd.edu/