PwC's Tech While You Trek

PwC's Tech While You Trek: Immersive Interfaces

April 05, 2021 Season 1 Episode 26
PwC's Tech While You Trek
PwC's Tech While You Trek: Immersive Interfaces
Show Notes Transcript

Tune into another episode of Tech While You Trek to hear PwC Senior Manager Shaz Hoda and Senior Associate Justin Hromalik discuss how immersive technologies utilize human attributes such as touch and emotion to bring users closer to the digital world.

Tech While You Trek - Immersive Interfaces
Guests:  Shaz Hoda, Justin Hromalik
Release Date:  April 5, 2021


Adam (00:08): 
Hello everyone. And welcome to another episode of PwC Tech While You Trek. I am your host, Adam. And today I have with me, Shaz Hoda and Justin Hromalik to talk about the emerging technology convergence theme of immersive interfaces. Please introduce yourselves and tell us a little bit about who you are and how you came to be with PWC.

Shaz Hoda (00:24):
Hey Adam, I'm Shaz. I'm part of the emerging tech group for the PWC, within which, I lead the emerging tech lab. The lab is basically responsible for creating prototypes around new and upcoming technologies, which are not mainstream today, but hold great potential for the future.

Justin Hromalik (00:42):
My name is Justin, I'm a researcher. So I'm also part of the emerging tech group. I'm on the research team where I help drive research for 200 plus emerging technologies with a key focus on the essential eight and the different convergence themes that we have.

Adam (00:57):
So, as I mentioned today, we're going to be talking about immersive interface. So what is immersive interface?

Shaz Hoda (01:04):
So when you think about technology Adam, what we think about is either typing on a keyboard or clicking on something with a mouse, or maybe interacting with a touch screen, using our fingers. Now in contrast what immersive interfaces are, they allow us to interact with technology in a way that's very natural to us are in a way that you would typically interact with the physical world. If I can give you an example, think about a virtual reality headset in which what the headset is trying to do is basically look at your eye movement, your physical body movement to help you interact with the environment in a more natural way. Right?

Adam (01:42):
So how does that work? I know it seems like an oversimplified question, but how does that work?

Shaz Hoda (01:49):
The typically three essential pieces are three essential things that are happening together. Say let's take our example of a virtual reality headset to work. One is that there are multiple sensors. So these sensors are exchanging information or gathering information about your eye movement, how you're moving your body, and transferring or transmitting this information to the second piece, which is the hardware, which in our cases, again the VR headset. So all these sensors, there are dozens of them. They are transmitting a lot of information to this hardware, which is then trying to process this information to give you a near real time replica of the real world and how it does it is the third critical element of this, which is using AI. So what AI does is that it takes all that sensor information; they ignore is the redundant part of the information. And then basically just helps you process all that information in milliseconds or less than the blink of an eye so that what you are witnessing in the virtual world is very close or almost indistinguishable from how you would experience it in the real world.

Adam (02:54):
It's like direct mapping essentially in real time?

Shaz Hoda (03:00):
Yeah, absolutely. So if I give a more detailed example of the VR headset, instead of us having this conversation over the phone or over a video, we could be putting on our VR headsets and essentially having this conversation anywhere in the world. So what the VR headset would do for us is replicate the environment of say a cafe or an outdoor cafe in Paris. And instead of seeing each other on video, we would see each other sitting at the cafe and experiencing how the sun is moving, how people are walking by us or even the streets around us in real time.

Adam (03:34):
That strikes me that it would require the transference of a tremendous amount of data.

Shaz Hoda (03:39):
There's a lot of information that's being collected by the sensors, which is say how your hands are moving, how you're moving your head to see the view, et cetera. And all that information is being processed by an AI engine, which is basically taking a tremendous amount of data, ignoring the noise from the data and really processing the essential part to make it look real and to help you interact with that environment in the way that you would interact with the physical world rather than a virtual world.

Adam (04:09):
So what are some applications? What are some ways in which companies are using these immersive interfaces?

Justin Hromalik (04:15):
With immersive interfaces, we're really talking about a spectrum of maturity based on the use case and the type of interface. So on the highly mature end, we're looking at chat bots, virtual assistants, things that are really being scaled across organizations as we speak. So from automated call centers to online help chats, these customer centric use cases immediately kind of come to mind is what's going on there. But those types of intelligent agents are really being explored internally as a way to improve employee satisfaction and also just optimize business processes. Another big trend that's in kind of mid swing at the moment is the movement towards touchless or frictionless, everything. That's really really the idea of just being able to-

Adam (05:05):
Considering the times?

Justin Hromalik (05:07):
Yeah. Exactly, aptly timed. It's really just the idea of being able to interact with people, machines and businesses with as little physical interaction as possible. So the QR codes that you've started seeing in restaurants, virtual telehealth appointments that you started making with your doctors, those are pretty basic examples, but I think a lot of people are going to be surprised by how quickly things are starting to change. Especially in areas like travel and retail.

Adam (05:32):
I saw my first touchless entry when I got to college, which was longer ago than I cared to admit, but our IDs were the things that let us in and out of buildings.

Justin Hromalik (05:42):
That's definitely the future we're headed into, but our IDs in the future are going to be attached to our bodies in some way, either our phones or our AR headsets that we're going to be wearing. Those devices could all be connected to infrastructure. So you could do that anywhere.

Adam (05:57):
Why is it going to work in the future where it didn't work before?

Justin Hromalik (05:59):
I definitely think it's going to work this time around it. The next generation of AR headsets are really going to be quite technologically advanced, right? You're talking about eye tracking, 8K displays. So not being able to differentiate between a normal display and your vision versus real life.  Pass through cameras to actually be able to see parts of the environment overlaid on top of the digital world and vice versa.

Adam (06:24):
So gentlemen, talk to me a little bit about what's trending what's on the horizon, things that are interesting to watch out for in this space.

Justin Hromalik (06:31):
I think body movement trackings, something that's right on the horizon. That's quite interesting. On the industrial side, you're going to have safety monitoring systems that can track employees movement in terms of the factory floor or areas where safety is actually important, working with robots, for instance, being able to have that type of monitoring, to see how people are moving and give feedback using haptics, being able to alert people based on vibrations on their body, for instance, that they're coming too close to a dangerous area, emotion recognitions, maybe a little bit more cutting edge.

I think that's a pretty interesting one as well, basically machines and devices, being able to understand what you're feeling based on the facial expressions or sensor data from a smartwatch. These types of systems really are going to allow for really advanced capabilities in the near future. Where for example, you walk into your bathroom and your smart mirror analyzes your mood. You know, it gives you a pep talk for the day based on how you're feeling.

Adam (07:33):
It would do that based on like a question and answer thing, or it would do that just based on what it sees in your body language and your motion, how would your mirror sense your mood?

Justin Hromalik (07:41):
It's really a combination of those. And I think we'll see it start to be one thing and then morph often into others. So it could start off voice enabled. So detecting kind of cues in your voice. And then eventually as the technology advances using computer vision systems, it could analyze your facial expressions and start to see different subtle cues in how your face is moving.

This is the idea of sensor fusion, really combining all the sensor data available. I would say on the very far end, an interesting trend to watch is brain computer interfaces, right? Basically being able to communicate with machines and robots using just your thoughts, this isn't something that's just science fiction. There are actual products out there today that you can buy.

You can put it on the back of your head and it'll detect which smart device in your house that you're looking at, or what action you want to perform. Whether that's turning off a light or starting your coffee machine, those kind basic examples are doable today. But in the future, any action that you can perform in the digital world could theoretically be converted into a series of neurological patterns that could be analyzed by a machine.

Shaz Hoda (08:49):
In terms of diversity and inclusion. Justin and I think the brain computer interfaces is an interesting topic in itself. It permits inclusion, for instance, if you're not being able to do something because of a physical limitation, a brain computer interface can start off help you achieve those goals, because it now helps you attach a prosthetic arm, which you can operate through the brain computer interface to do actions, which you were not able to do before.

Adam (09:24):
Well gentlemen, listen before I get you out of here, there's one final question I ask of all my guests kind of a fun way to let you go. Are You ready?

Shaz Hoda (09:32):
Yeah. Absolutely.

Adam (09:33):
What piece of technology? What sort of technological application, what little gizmo would you have 10 years ago be completely shocked and surprised that you of today is using?

Shaz Hoda (9:45):
If you told me 10 years ago, I would be sharing such an intimate relationship with my smart watch. I would be quite surprised. The smart watch tells me when to get up. When to sit down, how many calories I need to burn, how many workouts I've been doing and monitors my heartbeat tells me how well I'm sleeping. And so many other things.

Adam (10:04):
I feel like we skip that and then skipped back, right? And the old science fiction shows from the fifties and sixties. They all had the communicator on their wrist and we went to the phone and then came back to the wrist. That's a good one. Justin what about you?

Justin Hromalik (10:16):
Creativity is a big surprise for me, really. So technology enabling that, enabling creativity and enabling me to express myself in new ways. I really believe that we're kind of headed into an age of escalated human creativity. That's powered by technology where the only limitation that people have are their imaginations. That's the kind of future that I dreamt about when I was a kid, but I just didn't realize I would be here so soon.

Adam (10:42):
Well. Listen, gentlemen, thank you so much for stopping by today and taking some time to chat a little bit.

Shaz Hoda (10:47):
Thanks Adam.

Justin Hromalik (10:48):
Thanks for having us.

Adam (10:49):
Shaz Hoda and Justin Hromalik. Thank you very much. This has been another episode of Tech while you Trek. I've been your host, Adam, and we will talk to you again next time.


Speaker 4 (11:02):
This podcast is brought to you by PWC. All rights reserved PWC refers to the U S member firm or one of its subsidiaries or affiliates, and may sometimes refer to the PWC network. Each member firm is a separate legal entity. Please see www.pwc.com/structure for further details. This podcast is for general information purposes only and should not be used as a substitute for consultation with professional advisors.