CapTech Trends

How Retailers Can Drive Sales With AR

December 09, 2020 CapTech
CapTech Trends
How Retailers Can Drive Sales With AR
Show Notes Transcript

We’ve long been following Augmented Realtiy (AR) and Virtual Reality (VR) and predicted that AR would outpace VR. So far that seems to be coming true. What’s interesting is that AR came about in the 1960s but went nowhere quickly because it wasn’t viable – until mobile came along and now it’s caught like wildfire. Why? Unlike VR, which is still pretty niche and expensive for hobbyists and architects, the cost of entry is next to $0. The ease of access and use makes it available to just about everyone with a mobile phone. With AR available at the finger tips of a large majority of Americans, where do we see it going next? What have companies gotten right and what have they gotten wrong? Listen to this podcast to explore the future of AR with us. 

Speaker 1:

Hello,

Speaker 2:

Come to CapTech trends, a place where we meet with thought leaders and subject matter experts to discuss emerging technology design and project methodology. I'm your host, Vinny Schoenfelder principal and chief technology officer at cap tech consultant. Uh, today Jack Cox is joining me again. Uh, if you're a frequent listener, you'll know, Jack from a previous podcast is both a guest and a cohost. He's a fellow at cap tech and a leader within our thought leadership teams and innovation area. Hey, thanks Vinny. Uh, so yeah, we're going to be discussing augmented reality. It's something that Jack and I have been following for years, um, and making different predictions about, uh, and so what we want to cover today is kind of current state of affairs with augmented reality, some new advancements and where we think we'll see things going in the future. So Jack get us started, give us a little history.

Speaker 3:

Well, I'll give you the reality. It's actually been around since the sixties, uh, with the first giant headset, but it didn't really become viable until we had gotten mobile phones.

Speaker 2:

I thought, I thought you were going to go down the LSD route, but okay. Keep going.

Speaker 3:

That was a totally different reality. Um, so with, with mobile devices, powerful mobile devices in our pocket, augmented reality started to catch hold. There were AR apps, uh, early in the app store that displayed, uh, information based on your location and what, uh, like stores and retailers around your location. Uh, they were somewhat clunky, not well supported. Um,

Speaker 2:

Weren't they weren't those like two D presented in a 3d view?

Speaker 3:

No, that they were 3d, but it was like a 2d billboards that you could see, right? Uh, that, that hovered in place then, um, Pokemon go came along. I was quite the, quite the big hit and around the same time Apple released AR kit now Pokemon go was not based on AR kit at all. It's a separate different framework that Niantic developed for that.

Speaker 2:

But again, that was like a two dimensional. He couldn't walk around Pokemon and see all sides of him.

Speaker 3:

Yes, that's right. The Pokemon was always looking at you. It was very much a kind of a flat experience, uh, with AR kit that came out shortly after that. Um, we started getting really deep experiences in AR where you could see three-dimensional objects, move around them, get close to them, um, manipulate them in the world real-world environment. And Apple has been progressively improving AR kit since then, Google has added AI capabilities to Android. Uh, they're, they're somewhat similar to AR kit, a little more difficult to program it, to deal with. Um, but there it is possible to, to produce comparable experiences across both platforms.

Speaker 2:

Yeah. One of the things that jumped out at me early on with augmented reality, being an early adopter of that and virtual reality, the first time I put on a true virtual reality headset, the HTC Vive, it was an amazing take your breath away experience. It really is that good. Um, some of the all in ones at the were, were not, and, and I think kind of a non-related people against that technology, but the, the, the real VR experience is actually very immersive and compelling. My first experience with augmented reality was the opposite. I couldn't wait to try the hollow lens from Microsoft, and when I put it on at a conference, uh, it was completely underwhelming. So it's kind of an interesting thing that you and I both at that time probably shared that experience. And yet we both predicted that in the immediate term, augmented reality would outpace a virtual reality from an adoption standpoint because it wouldn't be about the wearables. It would be about the phones and the tablets. So maybe speak to the difference in those platforms. Adoption, usability

Speaker 3:

It's real different than adoption is the amount of equipment you need for that. Um, and AR there's really two types of AR there's phone-based AR and then Pfizer or headband at AR and virtual reality. There's really just, head-mounted virtual reality, and you need substantial computing, compute power to produce a complete immersive environment that doesn't make you motion sick. Uh, and also can do the type of detail that you want to want to see in a gaming environment.

Speaker 2:

Like it's still relegated to the hobbyist and on the, on the, on the virtual. Yeah.

Speaker 3:

Yeah. It, it, VR takes a, a time commitment and a monetary commitment that really exclude that will exclude most people from getting involved in that it's still very much a niche type play that's marketed toward gamers. And also there's a substantial market actually in the engineering space to be able to visualize, uh, like architecture 3d spaces. Uh, but there you've got dedicated teams that are maintaining it, and they're building the worlds for that space with AR uh, for most people today, the cost of entry is zero because it's already built into your phone. That's for the, the foam on at AR the Holy grail is going to be visor mounted AR or head-mounted AR where it's always in front of your face. And your both hands are free to, uh, to manipulate the real world while you're saying an augmented world. So Google tried that Google tried that

Speaker 2:

Google glass. So what, what did they get, right? What did they get wrong?

Speaker 3:

They got right. That it was, um, displaying data as internet connected. It was a small device that had a decent battery life to it, a slaved off of a mobile device. Um, so it, we get network connected network connectivity from that mobile device. Uh, I thought that was all good. What they got wrong, I think was the aesthetics of it. And the impression that it was recording people all the time. Um, I think that was maybe, uh, uh, it precursor of people not trusting Google as much when they realized that they, they do have the power to, to record me all the time. So there were a restaurant restaurants and bars that were barring people that had Google glasses on. Um, and there were, there was a term of not very nice term glass hall for people who were wearing them in public,

Speaker 2:

Right. So they didn't win the hearts and minds

Speaker 3:

Did not win the hearts. And I think that's the big hurdle that, uh, any visor company will have, will have to overcome.

Speaker 2:

Let's talk about some of the limitations of the current visor set, and then also talk about where we think they're, they're working and doing well. Right? So, um, for those who haven't tried a lot of different headsets, um, if you only have a camera over one eye, it seems far less immersive than if you've got two cameras that are giving you a more complete vision of what of what's going on.

Speaker 3:

AR systems that are over one eye are typically industrial showing data and not, they're not showing three-dimensional objects. It's just, I'm overlaying some operational data into your field of view so that you can use both hands while you're doing your job and yet see that operational data,

Speaker 2:

Right. And once you get to the compute power and the battery, you try to put that on your head. It's like, we're in a small helmet. I mean, it's, it's not comfortable. It's bulky. It's not balanced really well.

Speaker 3:

It gets pretty, it gets pretty hot. Um, doesn't have a great battery life. We, we worked with one, uh, AR headset that actually had an issue with catching people's hair on fire. Cause it got so hot.

Speaker 2:

All right, that's not good. Um, and then when you, another implementation approach is to take that compute power and battery off the, off the head and attach it via a cable to what we call a hockey puck that you basically put in your back pocket. And again, there's these barriers to using it in that, in that way. Um, so where we do see it being successful though, uh, is industrial also, uh, companies like R-Class where you see them, uh, renting equipment in like a museum and getting added additive experience. And then you return the equipment when you're done, because it's highly effective and, you know, an hour long tour or something like that, as opposed to wearing it for eight hours a day and try it

Speaker 3:

In that environment, they can manage the battery life. They manage the data. It's a very constrained environment that the dealing with. So they don't need quite as much compute power as they would if they were more open space.

Speaker 2:

Right. So we're seeing, we're seeing the wearable technologies in control in domains that can, that can control aspects of the environment, right. Whether it's yep. Right.

Speaker 3:

If you want to try it out, find a museum that uses something like art glass, um, to go try out AR that museum and see how it augments that experience.

Speaker 2:

Right. So there's lots of rumors that Apple already has a visor created. Yeah. Is how, how, how do you think it will do it better?

Speaker 3:

I think Apple's probably going to focus on privacy as they've been doing for the past five or six years. Uh, one of the, one of the suspicions of rumors going around is that their Apple visor will not have a camera on it, but we'll use the LIDAR sensor that they've been including on the iPad pros and that the iPhone pros to sense the world around it. But the sense that in a way that's that protects privacy. It's not visual, it's not non visual, but it is detecting the shapes, edges and contours of things so that, um, your augmented objects can actually sit on tables or fall off the table if you don't put it in the right place. Um, and if you're doing using it, it could be then used for things like indoor navigation and in the dark too, to see, okay, where's, where's the edge of something. Um, it could be used for, uh, to, to help people with visual impairment to a T to see features the real world.

Speaker 2:

So can the, can the phone then take the place of the hot, the wired hockey puck?

Speaker 3:

The phone will probably take place of the compute puck.

Speaker 2:

Well, how can it do that from a data bandwidth perspective?

Speaker 3:

Yeah. Uh, uh, point to point wifi connection between the two. Um, there's always been an issue with one of the challenges with, with VR visor Mount at AR units is head tracking to be able to, to quickly respond to the small movements that, that people's heads make, um, that can cause displace objects. It can cause motion sickness when things that you're seeing aren't moving correctly. So that will be a challenge. How do, how do they get the latency? The low latency required to respond very quickly. And in the VR space, there's been just a tremendous amount of work to reduce that latency. And I forget what the minimum number of milliseconds is, but it's a very small number, uh, that the system needs to be responding to movements, to avoid, uh, disorder

Speaker 2:

Drawing a blank of that word, uh, where it's, where you perceive yourself to be versus where the technology is showing you to the extent that that's different is very disorienting. And, and that's, that's where the sickness and nausea and stuff comes from if that's not totally in love. Yep.

Speaker 3:

And there's also the need to sense where your body parts are. Right. Uh, and that can also be done with LIDAR by looking at, by recognizing the shape of your hands and knowing where your hands are at and where they're moving around so that you can actually get involved with, okay, I'm grabbing it augmented object with my hand and moving it. And it knows it in, in the real world where I'm doing, maybe I'm typing on a virtual keyboard and can see what letters I'm typing on it.

Speaker 2:

So you've mentioned LIDAR a couple of times, um, and we've set up front that the wearables are good in certain domains. We're hopeful. Apple will get it right, but millions of people are using augmented reality now on smartphones. Right. It was frustrating to me when AR first came out, trying to identify a flat surface, trying to get it to work intuitively incorrectly. It still felt premature and a hobbyist tool. How has LIDAR changed?

Speaker 3:

Yeah, one of the weaknesses of the pure camera based augmented reality is that it has trouble seeing surfaces that have no texture. So a plain white table or plain white wall, the camera is doesn't see what, where the surface is. So it can't figure out that there's a wall there with LIDAR it's shooting laser beams at it, and just sensing the reflection of the players were being, so it almost immediately detects surfaces, no matter what the color of their surfaces. Um, I think it, it has a little bit of trouble with clear surfaces, obviously because the laser is going to go through that.

Speaker 2:

So you've developed applications in both, uh, the LIDAR and posts

Speaker 3:

Three later. Um, we went from, how much has the game changed? So before with pre LIDAR, uh, with, it would take the app three or four seconds to start to sense the surroundings. And if you move too quickly, it would lose the surroundings and you would, it would have to recalibrate these surroundings again, with LIDAR that detecting the surroundings takes maybe 50 milliseconds. It's basically so fast, it's in perceptible. And then as you move around, you can move very quickly and it continues to sense the environment and build the environment around you. It, it builds a thing called a point cloud that has all the different surfaces and points that it's detected with LIDAR. Uh, if you move very quickly, that point cloud gets kind of thin, but it's still there. So it's still discern determined that, uh, there, yes, there is a surface here and then it uses that machine learning with the point cloud to say, Oh, to that particular shape that looks like a chair. We're going to call that a chair or that shape that looks like a person we're going to make that a person. And then we're going to start, uh, building the augmented reality world around that person. So any augmentations behind that person or occluded or, or hidden by that person.

Speaker 2:

Right. So it's always interesting when you get the intersection of multiple technologies, right? So it's, it's the augmented reality with LIDAR, with machine learning, to do recognition of, of objects.

Speaker 3:

Yeah. So computer vision, uh, location awareness, uh, the more better location awareness, the better the AR uh, machine learning, feeding out into that. Then now we're getting into areas where that point cloud and the augmentation are getting shared across the network. So with current Google and Apple systems, you can build publicly viewable, uh, augmentations that people can see, uh, if they have your app, they can open it up and see what other people have placed in the world, or what you have placed into that world.

Speaker 2:

So is this, who's using this from, from a, from a corporate perspective. Who's who, how are they using it? Are they seeing benefits from using it, or is this a technology looking for a problem?

Speaker 3:

We're starting to see a lot more activity in the retail space. Uh, home Depot has started adding augmented reality views of things to a lot of their products. Uh, there was a 12 foot tall skeleton statue. They were selling in October that you could pull up an AR and see what it looked like in the front of your house, which was a great way to keep people from buying something that isn't going to work in their house. Um, maybe people

Speaker 2:

Could see that with a Christmas tree size. Yeah.

Speaker 3:

Yeah. Christmas use has is a great yet. Great one. Yeah. Oh, I thought I had a 15 foot ceiling now you've got an eight foot ceiling in that 15 foot, Christmas tree is not going to fit. Uh, so home Depot is doing that a lot. Ikea was a leader in that space. Uh, there's a lot of retailers that are, are, are doing that. What they're seeing is that it greatly increases the time of, uh, engagement in the app in their application. But it also more importantly increases the likelihood that will complete that purchase. If they've tried out that item in augmented reality. And on the flip side, it also dramatically increases the likelihood that they won't return that object once, once they purchase it, know that they know that it's going to fit in their house, that it's going to look okay. They've sort of seen it in the real world.

Speaker 2:

Yeah. One of the numbers, and I'm sorry, I can't source it for you, but was people who used augmented reality, uh, prior to a purchase, uh, and more than doubled the conversion 112%

Speaker 3:

Increase in conversions.

Speaker 2:

Yeah. That's crazy. Um, an 81% on the PC when, when you work with the 3d model. So, um,

Speaker 3:

There, there was like a 35% decrease in the number of returns.

Speaker 2:

Wow. Yeah. Well that's, well, that makes sense. Right. You buy something that doesn't, you've got to return the Christmas tree when it's three feet too big. Right. Um, you know, COVID has made a lot of people shift to e-commerce. A lot of people already were, but obviously it reinforced that and brought a lot more people in yet there's products that you still want to see in person, um, like furniture, right. Or you want paint swatches to hold up to your wall or whatever else. And it seems like, um, or clothing, um, it seems like AR is a good use case in a time of COVID to bring products in your home, as opposed to having to go out and experience them in the, in the real world. And like we've seen with other things, once people start doing that, we believe that they're gonna continue to do that. Even postcard. Yeah.

Speaker 3:

We believe that Amazon is, is rolling more and more things that with AR capability, I think it depends on the retailer that's selling through Amazon, but anything they could do to reduce the number of returns, uh, or get the, give the customer comfort that they're buying a decent object. Um, you know, Amazon has a great return policy, but it's still a pain in the butt to return something just because you've got to take it to some store, drop it in a box somewhere. It's just

Speaker 2:

Sorry. It's effort. Yep. Right. Yeah. I mean, and for people who haven't experienced this, um, if you have a smartphone or a tablet, you can do this. Uh, it's really interesting to go into a room in your house and add furniture, and you're not looking at a picture of it. It exists in a 3d space in that vision, on your device. So you can walk around all sides of it. You can get on your hands and knees and look underneath it. You can see, uh, if something will fit under it. I mean, it's, you, you really are placing it in your environment.

Speaker 3:

Yeah. There was a, uh, a gag floating around, I think last week when people were having trouble finding[inaudible] to purchase where you could take a picture of something with a PS five, like sitting on the table. And, um, I know people that, that prank there, their spouse was saying, Hey, look, I, I finally got IPS five and like, Oh, what'd you find that it's a joke. It's fake. I'm just a mean person. I'm still failing at life.

Speaker 2:

Well, I remember we did a car model once, uh, early on Jack, when, when you were doing a demo of, of augmented reality and you could actually pull a chair into the car and sit down and get the, get the inside the car perspective.

Speaker 3:

Yeah. You could add like, stick your head in the car, look around, um, hold up your child's seat and to see that doesn't fit in there. Uh, you could do the AR and a garage. Does the car fit in the garage?

Speaker 2:

Yeah. Some of the stuff that, that, uh, I guess, um, I want to see improved, like if you put paint on a wall, um, and I've done that with different different apps, it works, it works pretty well. But if you've ever noticed, especially in, in, in, uh, when the sun's setting or it's coming up, paint on different walls, the same color can look very different just based on how the sun is shining in and what it's lighting and what's in, what's in shadow. Um, and I would think that AR is smart enough to be sensitive to light and the effect that on objects such as paint or the furniture or the car, and I haven't seen a good implementation.

Speaker 3:

Yeah. There's, that's still a very hard problem to solve of dealing with the ambient lighting in a room or in a space. There is a with new, um, reality kit that Apple is pushing developers to. There is the ability to adjust for the ambient lighting and to produce shadows. Uh, but it's still not, um, not really adapting to the light that's that's in the space.

Speaker 2:

So outside of, um, hardware improvements like LIDAR and faster processors, what has Apple or, or Google done from programming standard in terms of the libraries they make available? How has that improved

Speaker 3:

They're they're making the, the world building much easier and AR where they're adding capabilities and STKs to make it easier to bring in objects, Apple re uh, pretty substandard cardio SDZ, actually that Pixar pretty set standard USDA. Is there a way of describing 3d objects? And that makes it very easy to import from other systems and other programs into your AR world, uh, before you had to deal with some very obscure object formats and deal with a lot of weird scaling and texture issues. The texture is really the coding around the object that describes the color and that the look of the object, the reflectivity of the object.

Speaker 2:

So let's say I'm a, a good application developer, and I've done lots of websites and I've done some mobile apps, and I want to get into AR do I have to have skills in like gaming to understand how to work with 3d modeling? How do I do, I've got to be comfortable with Maya or blender to create models.

Speaker 3:

You don't, you can go to sites like turbo squid, or a sketch lab and download 3d objects that you can put into your programs. You do need to be comfortable with three-dimensional math to do scaling and did some of the rotations of these things. When you start getting into real object manipulation, that's where the bath kicks in. And you've got to kind of really be thinking about the geometry of things. Uh, you can build pretty simple experiences, actually with a real reality composer. You can fairly complex experiences, uh, just using those off the shelf objects. If you want to get into building custom objects, that's where you start needing getting it in the blender or Maya. Uh, you may need to get into photogrammetry to be able to take pictures or to capture real world objects and turn them into three-dimensional objects. There are some apps in the app store that you can use your phone or your iPad to do photogrammetry, uh, to do capturing objects that are smart. They actually work pretty well. And with the LIDAR sensor, they're working better and better because it's getting away from needing to do all that photo analysis. It's actually using the, uh, the point cloud that the LIDAR is building right to create these 3d objects.

Speaker 2:

Gotcha. So if you were listening to this podcast and you haven't played around with, um, AR apps, do you have a couple of favorites that you would recommend people download?

Speaker 3:

Um, the Nike app is fun, a fun one to play with, uh, actually the measure app on, on iOS, especially with, if you have a iPhone 12 pro or iPad pro with LIDAR, the measurements on that are phenomenal. They're so much more accurate than they were in the older AR kit system or an elder, uh, camera camera-based AR. Um, so that, that's an, uh, a nice tool to, to see some of the facilities of AR.

Speaker 2:

Yeah, and I I've, I've seen some, uh, clothing companies, uh, shoes, uh, glasses, uh, even, um, you know, uh, support coach that you can actually phew on your body as if you're wearing them

Speaker 3:

Warby Parker. They do AR with glasses on your face. Of course, Snapchat does all their Snapchat filters where they're putting things on people's face, and that's a form of AR.

Speaker 2:

So, so kind of closing thoughts here, let's have a little discussion on, uh, being a futurist, right? So let's assume Apple gets this right. Knocks it out of the park, does everything that you're expecting them to do and more, uh, what are some futuristic use cases you can kind of think of?

Speaker 3:

I think there's a lot of use cases for, uh, a brick and mortar retail, um, for contextual ads. Uh, right now they have shelf talkers in stores that could, those could be customized to say, Hey, this is a product that you've bought before. And you, you liked here it is, Oh, it's on sale. And have that show up in your view when you're in the grocery store.

Speaker 2:

So it's kinda, it's kind of the inverse of minority report because, you know, in minority report it would scan your eye and just shout it out. But everyone else could hear that too. This is like a much more,

Speaker 3:

Much, much worse.

Speaker 2:

Yeah. Private, uh, but it's kind of the same thing. Like it knows it to you. It knows what you're looking at. Yeah.

Speaker 3:

Yeah. And all these applications, there is tremendous potential for abuse and companies will need to be very careful that they don't over overwhelm the user, the, where with all this information coming in. Um, and I think there's, there needs to be a time where people take those glasses off and just disconnect. I know they won't, because

Speaker 2:

There's times people need to put their phones down and that's not happening either. Right. So it's hard to assume that as the technology gets better, our behaviors won't just get worse.

Speaker 3:

Yep. That is correct. Yeah. I mean, it's, uh, again, talking about blending, um, computer vision and machine learning with a really good, uh, headset, you could do recognize the environment, giving cues about the environment, like, um, you know, wayfinding this, you know, um, I'm in a new location, new store. How do I get get back to a certain department in the store? Or how do I get to my out the right exit on my Metro stop? Right.

Speaker 2:

You used home Depot as a example earlier, if you're in home Depot, putting dots on the floor to walk to the, to the bolts that you want to go by. And, you know, and, and, and if you have a list taking the most efficient path through the store

Speaker 3:

Or taking to the path that, that also takes you by things that you might want to buy.

Speaker 2:

Right. Right. And again, that, there's an ethical thing. Where do you want that boundary? Right. You know, when I think about this, the, you saying the smart boards and whatever in stores, I think about traveling and you walk up to a, in an airport, you walk into the flight tracker or whatever, and you see a hundred or so different flights, and it's scrolling through multiple pages to get to yours. It'd be great if you had the glasses on and it recognized that that's the screen you were looking at and told you just what you cared about. Yep.

Speaker 3:

And it could show you that information really almost anywhere, and then prompt you. Oh, it's time for you to head back to the gate at Stanford. By the time you get back to the gate, it will be time for you to board based on your flight number, your seat, your seat, and how long it is to get from where you're at to the gate.

Speaker 2:

Is that an anti-pattern though? I mean, if I have smart glasses on done well, and I want to know that it will be informed of a gate change, shouldn't I just be able to ask it and no matter where I am and have it show up, in my view, it should be walking to a, to a physical placard to get the same information to the old school. Right. It feels like you're, you're mimicking a past behavior when you don't need to that part of it is artificial.

Speaker 3:

Yep. But there's a certain amount of habitual behavior that people have that you can accommodate that for. But the first time they look at it, Nope. Here's some more information. And maybe say something, if you want that information to persist or notify you when there's a change.

Speaker 2:

Right. So, so leverage that. So it's not a one or the other allow for both and, and kind of use it as a training. I remember when, um, smartphones came out and everything was skeuomorphic, you know, so no pad looked like a notepad, right. And it, and it trained us to get used to that. And then they flattened the interface out once, once our patterns unexpected expectations,

Speaker 3:

It took about six years for that to happen.

Speaker 2:

You were talking about a patent. Um, the Apple has out that helps with, with indoor location. And you were combining some examples with, with, with the visor. There was a patent application published

Speaker 3:

From Apple that was using, uh, indoor location, uh, using the, uh, ultra wide band chips that are in the modern I-phones to locate the, uh, the phone user within a room and to identify, to show them the route to other people they may be interested in. So say, let's say you had a large conference and you want to find, you want to talk to somebody, you know, you know who they are, but you don't know where they're at. They could be in a room with a thousand people like a large ballroom, where, where are they at in that room? And basically route you dynamically to where that person is standing in the room,

Speaker 2:

Or yeah. Or like halo them somehow in a 3d view. Yeah. Um, halo people that are from your company, right. Who have opted into this. So you don't go up and make introductions

Speaker 3:

Or halo people that may be a good recruits, halo people that you've flagged that you want to talk to. It could be a number of uses for that type of micro location technology combined with augmented reality,

Speaker 2:

Halo my car in the parking lot. So I can go find it in the airport. Right? Yup. So Jack, any closing thoughts? Um, we talked about where we were, where we think it's going, um, improvements in technology, uh, how companies are using it, how it's being successful in companies. Is there anything you want to close with?

Speaker 3:

You'll see a continue up uptick in the number of companies that are doing augmented reality. We have utilities that are doing augmented reality airlines that are using or utilities using it. Utilities are using it to show customers what equipment will look like, uh, in their, on their property. So the utility who says, Hey, we want to put a new transformer in your backyard. The customer says, heck no, it's going to be a big ugly green box. I don't want that. And they pull out this AR app and say, okay, this is where we're going to put it. And this is how big it is. And we walk back to the house and look at it through augmented reality and then realize, Oh, it's completely hidden by the trees and the bushes. That's fine. You can light. But it also reduces that customer friction and allows the utility to move ahead with their, their grid improvement plans at lower costs and more quickly.

Speaker 2:

Yeah, I think it, when you said it, I was thinking of like looking behind walls, looking under floors, looking under roads, looking under my yard and nowhere wires and cables are run. So I know we're not to dig or drill or

Speaker 3:

That, that is an aspect of it. Also AR can be used for that, where you're showing where the infrastructure is located. Uh, there are, um, uh, precision concerns with using consumer devices for that type of location because a consumer device may be off by several feet, which could be, uh, problematic up to being fatal. Uh, if you dig a couple of feet away from where you should be digging,

Speaker 2:

Good point. So were there other ones you were mentioning, you were, you had talked about, um, energy utility. Were there any others,

Speaker 3:

Um, there's wayfinding a lot more wayfinding happening with augmented reality. It's a much more, much more natural way to find your way through a strength facility, rather than looking from a bird's eye view down onto a blue dot and a blue line. You're actually looking at a virtual, uh, dot or dotted line on the pavement to follow that dotted line.

Speaker 2:

I think there's going to be a lot more if and when, um, so we'll say when the visors become comfortable to wear as comfortable as regular glasses and can last all day and not cut your hair on fire. Um, I think we're going to see a lot more industrial use cases for this. Um, and workforce has, that are further enabled. Uh, I know that a lot of companies have turnover in manufacturing or turnover in warehousing shipping, especially during peak seasons like Christmas. And it takes, you know, four weeks to train somebody up and then you lose them at the end of Christmas. Uh, so I could see just from a training perspective, um, your first month on a job in a big facility, having those there's jobs there's built in job AIDS.

Speaker 3:

Yeah. That can be a big win and companies are using Google glass for that. Now Google glass is still out there and available for industrial applications. Uh, and it is a good application of that. And the hollow lens is used for that as well. There are other systems that are used for in the industrial space to reduce that, that training time. Um, and to all as a, also a safety thing of, okay, wait a minute, there's a warning here. You know, you're looking at this particular device, it's under pressure, that kind of thing. Great.

Speaker 2:

Well, thanks so much for all the information, Jack, always a pleasure to have you on the podcast. I always enjoy talking to you about this stuff. Um, for those of you listening, uh, please subscribe to the podcast. You can be notified when we have more of these and thanks for joining us.

Speaker 1:

Okay, thank you. Ready.

Speaker 2:

The entire contents and designing this podcast are the property of CapTech or used by Capitec with permission and our protect under us and international copyright and trademark laws users of this podcast may save and use information contained in it only for personal or other non-commercial educational purposes. Nobody using this podcast may be made without Catholic's prior written permission, captain Nixon, no warranty guarantee or representation as to the accuracy or efficiency of the information featured in this podcast. The information opinions and recommendations presented in it are for general information only. And any reliance on the information provided in it is done at your own risk. CapTech makes no warranty that this podcast or the server that makes it available. It's free of viruses, worms, or other elements or codes that manifest contaminating or destructive properties, Catholic expressly disclaims any and all liability or responsibility for any direct indirect incidental, or any other damages arising out of any use of, or reference to reliance on or inability to use this podcast or the information presented in it.