microphone headphone

Playing With Reality - Episode 2

What are the most exciting innovations in the augmented reality space? Is this technology a new stage of human evolution, where people will develop Supersight? Find out on this week’s episode of Playing with Reality.


Sorry, this content can only be visible if Functional Cookies are accepted. Please go to the Cookie Settings and change your preferences.


Augmented Reality is one of the most exciting frontiers in modern tech - from Google Glass to Snapchat filters, it already plays a significant role in our lives, more than many might think. New developments in hardware and AR’s opportunities for online community engagement have driven the field forward, putting us at the precipice of widespread use in normal life. But are there any costs to this? And where did AR actually begin - with smartphone cameras, or simple eye glasses? In this episode of Playing with Reality, we look to the future of this hugely exciting field, finding out about the most exciting innovations and companies currently on the market, while exploring why the enhancement of sight is going to dominate the future of popular technology.
Today’s Guests
David Rose
David Rose is an entrepreneur, MIT lecturer, author, and expert on all things to do with designing a better metaverse. His current role is as CTO of Home Outside, an AR tool that will allow people to redesign their backyards, and before that, he founded other AR companies like Clearwater. He has lectured on digital technologies at MIT and many other prestigious institutions.
You can find more about him through his website, where you can also order his book, Supersight: https://www.supersight.world/
Jason McDowall
Jason is an investor and entrepreneur in AR companies in this space, who has been working in the field since the dawn of the internet in the mid 2000s. He is currently the VP Product and Head of Visual Experience at Ostendo Technologies, a company that has built a “Quantum Photonic Imager” something that optimizes the way in which consumers can view both worlds at the same time: The physical reality and the projected augmented reality.
He also has his own podcast, The AR Show, where he speaks to brightest minds in AR to learn more about the cutting edge of tech in this field.
You can subscribe to the AR Show here: https://www.thearshow.com/


Subscribe Now:

Episode Transcript

Speakers: Menno van Doorn, David Rose & Jason McDowall

[Music playing 00:00:00]

Menno: One day, you will be able to look through the ground to see what lies beneath your city. From sewers to the ancient borders of the Roman Empire, you will be able to navigate the worlds without a map.

This is a promise of augmented reality, a technology that includes everything from Snapchat filters and Pokémon GO, to navigation, virtual surgery, and much, much more.

But what does it mean to augment our vision? Can we call it a new stage of human evolution, where humans will gain the power of Supersight?

Welcome back to Playing With Reality, with me, Menno van Doorn, a new podcast by information technology company, Sogeti.

[Music playing 00:00:55]

When we think about playing with reality, augmenting our vision is probably one of the first things that comes into mind. To change how we perceive the world for fun, to learn, to make life easier, this is the essence of what we want to explore in this series.

This new world of interactive technology is projected to reach some $90 billion by 2028. There's so much excitement about the possible future growth. From Apple to Google, and Microsoft to Facebook, all the big players are coming out with glasses, goggles and more competing to become the leaders in this nascent field.

But to some, augmented reality might seem just like a fad; something which will peak in hype like Pokémon GO, then fall away as it is overtaken by the bigger beasts of the metaverse.

I want to know whether augmented reality will be so important and influential in tomorrow's world of tech. In order to know where AR is going, we need to know where it came from.

To get at the heart of this, I got in touch with an old friend of mine, who is an expert in AR, a friend who coined a term, “Supersight.”

So, David-

David: It's just us.

Menno: It's just you and me.

David: Yeah.

Menno: That's David Rose, an entrepreneur, MIT lecturer, author, and expert on all things to do with designing a better metaverse. I met David back in 2013, and we quickly got chatting about his work in this field, because he really has done everything.

His current role is as CTO of Home Outside, an AR tool that will allow people to redesign their backyards. And before that, he founded other AR companies like ClearWater, and lectured on Digital Technologies at MIT, amongst many, many other things.

I called him up to find out some more about what he perceives augmented reality to truly be; where did it come from and what falls under its technological umbrella.

And I still remember the smile on your face when we were in Austin a couple of months ago, when this guy said, “Let's not take a taxi, but take this electrical scooter.”

And you were actually the first one to grab one of those scooters. And I saw the smile on your face, like you were a child in a candy store.

So, have you always been like this, exploring the world through new stuff, new technologies?

David: I think I've always been an aggressive early adopter, and interested in the difference between how you predict something will make you feel or what it will do to your life or to your relationships, versus what it actually does.

Because I feel like we so often get that wrong. You imagine that the Alexa in your kitchen is going to allow you to play more songs and revisit your rock and roll past, then what turns out to be the case; is like your kids start yelling at it and learn how to be rude. And your wife starts despising you for bringing new technology into the home. And it all changes once you actually start using the thing.

So, like I'm interested in that dichotomy of the, like what you predict and what actually turns out to be the case.

Menno: Well, you are an explorer when we talk about this new technology. Could you explain maybe what is the essence of augmented reality according to you?

David: Well, I think the ultimate vision of augmented reality is that we will have a blended view of the world. One that captures computer-generated layers and our view of the real world. And they will be synchronized.

So, the computer-generated images will be on plane with the table or against the wall surface, or on your countertop, kind of in perspective, against the road in perspective, translating signs in context, so that we will be able to see what we normally see. And then another layer of interpretation on top of that.

But there are lots and lots of permutations of what this is kind of ending up being, with augmented reality, where we're kind of seeing signs and signals of this that are not this fully realized mixed reality.

So, for example, like a rearview backup camera is what probably most of us have experienced today as augmented reality. Like we see out of the back of our car, and it's not just the view from the back of the car. But it also includes these like lines on the road and the lines change as you turn your wheel.

So, it can kind of anticipate that you're going to hit the car next to you or the trike, or whatever's behind you, like that is augmented reality. Or blind spot warning systems, all of these things are kind of the super lightweight version of that.

Menno: Yeah. So, that's the lightweight version, this augmented reality in our own lives today. There's also a historical context, talking about augmented technologies, like the media guru, Marshall McLuhan talked about it, the technology augment as in different ways.

Can you give some examples of the historical context of how we should perceive the augmented reality of today?

David: Sure. Well, I think of many types of technologies are augmentations of the human. So, certainly, the glasses that you and I are wearing give us an ability that we wouldn't have otherwise been able to have, which is to adapt to reading texts that are close to our eyes.

And so, we all rely on this kind of a type of augmentation or a type of prosthetic, and we don't even think of this as a technology anymore. We just all have like reading glasses.

And so, I think of reading glasses or Gore-Tex or zippers or air conditioning, all of those are technologies that change us and change our world and change our comfort and change our abilities. And I think we should see, spell-checking and augmented reality, language translation tools, these are all augmentations of a type.

Menno: So, would you say then that augmented reality as we talk about this new technology today is more a part of evolution of mankind or evolution of technology than a revolution?

David: I think that's the right way to position it. You have this slow Darwinian evolution that we see in the natural world. And I think technologies are just a faster paced evolution, for people that are lucky enough to be able to use them.

[Music playing 00:07:48]

Menno: I found what David said about prosthetics really interesting. Augmenting reality, rather than creating a virtual one, can feed into anything we use from simple eyeglasses and car windows, for instance, to implants, and then the most high-tech headsets on the market.

Some of that is just in theory anyway. I want to know more about AR as hardware, the kind of things, which we understand as augmenting our perceptions, because this is the essence of playing with reality.

I needed to speak to someone else in the AR world, to explain to me how we got there and where we are now; how do we get in a place where Supersight is possible, and what will it take to make it available for all?

So, I got in touch with Jason McDowall. Jason is incredibly passionate about all things related to AR. Not only he is an investor and entrepreneur in companies in this space, he also has his own podcast. It's called The AR Show, where he speaks to the brightest minds in AR to learn more about the cutting-edge of tech in this field.

When he is not chatting into his microphone, Jason is the VP Product and Head of Visual Experiences at Ostendo Technologies. That's a company that has built a Quantum Photonic Imager. That's something that optimizes the way in which consumers can view both worlds at the same time; the physical reality and the projected augmented reality.

He was definitely the perfect person to tell me more about the roadmap to getting to a world where AR is fully integrated in our lives. So, hello, Jason.

Jason: Hi, it's great to be here.

Menno: Happy to have you here, I would say. Before we dive in all these kind of details about augmented reality, first, I would say I'm curious to know where your obsession with AR comes from.

Jason: I studied engineering in college. I went to Carnegie Mellon University to study computer engineering. And at that time, that was the late nineties and I was really obsessed with that time, was the coming age of smartphones.

And for me, augmented reality is really an extension of this general idea, of being more connected to the information into the internet around us and being able to overlay the information.

I was first exposed to AR, mobile AR around 2010/11 timeframe, in the early days of, I think it was Zappar or one of those companies had produced this world web browser sort of thing, where you can hold up your phone and they had these geo pins, these pins based on kind of the compass heading of the phone, what direction you were looking, you could see, “Oh, there's the Eiffel Tower in that direction.”

And fundamentally, it is about the use case. It is about how does it improve the human experience and our connection to our real world around us, by using the information that we have stored digitally.

Menno: Well, maybe the most important question or the thing I'm most curious about would be whether we will actually be able to create this world of Supersight, whether we’ll become Superman or Superwoman, what would be the roadmap of getting there?

Jason: So, from a technical historical perspective, we as technologists, those in the world of technology have been imagining what would be possible if, since the late 1960s. It's been a long time. There wasn't a lot of progress for decades. It wasn't until the late eighties, nineties, mid-nineties, where we kind of took another step forward in terms of capabilities.

And that had a lot to do with our ability to compute, like what is the imagery we need to even be able to see? We have to be able to compute that, there’s a lot more effort in doing that than a simple 2D display, and having to understand the real world.

And we kind of began to understand what are all the necessary ingredients to make something like this work. But at that time, computing just wasn't good enough. Optics wasn't good enough. That was the first wave of VR also, was in the mid-nineties. We had this first attempt and then it kind of receded again for 10, 15 years.

And it came back in the early 2010s. And we had both Oculus on the VR side and Google Glass on the AR side come forward and say, “We're trying again. We think that the computing is now good enough to be able to deliver a decent experience, display technology is good enough. Optics is okay.”

And at that time, there was a lot of engineering put into these devices, exploration of what they could be. Google had a very specific perspective on what good looked like. Microsoft had a very different perspective on what good looked like. They came at it from very two different angles in the world of See-Through AR.

And Google took the perspective that wearability is key. The glass has to be comfortable enough to be worn for some reasonable period of time. Even if that means the capabilities of the device suffer considerably.

There's only one display kind of sat in front of your eye, looked like you're a bit of a cyborg, but it was very lightweight, had a little camera on it, connected to your smartphone.

And then HoloLens, they took different approach. Was effectively a fighter pilot helmet with a lot of capabilities and they tried to make it more human-like. Less helmet like, and a bit more glasses like. And so, it kind of ended up halfway in between. And that device had tons of capabilities in it. And they made a really capable device, but it was not small.

And anyways, we just kind of saw through those two companies, the experimentation that was happening at that time and some of the big challenges.

For Microsoft, it turned out to be really difficult to make it smaller, given the set of technology sources that they made. And for Google, they seemed to not invest a ton, at least that's the external perspective.

We saw them release an initial version and it was in the consumer mind space for a while. And then it kind of regressed from the consumer space, but it lived on in the enterprise space, mostly within medical and warehouse.

They focused on their simplicity and they learned that there are some use cases that can be useful, good enough, with that sort of very simple wearable sort of focus.

[Music playing 00:13:42]

Menno: With all the crazy technology going on behind it, it's amazing that something so relatively simple as comfort, should be such a priority for the AR glasses manufacturers. And if they get the comfort right, Jason thinks it could hold the key to integrating technologies into our normal vision.

So, what are the key use cases for AR then? What are the cutting-edge products currently on the market that are changing the way consumers think about the industry? David had some really interesting thoughts about this.

The thesis of your book, In Your Face, I would say is Supersight. Is that what's all about? It's giving us super powers or super magic or …?

David: Yeah, I think the question that I'm asking is what types of services will be enabled, if we have kind of cameras that are trained to see and recognize, are embedded in everyday things, including in the wearables that are on our face, and if the interaction is that intimate.

Menno: I think it's nice also to have some examples of, maybe of the work that you do with the eyeglass manufacturer, Warby Parker; can you maybe explain what you're doing over there?

David: Sure. Well, at Warby Parker, because glasses are such a fashion accessory, it's a really highly considered purchase. Getting the prescription right is a quarter of the problem when people are buying glasses.

People are doing different things with their glasses. Some people are playing an instrument or working on the computer a lot. And so, the size of the glasses, also shapes how it performs in different situations.

So, when the iPhone first came out, it was the iPhone X that first came out that had the front facing camera that would unlock the phone with your face. That had the technology in it in order to basically give almost a lidar scan of the terrain of the face.

And with that data, it was 40,000 points that are cast against your face. So, we were able to access that data and kind of divine the terrain of the face, and then recommend glasses that would fit all of these details of your face.

So, that was the first generation of the technology. And then the next generation of the technology that my team built was a Virtual Try-On. And Virtual Try-On now, I think is one of the kind of glowing examples for how AR is being used in the world.

So, we digitized the product line and wrote an algorithm, patented it to basically place these appropriately on your face. And it's been a huge success for the company because people, they're in the Warby Parker app, they can swipe down and they can open the camera, and they can see what those particular glasses look like on their face.

And then they can very quickly decide, “Oh, what do the smaller frames look like? What are more contrasty frames look like?” And then they can share that on social media or share that with their friends, and they can just get to the glasses decision faster.

Menno: Have you been surprised by the advices that the system gave you for instance, to use other glasses?

David: I think, when we built the system for Virtual Try-On, we were mostly were relying on the individual to make the judgment for how they look. Because I led the R&D team, we were also working on what's called a subjective neural network, which is the type of network that would give you fashion advice.

I think the bleeding edge, deepfakes, and subjective neural nets are kind of the bleeding edge of this technology where you say, “Well, wouldn't I rely on a designer to give me advice on which exactly on which glasses look good on my face, or which furniture looks good in my house, or which landscaping looks good out front?”

And the truth is if designers agree, if you can find designers that agree, then you can train a neural network to predictably, give the same advice as a designer would give, which is kind of bonkers a little bit, that we're going to have technologies that really give the expertise to be an art director.

We were building a three neural network system at Warby Parker that would look at your complexion and your face shape, and your face size and where you were in the world, and use all of those to try to predict which glasses you would “look best in” or be most likely to be happy about.

[Music playing 00:18:22]

Menno: Whether it's fashion, gaming or understanding our built environment. The possibilities seem to be endless, but comfort is king. And un-comfort is today's reality.

Let's return to those smart glasses; what if Warby Parker could put the same technology it uses in the smart phone camera and put it in their specs to look at the world around you. This would be Supersight on another level and the kind of thing we all associate with augmented reality.

I went back to Jason to find out more about what it takes to make successful AR-driven smart glasses from a hardware perspective.

I would call it the Holy Trinity of Supersight. You talk about three things; visual quality, information context, and device comfort. So, can you explain how these elements play together, and how difficult it is to get it all right in order to create these fantastic new glasses that everybody wants and actually works?

Jason: Yeah. First let me back up and throw out another analogy or similarity. So, in the PC era and the mobile era, there were two events that mirrored each other. And this was this idea that is the right device, the device that's engineered for the enterprise, or is the right winning device, the one that is underpowered, undersized built for the consumer, but then is upgraded over time to work well for the enterprise?

And in both the PC era and the mobile era, the answer was the consumer grade device was the winner. The entirety of our internet, our PCs, our laptops, everything is based on that consumer grade architecture that was developed in the PC era.

Same thing is true for the smartphone era. It was not the Nokia communicator or the early attempts by IBM, or the RIM Blackberry device that won. Those were devices that were targeting the enterprise, because those were the people that were willing to pay for the value that they provided.

These early attempts at creating this sort of value was very clearly felt. The ROI was very obvious within the enterprise, but it wasn't those devices that won.

Menno: Yeah. I see coming what you're going to say now. It will happen again.

Jason: It’ll happen again. So, we have the same thing, smartphones, Android and iOS, iPhone, they dominate the enterprise today. But ultimately, they took that consumer grade device and they upgraded to be good enough for enterprise.

The same thing I believe to be true for AR, for smart glasses. It will not be a HoloLens-like device that ends up winning the consumer or the enterprise. It will not be a real wear-like device that ends up winning over the long term.

And it's not that these companies don't appreciate that. They understand that they're taking advantage of what's available today in terms of the appetite on the enterprise side and the technology that's available. But ultimately, the architecture that's going to win is a consumer grade architecture.

And the elements of a consumer grade architecture really revolve around wearability as the most fundamental attribute of the device. If it's not truly wearable, then it's not going to have a chance in the consumer realm.

And within that, we have to also accept that because small is very hard, capabilities are left out in the process of making it truly small.

So, these early devices are not going to be what Microsoft and the hype train collectively are selling as this fully immersive alternative reality that we'll be walking around in.

It's going to be limited ultimately in this capability, but that device is the one that's going to win over time, as we are able to increase its capability, increase its relevance for enterprise over time, that's what's going to win, I believe fundamentally.

Menno: If you take that comparison, will it be the glasses that only take a picture that people like Snap … introduce to the market, what will be the minimal viable product to conquer the heart of the consumer?

Jason: That one is a little bit less obvious to me. I love the experimentation that Snap is doing right now, where they are creating a wearable pair of glasses that incorporates some technology, incorporates camera, incorporates something like the Facebook, meta book, metaverse, whatever they call themselves, Meta.

Their device is kind of also directionally similar, in that they're trying to incorporate some technology and get people comfortable with the idea of wearing something that incorporates this sort of technology.

And it's a great place to learn. It's a great place to learn about the human comfort side of things, and I think kind of going back to this triumph that you talked about, this device comfort, visual quality, information context — this notion of device comfort is really paramount. Visual quality of course, is this ability to see clearly digital physical that includes the real world.

Sometimes in some of these experiments that I've seen, you have an okay digital picture, but you can't really see the real world very well or vice versa. And device comfort is really about this idea of being able to wear it for an extended period of time. And information context is about delivering timely and relevant information.

[Music playing 00:23:15]

Menno: If the ultimate goal is to have people wearing smart glasses all the time and everywhere, then having a product which people can wear for an extended period of time is vital. And this consumer centric approach to their creation has come alongside the developments in miniaturization and more.

But what else will happen in the future of this technology? Where is it going next? In this series, we are taking you into the heart of the metaverse. So, I wanted to know where AR stands, how will the real world metaverse, as we might call AR, help us to navigate the world. I asked David for his thoughts on this.

So, what's the part of the metaverse that you actually like?

David: I am very interested in and would like to talk about two projects that I'm working on, that are all about the real world metaverse, that is the AR metaverse.

Menno: Why do you call it the real world?

David: Because the assumption is that the ground plane that we're designing on top of is the world, rather than nothingness as the ground plane. And like one of the projects that I'm working on is about navigating the real world. And I do feel like that maybe one of the kind of killer apps for AR, is just getting around.

We've been using Google for decades and in order to help us kind of make sense of where we are in the world and where we're going and the best way to get there. I feel like that layer of information kind of mapped to the real world has been incredibly useful and really changed all of our lives.

Menno: So, it's all about navigating the real world.

David: Well, I feel like for navigating the real world in a car, we've kind of done the experiment and that is one of the killer apps for the metaverse. I'm working on a project right now for the water world, it's called ClearWater AR.

And the first fantasy that we had was, boy, if we have kind of data elevation models, if we understand the terrain of what's below you, which has been mapped out by boats with sonar or by satellites that have penetrating lidar that penetrate the water column.

Menno: Yeah.

David: So, we have the mountains that are below the water mapped, and people would love to see those. If you're a boater and you don't want to hit rocks, or if you're a fisher person, and you'd like to see where the underwater structure is, like you'd love to be in a glass bottom boat, regardless of the lake's transparency, in order to see that world below.

So, we have a system where you can hold up your phone or put on glasses and you basically see the topo lines or the terrain of the world below, which is a type of navigation. And you can also use it to safely get from point A to point B in the world, given tides and currents and wind and other boats and all the other complexities of the water world.

Menno: I would call it Supersight.

David: Yeah. Well, actually I was presenting the ability to see into the human body, for doing surgical procedures and through the concrete of the city to understand the infrastructure, when somebody approached me after the conference and said, “Boy, could you do the same for water?” And I said, “Yeah, yeah, we could …”

[Music playing 00:26:42]

Menno: AR for navigation, again, shows that technology follows consumer preferences. So, the future of Supersight is the future of our own preferences, but when is it going to happen?

Both Jason and David are working on some truly groundbreaking tech, but will it be used tomorrow, next year maybe? Is that full integration something we should celebrate? Or should we be nervous about it?

Because a world where everyone is constantly blurted into AR, seeing a warped vision of reality could come with a downside. I asked Jason, when he sees this technology being fully integrated in our lives.

But if we look at the future and maybe that's the twist in our conversation now. So, what's the end point, when if ever, but let's assume we are all positive about this technology; will these technologies become fully integrated in our lives? Can you give me a ballpark figure, a year.

Jason: A year. Okay, fully integrated. Meaning that it has some sort of like what, 20/50% penetration, 50%?

Menno: Oh, let's be ambitious, 50%.

Jason: I'm going to go with 2034. Very specific.

Menno: I’m not going to ask why specifically.

Jason: So, there are a lot of things that have to happen between here and there. I'll just highlight a couple of those things. I think in the near term, we're going to see a lot of emphasis around devices that people call mixed reality devices.

They’re are basically VR devices that have good cameras and better software processing on them that allow you to take in the real world and then mix that with the digital screen that you're staring at. So, you're looking at the real world through a screen.

And these devices are going to get better and smaller over time, but they're still going to be VR devices, that will allow you to mix in the real world. And those will be amazing platforms for entertainment and education, and maybe even productivity, but they're still living room sorts of devices or conference room sorts of devices.

We're not going to be walking down the street, I don't believe at all, with a device like this. And so, those sorts of devices, I think, so if we're going to see these sort of video pass through VR devices, that people will call mixed reality – if that is the mid-2020s, 2023/4/5/6, with those devices, we're going to get good at understanding how far those devices can go in terms of their capabilities. And we get better at how do we mix the real world with the digital.

And so, we're still early in this adoption of head-worn technology. And so, I think that even as we get the technology ingredients themselves to be good enough, there's still years of experimentation around how do we make it a great experience? How do we put all these things together and something that's truly fantastic, that consumers will eagerly go out and adopt.

And I think we need a few years of that sort of iteration beyond the initial adoption, the initial availability of the core ingredients from a technology perspective.

Menno: I was thinking, isn't this just the opposite of Supersight? Is this just narrow sight? Aren't we in the end, narrowing down what people should pay attention to instead of broadening your view without any technology?

Jason: So, you speak to something very fundamental. I think about the human experience and interaction with the real world. How much technology should there be, and how much should it get out of the way?

Is it really just a facilitator of that real world human experience, or is it meant to replace real world sort of experience?

I agree that there is a risk that we become fully immersed and we'll get to a point where we're going to experiment with it. Somebody will create some sort of device and we'll play with it and we realize, you know what, for the vast majority of us, we still enjoy the real world. And it's about enhancing, uplifting that a little bit, so that we can make better decisions, faster decisions, wherever it happens to be.

I think the bigger risk from a dystopian future perspective, is the fact that these devices in order to deliver great experiences, have to have a lot of awareness of what's happening with you physically.

Where you are in the world, how you're feeling in that moment as can be perceived based on your head motion or your eye motion, or the way that your eye contracts — these sorts of things are all going to be inputs into the device that are being captured by these devices.

And the dystopian version of this future is that all that information gets sent back to some nefarious, overarching technology provider who wittingly or unwittingly shares that information with some organization who is intent on changing your opinion about something.

And this could be anything from benignly suggesting you go test drive a particular car because you think the car looks great, to changing the way that you vote. Or changing your opinion about some sort of political hot topic. And that is the bigger risk, is around manipulation in a way that we are not consciously aware of that we don't directly sign up for.

[Music playing 00:31:29]

Menno: So, in 2034, by Jason's reckoning, we are all going to be manipulated by our glasses as well. Is this going to be a world where the barrage of information is never out of sight? It's certainly a terrifying thought, but let's not end on a dystopian note.

If we want to play with reality, it should be about a better future, one built around enhancing and uplifting our normal lives. Because that is the key thing that I have learned from this episode. AR has the potential, maybe more than any other technology in this space, to provide real world help in the here and now.

Built around comfort of use and helping in everyday tasks, AR has the capability to transform the real world as we know it. It could create a real world metaverse where we live, not as avatars, but as people adapting the world around us, using technology to make things easier, faster and better.

And what will that work look like? Well, I will let David to have the final word with a utopian view of AR in the years to come.

Let's finish off with one utopian, maybe a last utopian view. Could you explore maybe a little bit about how we would maybe improve the way we connect to each other and communicate with each other and talk to each other? So, what kind of positive scenario would that be, or could it be?

David: Yeah, I have an image in the book, which is my kind of envisioning a conversational scaffolding system that might help people. In those first moments of a conversation, you want to find commonalities. Sometimes those happen naturally. And for some people, those happen very naturally.

But I think a lot of people could be helped by having kind of a metadata cloud that's orbiting around the head of other people, kind of like a swarm of bees, visually that shows you these are the points of commonality.

The way you create rapport often or like if you think about the exquisite party host friend of yours who says, “Oh Menno, you have to meet Lisa, you both share this bizarre interest in foil board, kite surfing,” or like something that nobody else in the room does.

So, I see this kind of services, like the exquisite party host, who finds those uncommon commonalities, and you might ignore them. Like it doesn't have to be a conversational topic, but at least, it allows you to kind of see, “Oh yeah, we both went to super tiny fringy colleges.”

Menno: You mean like a conversational starter?

David: Yes, yes. I really feel like it's one of the most interesting use cases of augmented reality.

Menno: So, David, thank you so much.

David: Sure, this has been fun.

Menno: That's all for today. Thank you so much for listening, and to both David and Jason for their beautiful insights. Make sure to pick up David's book, Supersight, and to tune in to Jason's podcast, The AR Show. Of course, if you want to learn more about what they're both up to.

Next time, we'll be going deeper into the metaverse, looking at its origins and early incarnations to create a roadmap which will lead us to the current state of play in the world of VR.

Do join us again, next time on Playing With Reality.

[Music playing 00:35:09]