The MetaVerse: Applications

The MetaVerse: Applications

Tags
metaverse
Published
May 14, 2022
Author
Randall Hand
URL
So last time we spoke about one of the core pieces of MetaVerse: The ability of a device to recognize where it is, and have information on what possible augments are available to it.  That's well and good, but how do we build these augments in a flexible and constructive way that can respond not only to the environment, but to the user?
Well, there are lots of companies throwing their weight around in this space.  Depending on who you follow on twitter, you may believe that SnapChat lenses are the future of XR.. Or maybe it's the systems coming out of Niantic.  Or, like so many people, you believe it's Unity or Unreal, the gaming engines.  It makes sense in a way, they already have all of the 3D Rendering and Physics capabilities, so why wouldn't that be the natural solution?
In short, it's too hard.  You might think to yourself "Unity? Hard?" but let me continue.
The MetaVerse has to be built on a set of open API's.  The language and tools used to build these systems can't be owned by any one company, no more than there's a single source of C++ or Javascript tooling. There are big players and tiny players, and all of them work for various use cases. It will be the same for MetaVerse.
There will be experiences that remap the world around you to look like Hogwarts, where you might find a dragon or a dementor around every corner, and for those experiences maybe Unity is the right choice.  But there will be equally many, probably even more, experiences that are much more mundane and "usability" focused.  Where, in your location, can you knock off your shopping list?  What about the bus or subway schedule?  Maybe you just want to order some coffee at StarBucks using their MetaVerse ordering system, and get directions to where you can pick it up.  Or maybe you just want a little AI-driven bot to follow you around and be available, like a virtual Alexa or Siri.
Unity isn't the best solution for these systems. These systems need to be lightweight and responsive to the user and environment.  There's a whole world of new systems coming in the near future for "intelligent environments" that we have only just begun to see.  If you pop open a "Window" in MetaVerse contexts, you don't want it to appear Z-fighting with reality, or behind physical objects and leading to migraine-inducing focus problems.  Just like Windows went through a few years of learning all about smart window placement, we're going to deal with the same thing in XR content, and no application developer is honestly going to want to deal with it.  It's an OS & "Window Manager" problem, and should be dealt with as such.
In addition, there's a whole new set of interfaces required in XR space that we don't have good solutions for today.  Some experiences will be user-triggered, doing smart things like starting when you get close enough, or when they detect they "have your attention". The exact logic of how these work will be tricky, but requires a standard set of triggers that can be shared between Devices, Applications, and Cloud Services. 
I think it's no coincidence that JavaScript frequently tops out lists of the most popular programming languages.  It's ability to run in almost every environment, and the wide collection of tooling and libraries, all make it a great swiss-army-knife of a language.  It also puts it in a unique place to serve XR content.  The "MetaVerse React" Framework would be something that could be deployed to a serverless infrastructure, connected to a Map Server, and have a combination of Backend and Frontend code to build an experience. 
Imagine this: As a developer, you get an API key from your chosen Map Server. You load that API key into your node+React application.  Now your application can get webhook callbacks and launch when a user enter's the "interaction zone" around your app, and serve up content.  You application can also post limited information back to the Map.  Not physical map details, but metadata tags like basic State information. Some of the information will be global ("App in use", "Down for maintenance") and some of it will be instanced (What the user is currently looking at, states of the UI for their experience), and the app has the ability to decide how it handles all of this. Some information will be personal, some will be shared.  Sometimes I hit a button and only I know, sometimes I hit a button and everyone sees it change state.  If it's a cart checkout app in a store, probably I'm the only one that sees all the details, but if it's a virtual Dartboard then everyone should be able to see what's going on.  JavaScript and the related frameworks are well suited for this, but this is all build on Open API's so there will always be people building similar experiences in other frameworks (Unity, C, etc).
JavaScript as a 3d rendering platform will seem foreign to many people, but you have to realize that the bulk of MetaVerse use cases won't actually be 3D. It'll be what we do today in 2D, in 3D. By that I mean effectively "floating panes of glass" with minimal 3D effects.  Virtual posters and checkout lines in a store, enterprise dashboards and oversight use cases, even most training and medical use cases will be flat panes of glass with the occasional arrow into 3D.  There will be a few companies that take it the extra mile to fully 3D augment things, and they'll be successful.  But we'll find the market flooded with lower-fidelity experiences that provide sufficient value and meet people's needs.
I estimate it will take about 10 years in this space, only then will we start to see the really wild 3D stuff start.  Look at the Smartphone industry: If we use Apple and Android as the primary example of smartphones.. They all basically started as glorified web browsers.  In fact, a "good web browser" was a primary selling point of the original iPhone. When apps first became a thing, most of them were just packaged websites. There were a few exceptions, Angry Birds and the like.  But 95% of the content, and user engagement, was in Twitter, Facebook, note taking apps, mapping apps, and all of the other things that people actually needed.  Games are a good entry point and make flashy advertising, but you have to provide value in order to maintain engagement.
And User Engagement is going to be the key metric for MetaVerse.  How do you get users to use your experience?  And use it often?  Because by using the experience, you're collecting data not just about their shopping habits or their browsing history, you're collecting data on The User.  Their Home, their workplace.. How many square feet is it? How cluttered is it? What brands of appliances do they have? What colors do they like? Are they sluggish or hyper? In their room, what do they look at regularly?  This is why companies are pumping millions of USD into VR, AR, XR tech companies.  This is the dark secret of MetaVerse that gets companies and advertisers excited, and it's what they're now working to build programs for...
 
How do they convince you that you need this?