(Originally posted on Medium)
Apple’s World Wide Developer Conference this year was a big deal for builders in their AR ecosystem…
After receiving a slew of texts and emails over the past week about Apple’s WWDC, I thought it might be fun to write about some of the more interesting takeaways we are focused on as AR/VR/Gaming builders. Many of my friends, and even a couple of retailers reached out asking about how these developments impact us and others in the space.
There were many exciting aspects of WWDC22 on a technical level, but we’re going focus on some of the larger implications we took away, and why Apple’s announcements at WWDC are investments in well positioned startups (like Realize!).
Yes, Apple is absolutely attempting to destroy any competing AR ecosystems
Niantic’s Lightship ARDK has absolutely been competitive, if not winning in many respects, when compared to Apple’s ARKit. It’s cross-platform (build it once, deploy in iOS and Android), enables relocation (map once, use forever) and has been pushing boundaries on technical specifications (e.g. instant depth sensing).
So, what did Apple do to clap back? They did what they always do. Basically just matched the technical specifications of their competitors and then focused the rest of their efforts on a gorgeously designed interface to do something many companies (Realize included) have been doing for about a year or two — map a room in seconds. The demo was unreal, the dollhouse map that shows up in the bottom of the screen looks like something out of a Chris Nolan movie. Classic Apple crushing it on the execution and delivery.
This time, Apple’s near-flawless presentation is supported by a thoughtful back-end experience. The UI/UX work they’ve done is a massive lift to apps that rely heavily on ARKit’s mapping functionality. User education is one of the biggest inhibitors on mass adoption right now. At Realize, our power users are our college students. They can’t get enough of our product. Why? Because although it may be their first time mapping a dorm room, it is almost never their first time using an AR app.
Apple’s elegant and native approach opens the door for more first-time Realize users to arrive at our doorstep familiar with a best-in-class AR/VR experience.
Apple is recommitting to the developer community
It’s been a meme for years in the developer community that real software engineers can’t love Apple. From their stringent and inflexible history of App Store approval processes, down to the fact that you can’t even upgrade the RAM on your own machine, for the longest time, Apple has regularly given a nice big F-You to developers by refusing to listen to the terabytes of feedback begging them to be reasonable.
WWDC22 showed that all hope is not lost.
Let’s talk improvements to Metal (if you know what Metal is, skip the next 3 paragraphs).
What is Metal?
Metal is Apple’s framework for accessing an iPhone’s GPU.
Why do GPUs matter?
A GPU does all of the tiny mathematical computations which make things like high-fidelity graphic rendering and Neural Nets (Machine Learning Models) able to run at high speeds. When you are doing AR, you generally need both to work, and the faster they work, the smoother and more accurate of an experience the user will have.
So… Metal is the framework to access the GPU…?
Here’s the important detail you need to take away when you hear about improvements to the Metal framework. Apple is essentially giving developers more tools and more flexibility to optimize the applications they are building for iPhones.
Of the various improvements to Metal, the focus on memory, battery life and speed all seemed to be indicators of Apple finally listening to its developer community.
The stats that were touted in the new Metal improvements for the A15 bionic chip are incredible. Using their new frameworks for lossy compression, they claim a 50% reduction in texture sizes vs the old methodology (lossless compression)…. 50%!! This means a world of difference when you’re building a game that renders hundreds of textures at any given moment. It also means that for any given texture, the workload on the GPU has been decreased, meaning MORE BATTERY LIFE!
Now, of course this whole lossy vs lossless compression comes with a tradeoff. You’re choosing to give up image quality for memory, but the point is that Apple is focused on letting its developers make those trade offs.
Prioritizing developer autonomy over something like “consistency and quality of apps in the App Store” (a tagline they’ve hidden behind for decades to excuse them refusing to cater to developers’ requests) implies something important. Perhaps Apple, the de facto governing body of ~50% of the mobile internet, is learning how its policymaking needs to change. Maybe we’re seeing them realize that if they want developers, they need to empower those developers.
… maybe.
People are starting to pay attention to the space
When Apple hypes up a space, many companies feel like they need to scramble to keep up. Amazon announced their “virtual try-on” AR stack for footwear as Apple was unveiling new AR capabilities at WWDC.
Bigger companies are putting AR/VR build-outs on their roadmaps. VC’s are circling dollars to invest into startups building experiences in these AR ecosystems. Heck, Fidelity has a metaverse ETF! If you’re building in this space, there is no better time to be starting conversations… this topic is on everyone’s radar right now.
So, does a big marketing splash really mean that much has changed? Well, in this case, it seems to be accelerating the pace of adoption, and if you know anything about inflection points in technology, you know that adoption is generally the most critical domino to fall (and almost always the last).
So, what about Realize?
Wow, so kind of you to ask!
RoomPlan Demo — good or bad for you?
Verdict: Definitely good for us.
Real-time segmentation, brilliant UI, all wrapped up in Apples ARKit SDK. We are thrilled to be pulling these improvements into our scanning process.
I know what you’re thinking “But what about the copycats? Anyone can make a room scanning app now!”
Yes, that is absolutely true, and it has always been true. We didn’t invent SLAM. We aren’t the only ones who know how to do Semantic Segmentation. Our IP and value creation has never been wrapped up in the scanning/capturing process. We’ve been fortunate to simply build on the foundation that Apple, Unity and very smart PHDs at Oxford have been laying down.
At Realize, we transform room scan data into an intuitive and compelling shoppable experience for our users. Marrying the physical and the digital was done a long time ago by Matterport, but what they couldn’t figure out was how to turn that digital space into an ongoing experience.
We are obsessed with learning from our early customers. We hear phrases like “It’s like the Sims, but for real life.” We unpack these phrases with our customers, and in turn find that Realize is something that will help them make progress in their actual lives, like turning an empty dorm into a place that actually feels like home.
It comes from a catalog of 10k+ real products that you can actually buy. It comes from spending all of our free time with users, understanding their pain points and building something that helps them make the progress they’re looking for. It comes from experiences that delight users in such a way that they help us onboard and troubleshoot for other users, they do our marketing for us.
And THAT, is an awfully difficult thing to copy.
Won’t this hurt your ability to partner with big companies? They can just build what you have themselves.
Verdict: A cautious “no”
I’ve yet to experience a single conversation with a retailer go dark because they just decided to build it themselves. In fact, what I’ve heard more often than not, is some variation of “the metaverse is so big, no one is going to be able to own it all.”
And the reality is that no one should.
What is Realize’s biggest takeaway from WWDC?
Excitement.
Apple is setting the stage for more users to come to Realize familiar with best-in-class AR/VR experiences and we need to be ready for them.