I haven’t written a blog post in awhile. Over the past 6 months, I’d try to pontificate on the topic of Augmented Reality but some major new development would always occur. I have a bunch of scrapped posts sitting in Google Drive that are now totally irrelevant. Cruising through December, I figured the coast was clear. I was considering writing a dull year in review post when the final paradigm shift occurred with Snap’s release of Lens Studio. So, let’s try and get this out before it’s obsolete!
The Return of Smartphone AR
Smartphone AR is definitely back. After Apple’s announcement, everyone wanted to talk about ARKit. Despite developing the award-winning Holographic Easter Egg Hunt for HoloLens with Microsoft this past Spring, discussions with clients and investors became laser-focused on smartphone AR instead of mixed reality.
It looks like 2018 will be a big year for these platforms while mixed reality headset makers gear up for 2019 and beyond. Because of this renewed interest in smartphone AR, this is a good time to investigate your options if you’re looking to get into this platform.
ARKit and ARCore
Despite being announced after Facebook’s AR Camera Effects platform, it really was Apple’s ARKit’s announcement that set off this new hype cycle for smartphone AR. Google’s announcement of ARCore for Android was seemingly a me-too move, but also quite significant.
This isn’t about ARKit versus ARCore since there is no competition. They both do similar things on different platforms. ARCore and ARKit have a common set of features but implement them in ways that are subtly different from the user’s perspective. Because of this, it’s not super difficult to port applications between the two platforms if you are using Unity.
The biggest limitation of both ARKit and ARCore is that when you quit the application, it forgets where everything is. Although you can place anchors in the scene to position virtual objects in the real world, there is no persistence between sessions. I suspect ARCore might advance quicker in this department as Google’s ill-fated Tango technology had this in their SDK for years. I’m assuming we’ll see more and more Tango features merged into ARCore in 2018. Rumors suggest ARKit 2.0 will also see similar improvements.
ARKit does one up ARCore with the addition of face tracking for the iPhone X. This is the most advanced facial tracking system currently available on mobile phones. However, it’s only on one device–albeit a wildly popular one. ARKit’s facial tracking seems to produce results far beyond current mask filter SDKs as it builds a mesh out of your face using the TrueDepth camera. However, there doesn’t seem to be a reason why many of the basic facial tracking features can’t be brought over to phones with standard cameras. Maybe we’ll see a subset of these features trickle down into other iOS devices in the near future.
ARKit has far more penetration than ARCore. ARCore runs on a tiny fraction of Android devices, and this isn’t likely to improve. ARKit requires an iPhone 6S and above, but that’s still a large chunk of iOS devices. There probably is zero business case for focusing on ARCore first. If you truly need to develop a standalone AR app, your best bet is to target iOS primarily and Android second (if at all). If ARCore starts to get some of Tango’s features added to it ahead of ARKit, then there will be compelling use cases for ARCore exclusive apps.
Facebook Camera Effects Platform vs. Snapchat World Lens
When ARKit was first announced, I had a few meetings at large companies. They all thought it was cool, but didn’t want to develop standalone apps. Getting users to download yet another app is expensive and somewhat futile as most go unused after a few tries. There’s a lot more interest in distributing AR experiences inside apps people already have installed. Before Facebook Camera Effects was announced, the only option was Blippar. Which really isn’t an option since hardly anyone uses it.
I got access to Facebook Camera Effects early on and was really impressed with the tools. Leading up to the public release, Facebook has added a lot of features. I’ve seen everything from simple masks to full-blown multiplayer games built with Facebook’s AR Studio.
The great thing about Camera Effects Platform is you are able to distribute an AR experience through an app that already has hundreds of millions of users. Because of this reach, a filter must be tested on a wide variety of phones to account for per-platform limitations and bugs. This is because Facebook AR filters run on a huge number of devices–whether they have native AR SDKs or not.
What’s tricky is after getting approval for distribution of your filter, you still have to somehow tell users to use it. Facebook provides a few options, such as attaching a filter to a promoted Facebook page, but discovery is still a challenge.
As Camera Effects Platform opened to all, Snap released Lens Studio for both Windows and Mac. This platform allows developers to create World Lens effects for Snapchat. I was really excited about this because a lot of clients were just not very enthusiastic about Facebook’s offering. I kept hearing that the valuable eyeballs are all on Snapchat and not Facebook, despite Snapchat’s flatlining growth. Brands and and marketers were chomping at the bit to produce content for Snapchat without navigating Snap’s opaque advertising platform.
As for distribution, you still have to go through an approval process which includes making sure your lens is performant on low-end devices as well as current phones. Once available you can link your lens to a Snapcode which you can distribute any way you want.
Which should you develop for? Unlike ARCore and ARKit, Facebook and Snapchat have wildly different feature sets. You could start with a Facebook Camera Effect and then produce a World Lens with a subset of features using detail reduced assets.
The easier path may be to port up. Start with a simple World Lens and then build a more elaborate Facebook AR filter with the same assets. Given how few people use Facebook’s stories feature, I feel that it may be smarter to target Snapchat first. Once Facebook’s Camera Effects Platform works on Instagram I’d probably target Facebook first. It really depends on what demographic you are trying to hit.
App vs. Filters
Should you develop a standalone AR app or a filter inside a social network platform? It really depends on what you’re trying to accomplish. If you want to monetize users, the only option is a standalone ARKit or ARCore app. You are free to add in-app purchases and ads in your experience as you would any other app. Facebook and Snap’s guidelines don’t allow this on their respective platforms. Are you using AR to create branded content? In the case of AR filters, they are usually ads in themselves. If you are trying to get as much reach as possible, a properly marketed and distributed AR filter is a no-brainer. A thorough mobile AR strategy may involve a combination of both native apps and filters–and in the case of Facebook’s Camera Effects Platform, they can even link to each other via REST calls.
2018 is going to be an exciting year for smartphone AR. With the explosive growth of AR apps on the AppStore and the floodgates opening for filters on social media platforms, you should be including smartphone AR into your mixed reality strategy. Give your users a taste of the real thing before the mixed reality revolution arrives.