Apple ARKit – mainstream Augmented Reality

Ever wandered through a portal into another dimension, or wondered what it would look like if you could get inside a CAD model or an anatomy simulation? This is the promise of Apple’s new ARKit technology for Augmented Reality, part of iOS11, the latest version of the operating system software that drives hundreds of millions of iPads and iPhones.

Turning the IET into a Mario level using Apple ARKit

Turning the IET into a Mario level using Apple ARKit

Augmented Reality has been around for years, but in quite a limited way – point your phone/tablet camera at a picture that has special markers on it, and the AR app will typically do something like activate a video or show you a 3D model.

But anyone wanting to develop an AR app in the past has had to fend with a couple of big problems – firstly the hardware in phones and tablets hasn’t quite been up to the job of real time image processing and position tracking, and secondly there hasn’t been a standard way of adding AR capability to an app.

With recent improvements in processor technology and more powerful graphics and AI co-processors being shipped on our devices, the technology is now at a level where real time position tracking is feasible. Apple are rumoured to be including a sensor similar to Google’s Project Tango device on the upcoming iPhone 8, which will support real time depth sensing and occlusion. This means that your device will be able to tell where objects in the virtual world are in relation to objects in the real world – e.g. is there a person stood in front of a virtual object?

Apple and Google are also addressing the standardisation issue by adding AR capabilities to their standard development frameworks – through ARKit on Apple devices and the upcoming ARCore on Android devices. Apple have something of a lead here, having given developers access to ARKit as part of a preview of iOS11. This means that there are literally hundreds of developers who already know how to create ARKit apps. We can expect that there will be lots of exciting new AR apps appearing in the App Store shortly after iOS11 formally launches – most likely as part of the iPhone 8 launch announcement. If you’re a developer, you can find lots of demo / prototype ARKit apps on GitHub. [[ edit: this was written before the iPhone 8 / X launch! ]]

As part of the Jisc Digi Lab at this year’s Times Higher Education World Academic Summit I made a video that shows a couple of the demo apps that people have made, and gives you a little bit of an idea of how it will be used:

How can we see people using ARKit in research and education? Well, just imagine holding your phone up to find that the equipment around you in the STEM lab are all tagged with their names, documentation, “reserve me” buttons and the like – maybe with a graphical status indicating whether you have had the health and safety induction to use the kit. Or imagine a prospective student visit where the would-be students can hold their phones up to see what happens in each building, and giant arrows appear directing them to the next activity, induction session, students union social etc.

It’s easy to picture AR becoming widely used in navigation apps like Apple Maps and Google Maps – and for the technology to leap from screens we hold up in front of us to screens that we wear (glasses!). Here’s a video from Keiichi Matsuda that imagines just what the future might look like when Augmented Reality glasses have become the norm:

How will you use ARKit in research and education? Perhaps you already have plans? Leave a comment below to share your ideas.

Leave a Reply

Your email address will not be published. Required fields are marked *