Thursday, July 25, 2019

Introduction to Augmented Reality and ARCore (4) - Bringing ARCore to life

Links
  1. Introduction to augmented reality (AR)
  2. The basics of AR functionality
  3. Taking the next steps with ARCore

Bringing ARCore to life


A Closer look at mechanics of ARCore

  • Surface detection allows ARCore to place digital objects on various surface heights, to render different objects at different sizes and positions, and to create more realistic AR experiences in general.
  • Pose is the position and orientation of any object in relation to the world around it. Everything has its own unique pose: from your mobile device to the augmented 3D asset that you see on your display.
  • Hit-testing lets you establish a pose for virtual objects and is the next step in the ARCore user process after feature-tracking (finding stationary feature points that inform the environmental understanding of the device) and plane-finding (the smartphone-specific process by which ARCore determines where horizontal surfaces are in your environment).
  • Light estimation is a process that allows the phone to estimate the environment's current lighting conditions. ARCore is able to detect objects in suboptimal light and map a room successfully, but it’s important to note that there is a limit to how low the light can be for the experience to function.
  • Occlusion is when one 3D object blocks another 3D object. Currently this is only possible with digital objects, and AR objects cannot be occluded by a real world object. For example, in an AR game the digital object would not be able to behind a real couch in the real world.
  • Assets in multi-plane detection are scaled appropriately in relationship to the established planes, though only need to be placed on them (via anchor points) when it causes them to function like their real-world counterparts.
  • Immersion can be broken by users interacting with AR objects as if they were physically real. Framing can be used to combat these immersion-breaking interactions.
  • Spatial mapping is the ability to create a 3D map of the environment and helps establish where assets can be placed.
  • Feature points are stationary and are used to further environmental understanding and place planes in an experience. ARCore assumes planes are unmoving, so it is inadvisable to attempt to anchor a digital object to a real world object that is in motion. In general, it’s best not to place an object until the room has been sufficiently mapped and static surfaces have been recognized and designated as feature points.

Ploy, a library of 3D assets for your AR app


Google's Poly is a 3D asset library that lets you quickly find 3D objects and scenes for use in your apps, and it was built from the ground up with AR and VR development in mind.

You can use VR creation tools from Google like Tilt Brush and Blocks to build 3D assets and store them on Poly for use in AR apps.

Poly allows you to browse, search, view and download thousands of objects and scenes for your project.

Poly is available on desktop, mobile, and in VR.

Poly can also create gifs and publish creations to the web.

The Poly API provides read access to assets in the Poly library.

With the Poly API, users can access Google’s growing collection of Creative Commons 3D assets and interact directly with Poly to search, download, and import objects dynamically across desktop, mobile, virtual reality, and augmented reality.

Creators can find all types of assets for applications and easily search for remixable, free assets licensed under a Creative Commons license by keyword, category, format, popularity or date uploaded.

Creators can filter by model complexity, or create a personalized experience by letting users sign into the app with their Google account to access any assets they’ve uploaded or liked on Poly.

Sceneform for easier AR content creation

Sceneform SDK is a high-level 3D framework that makes it easy for users to build AR apps in Java. It offers a new library for Android that enables the rapid creation and integration of AR experiences, and combines ARCore with a powerful physically-based 3D renderer. It includes a runtime API for working with graphics and rendering, and a plugin to help you import, preview, and tweak the look and feel of your assets directly in Android Studio.

Sceneform is highly optimized for mobile. Java developers can now build immersive, 3D apps without having to learn complicated APIs like OpenGL. They can use it to build AR apps from scratch as well as add AR features to existing ones.

What follows is a walkthrough of how to use Sceneform. It's more technically advanced than most of the other content in this course--it's very helpful to have a little background in Java to fully appreciate how you might use it yourself--but we've included it so that aspiring creators can start to learn how to use Sceneform to make their own AR content.

Using Ploy and Unity to create ARCore assets

Unity is a cross-platform game engine and development environment for both 3D and 2D interactive applications. It has a variety of tools, from the simple to the professionally complex, to allow for the streamlined creation of 3D objects and environments.
Poly toolkit for Unity is a plugin that allows you to import assets from Poly into Unity at edit time and at runtime.
Edit-time means manually downloading assets from Poly and importing them into your app's project while you are creating your app or experience.
Runtime means downloading assets from Poly when your app is running. This allows your app to leverage Poly's ever-expanding library of assets.


No comments:

Post a Comment