Thursday, July 25, 2019

Introduction to Augmented Reality and ARCore (3) - Taking the next steps with ARCore

Links
  1. Introduction to augmented reality (AR)
  2. The basics of AR functionality


Taking the next steps with ARCore


Cloud Anchors for shared AR


As we’ve covered, anchors are the mechanism by which you can attach virtual content to a trackable real-world point.

Building on this concept and supported by the cloud, Cloud Anchors are a cross-platform feature that allow both iOS and Android device users to share the same AR experience despite using different underlying AR technologies. Where traditionally anchors have been isolated to one device—or one augmented reality—these anchors can be shared by multiple different devices simultaneously.

This makes AR possible for groups of people rather than just individuals, allowing for shared, collaborative experiences like redecorating your home, playing games and making art in 3D space together.


Use cases and current powers/limitations of AR

  • ARCore can be used to create dynamic experiences for businesses, nonprofits, healthcare, schools, and more.
  • ARCore’s strengths are its phone-based spatial mapping capabilities and addressable user base. Approximately 85% of phones around the world run on the Android operating system.
  • At of the beginning of 2018, ARCore is already available on 100 million Android-powered smartphones and that number continues growing. ARCore requires a lot of processing power, so not all older Android models have the necessary specifications yet. ARCore is also available in China.
  • Limitations to consider with contemporary AR technology include: low-light environments, a lack of featured surfaces, and the availability of powerful mobile processors in new phones.

User experience (UX) and user interface (UI) to consider

User interface (UI) and user experience (UX) are two complementary fields of product design. They have a lot in common, but generally speaking UX is the more technical of the two. UX is the process and underlying framework of enhancing user flow to create products with high usability and accessibility for end users while UI is more akin to graphic design and focuses on the front-end design elements of the interface.

At the most basic level, UI is the visuals of your app and everything that a user interacts with and UX is the process that supports the user's journey and interactions.

Before creating an AR app it is important to deeply consider the design so that you are utilizing the minimal space of a phone display in the best possible way. By creating the most intuitive UX/UI for your app, you give it the best chance to succeed with users.

When using an AR app your smartphone’s screen will mostly be filled with input from the camera, showing the live feed of the real world. It’s important to not overly clutter the screen with buttons or other elements that are nonessential and might be confusing for users.

For example, the AR Stickers app displays placeable 3D objects at the top of the screen, so you can drag and drop a character into your scene. The app also allows you to hide the objects menu by clicking the “ ^ “ button. This lets the user concentrate on the character without the additional noise of the objects menu.

If your app demands a lot of features, make sure the icons are displayed at the corners of the screen and don’t take up the crucial real estate in the middle of the screen. This is usually where most of the interaction with AR objects take place. Giving users the option to hide the menu is also a great feature to ensure immersion.

It is important not to crowd the screen with too much information at once. However, with AR you’re not restricted to just the screen when it comes to UI menus. One option that allows creators to incorporate a lot of information is to use the environment as a menu, letting users interact with items or display information tagged to specific features, such as buildings.

Basic AR interaction options

The following list offers some of the major examples of the types of interaction options AR app creators can give users. Treat this as a starting point rather than a complete list; mobile AR is young, and creators are discovering new interaction mechanics all the time.

  1. Drag and Drop: This feature lets users drag and drop objects from a menu of 3D digital assets “onto” the screen to place them on a real world surface (that has been spatially mapped by ARCore).
  2. Voice: Voice is quickly emerging as a powerful interaction tool that creators can build into AR apps. Building in pre-programmed voice commands allows users to execute specific actions within the AR app. This is often best achieved by embedding the Google Assistant SDK to add voice-enabled smart interaction
  3. Tap: With the tap mechanic, users can place objects in the real world by tapping on the screen. ARCore uses raycasting (the use of ray-to-surface as an intersection test), and projects a ray to help estimate where the AR object should be placed in order to appear in the real-world surface in a believable way. Another way to use tap is as a mechanic to interact with a digital object that is already placed in the scene. For example, maybe the app allows users to animate a 3D object when users tap on it.
  4. Pinch and Zoom: Pinch and zoom lets users enlarge or scale down a 3D object—or use the interaction in creative ways to build a game or user experience. For example, this could be used to pull back the string of a bow in a bow-and-arrow target game.
  5. Slide: Users can interact with 3D objects through sliding, which translates (or moves) objects in-scene, or use it as a game mechanic. For example, say you are creating an AR paper toss game. You could enable a slide interaction to let users project or throw papers into a trash can.
  6. Tilt: Using the accelerometer and gyroscope, a tilt of the phone can also be used as an input for creative interactions. An easy example would be to make a “steering wheel” mechanic for a racing tabletop AR game.

Think like a user

  • User flow is the journey of your app's users and how a person will engage, step by step, with your AR experience.
  • Planning your user flow needs to take into account the scene, the user interactions, any audio cues, and the final user actions.
  • A user flow can be created with simple sketches and panels all collected into one cohesive diagram.
  • UX and UI are complementary fields of product design, and generally speaking UX is the more technical of the two.
  • When considering UX/UI, one good rule of thumb to remember with AR is to avoid cluttering the screen with too many buttons or elements that might be confusing to users.
  • Choosing to use cartoonish designs or lighting can actually make the experience feel more realistic to the user, as opposed to photorealistic assets that fail to meet our expectations when they don't blend in perfectly with the real world.
  • Users might try to “break” your experience by deliberately disregarding your carefully planned user flow, but your resources are better spent on improving your app’s usability rather than trying to prevent bad actors.

Next steps on the AR journey

  • Advanced 3D design tools like Maya, Zbrush, Blender, and 3ds Max are powerful professional tools.
  • Google’s Poly can be a good starting resource for building your first ARCore experience.
  • Poly by Google is a repository of 3D assets that can be quickly downloaded and used in your ARCore experience.
  • The recommended guide for your AR experience is a design document that contains all of the 3D assets, sounds, and other design ideas for your team to implement.
  • You may need to hire advanced personnel to help you build your experience, such as: 3D artists, texture designers, level designers, sound designers, or other professionals.

1 comment: