February 6, 2019
0 min
-
Parallax
No items found.

Making AR Apps and Websites

Parallax

Here at Parallax, we've been excited to work on various augmented reality (AR) projects for our clients recently. For those that don't know – AR is a set of new technologies that allow you to display digital objects in a way that makes them look part of the real world.You might be wondering what steps are involved in taking an object and making it viewable in an AR app or website. Here's a quick breakdown of what we've discovered about the process. GeometryEverything starts with a 3D model or 'mesh' typically created with 3D editors like Blender or Maya. Some smartphones come with depth-sensing cameras so scanning your 3D models can be an option too. To allow things to run at a smooth framerate – meshes need to be efficient, typically less than 100K polygons so some form of optimisation pass will often be required.Most 3D editors will have several tools to help transform hi-res models into low-res ones, usually referred to as retopologizing or 'retopo' for short., DecimatorAs shown in the image above, the decimator collapses polygons fall below a certain threshold in an area. A blunt instrument, but can be effective in making really dense meshes manageable. Similar to changing the quality setting on a JPEG. ProjectionAllows you to draw a lower resolution mesh on the surface of the hi-res version retaining the overall shape., PBRIf an object is to look like it belongs in the real world it needs realistic materials and lighting. To achieve this most AR renderers use 'PBR'. This stands for Physically Based Rendering which uses lighting algorithms based on real-world physics. PBR started off in the movie industry where assets need to look consistently accurate regardless of the environment. Before PBR came along artists would create special purpose lighting setups per scene that were time-consuming to create and unintuitive to edit. PBR has now become something of a standard and is widely used in games and other real-time applications such as AR.,PBR uses a small number of shader inputs to describe how all materials look. *Albedo/Diffuse*The base colour of the material*Metallicity*The vast majority of materials are described as being either metallic or non-metallic (dielectrics).*Roughness*Simply how smooth or rough the surface is.*Normals*These take the 'baked' normal maps mentioned previously as input. Allows the display of small details like stitching or rivets.There are some other special purpose properties such as emission and transparency but these tend to the vary depending on the renderer. Materials are rarely uniform across their surface so most of these values take an image map as input. Libraries and PlatformsWe have our assets ready to go, now we just have to pick what devices, browsers and operating systems we want to display them on. This is where things start to get tricky.*Apple SceneKit / ARKit*"https://developer.apple.com/arkit/":https://developer.apple.com/arkit/Apple's AR framework, required to run natively on iOS devices.*ARCore*"https://developers.google.com/ar/":https://developers.google.com/ar/Google's AR framework, required to run natively on Android. Also works on iOS.Of course on the web, there is the usual zoo of competing JavaScript libraries.*ARjs*"https://github.com/jeromeetienne/AR.js":https://github.com/jeromeetienne/AR.js One of the first marker-based AR libraries*WebARonARCore*Yes, that's its actual name... Allows the use of Google ARCore via the browser."https://github.com/google-ar/WebARonARCore":https://github.com/google-ar/WebARonARCoreThere's also a version that allows access to ARKit too."https://github.com/google-ar/WebARonARKit":https://github.com/google-ar/WebARonARKit*AR model view component*A Google web component that aims to abstract the differences between the different AR libraries."https://github.com/GoogleWebComponents/model-viewer":https://github.com/GoogleWebComponents/model-viewer File FormatsDespite various attempts, there's never been such thing as a universal 3D file format. Apps running natively on iOS will require Collada (.dae) files. Website or apps using the AR Quick Look viewer require USDZ files. AR Core on Android uses GLTF. *COLLADA*"https://www.khronos.org/collada/":https://www.khronos.org/collada/A broadly used XML format originally created by Sony and now managed by the Khronos Group (of OpenGL fame).*USDZ* "https://graphics.pixar.com/usd/docs/Usdz-File-Format-Specification.html":https://graphics.pixar.com/usd/docs/Usdz-File-Format-Specification.htmlShort for Universal Scene Descriptor (Z = binary version). This format originated at Pixar and a converter comes built in with XCode. The converter also produces a more human-friendly USDA ASCII version, useful for debugging. *GLTF* "https://www.khronos.org/gltf/":https://www.khronos.org/gltf/Another format managed by the Khronos Group GLTF is becoming the most popular interchange format for real-time assets. Supports PBR materials and the ability to embed textures in a single file.It's a pain exporting these formats manually.

Introducing ARexportWe've created a command line script that will convert a Blender3D (.blend) file into these three main file formats in one shot. Download ARexport You can find it "on our GitHub here":https://github.com/parallax/ar-export. The Collada format doesn't seem to support PBR materials directly but as it's an XML format it should be possible to add in some way. Pull requests are welcome ;)Hopefully, this gives you an idea of what's involved in creating AR suitable 3D assets. Send us an email or shout our way on Twitter if you have any questions!