Augmented Reality Development Made Easy With ARCore
Augmented Reality DevelopmentWhy ARCore?
WWDC In June 2017, Apple revealed ARKit, a world-famous for AR application development. After that in a few months, Google revealed AR Core, which was acquired from google Tango an Indoor Mapping Project. Google’s response to ARKit and ARCore is a developer platform for the design and deployment of mind-blowing Augmented reality. With help of this SDK, users’ phones are able to understand the external environment. The amazing thing about this SDK is that it supports both Android and iOS devices.
In which device and platform ARCore feature is supported?
Supported platforms: Android version 7.0 and higher is supported and on iOS 11 or higher is supported.
What are the core elements of ARCore?
Motion Tracking:
AR core figure outs the position as well as the orientation of a virtual simulated 3D object in the actual world, using the phone’s camera as well as sensor data. It is termed as object pose. When a mobile phone is running, AR Core tracks a virtual object pause in the scene. This allows him to present these items in the right aspect depending on the mobile’s position.
Environmental Understanding:
By altering input from the device’s camera, the AR core encounters horizontal as well as vertical level surfaces such as a table, a floor, or a wall. These discovered surfaces are termed as planes. The AR core affixes a virtual object to the aircraft at a specified location as well as orientation. These fixed points are termed as anchors.
Light Evaluation:
The AR core understands the light of the environment. Virtual items can then be adjusted to the average fury and color to keep them in the same condition as their surroundings. This prepares virtual objects to look more realistic.
How to create an ARCore application?
1) You will need Android Studio 3.1 or higher. We want to create a new project with support targeting API 24: Android 7.0.
2) Setup the Gradle to support JavaVersion 8, as java version 8 is necessary to build ARCore Application. Below is the line you need to place in build.gradle. After that Sync the project.
android {
…
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
3) Now you need to add ARCore dependency in project. By adding those dependencies the ARCore components will be accessible in the project. Below is the line you need to place in build.gradle.
dependencies {
…
implementation “com.google.ar.sceneform.ux:sceneform-ux:1.4.0”
}
Note: Since this is an augmented reality app, we have to ask the user for access to the camera. The Play Store uses a specific AR meta tag that filters applications for users who don’t have an ARCore-enabled device.
4) Then you need to add a below meta tag in AndroidManifest.xml, which would show you app in the category of ar supported app
<uses-permission android:name=”android.permission.CAMERA” />
<uses-feature
android:name=”android.hardware.camera.ar”
android:required=”true” />
5) As you now know that the project is ready to add 3D Objects, but still one thing is remaining. Now you need to add Google Sceneform Tools Plugin in Android studio.
After adding this plugin your android studio would be able to read 3D AR objects
6) Now it’s time to place 3D Object inside the application. You may ask your designer/graphics artist to create a custom 3D Obj as per your requirement. OR also can use 3D Obj which is provided by Google.
Note: The Sceneform android studio plugin supports an OBJ, and FBX, and a glTF file formats only.
7) Create sample data/ directory inside app. The next task is right-clicking downloaded 3D Obj and select “Import Sceneform Assert”. Let’s say for example, in-app have added Earth Object then it would look like:
8) Now it’s a time of implementation in code. First of all, you need to insert ARFragment in your activity’s layout file.
<fragment
android:id=”@+id/sceneform_fragment”
android:name=”com.google.ar.sceneform.ux.ArFragment”
android:layout_width=”match_parent”
android:layout_height=”match_parent”
app:layout_constraintBottom_toBottomOf=”parent”
app:layout_constraintEnd_toEndOf=”parent”
app:layout_constraintStart_toStartOf=”parent”
app:layout_constraintTop_toTopOf=”parent” />
the placed 3D Object now need to render in this fragment.
9) Now task is to insert AR Obj in ARFragment, let’s say in app have button and on click of that need now to place 3D Obj in the actual world.
addObject(Uri.parse(“NOVELO_EARTH.sfb”))
private fun addObject(model: Uri) {
val frame = arFragment.arSceneView.arFrame
val point = getScreenCenter()
if (frame != null) {
val hits = frame.hitTest(point.x.toFloat(), point.y.toFloat())
for (hit in hits) {
val trackable = hit.trackable
if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) {
placeObject(arFragment, hit.createAnchor(), model)
break
}
}
}
}
private fun placeObject(fragment: ArFragment, anchor: Anchor, model: Uri) {
ModelRenderable.builder()
.setSource(fragment.context, model)
.build()
.thenAccept {
addNodeToScene(fragment, anchor, it)
}
.exceptionally {
Toast.makeText(this@MainActivity, “Error”, Toast.LENGTH_SHORT).show()
return@exceptionally null
}
}
10) Now from the above code object is setup to load in AR Fragment, but then you need to add properties to Object. Let’s say you want it to be resizable or movable. then you need to add a node to the scene. This adds properties to 3D Object also provides a way to render in ARScene in the screen.
private fun addNodeToScene(fragment: ArFragment, anchor: Anchor, renderable: ModelRenderable) {
val anchorNode = AnchorNode(anchor)
// TransformableNode means the user to move, scale and rotate the model
val transformableNode = TransformableNode(fragment.transformationSystem)
transformableNode.renderable = renderable
transformableNode.setParent(anchorNode)
fragment.arSceneView.scene.addChild(anchorNode)
transformableNode.select()
}
11) You are now set, but the most important thing is first you need to detect the floor to place objects. For that, you need to create listeners to check the scene.
arFragment.arSceneView.scene.addOnUpdateListener { frameTime ->
arFragment.onUpdate(frameTime)
onUpdate()
}
// Updates the tracking state
private fun onUpdate() {
updateTracking()
// Check if the devices gaze is hitting a plane detected by ARCore
if (isTracking) {
val hitTestChanged = updateHitTest()
if (hitTestChanged) {
showFab(isHitting)
}
}
}
12) So with this are floor mesh will be detected and then you can check is Tracking live or not.
// Performs frame.HitTest and returns if a hit is detected
private fun updateHitTest(): Boolean {
val frame = arFragment.arSceneView.arFrame
val point = getScreenCenter()
val hits: List<HitResult>
val wasHitting = isHitting
isHitting = false
if (frame != null) {
hits = frame.hitTest(point.x.toFloat(), point.y.toFloat())
for (hit in hits) {
val trackable = hit.trackable
if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) {
isHitting = true
break
}
}
}
return wasHitting != isHitting
}
So you’re all set, when you just tap the button after the phone detects the floor, then the 3D object will fall into the AR fragment and the transform properly all Object to resize or touch drag.
I hope you enjoyed this brief intro to ARCore with android.