In this tutorial, we’ll be discussing Augmented Reality.
Overview
According to Wikipedia, ARCore is a software development kit developed by Google that allows for augmented reality applications to be built.
ARCore uses three key technologies to integrate virtual content with the real environment:
- Motion Tracking: it allows the phone to understand its position relative to the world.
- Environmental understanding: This allows the phone to detect the size and location of all type of surfaces, vertical, horizontal and angled.
- Light Estimation: it allows the phone to estimate the environment’s current lighting conditions.
ARCore is a Google platform that enables your applications to “see” and understand the physical world, via your device’s camera.
In our app/build.gradle
file, we can add the dependency for Sceneform to update the project. Although we’ll be writing this tutorial in Kotlin, Sceneform uses some language constructs from Java 8, so we’ll need to explicitly add support since our minimum API is less than 26.
1 |
implementation “com.google.ar.sceneform.ux:sceneform-ux:1.4.0” |
add project lable dependency
1 |
classpath 'com.google.ar.sceneform:plugin:1.0.1' |
you need to set the ar plugin in your app’s Gradle file as well.
Add the following below the dependencies:
1 |
apply plugin: 'com.google.ar.sceneform.plugin' |
We need to permission for camera in Android Manifest.
1 2 3 4 |
<uses-permission android:name="android.permission.CAMERA" /> <uses-feature android:name="android.hardware.camera.ar" android:required="true" /> |
Adding our Model
To make life easier, and to allow us to import our own 3D assets, we’ll also include the Sceneform plugin for Android Studio. You can install this by selecting preferences, plugins, browse repositories, and search for Google Sceneform Tools (Beta)
We can download the Google Sceneform Tools plugin in our Android Studio to view and render the 3d models.
You can go to https://poly.google.com/ and download a sample model. Don’t forget to credit the creator!
Typically, the OBJ and GLTX formats are used for rendering augmented images.
Now let’s build our first AR application where we’ll use the above 3D model as our AR image.
Now right click on app and create simple directory.
copy and paste all files that you download.
Then right click on obj file and click on “import sceneform assets”
Let’s code
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
<?xml version="1.0" encoding="utf-8"?> <android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:app="http://schemas.android.com/apk/res-auto" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" tools:context=".MainActivity"> <fragment android:id="@+id/sceneform_fragment" android:name="com.google.ar.sceneform.ux.ArFragment" android:layout_width="match_parent" android:layout_height="match_parent" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent" app:layout_constraintTop_toTopOf="parent" /> <android.support.design.widget.FloatingActionButton android:id="@+id/floatingActionButton" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_marginBottom="8dp" android:layout_marginEnd="8dp" android:layout_marginStart="8dp" app:layout_constraintBottom_toBottomOf="@+id/sceneform_fragment" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent" app:srcCompat="@drawable/ic_explore_white_24dp" /> </android.support.constraint.ConstraintLayout> |
MainActivity class
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 |
class MainActivity : AppCompatActivity() { private lateinit var arFragment: ArFragment private var isTracking: Boolean = false private var isHitting: Boolean = false override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_main) arFragment = sceneform_fragment as ArFragme arFragment.arSceneView.scene.addOnUpdateListener { frameTime -> arFragment.onUpdate(frameTime) onUpdate() } floatingActionButton.setOnClickListener { addObject(Uri.parse("NOVELO_EARTH.sfb")) } showFab(false) } private fun showFab(enabled: Boolean) { if (enabled) { floatingActionButton.isEnabled = true floatingActionButton.visibility = View.VISIBLE } else { floatingActionButton.isEnabled = false floatingActionButton.visibility = View.GONE } } // Updates the tracking state private fun onUpdate() { updateTracking() // Check if the devices gaze is hitting a plane detected by ARCore if (isTracking) { val hitTestChanged = updateHitTest() if (hitTestChanged) { showFab(isHitting) } } } // Performs frame.HitTest and returns if a hit is detected private fun updateHitTest(): Boolean { val frame = arFragment.arSceneView.arFrame val point = getScreenCenter() val hits: List<HitResult> val wasHitting = isHitting isHitting = false if (frame != null) { hits = frame.hitTest(point.x.toFloat(), point.y.toFloat()) for (hit in hits) { val trackable = hit.trackable if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) { isHitting = true break } } } return wasHitting != isHitting } // Makes use of ARCore's camera state and returns true if the tracking state has changed private fun updateTracking(): Boolean { val frame = arFragment.arSceneView.arFrame val wasTracking = isTracking isTracking = frame.camera.trackingState == TrackingState.TRACKING return isTracking != wasTracking } // Simply returns the center of the screen private fun getScreenCenter(): Point { val view = findViewById<View>(android.R.id.content) return Point(view.width / 2, view.height / 2) } private fun addObject(model: Uri) { val frame = arFragment.arSceneView.arFrame val point = getScreenCenter() if (frame != null) { val hits = frame.hitTest(point.x.toFloat(), point.y.toFloat()) for (hit in hits) { val trackable = hit.trackable if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) { placeObject(arFragment, hit.createAnchor(), model) break } } } } private fun placeObject(fragment: ArFragment, anchor: Anchor, model: Uri) { ModelRenderable.builder() .setSource(fragment.context, model) .build() .thenAccept { addNodeToScene(fragment, anchor, it) } .exceptionally { Toast.makeText(this@MainActivity, "Error", Toast.LENGTH_SHORT).show() return@exceptionally null } } private fun addNodeToScene(fragment: ArFragment, anchor: Anchor, renderable: ModelRenderable) { val anchorNode = AnchorNode(anchor) // TransformableNode means the user to move, scale and rotate the model val transformableNode = TransformableNode(fragment.transformationSystem) transformableNode.renderable = renderable transformableNode.setParent(anchorNode) fragment.arSceneView.scene.addChild(anchorNode) transformableNode.select() } } |