Google AR Core Sceneform Animation Android Image Source unsplash.com.

Here is the GitHub link for this project ARCore Sceneform Animation.

Introduction

We’ll delve into the world of Android Augmented Reality in this article. We’ll use ARCore, Google’s framework for creating augmented reality experiences. We’ll look at how ARCore is transforming AR app creation by abstracting complex matrix and vector math and providing us with beautiful AR APIs.

According to Wikipedia, “augmented reality” is a technology that “superimposes a computer-generated image on a user’s view of the real world, producing a composite view.”In essence, augmented reality (AR) is a technology that allows us to project computer-generated 3D object models into the real world and make them interact with their environments as if they were physically present.

Augmented reality (AR) serves as a bridge between humans and computers, allowing businesses to market their products more effectively. It’s no surprise, then, that technology is growing in popularity every day.

AR, in particular, does not construct an artificial world; rather, it interacts with the current environment and adds functionality to it. AR’s use of displaying digital 3D models in the real world through a mobile camera is one example.

Developing AR applications a few years ago meant learning OpenGL and complex vector math. In order to make AR development easier for developers, Google launched ARCore along with Sceneform SDK (for Android) in 2018.

Photo Preview

Glossary

ARCore: In 2018, a group of Google employees created a software development tool. It enables the development of augmented reality applications and employs three main features to incorporate virtual content into the real world:

Motion tracking is a feature that enables a mobile camera to determine its location in relation to the real world.

Understand the Environment: This function enables the mobile camera to recognize the size and position of a variety of surfaces, including horizontal, vertical, and angled planes.

Light Estimation: This feature helps the mobile camera to estimate real-world lighting conditions. 

Sceneform: Sceneform ends up acting as ARCore’s companion when it comes to augmented reality (AR) software creation on Android. ARCore is an engine that aids SDKs in rendering augmented objects, rather than an SDK.

As a result, Google released the Sceneform SDK, which allows developers to build Android AR apps without learning OpenGL. This SDK helps us to import and display 3D models in.obj,.fbx, and.glTF formats. It works as an Android Studio plugin for 3D asset development.

Steps:

We’ll need to install a Sceneform plugin on Android Studio to get started with our first AR app. This plugin helps us to add 3D models to our projects (supported formats: obj, gltf (animations not supported), and fbx). These are the most widely used 3D model formats, which Blender, Sketchup, and others support.

However, these 3D models cannot be used to render directly into the app. The Sceneform plugin assists us in converting these to supported formats, such as *.sfa (SceneForm Asset) or *.sfb (SceneForm Binary As a result, in order to make use of this feature, Google released the Sceneform SDK, which allows developers to create Android AR apps without having to learn OpenGL. , which is created from the *.sfa during app development).
More information about Sceneform can be found in the Android Developers Guide.

Step 1: Adding the Sceneform Plugin

In Android Studio, the Sceneform Plugin must be mounted. You can use the Sceneform plugin to help you with tasks like model importation into your Android project.

To install the plugin, follow the steps below:

  • For users of Windows: Go to File-> Preferences-> Plugins are a type of software that allows you to
    For users of Mac OS X: To access Plugins, go to Android Studio-> File->settings->plugins.
    Now, in the search bar, type “Sceneform.” Google Sceneform Tools will be at the top of the list.
  • Install the plugin and restart android studio.

Step 2: Adding required dependencies

To begin, add the following dependency to the project-level build.gradle file:

dependencies {
classpath "com.android.tools.build:gradle:4.0.2"
classpath 'com.google.ar.sceneform:plugin:1.17.1'
}
view raw build.gradle hosted with ❤ by GitHub

Important: The Sceneform SDK needs a minSdkVersion of 24 or greater. As a result, make sure that minSdkVersion>= 24 is set. Also, double-check that the Maven repository is included in your project’s build. gradle is a programming language.

Second, add the new ARCore library to your app’s construct as a dependency as shown in the app-level gradle file:

dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.4'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
//SceneForm
implementation "com.google.ar.sceneform.ux:sceneform-ux:1.17.1"
implementation "com.google.ar.sceneform:animation:1.17.1"
implementation 'com.android.support:design:29.0.0'
}
view raw build.gradle hosted with ❤ by GitHub

Step 3: Adding permissions to the manifest

<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera.ar" android:required="true" />
view raw AndroidManifest.xml hosted with ❤ by GitHub

For obvious reasons, AR apps need the Camera permission.
When we host our renderable (3D model) and retrieve them at runtime, the Internet permission will come in handy as shown below.

Step 4: Creating the activity_main.xml file

We must first include the Sceneform fragment in our layout format. This will be the scene where all of our 3D models will be placed. It handles camera initialization and permission management.
Navigate to the main layout file. It’s activity main.xml in my case, and I’ve added the Sceneform fragment:

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<fragment
android:id="@+id/sceneform_fragment"
class="com.google.ar.sceneform.ux.ArFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
<com.google.android.material.floatingactionbutton.FloatingActionButton
android:id="@+id/btn_anim"
android:clickable="true"
android:focusable="true"
app:elevation="1dp"
app:layout_constraintHorizontal_chainStyle="spread"
app:layout_constraintHorizontal_bias="0.45"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintStart_toEndOf="parent"
app:layout_constraintBottom_toBottomOf="parent"
app:srcCompat="@drawable/ic_baseline_play_arrow_24"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</androidx.constraintlayout.widget.ConstraintLayout>
view raw activity_main.xml hosted with ❤ by GitHub

Since this will cover my entire operation, I’ve set the width and height to match_parent. You may choose the measurements that best fit your needs.

Step 5: Adding the 3D model

Now it’s time to download and import the 3D models that will be made into our app. In our case, we’ll be rendering and moving a 3D animation in our room.

You can get 3D models from anywhere, but Google has a great repository called POLY where you can get 3D models for your app. The models are available in.obj and.gltf formats. The.obj file will be downloaded.

Expand the software folder in your android studio project’s project view. A folder called “sampledata” will appear. If you don’t have one, make one now.

You must extract the downloaded zip file into this sample data folder after your model has finished downloading.

You will find a .mtl file, a .obj file and a png image of the model. We’ll import the .obj file in our application using the sceneform plugin.

Step 6: Importing the model using Sceneform plugin 

When you right-click on the.obj file, you’ll see an option to “Import Sceneform Asset.” Click on it and leave the default settings. After you’ve finished importing, gradle will sync the project so that the asset is included in your app.

You have now completed the import of the 3D asset into your submission.

The lines below will be added to your app-level build.gradle file when you “Import Sceneform Asset.”

sceneform.asset('sampledata/samples/cangrejo.fbx',
'default', // 'Material Path' specified during import.
'sampledata/samples/cangrejo.sfa',
'src/main/res/raw') // 'location you want to store the generated .sfb file'
view raw build.gradle hosted with ❤ by GitHub

Step 7: Building the Model

In your java file, add the following code:

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
arFragment = (ArFragment)getSupportFragmentManager()
.findFragmentById(R.id.sceneform_fragment);
//Tap on plane event
arFragment.setOnTapArPlaneListener(new BaseArFragment.OnTapArPlaneListener() {
@Override
public void onTapPlane(HitResult hitResult, Plane plane, MotionEvent motionEvent) {
if(animationCrab ==null)
return;
//Create the Anchor
Anchor anchor = hitResult.createAnchor();
if(anchorNode == null) //If crab is not place on plane
{
anchorNode = new AnchorNode(anchor);
anchorNode.setParent(arFragment.getArSceneView().getScene());
transformableNode = new TransformableNode(arFragment.getTransformationSystem());
//Scale model
transformableNode.getScaleController().setMinScale(0.09f);
transformableNode.getScaleController().setMaxScale(0.1f);
transformableNode.setParent(anchorNode);
transformableNode.setRenderable(animationCrab);
}
}
});
view raw MainActivity.java hosted with ❤ by GitHub

Let’s take a closer look at the situation.

With the aid of supportFragmentManager and the fragment id, we first get the fragment we inserted in our layout code.

The model must then be loaded into the scene. The ModelRenderable class from the Sceneform SDK is used for this. We can load our model by passing the name of the generated.sfb file to ModelRenderable’s setSource() process.

Since the model is being constructed in the background, it is introduced to the main thread once it is loaded, which makes it to the scene.

The model is passed into the thenAccept form. An exception is thrown if the model is built incorrectly.

Now that our model has been loaded, it’s time to put it in the scene.

Step 8:Adding the Model to the AR Scene

Since our AR fragment is the scene’s container, we need to add a model to it once it is clicked. As a result, we’ll have a onTapListener in our fragment.

Scene: This is where our three-dimensional world will be made.

It’s an imaginary ray of light that comes from infinity, and the point of tap is its first point of intersection with the real world. AnchorNode: This is the node that places itself in the world automatically. When the plane is detected, this is the first node to be set.

A fixed position in the physical world is referred to as an anchor. Used to convert local coordinates (as seen by the user) to real-world coordinates.

TransformableNode: A node that responds to user interactions including rotation, zooming, and dragging.
The following is an example of your final Java file:

package com.dennis.sceneformanim;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.content.ContextCompat;
import android.content.Context;
import android.content.res.ColorStateList;
import android.graphics.Color;
import android.os.Bundle;
import android.view.MotionEvent;
import android.view.View;
import android.widget.Toast;
import com.google.android.material.floatingactionbutton.FloatingActionButton;
import com.google.ar.core.Anchor;
import com.google.ar.core.HitResult;
import com.google.ar.core.Plane;
import com.google.ar.sceneform.AnchorNode;
import com.google.ar.sceneform.FrameTime;
import com.google.ar.sceneform.Scene;
import com.google.ar.sceneform.animation.ModelAnimator;
import com.google.ar.sceneform.rendering.AnimationData;
import com.google.ar.sceneform.rendering.ModelRenderable;
import com.google.ar.sceneform.ux.ArFragment;
import com.google.ar.sceneform.ux.BaseArFragment;
import com.google.ar.sceneform.ux.TransformableNode;
public class MainActivity extends AppCompatActivity {
//Variable
private ArFragment arFragment;
private AnchorNode anchorNode;
private ModelAnimator animator;
private int nextAnimation;
private FloatingActionButton btn_anim;
private ModelRenderable animationCrab;
private TransformableNode transformableNode;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
arFragment = (ArFragment)getSupportFragmentManager()
.findFragmentById(R.id.sceneform_fragment);
//Tap on plane event
arFragment.setOnTapArPlaneListener(new BaseArFragment.OnTapArPlaneListener() {
@Override
public void onTapPlane(HitResult hitResult, Plane plane, MotionEvent motionEvent) {
if(animationCrab ==null)
return;
//Create the Anchor
Anchor anchor = hitResult.createAnchor();
if(anchorNode == null) //If crab is not place on plane
{
anchorNode = new AnchorNode(anchor);
anchorNode.setParent(arFragment.getArSceneView().getScene());
transformableNode = new TransformableNode(arFragment.getTransformationSystem());
//Scale model
transformableNode.getScaleController().setMinScale(0.09f);
transformableNode.getScaleController().setMaxScale(0.1f);
transformableNode.setParent(anchorNode);
transformableNode.setRenderable(animationCrab);
}
}
});
//Add frame update to control state of button
arFragment.getArSceneView().getScene()
.addOnUpdateListener(new Scene.OnUpdateListener(){
public void onUpdate(FrameTime frameTime){
if (anchorNode == null)
{
if (btn_anim.isEnabled())
{
btn_anim.setBackgroundTintList(ColorStateList.valueOf(Color.GRAY));
btn_anim.setEnabled(false);
}
}
else
{
if (!btn_anim.isEnabled())
{
btn_anim.setBackgroundTintList(ContextCompat.getColorStateList(MainActivity.this,R.color.colorAccent));
btn_anim.setEnabled(true);
}
}
}
});
btn_anim = (FloatingActionButton)findViewById(R.id.btn_anim);
btn_anim.setEnabled(false);
btn_anim.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if(animator == null || !animator.isRunning())
{
AnimationData data = animationCrab.getAnimationData(nextAnimation);
nextAnimation = (nextAnimation+1)%animationCrab.getAnimationDataCount();
animator = new ModelAnimator(data,animationCrab);
animator.start();
}
}
});
setupModel();
}
private void setupModel() {
ModelRenderable.builder()
.setSource(this, R.raw.cangrejo)
.build()
.thenAccept(renderable -> animationCrab = renderable)
.exceptionally(throwable -> {
Toast.makeText(this, ""+throwable.getMessage(), Toast.LENGTH_SHORT).show();
return null;
});
}
}
view raw MainActivity.java hosted with ❤ by GitHub

That concludes our discussion. We’ve built a completely functioning Android augmented reality app. The entire source code is available on Github.

Future Directions

You can clone this project from our GitHub repository and begin experimenting with it. Importing several models, modifying properties in the.sfb file, and incorporating gesture interactions to the models are all things you can practice.

Stay tuned for more articles on ARCore-based Augmented Reality applications.

Finally, potential directions include showing the shape loading view inside an app that allows the progress bar to be shown when waiting for the next step to process, allowing the user to successfully advance to the next step.

Learning Strategies and Tools

Despite the widespread use of augmented reality in many aspects of daily life, it is still new and untested in education. Though the applications of augmented reality in teaching and learning are vast, it does open up new avenues for learning. Teachers gain new tools to better capture students’ attention and inspire them, while students gain new tools to imagine subjects and complex concepts, as well as practical skills. Parents, too, will benefit from encouraging their children to study with fun apps.

Finally, this article is intended to provide readers with a general introduction to working with augmented reality in Android Studio.

Reflective Analysis

It was a wonderful learning experience to figure out how to use ARcore and Sceneform SDK in Android. It’s a fantastic tool that makes understanding Augmented Reality endpoints a breeze.

One of the most popular applications in Android, it’s used to show filters in photos and videos. AR is carving out its own niche in most sectors in a variety of forms and for a variety of purposes. An augmented reality application, such as the world-famous Pokemon Go app, is just one example. In this post, we’ve gone over the fundamentals of creating an Android app with an augmented reality feature.

Finally, I spent 72 hours completing the project and writing the blog. All will eventually be available in this GitHub link.

Link to the previous post: https://blog.learningdollars.com/2020/09/16/how-to-check-internet-connection-programatically-on-android-from-a-button-click-in-kotlin

That’s all for this tutorial!