android connection points - artoolkitx/artoolkitx GitHub Wiki
When you are developing in Android you will most likely use the Java-API to integrate artoolkitX into your apps. As you might know, Android also supports native development using the NDK. This would allow you to directly use the ARController-Interface using C++. On this page, however, we are only going to talk about the usage of the Java-API.
There are several classes and interfaces that you need to know of when you are developing on Android. These classes and interfaces are provided to you inside the ARXJ component (ARXJ.aar). The most important classes are ARRenderer and ARActivity. As you most likely know, in Android the Actvity class is used to control your apps lifecycle and to create your app's different screens.
ARActivity is an abstract class in which artoolkitX handles the lifecycle of an artoolkitX application. This also includes opening and controlling the camera stream. Your implementation of ARActivity can be fairly simple. You only need to provide the artoolkitX framework with a content view on which to render the camera stream and the augmented reality objects, and an instance of an ARRenderer implementation.
/**
* A very simple example of extending ARActivity to create a new AR application.
*/
public class AR2dTrackingActivity extends ARActivity {
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
}
/**
* Provide our own AR2dTrackingRenderer.
*/
@Override
protected ARRenderer supplyRenderer() {
return new AR2dTrackingRenderer();
}
/**
* Use the FrameLayout in this Activity's UI.
*/
@Override
protected FrameLayout supplyFrameLayout() {
return (FrameLayout) this.findViewById(R.id.mainFrameLayout);
}
}
Inside the ARRenderer implementation, you create your AR scene and instruct artoolkitX which trackables you would like to use (see configureARScene()
. You can also configure artoolkitX inside ARRenderer. Finally, the ARRenderer class must have a draw()
function. This function will be called for every video frame that we receive from the camera. Inside the draw()
function you need to query the artoolkitX-API if any of your trackables are visible to the camera. If so you get the transformation matrix and can query for the projection matrix. To access these information you need to use the ARController
class which is implemented as a singleton.
/**
* A very simple Renderer that adds a marker and draws a cube on it.
*/
class AR2dTrackingRenderer extends ARRenderer {
private SimpleShaderProgram shaderProgram;
private static final Trackable trackables[] = new Trackable[]{
new Trackable("pinball.jpg", 1.0f)
};
private int trackableUIDs[] = new int[trackables.length];
private Cube cube;
/**
* Markers can be configured here.
*/
@Override
public boolean configureARScene() {
int i = 0;
for (Trackable trackable : trackables) {
trackableUIDs[i] = ARController.getInstance().addTrackable("2d;Data/" + trackable.getName() + ";" + trackable.getWidth());
if (trackableUIDs[i] < 0) return false;
i++;
}
return true;
}
//Shader calls should be within a GL thread. GL threads are onSurfaceChanged(), onSurfaceCreated() or onDrawFrame()
//As the cube instantiates the shader during setShaderProgram call we need to create the cube here.
@Override
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
this.shaderProgram = new SimpleShaderProgram(new SimpleVertexShader(), new SimpleFragmentShader());
cube = new Cube(40.0f, 0.0f, 0.0f, 20.0f);
cube.setShaderProgram(shaderProgram);
super.onSurfaceCreated(unused, config);
}
/**
* Override the draw function from ARRenderer.
*/
@Override
public void draw() {
super.draw();
GLES20.glEnable(GLES20.GL_CULL_FACE);
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
GLES20.glFrontFace(GLES20.GL_CCW);
// Look for trackables, and draw on each found one.
for (int trackableUID : trackableUIDs) {
// If the trackable is visible, apply its transformation, and render a cube
float[] modelViewMatrix = new float[16];
if (ARController.getInstance().queryTrackableVisibilityAndTransformation(trackableUID, modelViewMatrix)) {
float[] projectionMatrix = ARController.getInstance().getProjectionMatrix(10.0f, 10000.0f);
cube.draw(projectionMatrix, modelViewMatrix);
}
}
Finally you also need to write a ShaderProgram which must consist of a VertexShader and a FragmentShader implementation. ArtoolkitX provides simple shader examples and a convenience OpenGLShader
interface which you can use for this task. In the above example code you can see the usage of SimpleVertexShader
and SimpleFragmentShader
inside the onSurfaceCreated
function.
Some information about OpenGL ES can be found on the Android websites (scroll down to the OpenGL ES 2.0 section) an overview of OpenGL ES 2.0 can be found in the official documentation from Khronos