Throughout this module, we'll use an example program to apply the theory and the concepts we cover. Let me briefly talk you through it's structure. Like the previous example program, I define the MainActivity class, then MyRenderer class, and then MyView class. I've also added a new class called Sphere in this example program. The MainActivity class is similar to the previous example program, I create a new MyView object, and then set the content view to this newly created object. In the MyView class, I added the functions to enable touch control. So I added the override function on TouchEvent to capture the MotionEvents of ACTION_POINTER_UP, ACTION_POINTER_DOWN, and ACTION_MOVE to detect if there is a user's touch on a screen or not, and also if the user panning on the screen. I've also added a function to detect if there's a pinch gesture. Based on the distance between the multi-touch events, allow the user to zoom in and zoom out their view of the 3D object. So I call the SetZoom function in the MyRenderer to allow the user to zoom in and zoom out of the view of the object. For the user's panning on the screen, I calculated the displacement between the current TouchEvent and a previous TouchEvents, and then use it to rotate a 3D object by calling the setYAngle and setXAngle. So this means it allows the user to rotate the object by panning on the screen. To detect a pinch gesture I added a function called distance, it calculates the displacement distance between the two multi-TouchEvents, and the function isPinchGesture to detect if there's any pinch gesture based on the distance between the multi-TouchEvents. In the MyRenderer, I calculate the projection matrix, the view matrix, and the model matrix to join the 3D object on the screen. I've also added a function to set the rotational matrix of the mYAngle and mXAngle which are set by the touch events. I call the draw function of the sphere to draw the 3D sphere on the screen. In the Sphere class, I basically define and draw a 3D sphere with lighting effects on screen. So in the vertexShader, I added the variables and also the functions to enable the ambient light, the diffuse light, and also specular lighting effects on the object, and I calculated the respective light weighting effect and passed it on to the fragmentShaders for the program for the shaders to draw the fragments with the lighting effects. For more details about our lighting effects, please refer to my video lectures on lighting and illuminations. To draw the 3D sphere, I add a function called createSphere, which you'll calculate the vertexes of each points on the surface of a sphere. For more details about how to create a 3D sphere with OpenGL, please refer to my video lectures on OpenGL ES sphere. In this example program, I would like to overlay the 3D sphere with the map of the world. So add a function called LoadTexture which will load the resources, which is the image of the world as a texture object. I added the texture data handle to point to this LoadTexture image, and then call the glActiveTexture, glBindTexture to load the texture and then bind that to the shader to allow the shaders to overlay the 3D object with the 2D image of the world. When you run the program, you'll see the image of the globe, and then you can use the TouchEvents to rotate or pan the globe around, and also you can use the pinch gesture to zoom in or zoom out of the view of the 3D globe.