Creating a WebGL landing page for an Android app.

Right after we’ve released our latest 3D Buddha Live Wallpaper an idea to make a landing page for this app emerged. Because what is the best way to show features of live wallpaper? To run it right in user’s browser, of course. Thanks to WebGL users can preview exactly the same rich, fully 3D graphics and smooth performance as in full-featured Android app right in mobile browser. This experience is way beyond watching screenshots or video of an app on Google Play.

It was quite easy to implement this landing page because we share the same framework for both web and Android platforms. Actually, web version of an app was used to implement first draft versions of scene even before making an actual Android app. This is because it is way faster and easier to experiment and tweak various stuff in runtime in web app than in its Android sibling.

Our custom low-level, lightweight framework used to render scene doesn’t process input data and simply loads it to GPU as is. Input data is already converted into binary format so framework loads data into GPU immediately after each geometry resource is downloaded, without converting, parsing or any other processing. Texture data is downloaded in form of PNG images, so it is converted before being passed to GPU but this work is done by browser and WebGL/drivers so overhead is negligible. It is not a full-featured framework, it obviously lacks a lot of stuff compared to engines like three.js or Pixi.js. But its goal was different — it was meant to be minimalistic and to match Android rendering code as much possible.

Scene composition

Scene is composed of just five objects: table, Buddha statue, sky, light shafts, and dust. They are drawn in certain order with a single draw call per each object (dust particles are drawn in 8 draw calls, more on that later), each using specific shader. This improves performance by minimizing OpenGL state changes. Scene is completely rendered in just 12 draw calls.

Object are rendered in fixed order. Opaque objects are drawn in reverse painter’s order, than transparent objects are drawn. This improves performance again by reducing overdraw.

Shaders used in demo

Below I describe the GLSL shaders which are used to render scene. You can check out sources of each of them here. Exactly the same shaders are used in Android live wallpaper.

DiffuseShader. The simplest shader which simply renders texture on model’s triangles according to provided texture coordinates. It is used to render sky.

SphericalMapLMShader. It is used to render Buddha statue. It uses normal and spherical mapping to visualize shiny metallic surface. This is the same shader used to draw coins in Bitcoins live wallpaper and 3D demo. You can read about it in our Porting Android live wallpaper to WebGL post.

LMTableShader. This shader is used to render table surface with baked shadows from lightmap under statue. Also table surface fades away to blend with background at the edges of round area. It takes two textures — diffuse and lightmap. Lightmap utilizes channel packing and only red and green channels from lightmap are used. This means that lightmap is essentially monochromatic. Shadow color is a product of red and green channel values. Diffuse color is then multiplied by shadow color. So at the center of texture, where shadow of Buddha forms green spot it is quite dark while yellow color of main area results in lightmap color close to 1.0. Opacity of table is determined by green channel.

The following image explains how lightmap and opacity affect image:

LightShaftShader. This shader is used to render light shafts with moving effect. It uses rim lighting formula to smoothly show and hide light shafts depending on an angle between triangle surface and camera. Light shafts geometry itself is just a few sheets with normals and texture coordinates.

PointSpriteScaledColoredShader. This shader is used to efficiently render large amount of point sprites. It scales point sprites according to Z coordinate so the closer sprite is to camera the larger it is. gl.POINTS primitives are used to render each dust cloud model which contain 20 coordinates. So with just 8 draw calls we are able to render an impressively dense cloud of 160 dust particles. Each dust cloud is slowly rotated in random direction and because these objects overlap it is virtually impossible to notice that some particles belong to the same cloud and are orbiting around the same pivot.

Size and Speed Optimizations

Total size of transferred content is only 434 kBytes gzipped so demo loads all content pretty fast even on mobile connections. So yes, it can fit on a floppy disk. Three times.

Such small size is achieved by some optimizations. First, this page doesn’t use fully-featured 3D engines like three.js, only our own minimal framework. For comparison, JS code of widely used three.js library alone is 135 KB gzipped.

To make content even smaller, we’ve compressed PNG images to 8 bits in parts where quality is not crucial. We haven’t compressed images where quality degradation by reducing colors is clearly visible, so sky and normal maps textures require full-color 24-bit images.

Result

You can check out the web page here and the Android app here and compare them. And of course feel free checking out source code on GitHub here.

Originally published at androidworks-kea.blogspot.com.

Front-end developer making 3D live wallpaper apps for Android.