One of the issues with the Gecko port to Android is that, early on, I used some internals to tie in to the Android graphics system from native code. This worked fine, but it complicated the build: you needed to pull in a bunch of headers and some libraries from the actual Android source to be able to complete a build.
The solution for this was pretty easy: move to OpenGL for rendering. However, there are some interesting quirks here. I’m targeting Android 2.x only: specifically the Motorola Droid, HTC Nexus One, and a NVIDIA Tegra 250 devkit I have here. For this initial step, all I need is to just draw a textured quad. We’ve got full OpenGL compositing, rendering, fancy video decoding and all that stuff coming later, but for now we’re just hooking into our software rasterizer, uploading the result as a texture, and drawing a textured quad. Easy, right? Here are some random issues I ran into while doing this over the past day or two.
First Attempt: OpenGL ES 1.1
Well, there are two wrinkles. First, Cairo’s software rasterizer uses a 32-bit ARGB pixel format and layout. In little-endian per-byte terms, that’s B G R A. OpenGL ES supports A R G B. There is an EXT_bgra extension that adds support for GL_BGRA as another byte format, and this extension is one that’s potentially available on GL ES. The second wrinkle is that this quad is display-sized, so the texture is display sized; it’s not going to be power-of-two dimensions. While OpenGL ES 2.0 supports non power of two textures in the base (with some limitations, which are not relevant for my use case), ES 1.1 does not, and I figured given that all I was drawing is a textured quad, I may as well use ES 1.1.
Unfortunately, the tree devices I mentioned above support different combinations of these. The NVIDIA device supports both EXT_bgra and ARB_texture_non_power_of_two. This is perfect; no workarounds are needed here, though for some reason it doesn’t like TexSubImage2D with BGRA data, but that’s not a big deal. The Droid (OMAP3, PowerVR SGX) supports EXT_texture_format_BGRA8888 (note: different name, similar functionality), so that’s good, but it doesn’t support non power of two textures with ES1.1. The Nexus One, on the other hand, supports neither BGRA nor NPOT textures.
I was about to start using OES_draw_texture as well, because that seemed like a potentially faster way to get what I want to happen — but the lack of BGRA support on the Nexus One made me turn to ES2, where I can do the RGBA->BGRA swizzle in the fragment shader.
Undefined Symbols in GLESv2 Import Library
More fun! The Android r3 NDK includes GLESv2 support, yay! The bad news is that libGLESv2.so includes an external reference to _ZN7android33egl_get_image_for_current_contextEPv (android::egl_get_image_for_current_context), which means you’ll get linker errors (or at least undefined symbol errors) if you try to link anything that’s not a shared library. Conveniently, that’s what you need to produce with the NDK anyway, but if you have some helper command line tools along the way, they’ll fail. The solution is to add -Wl,–allow-shlib-undefined to your binary compile/link step.
After that, this was fairly straightforward, though the SDK only grudgingly allows you to specify the necessary EGL tokens for GLES2; the code samples in the NDK all just provide explicit integer values for them inside the code.
Choosing an EGLConfig
This applies to both OpenGL ES 1 and OpenGL ES 2 on Android. When creating an EGLSurface for a SurfaceView (take a look at how GLSurfaceView does it for the details), you have to get an EGLConfig that has an exact match for the number of red/green/blue/alpha bits as your surface. There’s a format parameter to surfaceChanged that’s supposed to tell you the format of the surface. However, it seems to always show up as ‘-1′, which according to PixelFormat.java, is “OPAQUE”. That’s not very helpful. Reading GLSurfaceView, it can show up as -2, which is TRANSPARENT. So — you have to assume that if you have an OPAQUE surface format, it’ll be 5650, and if you have a TRANSPARENT format it’ll be 8888. This is pretty silly, as there are PixelFormat constants for handy things like RGBA_8888, RGB_565, RGB_888, RGBA_5551, etc. Why doesn’t SurfaceView send the actual format down?
The devices that I have seem pretty consistent at least with 565 for OPAQUE, so it works OK, but it’s not pretty, and will likely blow up spectacularly if anyone introduces, say, a large-display Android device that uses 24bpp color.
Another config issue is that some GPUs have some odd requirements for getting the most preformance; for example, as discovered via searching, the PowerVR SGX in the Droid really wants 24-bit depth, as it’s faster than 0 and 32. The Tegra, on the other hand, doesn’t have 24bpp depth at all, only 0 or 16 (and I don’t think it cares one way or the other). Not sure whether the GPU in the Nexus One cares or has a preference. So, you have to search for a 24-bit depth config first, use it if it’s found, and then try 0 if not found. I suppose an alternate approach might be to search for 16-bit depth, but that might give you 32-bit if that happens to be supported somewhere.
At the end of all of this though, I have an app that uses OpenGL ES 2 on three different Android devices (with three different GPUs).