In my previous article, I gave an introduction to the built-in features of the Android platform for supporting screens of various sizes and densities. In this article, I am going to expand on this and show you actual code for achieving screen-independence in an app. My example will be a Processing app (as that’s my own primary use case), but the ideas should apply equally well to any game or graphics-centric app.
Let’s use the game “Chimpzilla Attacks!” as an example. I made up this game specifically for the purpose of this tutorial—and already spent way too much time on the mockup. I have no idea what the game mechanics are, but judging from the cheesy graphics, it’s got to be some kind of “Punch The Monkey” knock-off. Anyway, back to the tutorial…
In my ongoing effort to port a desktop Processing application to Android, I am now trying to add support for multiple screen sizes. The goal is that the same .apk works on, say, a 3.7″ Milestone equally well as on a 7″ Galaxy Tab and on a multitude of other screens. The Android Dev Guide has a comprehensive page that explains the platform features for supporting different screens. I did my own tests and experiments to better understand the concepts. This article explains my findings and hopefully saves you some time and work.
I added sound playback to the Processing app that I’m currently porting to Android. On the desktop, I used the Minim sound library, which is not supported on Android (although there is some discussion about alternatives and a possible port). All I really need from a sound library is a way to play several .ogg (or .mp3 or .wav) files simultaneously and asynchronously, query their playback durations, maybe set the playback volume of individual sounds, and stop playing sounds at any time.
In general, a sound can be played using the following code:
snd = new MediaPlayer(); snd.setDataSource(path_or_url_of_sound_to_play); snd.prepare(); snd.start()
My own app is a game that contains only a handful of .ogg files. I keep a HashMap of the MediaPlayer instances for all sounds in the game, with prepare() already called on them. Then I simply call start() whenever an event triggers the sound in the game.
Following up on my “First Steps” article, I did some more Processing development on Android. I ran into some problems, solved some, and was able to start my ported Processing program inside the emulator.
This article is still not about actual Processing source code. All I did was fiddle with the Android SDK tools some more.
Building Without Eclipse
Eclipse is kind of a resource hog (yeah, I know, “I told you so” just ain’t saying it). Therefore, I took a closer look at the command line tools from the Android SDK to replace it (at least part of the time).
To convert an existing Eclipse Android project to an Ant project that you can build from the command line, use the “android” tool from the SDK:
I got into mobile development recently. I ported an SDL-based game to the Openmoko FreeRunner and to an iPhone 3GS. Now I’m planning to do some Android development on a Motorola Milestone (the European version of the Droid).
I have previously been using the Processing programming environment on the desktop, so I was delighted to learn that it is now available for Android.
(“Processing is an open source programming language and environment for people who want to create images, animations, and interactions.” It won a Prix Ars Electronica award in the “Net Vision / Net Excellence” category in 2005.)
I am by no means an Android or Processing expert. This article serves mainly as documentation for myself about the steps to build a simple application. If it is also useful for others, so much the better.