Android Development at Mobile.de
At mobile.de we support our users by providing the best experience when using our products. When it comes to supporting Android users, a natural fit is a native Android app. Since mobile devices nowadays make up a significant amount of traffic, it is key that quality and user experience are equally as important as technical finesse. By keeping this in mind we work as a tightly integrated team to bring only the best experience to our users. This is how we do it.
Producing a high quality mobile application is no one man job. People with different roles and diverse backgrounds contribute to the overall success of an application. In our case the team structure looks as follows:
Currently we operate with two full time Android Software Engineers and a Working Student which are responsible for developing the Android application. Our Backend Engineer provides a backend service and APIs which supply all of our mobile apps with access to the mobile.de data pool. The QA Engineer has a keen eye on the overall quality of the app, from manual testing to automated testing strategies. A Product Manager produces the first drafts of a new feature and serves as the last approval authority once a feature is developed. Along the way we are supported by our UX Designer who has a clear vision about the overall look and feel of the app and adjusts the apps experience to the needs of our users and our business. Along the way our Team Lead helps us to overcome barriers, provides support and acts as an interface to other stakeholders.
Product Development Lifecycle
We organize our way of working by utilizing the Kanban method to work in a very agile way. We defined a product development lifecycle loop which best fits our needs and visualized this loop on our Kanban board so everyone is always aware of what is happening and what will come up next.
This lifecycle loop is laid out as follows:
We start off a product development cycle with the planning phase where we have a look at the current state of development and decide on which features we will work on in the upcoming week. A set of features makes up a release of an app version. Usually the PM prioritizes the features and consults with UX and the Dev team about what makes it into the release and what doesn't.
After refining the product and UX definitions of the features into technical tasks, the development team takes care of the implementation.
To guarantee quality and stability during development, the development team writes automated tests along with the actual implementations of the features. These automated tests are executed when new work has been committed and pushed to the code repository and provides fast feedback if everything within the app works as expected. Manual tests on real end user devices are also a must and finally our QA Engineer thoroughly tests each feature implementation to give the feature approval.
Once everything is in order and our feature scope has been reached, we publish our application to Google Play.
That is not the end of it though. Since we are a data driven company we analyze each implemented feature to a great extent and take user feedback very serious. We analyze user behaviour and performance of our features to provide the best user experience and to increase the value of our app for our business. Sometimes a feature seems to be a great idea or you think that some minor visual changes could benefit the user, but if data shows that users aren't achieving their goal anymore or the majority is just generally unhappy with certain changes, you might have to make changes again acting against your initial hypothesis. Data never lies so that's why we validate a lot and a previously "finished" feature might be refined over and over until it perfectly fits all needs. That's what our product development lifecycle is all about.
When developing any application a good set of tools and practices is the cornerstone of success. We are using a lot of industry wide adapted and well supported tools to help us do what we do best — focus on our core business logic. The following paragraphs should give you an overview of the kinds of tools we are currently using.
As a basic setup we are using the Eclipse IDE as our code editor of choice. Although there are plenty of other good choices you could make when choosing an IDE, Eclipse provides the best support for the tools we are using right now. Android Studio is a good candidate for a perfectly tailored Android IDE but since it is still in the early stages of development we may consider switching at a later stage.
We use Ant to build our Android application right now but we may consider switching to Gradle when the overall community adaptation progresses.
Following best practice is advised for managing your sources, especially when working in larger teams. The most praised distributed source code management tool nowadays is without doubt Git. It helps us to keep track of our project and even design resources. It also enables us to work on our product wherever we are, be it in a remote office or at home.
To continuously receive feedback on our app builds and executed tests we are using the Jenkins continuous integration platform.
Of course these are just the most basic tools you would need to get your development going and since nobody should reinvent the wheel over and over again we focus on our core business logic and utilize a couple of community wide prominent libraries to help us with specialized tasks so we don't need to invest time to tackle them on our own. The following is a list of some of the libraries we are currently using in our Android app. Be aware that the usage of libraries tends to change rapidly and this reflects only a snapshot in time.
- Android Support Library (API Compatibility)
- Gson (JSON Serialization / Deserialization)
- Volley (Networking Abstraction)
- OkHttp (Networking and Http Client)
- Picasso (Image Loading and Caching)
- Otto (Event Bus)
- HockeyApp (Crash Reporting)
- Google Analytics (App Usage Statistics)
The overall architecture of the mobile.de Android app is divided in just a few layers. The presentational and controlling layer consisting of several Activities hosting Fragments is utilizing services from the service layer to communicate with our mobile.de backend API. Those services are managed by a service registry which implements dependency injection to handle service dependencies. This dependency injection and abstraction helps us to change service implementations in a testing scenario. The communication via the backend APIs is done in a restful way and messages are sent in the JSON format. After a service processes the backend response, the results are propagated either by callback methods or via an event bus to communicate to the presentational layer that something in the UI should be changed. Here is a quick diagram to visualize this.
Within our mobile teams at mobile.de we established several quality gates when it comes to testing. The first one in the chain should always be automated testing. Writing tests for all your feature implementations is most important when you are working in a team where a small change in one component could have an impact on another component. It provides a quick feedback loop and in some cases can even be part of a feature documentation by showing actual use cases while testing.
In our Android app we have two styles of automated testing. We have normal JUnit tests supported by the Robolectric framework to be able to test our code without having to rely on a running Android environment. Robolectric provides a simulated Android environment without having to start up an emulator. This results in really quick test case execution. Right now we have over 700 JUnit tests running on a daily basis. The other way of automated testing we use is UI instrumentation testing. Google just recently released the Espresso UI testing framework which is tightly synchronized with the UI thread of your application so that every action you define in your tests is guaranteed to be executed after another. Those UI tests are great for testing whole user flows and you can define your test in a way that reflects the normal interaction a user would have with the app under test. You click on buttons, swipe views and can do everything a user can do with your app. This can help you to test visual and flow behaviour. Since you need a running emulator or device for those kind of tests they tend to run slower than normal JUnit tests which are executed on a JVM. We currently have over 240 UI Tests. Here you can see one of our UI tests running on a real device.
Automated testing gives us a quick feedback loop and can test a lot aspects of our app but it won't make manual testing obsolete. That's why each feature has to be approved by our QA Engineer or supporting colleagues which haven't implemented the feature themselves. They are not biased like the original developer and they interact with the app more naturally than a test would do. This helps to identify UX shortcomings or corner case bugs.
After the code was tested through those two stages, it is far from being released to the public. Before submitting our Android app to Google Play, we do a so called session based test (SBT). At an SBT everyone is invited to test the newest release for an hour with focus on the newly developed features but also on the general usage of the app. Feedback and occurring bugs are collected on a board and prioritized afterwards. Found bugs will be fixed and enhancements either added or moved to an upcoming release.
Now, when the app is in a good state it is subject to a beta test. A beta test helps us to gather feedback from real users who haven't been part of the actual development of the app. If you want to be part of the mobile.de Android app beta testing group please follow the instructions in our beta testing google group.
If we want to cautiously monitor the performance of a new release we also might decide on a staged rollout via Google Play. If a release is rolled out to only 10% of all of our users, it helps to identify errors quickly before every user is affected. In such a scenario we can react quickly and provide a bug-fixed new version of the application.
I already mentioned in the beginning that we at mobile.de are very data driven. To gather data on what works best for our users and our business we run A/B tests within our released application and monitor the users interaction with those features. This helps us to better understand what works for our users and what doesn't. You would be surprised how a hypothesis which seems so logical in the beginning can turn out to be totally the opposite. In the end every user will be presented with the experience that works best.
A lot of people are involved in creating our mobile.de Android app and a lot more than just programming is required to create an app that millions of people are using. Just to give an impression what it means to be Germany's biggest car sales platform and Android being the top growing platform when it comes to mobile, here are some statistics we want to share with you.
3.6 million total downloads
2.2 million active installs
1.3 million active users
Google Play rating higher than 4.5
72% of users on JellyBean (4.1 - 4.3)
66% of devices are Samsung devices
19% of users are Samsung Galaxy S3 users
91% of users speak german
94% of users are from Germany
15 App versions still in use
80% of users updating regularly