Android is the world’s most widely used operating system. And we’re huge fans of it here at Conseal.
But building Android apps can present some unique security challenges to developers. These may be for example as a result of the way that Android operates, the hardware that runs it, or its Linux heritage.
This blog post outlines some of the things that our code auditors and penetration testers look for when reviewing Android apps for security vulnerabilities.
Android versioning and hardware issues
There are three billion active Android devices worldwide, created by hundreds of manufacturers. Each manufacturer has its own release schedule for OS updates, and it can be a difficult process to provide new Android releases for their aging hardware. This means there are a lot of active 3-5 year old devices which have versions of Android equal to their age.
Developers therefore have to support older versions of Android. Google recognises this need and provides excellent support for it, via AppCompat (part of AndroidX). This makes support for even very old versions of Android almost automatic. And so it’s common to see apps supporting Android versions from 6 or 7 years ago. That’s great for consumers, great for limiting e-waste and great for the platform.
But it’s not great for security, which needs to be handled carefully as a result.
To take one example, Android once gave apps much greater access to storage. It was extremely easy for a developer to accidentally save private data to a location which other apps could read. These days, the advent of Scoped Storage in Android 10 (and improved since then) has made this virtually impossible.
But if developers are supporting older versions of Android, their apps may well still be susceptible to this issue when running on those versions. And so, despite it having already been addressed very effectively in the OS, it still needs to be on developers’ radars.
Dynamically loaded code
Android supports dynamically loading code. This is a massively powerful feature which allows apps to decentralise their logic. This alone is what allows Android to run certain types of app which other operating systems simply wouldn’t be able to support.
But it needs significant security considerations. Developers need to make sure that dynamically loaded code is safe, and that’s no small feat. Code signing represents a part of that, which itself also presents a minefield of cryptographic considerations.
The Android WebView - a mini web browser which is commonly used inside apps - can be used by bad actors to dynamically load malicious code too. It therefore needs careful handling from a security perspective.
Android has a central system log, a repository of real-time information provided by apps on what they are doing. Developers commonly use this for debugging, because it’s helpful to dump whatever data a particular function may be working on.
But there are numerous ways for a bad actor to read this log, so it represents a security issue. We review all debug logs that an app makes to ensure that sensitive information is correctly suppressed.
Certain actions on Android require permissions from the user, particularly when those actions might affect privacy. For example an app is only permitted to take a photo once the user grants it to access the camera.
It’s easy for developers to fall into the trap of requesting a particular permission too early, or to require permissions that aren’t needed. Doing so can allow a malicious actor more access to private data than they would otherwise have.
We check, therefore, that permissions are only requested at the exact moment they are needed.
Android NDK and low-level code
The Android NDK allows developers to write apps using low-level C / C++ code. This allows us, for example, to write intensive code in the most efficient way possible, ultimately allowing things like games to run more smoothly. It’s enormously powerful, for the right application.
But it’s subject to all the usual problems of low-level languages. Buffer overflow errors are just one example; they could allow a malicious user to take over the running app by just entering the wrong data.
Use-after-free errors can have the same effect.
It’s often technically difficult to find these problems. They require a careful code review and a sharp eye.
These are just a small sample of the Android-specific security concerns that we work with our clients to eliminate. For a more extensive list in technical language, Google maintains a document. But in general they show why it’s important to go a step further than having your app penetration-tested by a general security expert. It also needs “white-box” code auditing by an experienced Android security engineer. There’s no certain way of securing against all these pitfalls, but this gives your app - and your users’ privacy - the best chance.