I have recently read an interesting article about a team of researchers that downloaded and parsed the Android Playmarket. Then they analyzed hundreds of thousands of applications regarding the presence of secret tokens and passwords.
Since the result of their work concerned the analysis of decompiled code under Android only, I decided to write about a research I did a year ago. I performed it not for Android only, but also for iOS applications. By the way, this research became an online tool, but I’ll tell you about it at the very end, when its meaning becomes obvious. Also, I presented some of the mentioned above things at Zero Nights conference.
So, here it goes…
Stores are the main Goal
There’re a lot of information about the manual analysis of mobile applications. There are also a lot of test methods and check lists. But the bigger part of these checks relates to user safety: the way the data is stored and transferred and the way of getting access to it with the help of application vulnerabilities.
But why should an attacker deal with the way an application operates on a device of a certain user, if he can attack the server part and steal the data of ALL the users at once? How can an application be useful for attacking the cloud infrastructure of that app? What about analyzing thousands, or even tens of thousands of applications and check them for typical bugs, such as tokens, authentication keys and other secrets?
Since the provided link provides exhaustive information about the Google Play, I’ll stick to an applications for iOS in this article. As for AppStore automatic downloader and its implementation, it’s the subject for another article. I’ll just say that it’s a more complicated task than a download manager for the Google Play.
The articles concerning the distribution of iOS applications state that:
- The application is encrypted
- The application is secured by DRM
- The installed application associates itself with the device
Behind all of these statements stands the fact that the compiled code in the application distributive (which is actually a simple zip-archive) is encrypted with the binding to the device. Other content exists unencrypted.
Where to Begin?
The first tools coming to my mind (authentication tokens, keys and all that) are strings and grep. But they’re not good for automation. Search by string creates so much garbage requiring manual analysis that automation loses its meaning.
To write an acceptable system of automatic analysis, we should look attentively at the distributive structure. Having unpacked distributives for ~15 000 applications and discarded the obvious trash (pictures, audios and videos), we will get 224 061 file of 1396 types.
*.m and *.h (the source code) are interesting enough, but we should not forget about configs, such as XML-, PList- and SQLite containers. Having accepted this simplification, let’s build the TOP of the most popular types. The total number of files we’re interested in is 94 452, which makes 42% from the initial number.
An application which we'll call “normal” consists of:
- Media content: pictures, audios, videos and interface resources;
- The compiled code;
- Containers with data: SQLite, XML, PList, BРList
- Some trash that got into the distributive for some unknown reason
Thus, there are two tasks:
- Recursive search of various secrets in SQLite, XML, PList
- Search of some “unusual” trash and private keys
Keep this token in secret
Apparently, the fact that a published application becomes public is not obvious for most developers. We can sometimes see Oauth tokens of Twitter and other popular services in there. As an example, there was an application that gathered contacts, photos, geolocation and users deviceID and stored them all in the Amazon Cloud. It was indeed using a token included in one of PList files. By using this token, it was no problem to get all users data from the cloud and follow devices in real-time.
We should mention about such services allowing flexible management of push notifications, like UrbanAirship. It’s stated in documentation that the master secret (with its help the server part of an application sends push notifications) should never be placed in an application bundle. But we can see them anyway. Thus, I can send notifications to all application users.
TEST-DEV
We should also talk about various artifacts of testing and development processes, meaning links to debugging interfaces, version control systems and links to dev-environment. This information can be extremely interesting for attackers, as sometimes it contains database dumps with real users. As a rule, engineers do not deal with dev environment security (for example, by leaving default passwords on). Nevertheless, they often use actual users data in tests to simulate production environment. An outstanding discovery was the script, which could send push notifications to all application users.
Tap to enter
It is quite expectable to find the information about the test environment and all version control systems in the distributive. Still, some things cannot but surprise:
- SQLite base with the service account data:
- A business card application with client side authentication:
- A private key to sign transactions:
What is It Doing Here?!
The mentioned above findings are «quite» explainable, but sometimes we can find such unbelievable things like a PKCS-container with the developer certificate… and a private key for it:
Or pieces of PHP code with usernames/passwords to access database server:
My favorite one… An OpenVPN client config:
And also not encrypted private keys of all kinds:
Is There Anything Besides Secrets?
No matter how disputable the licensing problem is, we can face it here as well. Most developers use the code of frameworks that can be under GPL license. The way GPL operates with paid and free applications in the App Store is the question that can make a room for patent trolls.
Is There an App for That?
Thus, we have thousands of applications containing the mentioned bugs but developers are in no haste to fix them. What’s the problem here? There are plenty of security audit applications on GitHub. But to work with them, developers should do the following:
- Spend time and efforts to understand the way everything works. It’s an additional work they will not get paid for.
- Create and support an infrastructure.
- If there are plenty of applications, we will have to hire a full-time expert that will perform the first two points.
Thus, only rich corporations which worry about their brand and financial structure can afford secure development. Meanwhile, the number of insecure applications keeps growing.
HackApp application is the answer to all of the mentioned factors. It’s a tool providing the basic security analysis of applications. It has the following principles:
- A report should not burden a developer with such technical details like listings and traces. It should clearly notify about the places requiring debugging.
- It should not require infrastructure investments
- It should contain the interface for automatic interaction and be embeddable into the process of a pre-release application testing. Thus, become another testing tool.
As of today, there are two versions of HackApp: Basic and Pro (with paid subscription), but it’s another story.
0 comments
Upload image