The prevalence of mobile devices in everyday life has led to the expansion of mobile apps on the market. While the apps evolve, providing a limitless number of possibilities and competing in design, there are rising concerns about privacy incidents on mobile devices via permission models. Users often do not understand the privacy implications that granting permissions carries, mainly because it is unclear how the app is processing privacy-sensitive resources. As a result, there are numerous cases of data leakage by the apps that users are not aware of. Researchers tackle this issue by introducing a field of meta-data analysis of the apps. By releasing the apps, developers provide meta information such as descriptions, privacy policies, categories, screenshots, etc. This data can further be used to predict the actual behavior of the apps and to measure the discrepancies between the actual behavior of the app and the described one, known as the description-to-permission fidelity in the literature.
In this project, we provide a literature overview of the ongoing work that focuses on enhancing the fidelity of the description in Android apps. We systematically review the latest progress in the field of application description analysis and elaborate on the techniques in the field of Natural Language Processing and Deep Learning that have shown potential to tackle the issue. Furthermore, we aim to address and discuss challenges and open research issues.