Translating Video Recordings of Mobile App Usages into Replayable Scenarios

Autor: Bernal-Cárdenas, Carlos, Cooper, Nathan, Moran, Kevin, Chaparro, Oscar, Marcus, Andrian, Poshyvanyk, Denys
Rok vydání: 2020
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1145/3377811.3380328
Popis: Screen recordings of mobile applications are easy to obtain and capture a wealth of information pertinent to software developers (e.g., bugs or feature requests), making them a popular mechanism for crowdsourced app feedback. Thus, these videos are becoming a common artifact that developers must manage. In light of unique mobile development constraints, including swift release cycles and rapidly evolving platforms, automated techniques for analyzing all types of rich software artifacts provide benefit to mobile developers. Unfortunately, automatically analyzing screen recordings presents serious challenges, due to their graphical nature, compared to other types of (textual) artifacts. To address these challenges, this paper introduces V2S, a lightweight, automated approach for translating video recordings of Android app usages into replayable scenarios. V2S is based primarily on computer vision techniques and adapts recent solutions for object detection and image classification to detect and classify user actions captured in a video, and convert these into a replayable test scenario. We performed an extensive evaluation of V2S involving 175 videos depicting 3,534 GUI-based actions collected from users exercising features and reproducing bugs from over 80 popular Android apps. Our results illustrate that V2S can accurately replay scenarios from screen recordings, and is capable of reproducing $\approx$ 89% of our collected videos with minimal overhead. A case study with three industrial partners illustrates the potential usefulness of V2S from the viewpoint of developers.
Comment: In proceedings of the 42nd International Conference on Software Engineering (ICSE'20), 13 pages
Databáze: arXiv