Title Mobile app review analysis for crowdsourcing of software requirements: a mapping study of automated and semi-automated tools /
Authors Massenon, Rhodes ; Gambo, Ishaya ; Ogundokun, Roseline Oluwaseun ; Ogundepo, Ezekiel Adebayo ; Srivastava, Sweta ; Agarwal, Saurabh ; Pak, Wooguil
DOI 10.7717/peerj-cs.2401
Full Text Download
Is Part of PeerJ computer science.. London : PeerJ. 2024, vol. 10, art. no. e2401, p. 1-60.. ISSN 2376-5992
Keywords [eng] automated tools ; crowdsourcing ; feature extraction ; mapping study ; mobile app reviews ; semiautomated tools ; software requirements
Abstract [eng] Mobile app reviews are valuable for gaining user feedback on features, usability, and areas for improvement. Analyzing these reviews manually is difficult due to volume and structure, leading to the need for automated techniques. This mapping study categorizes existing approaches for automated and semi-automated tools by analyzing 180 primary studies. Techniques include topic modeling, collocation finding, association rule-based, aspect-based sentiment analysis, frequency-based, word vector-based, and hybrid approaches. The study compares various tools for analyzing mobile app reviews based on performance, scalability, and userfriendliness. Tools like KEFE, MERIT, DIVER, SAFER, SIRA, T-FEX, RE-BERT, and AOBTM outperformed baseline tools like IDEA and SAFE in identifying emerging issues and extracting relevant information. The study also discusses limitations such as manual intervention, linguistic complexities, scalability issues, and interpretability challenges in incorporating user feedback. Overall, this mapping study outlines the current state of feature extraction from app reviews, suggesting future research and innovation opportunities for extracting software requirements from mobile app reviews, thereby improving mobile app development.
Published London : PeerJ
Type Journal article
Language English
Publication date 2024
CC license CC license description