Assessing Child Safety Risk of Alternative Social Apps

dc.contributor.advisorWarner, Mark
dc.contributor.authorAlabbasi, Dana
dc.date.accessioned2024-12-10T06:25:27Z
dc.date.issued2024-09-09
dc.descriptionThis thesis examines the risks children face on alternative social networking applications, using Ofcom’s Protection of Children Code of Practice under the UK’s Online Safety Act 2023 as a framework. Focusing on apps like StreamKar, Chamet, BuzzCast, MICO, and SuperLive, it analyzes formal documents and user interfaces to assess safety measures such as age verification and content moderation. Findings reveal significant gaps between policy and implementation, highlighting risks of harmful solicitation and the urgent need for stronger safeguards to ensure the online safety of child users.
dc.description.abstractChildren’s use of technology has been increasing rapidly and their exposure to online harm is an area of deep concern. Lawmakers and regulators are combining efforts with service providers to mitigate the emerging risks on child safety. This is primarily done by creating age-appropriate restrictions such as design codes on online services that children will likely access. The Office of Communications, Ofcom, an online safety regulator under the United Kingdom’s Online Safety Act 2023, proposed a Protection of Children Code of Practice for user-to-user services. This code of practice seeks to enforce compliance with children’s safety duties, assess risks to children, and specify the required steps to mitigate these risks. Pursuant to this, this thesis aims to present an analysis of five alternative social networking applications that include live streaming and instant messaging features (i.e., StreamKar, Chamet, BuzzCast, MICO, and SuperLive) by applying Ofcom’s code of prac tice as a framework. Specifically, it assesses child safety risks through analysing formal and informal documents (i.e., legally binding and non-legally binding documents) and in specting the apps’ user interfaces. Under the analysis, the findings portray the lack of safeguards such as robust age verification mechanisms and content moderation practices in alternative applications. In addition, the analysis discovered several inconsistencies in measures such as reporting and complaints, content moderation, and user support, wherein a gap is seen between what is stated in the documents of these applications and what is, in reality, implemented in the UI. Finally, it is evidently clear from the findings that app policies and in-app features contribute to the risks of harmful solicitation attempts towards children and highlight the need for stronger enforcement of robust safety mechanisms and designs to ensure online safety of child users.
dc.format.extent73
dc.identifier.citationAlabbasi, D. (2024). Assessing child safety risk of alternative social apps. University College London.
dc.identifier.urihttps://hdl.handle.net/20.500.14154/74082
dc.language.isoen
dc.publisherUniversity College London
dc.subjectChild Safety
dc.subjectSocial networking applications
dc.subjectAlternative Applications
dc.subjectOfcom Protection Of Children Code of Practice for user-to-user services
dc.subjectAge verification
dc.subjectContent moderation
dc.subjectLive Streaming and Chat
dc.titleAssessing Child Safety Risk of Alternative Social Apps
dc.typeThesis
sdl.degree.departmentComputer Science
sdl.degree.disciplineInformation Security
sdl.degree.grantorUniversity College London
sdl.degree.nameMaster of Science

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
SACM-Dissertation.pdf
Size:
1.23 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.61 KB
Format:
Item-specific license agreed to upon submission
Description:

Copyright owned by the Saudi Digital Library (SDL) © 2025