SACM - United Kingdom

Permanent URI for this collectionhttps://drepo.sdl.edu.sa/handle/20.500.14154/9667

Browse

Search Results

Now showing 1 - 7 of 7
  • ItemRestricted
    AI-Driven Approaches for Privacy Compliance: Enhancing Adherence to Privacy Regulations
    (Univeristy of Warwick, 2024-02) Alamri, Hamad; Maple, Carsten
    This thesis investigates and explores some inherent limitations within the current privacy policy landscape, provides recommendations, and proposes potential solutions to address these issues. The first contribution of this thesis is a comprehensive study that addresses a significant gap in the literature. This study provides a detailed overview of the current landscape of privacy policies, covering both their limitations and proposed solutions, with the aim of identifying the most practical and applicable approaches for researchers in the field. Second, the thesis tackles the challenge of privacy policy accessibility in app stores by introducing the App Privacy Policy Extractor (APPE) system. The APPE pipeline consists of various components, each developed to perform a specific task and provide insightful information about the apps' privacy policies. By analysing over two million apps in the iOS App Store, APPE offers unprecedented and comprehensive store-wide insights into policy distribution and can act as a mechanism for enforcing privacy policy requirements in app stores automatically. Third, the thesis investigates the issue of privacy policy complexity. By establishing generalisability across app categories and drawing attention to associated matters of time and cost, the study demonstrates that the current situation requires immediate and effective solutions. It suggests several recommendations and potential solutions. Finally, to enhance user engagement with privacy policies, a novel framework utilising a cost-effective unsupervised approach, based on the latest AI innovations, has been developed. The comparison of the findings of this study with state-of-the-art methods suggests that this approach can produce outcomes that are on par with those of human experts, or even surpass them, yet in a more efficient and automated manner.
    21 0
  • ItemRestricted
    The creation and proliferation of deepfake “adult content”
    (University of Sussex, 2024) AlZahrani, Ahmed; Rizov, Vladimir
    This study investigates the creation and proliferation of deepfake pornographic content, focusing on its causes, impacts on privacy and security, and the necessary measures to address the ethical and legal challenges it presents. The analysis identifies financial incentives, personal vendettas, and a fascination with technology as key motivations behind the creation of deepfakes. Victims suffer significant consequences, including psychological harm, social exclusion, and job loss. The role of social media and video-sharing platforms is critical in the spread of deepfakes due to insufficient content moderation and algorithmic oversight. Despite advancements in technological solutions and legal frameworks, there are still considerable gaps in preventing deepfakes. The study calls for a comprehensive strategy that includes technological innovations, robust legal measures, and public awareness to mitigate the impact of deepfakes. It also emphasizes the importance of future interdisciplinary research to improve detection, prevention, and support for victims.
    26 0
  • ItemRestricted
    Saudi Millennials' Privacy Practices in the Age of the Personal Data Protection Law
    (Royal Holloway, University of London, 2024-08) Almutairi, Abdullah Ayed; Murphy, Sean
    Executive summary The project's objective is to assess the impact of Saudi Arabia's implementation of the Personal Data Protection Law (PDPL) on Saudi millennials' data privacy practices. Specifically, the study aims to (1) examine the level of awareness among Saudi millennials regarding the PDPL and the rights they have acquired under it; (2) assess their attitudes towards the effectiveness of the PDPL in protecting their personal data; and (3) investigate any changes in their personal data practices following the implementation of the PDPL. The project utilizes a quantitative methodology, using the Knowledge-Attitude￾Practice (KAP) model as a framework. We obtained data from a sample group of Saudi millennials by means of a survey. The survey was created with the purpose of assessing the participants' knowledge of the PDPL, their views on its effectiveness, and their individual handling of data practices following its implementation. We tested the research hypotheses and derived significant conclusions from the data through the application of statistical analysis, which encompassed both inferential and descriptive techniques. Key findings reveal that a large percentage of Saudi millennials lack an adequate understanding of the PDPL. Nevertheless, individuals with a sufficient level of awareness regarding the PDPL tended to follow better privacy practices. The findings also indicated that participants who were knowledgeable about the PDPL exhibited favourable attitudes towards its efficacy in enhancing personal data protection measures in Saudi Arabia. The project found that familiarity with the PDPL and appropriate perceptions of its efficacy resulted in an inclination to adhere to better practices. In conclusion, the implementation of the PDPL had an influence on the data privacy practices of Saudi millennials who were knowledgeable about it and its specifics. However, it is imperative to enhance awareness campaigns in order to augment the number of individuals who are knowledgeable about the PDPL and their rights under it. This will ultimately enhance privacy practices among Saudi citizens. The findings overall show that more awareness of the PDPL leads to improved privacy practices.
    23 0
  • Thumbnail Image
    ItemRestricted
    CROSS-CULTURAL UNDERSTANDING OF HOW PEOPLE USE SECURE GROUP CHAT TOOLS IN THE UNITED KINGDOM AND SAUDI ARABIA
    (King’s College London, 2023-08-15) Alrabeah, Ghada; Abu-Salma, Ruba
    Group communication tools have gained widespread popularity, attracting over a billion users. However, questions arise, how closely are our messages being watched by external parties? Is end-to-end encryption implemented by the application? Many group communication tools either do not offer enough security features to protect their users or make it challenging for them to understand and use these features. This research discusses how users perceive and use secure group communication tools, focusing on users in the United Kingdom and Saudi Arabia. A mixed-methods approach involving interviews with 20 participants and a survey with 204 respondents was conducted. The study reveals key factors driving users' choices, their understanding of security and privacy, their willingness to adopt or not adopt secure group communication tools, and cultural differences. The findings underline the priority factors like popularity, usability, and being free, as influential in tool selection. Users express willingness to use secure tools, yet gaps arise between intention and practice, attributed to misconceptions, motivation, and trust concerns. Privacy practices vary between cultures, with Saudi participants showing more caution. On the other hand, the UK displays higher trust levels in communication tools compared to Saudi Arabia. These cultural influences shape communication priorities, with Saudis leaning toward group communication and the UK prioritizing individual communications. Despite these differences, the study suggests the potential for universally secure applications catering to diverse user needs. The study offers recommendations for tool design that help improve the adoption of secure group communication.
    5 0
  • Thumbnail Image
    ItemRestricted
    A Trust-Based Mechanism to Manage User Privacy in University Smart Buildings
    (Newcastle University, 2024-06-17) Taher, Rawan; Morisset, Charles
    Smart buildings employ a diverse range of technologies, including sensors, to monitor the environment and create a comfortable space for users via data collection. This monitoring reveals data about users' activities that could raise privacy concerns. In recent years, privacy has received increasing attention in smart home environments. Several studies have proposed solutions that allow smart home users to retain control over data collection and manage their privacy. However, little attention has been paid to user privacy in smart buildings that serve as places of work or study. Research on privacy in smart buildings has predominantly focused on technical aspects, with relatively limited public voice engagement in the literature. In order to effectively manage smart building environments, it is necessary to collect data. However, it is equally important to prioritise and respect the privacy principles and regulations. Users in smart buildings typically have limited or no ownership or control over captured data, along with limited awareness and insufficient disclosure from the building management which limits their ability to manage their privacy. This stands in contrast to smart home users who often possess partial ownership and control over their infrastructure. Privacy management in smart buildings poses a significant challenge, consequently, there is a need to identify a privacy design mechanism that can incorporate users' voices into the data practices and find a balanced trade-off between the utility and privacy of smart building data. In this thesis, three studies are conducted to make significant contributions to the management of user privacy in university smart buildings. The first contribution involves analysing the impact of various building technologies on users' privacy. These technologies are then mapped to different facets of privacy harm using the Solove taxonomy. The second contribution involves conducting semi-structured interviews to understand users' privacy perceptions, preferences, and trade-offs, thereby identifying key requirements for privacy mechanisms in smart buildings. The third contribution involves a confirmatory study aimed at designing a privacy mechanism for smart buildings. In light of these contributions, this thesis introduces a trust-based privacy design mechanism called the Privacy Committee. The main goal of the Privacy Committee is to incorporate user voices into the decision-making process concerning data sharing and to establish comprehensive oversight of data practices within smart buildings.
    22 0
  • Thumbnail Image
    ItemRestricted
    Designing Privacy Aware Internet of Things Applications.
    (Cardiff University, 2024-03-25) Alhirabi, Nada; Perera, Charith
    The Internet of Things (IoT) integrates physical devices with software, enabling extensive data interactions. This combination, often involving diverse specialists, leads to complexity, with privacy often overlooked. Given the sensitive nature of data in many IoT applications and the strict privacy regulations they face, early privacy consideration is essential. Many researchers advocate techno-regulatory methods like privacy-by-design (PbD) principles. Their complexity and lack of clear guidelines make their application in IoT challenging. We present a simplified and visual method for IoT developers to embed privacy into their applications. Unlike traditional methodologies that involve complex and time-consuming steps, our method is straightforward and interactive. Our framework approach is intended for the conceptual design phase of the software development life cycle (SDLC) to support early dialogue between lawyers and developers in the context of IoT app design. The key value is following a user(developer)-centric approach to fulfil their needs in addition to meeting privacy requirements. The thesis contributes in three ways. First, by exploring non-IoT privacy techniques, we discovered the challenges of migrating these strategies to IoT. Second, our subsequent interactions with developers and privacy experts revealed common challenges in privacy design. Accordingly, we proposed PARROT (PrivAcy by design tool foR inteRnet Of Things), a tool engineered to intuitively guide IoT developers. Third, our exploration of less regulated domains illustrated further privacy challenges and underscored the potential of tools like PARROT to amplify awareness of privacy norms in IoT design. Through multiple case studies and experiments, we validated PARROT’s effectiveness in reducing privacy issues while designing IoT applications. Overall, the experimental results demonstrated in this thesis confirm our hypothesis that PARROT reduces privacy mistakes and increases privacy knowledge among developers during the Internet of Things software design phase by offering an interactive design method to augment the design process and provide real-time feedback.
    39 0
  • Thumbnail Image
    ItemRestricted
    Examining Adversarial Examples as Defensive Approach Against Web Fingerprinting Attacks
    (Saudi Digital Library, 2023) Alzamil, Layla; Elahi, Tariq
    In the age of online surveillance, and the growth in privacy and security concerns for individuals activities over the internet. Tor browser is a widely used anonymisation network offering security and privacy-enhanced features to protect users online. However, web fingerprinting attacks (WF) have been a challenging threat that aims to deanonymise users browsing activities over Tor. This interdisciplinary project contributes to defending against WF attacks by employing the “attack-on-attack” approach, where Adversarial Examples (AEs) attacks are launched to exploit existing vulnerabilities in the neural network architecture. The FGSM and DeepFool construction methods are implemented to introduce perturbed data to these models and lead them to misclassify, significantly decreasing the classifier prediction accuracy.
    15 0

Copyright owned by the Saudi Digital Library (SDL) © 2024