Check-in and access this session from the IGF Schedule.

IGF 2024 WS #211 Disability & Data Protection for Digital Inclusion

    Organizer 1: Fawaz Shaheen, Centre for Communication Governance, NLU Delhi
    Organizer 2: Tithi Neogi, Centre for Communication Governance, NLU Delhi
    Organizer 3: Angelina Dash, Centre for Communication Governance, National Law University Delhi

    Speaker 1: Zeynep Varoglu, Inter-Governmental Organisation, Western Europe and Other States
    Speaker 2: Ariana Aboulafia, Civil Society, Western Europe and Other States
    Speaker 3: Maitreya Sha

    Additional Speakers

    Osama Manzer, Civil Society, Asia-Pacific group

    Eleni Boursinou, Inter-Governmental organisation, Western Europe and other states

    Moderator

    Fawaz Shaheen, Civil Society, Asia-Pacific Group

    Online Moderator

    Tithi Neogi, Civil Society, Asia-Pacific Group

    Rapporteur

    Angelina Dash, Civil Society, Asia-Pacific Group

    Format

    Classroom Format
    Duration (minutes): 90
    Format description: The classroom workshop layout is advantageous allowing for more structure and clearer outcomes and learning objectives. In order to ensure that our format promotes open communication among diverse participants, we will be having a moderated discussion and will include collaborative breakout sessions. The chosen duration of 90 minutes further fosters participant engagement and accounts for any additional unforeseen accessibility considerations. The first part of the workshop comprises two policy questions to be discussed for a duration of 50 minutes, interspersed with 5-minute Q&A sessions after each policy question. The second part of the workshop will focus on collaborative breakout sessions where participants co-create the Multistakeholder Code of Best Practices for a period of 15 minutes. Additionally, the session requires 5 minutes for the introductory remarks and 10 mins for the closing remarks.

    Policy Question(s)

    A. How can data protection frameworks protect the privacy rights of persons with disabilities? B. How does digital accessibility strengthen data protection for persons with disabilities? How can consent mechanisms be made more accessible for persons with disabilities? C. How does automated decision-making affect persons with disabilities? How can data protection principles and frameworks prevent discrimination against persons with disabilities by automated decision-making systems?

    What will participants gain from attending this session? 

    Participants will gain insights on how data protection frameworks can enable ADM technologies to be fair for persons with disabilities. The session aims to equip participants with an understanding of accessibility by design mandates, opt-out measures to data processing, and transparency disclosures by ADM technologies, in order to empower persons with disabilities to make informed decisions about how their data is processed. Our speakers belong to diverse stakeholder groups, regions and genders, some of whom have lived experiences with disabilities. This diversity ranges from digital inclusion and engagement at the grassroots, to policy research including digital accessibility and public interest advocacy in India and the US. We also have a speaker working on accessibility, inclusivity and digital governance in the inter-governmental sector. The session aims to enrich participant perspectives by collaborating to develop pathways for an inclusive Internet. By centring accessibility as the precursor to meaningful engagement, the proposed workshop will also inspire participants to employ similar accessibility measures in future events.

     

    Description:
    Multistakeholder conversations around Internet governance often do not adequately include disability related concerns within their ambit. This omission becomes more concerning when discrimination against persons with disabilities is furthered by artificial intelligence and automated decision-making technologies (ADM). These technologies can process personal data in a manner that makes unfair decisions about Persons with Disabilities, preventing them from using the Internet to achieve economic growth and holistic development. This collaborative workshop (classroom format) initiates a multistakeholder conversation to advance human rights by securing digital inclusion through data protection frameworks for persons with disabilities. Using innovative approaches, speakers and participants will collaboratively design best practices to achieve an inclusive Internet for persons with disabilities. This will be accomplished by: 1) exploring the interplay between digital accessibility, data protection and ADM; and 2) examining how data protection frameworks can address technology-facilitated inequalities faced by persons with disabilities, in alignment with SDGs. Relying on the Centre for Communication Governance’s ongoing research on centring disability in India’s Digital Personal Data Protection Act, 2023, the workshop will be bolstered with insights based on CCG’s continuous engagement with diverse stakeholders from the disability and technology ecosystem. The workshop will collaboratively facilitate the design of ‘Our Shared Vision’: a multistakeholder code of best practices toward digital inclusion for persons with disabilities. These best practices will emerge from the exchange of ideas between the participants and the diverse speakers from various stakeholder groups - across industry, civil society and academia, and inter-governmental organisations. Our diverse panel comprises persons with disabilities. Their experiences and expertise will serve as representation of heterogeneity in persons with disabilities on the Internet. Addressing that disability is not a monolith, and that persons with different disabilities face discrimination through technology differently, is crucial to achieving digital inclusion, autonomy and user choice for persons with disabilities.

    Background paper: ‘Persons with Disabilities vis-à-vis the Digital Personal Data Protection Act, 2023

     

     

    Expected Outcomes

    This session enables experiential outcomes through multistakeholder discussions on preventing technology-facilitated discrimination against persons with disabilities through reliance on data protection frameworks. CCG aims to realise the following outcomes: 1. ‘Our Shared Vision’ - a multistakeholder code of best practices toward digital inclusion for persons with disabilities. This output will be disseminated on the CCG's Blog and social media. 2. CCG blog series on Disability and Data Protection focused on multistakeholder solutions to the gaps identified in the workshop. 3. Summary report of the IGF Workshop - to be shared with our speakers, published on CCG Blog and disseminated across CCG’s social media. 4. CCG Podcast discussing the workshop’s learnings and ‘Our Shared Vision’, to be disseminated on CCG’s podcast channels - Spotify, YouTube, etc. along with social media outreach. 5. CCG’s Courses and bootcamps will integrate learnings from the workshop on building inclusive Internet governance for persons with disabilities.

    Hybrid Format: To ensure meaningful engagement, online and onsite moderators will separately divide into online and onsite interactive breakout sessions and reassemble to exchange insights on the multistakeholder code with speakers and other participants. For continued engagement, this code will remain available to view or download for a week after the session concludes. Centring accessibility within hybrid participation, a workshop policy on institutional values and accommodations provided will be shared through a QR code before the session commences. These accommodations include closed captioning and easily readable workshop documents (speakers’ resources and the workshop policy) with text descriptions for images and accessible colour contrast ratio of at least 4.5:1. The session will promote participant engagement through Q&A sessions after each policy question, and breakout sessions. The multistakeholder code format will be predesigned on Google documents instead of advanced interactive tools like Miro, ensuring accessibility and avoiding functional difficulties.

    Key Takeaways (* deadline at the end of the session day)

    There is a lack of representation of persons with disabilities in data sets and research on building more inclusive automated decision-making systems and AI-powered assistive technology. This lack of representation manifests through AI bias by chat-based generative AI technologies.

    Digital accessibility is a precursor to meaningful and informed consent. It is therefore the stepping stone towards ensuring data autonomy for persons with disabilities and securing the inclusive Internet we want.

    There is an inherent tradeoff between privacy and accessibility: AI-powered technologies often violate the privacy of persons under the garb of accessibility. For instance, accessibility features layered with AI to make websites accessible with assistive technologies reveal data like usage of screen readers, etc. This relates to a larger issue of false optimism around AI, where the intent of companies can be unclear.

    Call to Action (* deadline at the end of the session day)

    Multistakeholder cooperation is a must between different stakeholder groups like civil society, governments, industry and academia. Persons with disabilities should be adequately represented in such conversations. We aim to facilitate this multistakeholderism through ‘Our Shared Vision: a Multistakeholder Code of Best Practices for an Inclusive Internet’.

    Companies should strive towards more inclusive technical design, particularly for AI-powered assistive technologies and ADM systems, throughout the entire life cycle and stages of the design process. This would include design, deployment and governance. Data protection frameworks should account for the diverse challenges faced by persons with disabilities, especially intersectional marginalisation due to gender, caste, poverty and illiteracy.

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

    IGF 2024 WS #211 Disability and Data Protection for Digital Inclusion

    Fawaz Shaheen (Centre for Communication Governance at National Law University Delhi), the onsite moderator, explained that the workshop sought to initiate a multistakeholder conversation on securing digital inclusion and data autonomy for persons with disabilities through data protection frameworks. The session explored how data protection frameworks can enable Automated Decision-Making (“ADM”) technologies to be fair for persons with disabilities.

    The workshop began with a presentation by session organisers Angelina Dash and Tithi Neogi (Centre for Communication Governance at National Law University Delhi) presenting key concerns, recommendations and takeaways from their policy brief on ‘Persons with Disabilities vis-à-vis the Digital Personal Data Protection Act, 2023’. India’s Digital Personal Data Protection Act (“DPDPA”) clubs persons with disabilities and children, and equates lawful guardians of persons with disabilities with parents of children, for the purposes of collection of personal data relating to persons with disabilities for the exercise of their rights over such data. This leads to infantilisation of persons with disabilities and undermines their dignity and autonomy. DPDPA enables collection of consent from persons with disabilities through their lawful guardians without accounting for any consultative framework based on mutual decision-making with such persons. Digital accessibility, which is a precursor to privacy and data protection, is often overlooked in the formulation of consent mechanisms. The DPDPA does not carve out any distinct category of sensitive personal data with additional safeguards, making personal data related to health, finances etc of persons with disabilities more vulnerable to exposure and discriminatory use. Based on their research, the session organisers recommended inclusion of a separate provision for persons with disabilities including a definition in the DPDPA, incorporation of mutual decision-making principles, accessible consent mechanisms compatible with assistive technologies, and a separate category with heightened protection for sensitive personal data. The session organisers then laid down some common themes from their research - digital accessibility, data autonomy and data protection, connecting them to a global, multistakeholder conversation on digital inclusion, as envisioned for the workshop.

    The second part of the workshop involved two rounds of discussion with the speakers.

    Round 1: How digital accessibility strengthens data protection for persons with disabilities.

    Eleni Boursinou (UNESCO’s Communication and Information sector/Universal Access to Information section) began by discussing the role digital accessibility plays in furthering the Sustainable Development Goals (“SDGs”). She pointed out that digital accessibility plays a critical role in advancing the SGDs by fostering inclusion and equity, particularly for marginalised groups, including persons with disabilities. She was of the view that equitable participation, empowerment and data autonomy of marginalised groups can be promoted by embedding digital accessibility and inclusive design into policies and practices.

    Following this conversation, Osama Manzar (Digital Empowerment Foundation) discussed the challenges with preserving anonymity and privacy while gathering sufficient data to support the needs of persons with disabilities. He provided an overview of the DEF’s ongoing work to promote digital inclusion of persons with disabilities, who have become successful entrepreneurs in their villages after receiving training through digital tools.

    Maitreya Shah (University of California, Berkeley; Berkman Klein Center for Internet and Society at Harvard University) continued the discussion by highlighting the unique challenges persons with disabilities face with regard to digital accessibility and data protection in India. He addressed how biometric iris scans under programmes like Aadhaar can have greater inaccuracies and privacy implications for persons with disabilities. He also pointed out that there is often an ostensible tradeoff between accessibility and privacy, where users with disabilities are compelled to compromise on privacy in order to access technology. This raises concerns because automated tools designed to address accessibility requirements often violate users’ privacy under the garb of enhancing accessibility.

    Round 2: Data available and the impact of ADM systems on persons with disabilities.

    Osama elaborated on the importance of innovation when it comes to collecting data from persons with disabilities and contextualising it. Data collected in relation to persons with disabilities is particularly diverse due to the diversity in their abilities, and innovation must be matched equally with contextualisation and cross-pollination of existing technologies. While ADM systems have risen to prominence with recent innovations, existing technologies like mobile phones are more accessible for persons with disabilities, as they enable options based on abilities. For instance, for someone with visual impairment, mobile phones provide audio and tactile tools. These existing technologies are a checkbox for compatible abilities, not disabilities. ADM systems on the other hand, are known to discriminate as they comply with checkboxes for disabilities.

    Maitreya carried forward the conversation by highlighting how a lack of representation of persons with disabilities in datasets manifests in bias displayed by generative AI and AI chatbots. This is often particularly evident in decisions made by ADM systems. He pointed out that accountability must be affixed by shifting the onus from users (especially persons with disabilities and their lawful guardians under Indian frameworks) to data fiduciaries. ADM systems must account for persons with disabilities throughout the entire life cycles of their development, including in their design, deployment and governance.

    Eleni supported and contributed to Maitreya’s points by referring to UNESCO’s contributions within the area of open online education. She highlighted how ADM systems in education pose additional risks for persons with disabilities through biases in data and algorithms resulting in discriminatory outcomes, such as biased admissions decisions, assessments, or resource allocation. She recommended that AI systems must integrate accessibility from design to implementation, ensuring compatibility with assistive technologies. She suggested that embedding fairness, transparency and explainability in ADM systems is key to promoting inclusivity.

    Q&A session

    In the Q&A session, we received questions from the audience on accounting for diversity of disability in technological innovation and datasets, and affixing accountability. A participant inquired how existing multistakeholder forums on digital accessibility can be leveraged, while acknowledging the need for more multistakeholder conversations. The participant highlighted the work of the existing Dynamic Coalition on Digital Accessibility within the IGF. The participant acknowledged the concerns raised by the speakers on how disability is construed narrowly in medico-legal terms, thus excluding those with disabilities who are not medically considered disabled. They shared another concern common with standardisation - that broader conceptions of disability can be misused by malicious actors to avail benefits meant for persons with disabilities. They underlined the need to address these concerns through stronger cooperation between different stakeholders.

    Conclusion

    The presentation, conversations and the Q&A session led up to a unanimous consensus on the need for multistakeholderism to promote digital inclusion for persons with disabilities. The speakers and participants deliberated upon different modes to achieve this. Digital accessibility emerged as a common goal for participants, with speakers adding that digital accessibility must be understood and applied in a more nuanced and comprehensive way, by focusing more on strengthening abilities instead of trying to match the various disabilities. While there is a need for preparedness and innovation in making new technologies more accessible through new coalitions, the strengths of existing technologies and coalitions must also be explored.