The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> MODERATOR: Should we start? Okay. Thank you very much for being here. Let me start by welcoming you. These are both Pakistan‑based organizations. We're working on digital rights, gender, and the cross‑section of that.
There has been confusion on the agenda. This had to be a panel discussion, well, it's really not. It's a design thinking workshop, and that means there's no panel and everybody here is the panel. So all of you will be working together instead of us telling you what to do. Does that sound good? Great. My colleague Sadaf Khan is supposed to conduct it, and she could not make it. She's an extremely talented individual, so please excuse any, you know, misgivings. Let me start by explaining what design thinking processes and a little bit talking about the work flow of this sort of workshop. Can I have the presentation, please, on the wall? Okay.
So today we'll talk about solutions to counter online abuse and hate, which is directed at women and LGBTI groups specifically. We're working on this concept in Pakistan. We're using them in Pakistan as well, and we hope to have feedback from international colleagues. This is what we will be doing today.
The first step is to alphabet ties with the people and work in group and try to sort of interview people and get an equal footing with them and try to understand what their issues are. That's the first step. The second is define or stating the problem. Thankfully we have experts from APC who will talk in ten minutes what exactly the problem is specifically in the South Asian region and the global perspective. The third part is where you as groups work on coming up with potential solution to counter the problem. In the interest of team because we only have one and a half hours, we focus on the first three steps.
We won't work on prototype and test, because that will take a long time since we have limited time here, so we work on the idea and concept of an idea.
So here's a very basic work flow of this workshop. I'm taking five minutes to explain basic work flow, and that's the part. We hand it over to Erika and Serene that will state the plain. You work on an activity where you want to get to know people and your groups better and work on the grouping, and we'll give you time to interview each other and talk about the problems you face John line and write them down. Point-of-view statements, create point-of-view statements, ten minutes to create a point-of-view statement. We have 15 minutes where you can brainstorm with your fellow colleagues and group members to come up with a potential solution, and then I'm very sorry, but we only have ten minutes for the presentations in the end. So this is something we have to do.
>> AUDIENCE: Ten minutes total?
>> MODERATOR: Two minutes each. Thank you for pointing it out. These are simple questions we wrote down. So you can ask people what role does it play in your personal and professional life and what kind of speech is abuse and harassment? What do you face and so on? There is a list of questions. When you're interviewing each other, I try to project this on the screen so you have an idea of what to work on.
Then we have point-of-view statements where we want you to write down what exactly you have taken from those interviews. So an example would be to make optimal use of cyberspace and counter online abuse‑when it comes to activists or female journalists. You say how to effective report, but in the context, add any insight examples of this unstandardized language made reporting.
This looks very complicated, but let me make it easier for you and here we are. There's an example of a point-of-view statement to make optimal use of the cyberspace and counter online abuse, female journalists in Pakistan need a way to connect with feminist allies because openly talking about it leafing them to further abuse without public allies. Another easier example would be, for instance, to optimize the IGF experience, remote participants from around the world need a way to keep active conversations happening in different sessions at a time. In most parts of the world, Internet bandwidth for streaming radio might not be available. You're putting down what you learned from the interviews here.
Then we have our colleague from Pakistan who is actually working. Code for Pakistan is a not‑for‑profit that is coming up with a formalized tech solution in Pakistan. The learning from this process will directly add into it. My colleague will talk about a potential idea we have come up in Pakistan. It's by no means final, so this learning will definitely complement and supplement it.
Two things we need to be very, very careful of. We need to be absolutely focused on the work at hand. I think it's likely unjust of me to ask you all for that, but the more focused we are the better results we have in a design‑thinking exercise. Secondly, like I said, we have extremely limited time, so we have time designated for every activity. It would be great if we are to come up with solutions and if we come up with ideas at the end of the activity and we have to stick to the times. That's extremely important.
I requested my associate to be the timekeeper and tell us when it's time to shut up. Okay. Let's start.
I'm done one minute early, so I can take a couple of questions. Very quickly. No questions? Okay. Then I'll have request it's a fast‑paced activity. You have to keep up with the pace. Yes.
>> AUDIENCE: (Indiscernible).
>> MODERATOR: It will add in a situation in Pakistan with hate against women and transgenders in Pakistan. That's important. We have done it in smaller cities, but what we lack now is really the international perspective and feedback from colleagues like yourself. That's very important for us, and I'm looking at the clock and it says 12 seconds so I'll hand over the mic to my colleagues, Serene and Erika. If you can come up and talk a little bit about the problem that we're facing today. Or you can just use the mic there. Thank you. I made the time. That's so cool.
>> SERENE LIM: I'm timing myself as well. I work for families organizations based in Malaysia called Empower. We're partners on APC, and we work on Internet rights specifically.
Over the years we realized that online abuse is a barrier to women to get full access to the benefits brought about by the Internet today. In my very brief five minutes I will present. We will are disproportionately have religious cultures and values of a particular society.
In my country in Muslim women and I believe in many parts of the world as well, they face an additional risk for being a woman, and second, a Muslim. So abuse, harassment, violence against women is a culture constant, right? The interpretations of Islam has been used to justify acts like beatings of wives, marital rapes or treating women as a secondary citizen.
This is even down to things like as simple as what to wear and who you can ‑‑ who you should meet. So Muslim women, they really have no choice but to move to the online space where they can actually explore the identities and also to understand the interpretations of Koran in terms of gender relations and the normative hierarchy. We see abuse and harassment on this space as well.
Muslim women, especially here, express a political view or anything going against the dominance interpretations what Muslim or Islam is about they receive rape, death threats and trolls and assault. For some really weird reasons you have it worse if you wear a hate scarf.
If you wear the scarf, you're supposed to be like this good woman. The other concern that we are seeing is social surveillance as women. The more we step out of the house, we are ‑‑ the moment we step out of the house, we're under scrutiny for how we behave and wear as well.
Over the years social media has become this space of hypervisibilities, so everything you do is no longer private. And this affects even those who do not intentionally seek to be visible online. So, for instance, there are dedicated websites on social media con tonight to mock, insult noncompliance with women. Like that don't wear the scarf or single women or for merely speaking loudly in the public space or trans women.
So this set of vulnerabilities is undesirable. They never asked for it, but this also comes with harassment, stalking, threats, loss of job opportunity, and they face constant public humiliations and emotions as well. What is ironic is there are visible online, but in a way it makes them invisible in the physical space.
I know women back home.
They would draw themselves from social media accounts and went into hiding, and some even got into depression. So what we are really seeing is a culture of impunity. Muslim women have really, really little redress for justice, especially where religion is used as a political tool to control our movement and how we should behave.
Whether this is done intentionally or unintentionally, this is an attempt to wipe out Muslim women online and offline, and as a result they are unable to act on their own, and you're isolating a group of women from the broader fights for rights and justice. As a whole, I would say it's really brutal and cruel. This is a form of violence against women. So I will pass this to Erika. It's like four minutes. So Erika can give a broader picture. Thank you.
>> ERIKA SMITH: We're not going to be celebrated for what we say but how quickly we say it. Hopefully we're also being coherent and thanks a lot.
I think that you can see our hosts are coming from Pakistan and Serene is from Malaysia, and we all come from different countries and understand now after a long time talking about it that gender‑based violence online is not something separate, district, or even possible to separate from gender‑based violence offline, nor is it only about gender, although if we look at things from a gender perspective, we're looking at power relations, it's intersectional. We're understanding that this is ‑‑ that effects different people who are not women or don't identify as women.
It affects the LGBT community. If you have a different perspective, you see the nuances of this type of violence. It's important to look at and it's what came out in the best practice forum report last year. You can read that report. A lot of people contributed to it. We won't go into detail about it being gender‑based online.
Amazingly, we may be a at moment, but we don't have to explain it because you know what it is. What's really important is it's happening everywhere, and it has different nuances.
One of the recommendations of that report was to say it really needs to have a multi‑stakeholder response. It's exciting to see this time of design‑thinking workshop with a diverse community. The other part was it's important to have governmental and technical solution, but that can't happen on their own. If it's done without the consulting the people affected, they won't be good technical solutions. We know about the bad solutions that are surveillance weapons for us meant to protect us, right? I think it's really exciting to have this model.
The other thing that came out strongly in that report was the importance to look at not overlegislating or definitely not just looking at criminalization as a solution. In fact, it won't be necessarily a good solution if you ‑‑ if what women need is something else. That's another thing that I think the process that has been introduced will help us to get at.
We know that legislation would take a really long time, and we know about the impunity in so many countries. What other options do we have? What kind of support is out there for the people that we're really worried about that, you know, or we're seeing what's happening and we want to know what we can do to help. I hope that hopes with sufficient framing, and I'll stop. How did I do? I'm done. We need more time to talk I'm sure.
>> MODERATOR: Thank you so much. It was a very good presentation. Now it's time for me to start pushing you to do some work. Yay. I have these note pads here, and you might find them on your table as well. Can everybody see that?
People sitting behind it if they want to take part in the activity you can take the note back here. If you want to observe it, it's perfectly all right. What I want you to do is essentially draw the responses, any response that you might come up with when you look at hate online or abuse online, especially targeted at LGBTI groups and women, and especially in the context of a South Asia country and what would be your response? Anything you come up with, draw it on the paper and plain it to the person next to you. Does is it make any sense?
Like draw a picture or a comment that you may come up with or an illustration of any sort that might put you in the situation of things you know, right.
>> AUDIENCE: Repeat the request.
>> MODERATOR: It's very simple. Every day on a daily basis all of us people who use Internet, they come up with instances of hate and abuse online, right? Usually we have a response to that. Even if it's emotional or no response. What I want you to do is come out with a response and illustrate it on a paper and explain it to the person, why that particular response is something that you came up with. It can be anything. You work on it individually and explain it to the person right next to you.
So we have ten minutes for this activity. Ample time I hope, right? Yeah. Okay. Let's start with that. Let's start with that. Anybody from the group that wants to participate can slightly move your chair ahead and join the person here and just come and plain to the person next to you, right?
(Small group discussion)
>> MODERATOR: One minute left, guys. One minute. Okay, guys. I think we should stop here for a minute. Excuse me, guys. Time's up, time's up. I'm so sorry. I hate to interrupt everybody, but I think we'll have enough time for brainstorming and talking about the things we drew right now. Can you raise up your hands, people who have drawn, and explain everything to their partners, and people sitting next to them. Can you please raise up your hand? Okay. So that's quite a lot of people. Thank you so much. Can we have a round of applause for all of us?
(Applause)
>> MODERATOR: Excellent. So the next thing that we're going to do is to divide all of you into groups. Does that sound okay? Okay.
And so I see a lot of people sitting together. For instance, I see two of our colleagues from Facebook sitting together, and I think it will be better if one of them can sit here. Perhaps that's a good mix of people and groupings.
Similarly, the idea that we sort of, you know, dividing the people into groups for is we need to essentially divide this whole group into three or four main stakeholder. One can be an activist and others are techies into software development and coming up with downline solutions and so on.
The third one is international platforms like Facebook here, for instance, right? Anyone who is a lawyer, maybe that's the fourth category, lawyer or journalist and so on. We need to divide ourselves into groups based on these stakeholder groups.
So who do you think sort of activists, can you please raise up your hands. Don't be shy, please. Excellent.
So I need everybody to take note. These are the activists. Okay. I need ‑‑ yeah. Civil Society essentially. I need techies to raise their hands. Techies. Anybody who is good with commuter coding. There we have it. Excellent.
So can we have the international sort of platforms, global platforms and service providers raise up their hands and intermediaries? Quite a lot of them. Any professionals like journalists or lawyers in the room? Excellent. So this was to basically identify who is sitting where.
>> AUDIENCE: (Indiscernible).
>> MODERATOR: So you're an activist pretty much. Can we start the time? This is the worst part. This is where I need you to sort of get up from your seats and sit next to the person who you want to sit from another stakeholder group. It can be anybody. It can be a techie or lawyer or activists. It can be anybody who you want to sit with, right? So if I may, please. Yeah, you have to move. That's the most difficult part.
(Small group discussions)
>> MODERATOR: Thank you so much. I see a lot of groups here. Just for one moment, guys in the back, just for one moment, please. So I see a lot of people in their groups, and that's excellent. People have been asking what we need to do right now. Just sit for two minutes, and I'll tell you what the next steps are.
Right now what we do is interview each other, talk about the questions that we've written down on the presentation here. So these are the sample questions that you can ask your colleagues in a group. I see there are large groups of five, six people, so you don't have to interview every single one of them, but ponder through these questions or whatever questions that come to your mind about Internet spaces and the rule and the kind of abuse we see on literally a daily basis and how that affects the spaces of women on the Internet, right? These are the standard sample questions to ask in the group.
For this activity we have about 15 minutes. Let me tell you what we're doing afterwards. These two steps are enter connected. So we have 15 minutes for them to interview or do it collectively, just interview in the group and talk about the questions.
The second step is that everyone has to ‑‑ or if you want to do it as one per group, it's perfectly fine. You need to make a point-of-view statement for the group. You can do more if you want, but at least one group a point-of-view statement. Let me very quickly talk about how you can make a point-of-view statement.
So very quickly, what you can do is you can just ‑‑ the questions that you ask your colleagues, you can take notes. And you can see what exactly are the rules on the Internet. Are there activists Oran Dom personal users that use the Internet for Facebook and for their own personal social networking or digital activism or daily routines like a journalist or lawyer. What are the effects?
What is the impact of online abuse on them? So using the data you come up with this standard point-of-view statement.
So I have made one as a sample. I'll repeat it, so this says to make optimal use of cyberspace and counter online abuse, so I have put in a group, the female journalists in Pakistan, that you put in my point-of-view statement. They want to connect with the feminist allies.
When they openly talk on the Internet, they get harassment, and so the action that they need is they need to get in touch with feminist allies and follow community groups. This is a sample point-of-view statement. If you have your own ideas, pour it into this. I'm projecting this sample here. It will be there for the next 30 minutes.
Whenever you're working on it, you have about 10 minutes for this. In all for the next activity, we have 25 minutes where you have to interview each other and put a point-of-view statement, at least one per group. You can make more if you want to, but at least one per group. Does that sound okay? Are there any questions?
If you have any questions, I'll be here, and maybe you can start working on it. Then I'll come to each of the group members and basically sort of address whatever questions you have. All right?
Here are the questions, and so whenever you're done with the questions, I can project the point-of-view statement. Does that sound all right? Okay. We have 25 minutes for that, so let's start.
(Small group discussions)
>> MODERATOR: We have three more minutes, guys for the interviews. Three more minutes for the interviews.
(Small group discussions)
>> MODERATOR: We have to stop with interviews right now or we won't have time for presentations. Guys, the time for interviews is up. Excuse me. I'm so sorry to interrupt you, but the time for interviews is up. Otherwise, we won't have time for presentations, and that's a shame because we're talking about so many good things and we won't have time for presentations.
That won't be good. Guys, I'm sorry. I'm so sorry to bother you and interrupt you, but we won't have time for presentations and that will be a shame. The next step is we have five minutes to the groups in the back, the interviews are over. Now we have five minutes learning from the interviews to create a point-of-view statement. All right?
A sample point-of-view statement is up there on the screen. Now, this point-of-view statement is actually the most important activity of this workshop, because this will help us to learn from whatever you discussed in the groups, from whatever you interviewed, right? This is directly add into our initiative.
This is the most important thing right now. We have five minutes for in point-of-view statement, one per group. If you want to do more, that's completely up to you. At least one per group, right?
So it's on the screen. And there is another sample if you want to have a look. You have five minutes now, all right? Thank you.
(Small group discussions)
>> MODERATOR: Guys, I hate to be the bad guy here. Excuse me. We have to stop now. Guys, I'm sorry. I hate to be the bad guy. I know you hate my right now, but we have to stop for a bit. The groups who have written the point-of-view statements, can they please raise their hands. Okay. Have you done the POV statement? Okay. So start the time, and they can brainstorm.
Guys, we have to stop with the POV statements now. I'm sorry we have to stop with the POV statements now. It's so sorry right now. So we perhaps can have this discussion over the lunch again. Can anyone please look here? Can you look here?
So this is the time when we start the POV statements, because otherwise we don't have time to present them. That's a shame. We don't want to do that. Erika, we're over time, so we need the POV statements. Everybody by now ideally should be done with their POV statements. Please raise your hands if you have done it.
Are you done with the POV statement? Okay. So here's the next thing. This is extremely important. Now you have a point-of-view statement.
Can we just one minute focus here? Now you have a point-of-view statement. In other words, you have an idea of challenge of a particular stakeholder group, and B, you have a general idea of the kind of solution that they want. Right? You know their requirements and the point-of-view statement, right?
For instance, in my case I know what female journalists in this case require online. They need an idea and they need a thing where they can get in touch with their feminist allies, right? That's something they require. Ideally when you write your POV statements, you should have a stakeholder group and have an action in mind.
The action here is connecting. C, the other segment of the group which is feminists in the case. That's a technical requirement in terms of POV. What we have to do for the next eight minutes is think of a solution, of a digital online solution to cater to that requirement. Can we do that? We have eight minutes and we have to come up with a solution to cater to that requirement. Think broadly. Think anything at all.
(Small group discussions)
>> MODERATOR: Okay, guys. I hate to interrupt you. The time is up. Excuse me. Guys, the time is up. I have to start calling names again. The time is up. Erika, I'm sorry. We're out of time.
Thank you so much. We're out of time. Okay. Excellent. I'll start calling names when you don't stop. Sorry. Show of hands of people, the groups that have come up with a potential solution, or anything which is closer to that? Okay. Thank you so much. Now what I want is I think before we going on to that, let's have a big round of applause for all you guys now.
(Applause)
>> MODERATOR: I'm sorry. I hate to be the bad guy here. I am apparently. Now what we want to do is have a volunteer from each group, and in two minutes, A, talk about the requirement that you had, and B, a potential idea for the solution. You know, it might not be a solution. It can be anything. It can be a statement. It can be something that is what you think can be done in a situation regardless of laws in place and regardless of policies in place. In an ideal world what do you think should be done? Can we have the volunteer from this group? Excellent.
Can we have the volunteer from the other group? Who is the volunteer? Okay. So you can come here. Okay. So this is the two minutes. Okay. Here you are.
>> AUDIENCE: All right. What we came out was to make use of cyberspace and counter online abuse, women, in general, need to have more awareness, have their privacy kept, and have a safe space to learn and interchange ideas. So the problem for that was especially for the privacy and have the safe space using the cyber world, anybody can get in. We needed some sort of authentication on who is entering what sites.
In that case there were some capacity building or some creation with the service providers to provide some authentication schemes. There are some discussions using the mobile phone numbers for authentication to enter some sites, and that could be useful. There was a conclusion of no censorship or no overregulating will be the solution. Yes, we're collaborating with the authorities in case of threats that could become criminal acts.
>> MODERATOR: Okay. Thank you. Thank you so much.
>> AUDIENCE: Hi. My name is Japlin. We had a number of solutions long‑term and short‑term. Some of our discussions were apart from having cyber laws and accountability for social media corporations, it is important to have a community solidarity and engaged community mechanism among each other, among activists and among women in general. Then we went on to ‑‑ again, coming from the community.
Maybe having a list of collectives or people or organizations that you can approach to when you're going ‑‑ when you're going trolled or harassed. Then another tactic that is already in use a lot but is debatable, I know a lot of people do not like it, but a lot do also use it. It's obviously naming and shaming for repeat offenders. So if it suits you and you do not have a problem with it, that could be one strategy.
Then the long‑term solutions we had were, obviously, more education and awareness and cultural civility, and very important to have more women in STEM and more women in leadership positions, in decision‑making positions and in the designing of the technology. Thank you.
(Applause)
>> MODERATOR: Thank you so much. Are we done with this group? Do we have a volunteer? Thank you so much.
>> AUDIENCE: Hello. We talked about nonconsensual intimate images also known as revenge porn. We came up with an idea specific to that.
So we said to make social media safer for everyone, platforms should raise the cost of posting and sharing nonconsensual intimate images for perpetrators, because this will help signal that it is unacceptable behavior. And then we had two ideas ‑‑ well, I guess we kind of had three. We talked about facial recognition and the fact that when you post a picture ‑‑ when somebody else posts your picture, you should be able to get it taken down. Turns out Facebook is doing more on that. That's good news.
We really think that consequences for people that share images in addition to those who post them has to be a bigger thing. Like it's just because you're part of the problem. It's not, you know ‑‑ you're not a neutral actor. Oh, then how do we describe the downloading issue?
>> AUDIENCE: Sorry. There is a big thing we don't know how to assess. When you receive something on What's Up, for example, it gets saved on your hardware. It's not online anymore, so it's not on the web, and that's something we don't know how to deal with.
(Too low volume)
>> MODERATOR: Thank you so much. That was really relevant. Who's the next group?
>> AUDIENCE: Sorry. Okay. So we talked about, yeah, women and the intimate and we made the statement that to make optimal use of cyberspace and counter online abuse, people defending women's rights need a way to receive support from a caring and committed online community because in their context continued harassment will have a chilling effect on them raising their voices.
And we thought about in this context how to find a response to this. Basically, we talked about a response that could be to activate and alert when defenders are attacked that help generate a positive and loving speech from both men and women to denormalize the attacks. Basically, to be able to say this kind of behavior is not acceptable and do that, yeah.
>> MODERATOR: Thank you so much.
(Applause)
>> MODERATOR: Who is the other group? Should I bring you the mic there? Okay. I'm so sorry.
>> AUDIENCE: So our group was talking a lot about finding online solutions to abuse that happens online. So our point-of-view statement was women and at‑risk groups need safe spaces online to interact with allies to counter the abuse, because fighting alone won't solve the problem. We were talking about how even here when we were talking about the solutions that do exist, how we don't ‑‑ even though we kind of work in the same Civil Society networks, we didn't know what other organizations were doing.
So we were saying that some of the solutions should be for groups that do have solutions to find spaces, to share that so that we can share positive examples where people have successful fought online abuse and also local initiatives so that people know that there are ‑‑ there is support out there. And we wanted to talk also about connecting offline spaces and online spaces for not forgetting about that. More local campaigns and telling stories and best practices and how to handle it and making the connection with people online and offline.
>> MODERATOR: Great. Thank you.
(Applause)
>> MODERATOR: Can you pass me the mic, please.
>> AUDIENCE: So our group, we were talking about being on the Internet and members of the LGBTQ community. So our statement was to counter abuse against women and LGBTQ people, we need a way to effectively seek justice for crimes committed about these people without their moral characters being called into question or examined.
So we came up with two solutions. One was specific training directed at law enforcement and the judiciary. So it's actually them and the not just your ‑‑ it's specific to how they need to go about prosecuting and trying these crimes. The second one was calling out the behavior as problematic for this group of persons.
So for people who maybe have been to law enforcement and didn't get justice for their situations, to actually speak out and say, you know, there's a law against this. I went to the police. Nothing was done. The way they handled my case was problematic. That was our two solutions.
>> MODERATOR: Thank you. One more group is here. Guys, would you like to present? Would you like to present? There you are. Come on and do it.
>> AUDIENCE: Hi. I'm from Hong Kong. My name is Zoe. So we're discussing about cyber‑bullying in Hong Kong. Our statement is to make optimal use of cyberspace and counter this bullying in Hong Kong, we need to get the school recognizing the cyber‑bullying problems and create spaces for solutions.
We have discussed our solutions, and it's just an idea that works. So we need to have some offline education, for example, our schools need to hold some talks or workshops to get more people to recognize the seriousness of the problem of cyber‑bullying and also we can use the platform like YouTube because many young people are watching YouTube, so we can get more awareness from young people on YouTube. And also, we can have some advertisements on the TV so work for the government. Thank you.
(Applause)
>> MODERATOR: Okay, guys. Thank you very much. I'm sure you must have had the opportunity to hear from a lot of people here. What you might want to hold on is your notes that you've taken, because that's extremely important for us. This is where we take the feedback reviews from you and try to add it into what we're doing.
Now, moving on to the next part of this workshop, the design thinking workshop exercise is finished.
I have my colleague here from Pakistan. She's with ‑‑ she's with the Code for Pakistan, and they're the partners working to develop a technological solution to talk about and basically to do whatever we have discussed here.
Let me sort of flag this solution we're talking about is by no means final. There are a lot of things that a lot of privacy issues that we are currently trying to deal with the ideas which the code is a development and so on. The presentation is to give you an idea how your feedback will be used. So Sadaf, if you can take some time.
>> Hi everyone. Can we have the presentation up? Thank you. My name is Sadaf Habib. Okay. I'll just use ‑‑ so I work with an organization called Code for Pakistan. Can everyone hear me?
>> Not well. There's a lot of noise from that.
>> I'll use the other one.
>> It's from next door.
>> SADAF KHAN: I work with an organization called Code for Pakistan. We work on digital developing and leveraging tech and digital tools for civic and social good. Some of the areas we work on include citizen engagement and citizen empowerment, and we partner with NGOs like media matters and with government departments and other Civil Society organizations and are interested in open data projects. Can we have the next slide, please? The next one, please. This is a brief overview of the app we developed.
It's a very, very basic proof of concept right now. I'll just run everyone through what the app does, what we have so far. So, first of all, you launch the app and log in using your Facebook credentials, so your Facebook user and password. Once you log in, you get the home screen. The menu has four options right now. If you click report ‑‑ I'm sorry. If you click report, you get a list of all the users who have commented on posts on your time line.
So report an offensive or abusive comment, first you select the user.
And all the posts that that user has commented on your time line, those posts will be displayed. You can select the comment that you want to report, and once you select the comment, you can select a category. I don't know if you can read it from here. The three categories so far, sexual harassment, incitement of violence or trans rights or all of the above. Once you select a category, your report is sent.
The second feature is the highlight feature. This, again, once you pick highlight, you can select a category. Once you select a category, the users on your friend list who you have reported for that category, those users are displayed.
Next slide. The third feature is the browser reports feature. When you click browser reports, again, first you get the dialogue box, and you can select a category. Here you can browse reports that other users have also made. There's a small box for feet back, and there's a dislike button as well, so it acts as a sort of discussion forum as well. This is a very basic function of this browser reports feature, but you can also ‑‑ I mean, we have additional options for users to choose whether or not those posts are displayed in the second. The final feature is browse posts. Thank you.
Again, you can select one of the categories. Once you select, you can see all the posts that you have reported in that particular category. If you wish to do so, you can click the little X button in the corner, and that will get unreported, so this is the solution we have so far. If anyone has any questions.
>> MODERATOR: Thank you so much. Unfortunately, we don't have time for questions because we have our colleague here, Japlin, and we wanted to use the time for questions to talk about the recommendations from the report she has done. That's my timer saying that it's over. You know, the management has been kind enough to give us five minutes extra, so we have five minutes for you.
>> AUDIENCE: I'll be very quick. I'm going to be very, very quick. I'm Japlin from new Delhi. I run an organization. It works at the intersection of gender, media and technology. I'm sorry. Okay. Sorry. I recently did a research with 500 women in India with the support of freedom house from the U.S.
I surveyed 500 women. I'm going to very quickly talk about some research findings that I found through my report, and if you're interested you can have a copy of the report. Out of these 500 women that I surveyed in my social board, more than 50% of them reported having being violated, abused, harassed at some point at different times. One thing I want to emphasis here is a lot of people have asked me why do you call it online violence and why is it not abuse or harassment?
I would like to emphasize on that that the violence in the culture we experience online as women, it's not just restricted to our screens. It actually translates beyond our screens in various ways. Many women reported in my survey that the online violence has adverse effects on their mental health and social emotional well‑being where it can lead to depression and self‑censorship where they decide to get off the said platform or off the Internet totally. Apart from this, this kind of online violence translating into physical danger when private information or called doxxing is put up online of women's home addresses, or what is known as revenge porn. It's a sexual abuse on nonconsensual usage of private videos and photographs. It has led to suicides, and he is suspiciously from conservative countries where a private video goes online nonconsensually, you are shamed to the extent women commit suicide.
Before there was Baloch, her brother killed her because she posed in inappropriate pictures online. I have analyzed examples like these and analyzing Indian cyber laws. We have a lot of laws. They only help you to some extent, or they're not implemented on some people. I have some recommendations for social media websites for government agencies and law enforcement agencies and also, for users, because it is very important as women to ‑‑ the way to combat online violence is not to get off the Internet but reclaim the online space. Thank you. I will end here. If you're interested, please grab a copy.
(Applause)
>> MODERATOR: Thank you so much for taking time out. I can understand this was a very hectic workshop for you, but thank you so much.
(Session ended at 13:34 CT)