Written by Smita, Illustrated by Upasana Agarwal
The fourth and final session in the #SkillsforChange: Public Campaigning for Advocacy and Social Change online workshop series commenced on 21 August 2020 and looked at digital security and safety in the context of public campaigning. This session was done in partnership with Point of View, a non-profit based out of Mumbai, India, which works at the intersections of gender, sexuality, technology, digital rights, and digital security. Point of View’s work at this intersection is in the form of research, input into policy spaces, on ground workshops on digital rights and digital security, and publications at the intersection of sexuality, gender, and technology.
The session directly dove into discussing what we talk about and what we don’t talk about when we talk about digital security. With the digital security conversations still being largely located in global north economies, many digital security practices that are recommended often only help those who have had access to technology, and those who speak English. This leaves out large sections of the population who are users of technology and creates a hierarchy in terms of who deserves to be secure and who just has to deal with the violence and privacy concerns that may arise when using technology and the digital space.
With smartphones becoming an integral part of our lives, and with so much of ourselves being online, digital security is not only about protecting profiles on social media and dating apps, but about protecting the people and the bodies that are entwined with the tech. Dr. Anja Kovacs of the Internet Democracy Project looks at the need to put bodies back into data in her research, “Body as data in the age of datafication.”
“Once we start looking at digital security as not just means to protect our data or profile but as a way to protect a part of ourselves, our approach and thinking around this becomes more embodied.“
Smita
The need to combine our feminist and intersectional politics, lived realities, and movements with the tools when thinking about digital security was also highlighted. For example, asking queer and trans participants in a workshop to not use WhatsApp because it is not safe is easy, but not sustainable. This is because this digital security recommendation disregards the ground reality that the people’s communities may primarily be on WhatsApp, so they cannot afford to move away from it. Another example is that many of the secure messengers and tools are often not accessible to persons with visual impairment. This clearly shows that blanket recommendations and tech solutionism is an unsustainable model of digital security.
“It is essential that we address patriarchy, capitalism, xenophobia, queerphobia etc., the core causes behind online violence and privacy violations to have a holistic approach to digital security.“
Smita

Rather than giving tools to the participants, the workshop focussed on how to plan for digital security intentionally when planning a campaign. The participants, in smaller groups, discussed the risks, the likelihood of the risk, its impact, mitigation strategies, and who are our allies and who can we ask for support in case the risk occurs to build a risk assessment matrix.

The final part of the session looked at the data involved in an online campaign. This part emphasised on the need to be mindful of the kinds of data which we may be collecting in the course of a campaign, and is this collected with meaningful consent of those who are sharing their data with you. This is particularly important in campaigns which may involve contributions from the larger community. Then there is the question of how will data be stored, and what happens to all this information once the campaign wraps up.
In the plenary, participants highlighted the risks that are present for organisers as well as people who may take part in a campaign. One of key points highlighted by all the groups on how the risks and the impact of the risks will vary vastly from country to country, or even within the country. A group strategising digital security around a campaign on mental health and LGBTQI persons also pointed out that this could lead to reverse stigmatisation of saying that LGBTQI persons are ‘mentally unstable’ or ‘mentally ill’, which is a very specific risk that would not have come up in a overbroad digital security conversation. Some of the risks that were repeated were crackdown from the state and government authorities, attacks from anti-LGBTQI and religious bodies within the country, harassment from trolls online, and the risk posed to an organisation’s funding if the donors do not understand digital campaigns in the same way as the organisers.
The discussions highlighted the need for planning solidarity and digital security well before the digital campaign is launched, especially mindful cross-border solidarity in cases where the threats may be from a particular state or government or in cases of internet shutdowns.
The final part of the session looked at the data involved in an online campaign. This part emphasised on the need to be mindful of the kinds of data which we may be collecting in the course of a campaign, and is this collected with meaningful consent of those who are sharing their data with you. This is particularly important in campaigns which may involve contributions from the larger community. Then there is the question of how will data be stored, and what happens to all this information once the campaign wraps up.
Part of this is also to acknowledge the power dynamics which are present in a campaign, within an organisation among people working on the campaign, as well as between organisers and contributors. Meaningful clear consent and space to withdraw consent by contributors is of paramount importance. This cannot be just about a signature on a piece of paper, but need to be rooted on ethical feminist values and practices.
A participant asked about how to account for burnout and safety of people working on a campaign, which immediately showed an inclination and urgent need to think about holistic security, one which covers both the online and offline spaces. One of the suggestions was to see if the organisation can have mental health support, if possible.
The session also emphasised on the need to take a prevention is better than cure approach, where the campaign planning takes into account the mental health of the people involved to prevent burnouts and safeguard mental health and not just look at treatments, particularly in campaigns around violence.
At the core of it, the final session of the series stressed on the fact that a campaign does not come above the people, and when we talk about digital security, we are talking about actual people here, and we must think about it with as much sensitivity, care, and love.
This workshop was the last in a series of four workshops about Public Campaigning for Advocacy and Social Change. Click here to find out more about the series, resource persons, and read the other blogs.