By Dart Centre Asia Pacific CEO Erin Smith
and DCAP Board Chair Trina McLellan
Social media journalists, especially moderators and live bloggers, are like firefighters on the digital frontline, but putting out online fires is not without risks.
Scroll through your online news and social media feeds and they likely look quite different from even a month or two ago.
In busy newsrooms, there are thousands of images, stories, posts and comments to be shared daily about the Israel-Gaza – on top of violent conflicts elsewhere – content that can be traumatic, evocative and distressing: A bloodied hostage tossed into a truck, a child’s lifeless body under a sheet.
Hear more from the Melbourne Press Club with email updates (link)
The social media journalist’s job is to decide – on the spot – whether to publish such content, but that may be without much fact-checking or verification, opening them up to personal criticism by disaffected consumers.
Comments on some posts are rife with abuse, hatred and anger. Indeed, the calibre of comments has devolved to a point where, for social media journalists, the disturbing onslaught is almost without precedent.
The Israel-Hamas war has everyone’s attention, in visceral ways: So many people feel affected by what is happening, with so many voices from both sides expressing different points of view.
Perhaps you skip past the comments. Maybe you have limited your social media notifications.
But, if it's part of your job to moderate comments or publish posts, you can't look away, even when you're confronted with comments that question your own humanity and right to exist.
You may even have to let such comments run because they don't fall outside your organisation’s guidelines.
Social media journalists are exposed to put-downs, in-comment challenges, threats, racism, homophobia, misogyny or blame for things that are beyond their control.
It's a role that’s more common in newsrooms now but which, in some newsrooms, remains little understood, and is, therefore, rife with risk of harm.
While social media journalists strive to make the digital world a safer place for others, their role constantly exposes them to disturbing, disheartening and ultimately unpublishable content.
These journalists are, essentially, like firefighters on the digital frontline, dousing potential fires and protecting unsuspecting users from emotionally unsettling content. They do this for seven or more hours per shift. Protracted exposure can have a significant impact on wellbeing.
With greater understanding of the impact of online and social media production, including content moderation, newsrooms will be better placed to offer timely, evidence-based support, allowing these journalists to perform their job in ways that lessens the personal risk of harm.
No matter how cutting-edge technology may be, the human eye is still instrumental in assessing and making final decisions when responding to potentially harmful online content.
Research shows there is substantial evidence to suggest that constant exposure to traumatic and graphic content can affect journalists’ wellbeing.
It can also have a knock-on effect on job satisfaction and productivity as well as increase vulnerability to burnout, compassion fatigue and vicarious trauma.
Despite these profound challenges, most social media journalists are generally resilient and cope relatively well with the stresses associated with their role.
However, if we want them to keep putting out digital fires, it is critical that social media journalists are appropriately supported.
While access to an external Employee Assistance Programs (EAP’s) may be one option, support from a trained peer can be more immediate and effective, because they are more likely to understand the pressures involved.
In any case, like other frontline workers, social media journalists may initially be reluctant to seek support, preferring to speak to someone they know and trust who understands the unique challenges associated with their role.
By contrast, research shows that a toxic organisational culture can lead to reduced help-seeking, where asking for support or relief is likely to be seen as weak, career-limiting or even career-ending.
While online abuse can seem never-ending, some doesn’t even come from a human, with researchers at Carnegie Mellon University identifying during the COVID-19 pandemic that nearly half of Twitter accounts posting content about the coronavirus pandemic were likely bots.
Research done by Columbia University and the French National Institute found 59 per cent of links shared on social media were never clicked. In other words, most people appeared to share and comment on news posts without ever reading them.
Such “blind” shares influence what news gets circulated and what fades away, shaping our shared political and cultural agendas. They also contribute to the fires that social media journalists must extinguish.
Understanding the impact of this work may have come late, but it is not too late to make changes. We need these professionals to be able to work well and to flourish while dousing fires.
The Dart Centre Asia Pacific helps journalists and news organisations understand and manage traumatic stress. For more information, contact DCAP’s CEO at firstname.lastname@example.org