When a parent posts a picture or a video of their child on Instagram, they are likely hoping for what most social media users seek out: love, emojis, and compliments.
But when public posts featuring children with disabilities, medical conditions, or visible differences are picked up by the Instagram algorithm and suddenly shown to tens of thousands of strangers scrolling through random Reels, the comments instantly turn violent and abusive.
Strangers from around the world predict the child’s death, tell the parent that they should have chosen abortion, post GIFs comparing the child to a monster or an animal or a vegetable, claim that the child will live a meaningless life, accuse the parent of using drugs/alcohol during pregnancy, or even make sexual comments aimed at the minor.
While adults with disabilities also face discrimination, known as ableism, in the form of abusive comments and intrusive questions on Instagram, those who manage their own social media accounts can choose to hit back at users, limit their comments, report trolls, or reach out to the company for further assistance.
(For top technology news of the day, subscribe to our tech newsletter Today’s Cache)
On the other hand, children with disabilities who are being posted by their able-bodied parents or guardians may have little to no say in how they are featured on social media or the steps being taken to deal with their cyber bullies.
Though comments can be deleted and reported while users can be blocked, it requires tremendous mental effort to even begin to clean up the dozens of violent or sexual hate comments left under a child’s Reel gone viral. Many responses that would be criminal if uttered offline are further ‘liked’ and upvoted by thousands of social media users. While some parents and caretakers lock or restrict comments, others choose to use the hate their child receives as a way to raise awareness or funds.
What is Meta doing to (not) solve the problem?
The Hindu reached out to Instagram-parent Meta to provide an example of the hateful comments posted under the video of a child with visible facial differences, and asked what measures the company was taking to protect minors with disabilities. Meta did not respond to multiple emails requesting comment.
However, the company has already been hit by serious complaints alleging that it does not prioritise children’s safety on its apps. In the U.S. state of New Mexico, Attorney General Raúl Torrez filed a lawsuit in late 2023 against Meta and its CEO Mark Zuckerberg, as well as Facebook and Instagram, alleging that the company’s platforms were “prime locations for predators to trade child pornography and solicit minors for sex.”
Based on an investigation using decoy accounts of minors, the New Mexico Attorney General’s Office reported that Meta showed sexual content to children, allowed adults to contact children to ask for explicit pictures of them, and let both Facebook and Instagram users sell an “enormous volume of child pornography,” among other dangerous failings.
“The Office’s investigators found that certain child exploitative content is over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans,” said a December 6 press release published by Torrez’s office.
Avoiding hate versus celebrating life
An entrepreneur, disability activist, and the founder of the Nipman Foundation, Nipun Malhotra recalled recently seeing more “horrible” Instagram Reels mocking people with disabilities. Despite this, he supports parents who want to share photos of their children with disabilities as a way of celebrating their family members’ lives.
“So the fact of the matter is that if parents are posting things about their own children on social media, in a way there is an acceptance from a parent. I would much rather want a parent who’s sharing photos of their child with a disability on social media, compared to a parent who’s not [sharing] because they’re scared of these hate comments,” said Malhotra, adding that it was up to parents to complain about online hate.
“If they have, Meta should definitely take action and if they’ve not taken action, it’s, of course, disappointing. But I would definitely say that I support parents sharing photos of the children on social media with disabilities. It also triggers other parents who might have a child with disabilities to [say] that ‘See, despite whatever hate is coming, despite what else society thinks, they’re giving the child a normal life so we should give our child a normal life too,’” explained Malhotra.
Whether a family chooses to reveal the children on social media or not is a personal choice, but Malhotra stressed that parents should apply the same standard for all their children and not conceal or single out the family member with a disability.
“So I think there needs to be more social media content on disability and not less,” he said.
On the backend, Malhotra pointed out that social media companies needed more people with disabilities in their content, accessibility, and web/program designing teams, to get live feedback. Outside of the Meta ecosystem, he observed that more tech users with disabilities were shifting away from Elon Musk-owned X (formerly Twitter) and moving to platforms like LinkedIn for accessibility reasons, resulting in gains for one business and a large loss for the other.
Around one billion people, who make up approximately 15% of the global population, live with disabilities and form the world’s largest minority, according to the World Health Organisation (WHO). Malhotra highlighted this figure and said tech companies needed to think about their market.
“And I would not actually want to give this answer for Meta, but for all tech companies in general,” said Malhotra.
“Include more people with disabilities in your decision-making and you’ll see that your apps become not only more accessible but more hate-free also, in that sense.”
Published - January 30, 2024 02:06 pm IST