Friday, May 10, 2013

Censoring Facebook: Social network's violent video dilemma



Facebook's decision to remove videos
showing people being decapitated leaves the
firm in a quandary: should or shouldn't it
impose a wider censorship policy?

Originally the social network rejected calls
from users to delete the clips saying that it
wanted to "preserve people's rights to
describe, depict and comment on the world".
But after the BBC revealed that one of its
own safety advisers - the head of the Family
Online Safety Institute - had criticised its
decision, the firm announced a U-turn, saying
it would remove clips showing beheadings
while it re-evaluated its rules.

That potentially opens a can of worms.

Since publishing the article , readers have
contacted the BBC to complain about other
videos, including:

✧one that shows killings which do not involve
   beheadings
✧clips involving cruelty to dogs and other
     animals
✧a smartphone recording of a schoolgirl being
   punched to the ground by another pupil

In all cases they said the network had
refused their requests to remove the material.
A spokeswoman for Facebook confirmed its
policy had only been amended in regard to
decapitations.

But imposing stricter controls would open the
firm up to other criticism.

Before his death, internet freedom
campaigner Aaron Swartz warned of the
dangers of privately owned parts of the net
limiting what was posted onto their sites. He
called this "corporate tyranny" and named
Facebook as a specific concern.

The social network could not provide a date
for when its review would be complete. The
following range of opinions suggest it will
struggle to please everyone.

Richard Allan, Facebook


More than a billion people express
themselves and comment on the world in
which we live through Facebook and most of
the time this is entirely without problem.

On occasions, there are concerns about some
of the content that is being shared and we
have put in place a reporting system so that
people can tell us about this.

The reported content is evaluated against our
community standards and appropriate action
is taken where our rules have been breached.

When drawing up and enforcing our approach
to acceptable behaviour and content on
Facebook, we aim to strike the right balance
between enabling people to share
information, news and content - and
protecting the community as a whole.

This is a complex challenge as Facebook is a
large, diverse community and we are
continually presented with novel situations.
While we freely admit that we do not always
get it right, the trouble-free daily experience
of the vast majority of Facebook users
demonstrates that our systems are working
well in all but the most exceptional cases and
that they are improving over time.

As we said last week, we are reviewing our
rules related to content showing graphic
violence.

In doing so we are clear that there are
situations where it is important for people to
be able to share content through Facebook
even if this can at times be quite shocking.
For example, people caught up in violent
incidents such as the recent Boston
bombings or the ongoing conflict in Syria
want to be able to report on their experiences
and may use quite graphic content to do this.
This illustrates the kind of challenge that our
highly experienced team deals with on a daily
basis as we strive to offer a space for
sharing that is mindful of everyone's
expectations.

Celia Mellow, petition organiser


As a person who holds a strong sense of
justice, I had no hesitation in setting up a
petition for the removal of the sickening
decapitation video I was shocked to find on
my Facebook news feed.

What shocked me even more was the fact
that I had to actually make a petition in any
hope for the video to be removed.

No matter how many times my friends and I
reported it, we all received the same
message, stating that "it doesn't violate
Facebook's community standard on graphic
violence, which includes depicting harm to
someone or something".

How does a video of an innocent woman
being brutally murdered not "violate" this? I
can only hope that there is a criminal
investigation that will bring her justice.

As a loyal Facebook fan, I understand that
Facebook is only allowing people to have
freedom of speech. However, I think it is
about time they drew a line between what is
and isn't appropriate for the public.

Facebook's audience starts from children
aged 13 - what I feared the most was that my
younger sister could easily have witnessed
that disgusting video.

No-one should be exposed to such graphic
horror. Sadly, that video isn't the only
inappropriate content to have wandered onto
Facebook recently. I have heard of others
showing extreme violence and cruelty to both
humans and animals.

It's time that new stricter regulations are
made by Facebook in order to remove these
vile videos for good so that it might return to
being the safe social network it used to be.

Jeremie Zimmermann, La Quadrature du Net


Any intervention by Facebook to remove or
block access to content beyond what a court
might order - while respecting basic
fundamental rights and the principle of
proportionality - would in practice amount to
privatised censorship, and nobody has an
interest in going there.

A dominant, centralised actor such as
Facebook would be incentivised to spend as
little money as possible determining which
content would be lawful or not, suitable or
not, etc.

This would raise the question of what criteria
would be used. Opening such a breach would
ensure that any government could pressure
Facebook to consider their own criteria,
whether for political, religious or other
reasons.

Under such conditions we can be sure that
the fundamental right to freedom of speech
or the right to a fair trial would not be
respected.

As surely as we cannot trust giant centralised
corporations to defend our fundamental
freedoms, we cannot ask them to become the
judges and enforcers of what information
should be shared online.

Protecting children on the net is a
responsibility of their parents in the first
place. It cannot be outsourced to Facebook.
It is a matter of educating them about the
difference between the "circle of trust" offered
by their own private relationships with friends
and family, and the public nature of this
communication system.

Since Facebook collects and stores so much
information it should be able to determine
when one of its members is a minor and is
about to be exposed to content that has been
reported as unsuitable, and display a warning
message.

Users would then be free to choose to take
that advice, or make a conscious choice to
access the content.

Stephen Balkam, Family Online Safety Institute (Fosi)


Facebook, and most other social media sites,
have explicit terms of service about what is
and what is not acceptable to be hosted on
their websites.

Some go further and have created what are
known as community standards.
These more clearly state the rules about
what kinds of content will be removed.
Facebook, YouTube and Twitter have robust
reporting mechanisms so that ordinary users
can flag inappropriate or abusive content for
review.

What is challenging for these companies is
how and where to draw the line.
This will help them determine when to invoke
the "public interest" principle in keeping
material - such as images from the Boston
Marathon bombing - up on their site, even
though they depict graphic violence.
This is new territory for us all as we navigate
the rules, ethics and standards of user-
generated content sites.

Andrew McDiarmid, Center for Democracy &
Technology


The controversy over Facebook's treatment of
shocking videos of beheadings is the latest
illustration of the enormous complexity at
work when it comes to promoting the
exercise of human rights online.

Billions of people rely on internet platforms
to speak and access information in the
networked public sphere, but the platforms
are controlled by private companies, whose
terms of service in large part determine the
contours of free expression.

In one sense, platform operators are
themselves speakers that have the right to
determine their own policies. At the same
time, these "digital sovereigns" - to borrow a
phrase from Rebecca MacKinnon - effectively
govern their users' exercise of free expression
rights.

Platforms have a responsibility, particularly
as they grow to Facebook-scale, to consider
the human rights impact of their policies and
to minimise restrictions on free expression.
This is especially true with respect to
government restrictions. It would be troubling
indeed if government pressure precipitated
the video's removal in this case.

A key step in carrying out this responsibility
is ensuring that content policies are clearly
communicated and fairly applied.

The horrific beheading video and Facebook's
reported reaction demonstrate the challenges
that arise when trying to develop and apply
clear, consistent standards in the complex
and multi-faceted realm of online
communication.

Context matters a great deal. Different
companies might draw the line in different
places, and just because something is
offensive or disturbing does not mean it
necessarily violates a particular term. And it
certainly does not make it illegal.

Because of this complexity, systems for
assessing content require constant
refinement to ensure that free expression is
protected.

Advocates, too, must remain vigilant that the
private players that provide so much public
value online are meeting their responsibilities
to users.

Is it complicated and prone to mistakes and
close calls? Yes, but the alternative -
mandated content policies and individual
governments vying for control over the global
internet - is untenable and fraught with risk
for free expression.

Dr Lynne Jordan, British Psychological
Society


The main concern, as an experienced
psychologist, in working with the effects of
actual and vicarious violence is a lack of
awareness of violation of choice.

Material is posted on news feeds and "liked"
indiscriminately without thought as to the
rights of under-aged youngsters and others
who may view it.

People, whether young or old, can be
negatively affected by witnessing violence
either on screen or in reality.

Effects include trauma responses such as
replaying the images, feeling scared and
vulnerable, ashamed, invaded or violated and
confused, as well as angry and helpless,
which is reinforced via the news feed as
these things pop up uninvited.

Ethical codes are there for safety and to
preserve the right to choose what is viewed
when users are considered of age or able to
understand the implications. Social media
sites are mostly not obliged to adhere to
such codes which creates a problem,
particularly if they issue their own vague
inadequate guidelines.

Social networks' news feeds allow material to
arrive on people's pages that might never be
sought by choice.

Extensive "friendship lists" develop with
people who may not be actual friends but
through casual contact get "befriended",
perhaps out of obligation or a need to fit in,
be liked etc.

The material is often posted supposedly to
prevent the spread of violent crime or other
violations, but in fact it can inadvertently
escalate it by sidestepping the consent of the
people accessing the feeds.

This is reminiscent of the "ban smoking in
public places" debate with the concern of
whose rights we are protecting.

In that debate it was largely about public
physical health. This debate concerns public
mental health and wellbeing.

Source:bbc

No comments:

Post a Comment