Federation abuse policy

Social.Coop Federation Abuse Policy

v1 posted 9/24/2018

The behavior of members of Social.Coop is governed by our Code of Conduct. We promote the same values in the Fediverse and will act to protect Social.Coop members from abuse from people in other instances. Interactions with both off-instance individuals and other Mastodon Instances are governed by this Federation Policy.

Values

As a cooperative, Social.Coop is based on the values of self-help, self-responsibility, democracy, equality, equity, and solidarity. Our members believe in the ethical values of honesty, openness, social responsibility, and caring for others. We also follow the rest of the ICA Cooperative Values and Principles.

We welcome anyone willing to accept the responsibilities of membership, regardless of qualities such as gender identity or expression, sexual orientation, disability, mental illness, neuro(a)typicality, physical appearance, body size, age, race, nationality, ethnicity, socioeconomic status, family structure, spirituality, religion (or lack thereof), education, or other personal traits. We particularly celebrate diversity and do not tolerate bigotry or prejudice. Diverse opinions on politics, religion, and other matters from our own members or those on other instances are welcome as long as they align with our core values. Let there be no confusion, Social.Coop is anti-racist, anti-fascist, and anti-transphobic.

Individuals

If a Social.Coop member finds content that is harmful, threatening, or otherwise in violation of the Social.Coop [LINK: ]Code of Conduct, they have options:

  • if they are comfortable and believe it will help, they can consider contacting the violator privately to respectfully ask that they remove the violation.
  • block or mute the user;
  • report the incident to Social.Coop via the Mastodon software’ reporting feature.

Action may be taken by the Community Moderation Team concerning off-instance Fediverse accounts upon receiving a report via the Mastodon software.

Upon receiving a report, the procedures described in the Reporting Guidelines will be followed. When reviewing reports, moderators should check to see if there have been previous complaints about an individual, and take their previous actions into account when deciding on a recourse.

The Community Moderation Team will support members who seek informal conflict resolution, as well as those who file a more formal report, as described in the Reporting Guidelines.

Actions that may be taken by Social.Coop moderators include:

  • No action
  • Adding a content warning to a post when viewed on Social.Coop.
  • Removing a toot from Social.Coop.
  • Contacting the reported person to ask them to make a behavior change (ex.: delete a toot, add a content warning).
  • Silencing (muting) the individual, so the followers may still interact with them, but their toots no longer show up in social.coop timelines.

Causes for silencing may include

  • Un-cw’d content that includes: sexual content; gore, depictions of violence;
  • Generalized violent rhetoric
  • Excessively frequent promotions of a product or service (i.e. spam)

Suspending (banning) the user

  • Use of violence, threats of violence, or violent language directed against another person.
  • Harassment — defined as continuing to interact with or post about a social.coop user after having been asked to stop. This includes, but is not limited to, unwelcome sexual attention, deliberate intimidation, stalking, and dogpiling.
  • Making offensive comments or insults, particularly in relation to diverse traits (as referenced in our values above).
  • Advocating or encouraging any of the above behavior.
  • Posting: Personally Identifiable Information about others (“doxing”); Sexual Material that is either without consent or depicting individuals appearing to be under age 18 — including Loli.
  • A bot following social.coop users who have #nobot in their Mastodon profiles.

Instances

Action may be taken by the CWG against instances that are a source of problem behavior, subject to appeal by members to the membership as a whole.

Instance Three Strikes Rule

If the Community Moderation Team receives three reports concerning users of an instance that seem to fit a pattern of behavior, the Community Moderation Team will research the instance. If it is an obvious candidate for silencing or suspending under the below criteria, Community Moderation Team may do so. If it is not obvious, Community Moderation Team will contact the instance admin about the behavior and make their decision about ongoing federation based upon their response.

Silencing

Users may still follow users on a silenced instance, but the instance’s users will not be visible on the public timeline. Criteria for silencing include:

  • The site is the origin of a significant amount of spam.
  • The site is the origin of a significant number of users whose behavior has been reported as violating our Values or the Social.Coop Code of Conduct.
  • The site is the origin of repeated posts of sexual content, gore, and generalized violent rhetoric that lack content warnings.

Suspending

  • The site is the origin of a pattern of hate speech, and lacks policies or fails to enforce policies to deal with hate speech, or actively encourages it.
  • The site is the origin of repeated instances of harassment, and lacks policies or fails to enforce policies to deal with harasser users, or actively encourages them.
  • The site is the origin of sexual content depicting minors under 18, or non-consensual content like revenge porn.