Transparency in way of protocols

Tried to find if this is discussed or mentioned on the main web page of Couchers but didn’t find it.

Since I’m suspended now for 4 weeks from CS (still don’t know why since I don’t get an answer from their support, guess they don’t like critical customers) I was wondering how we arrange this in the future of Couchers?

Problem now on CS is that there’s no any kind of transparency or protocol (maybe there’s one but not communicated to the community) when people get suspended or banned. Think it would be great if we’re open and transparent to share this and how to handle this since there are probably a lot of grey area’s how people can get suspended. Any idea’s on this?


I think it would be great to generally address transparency. Trustroots for example has a statistics page. We could expand that to share insights on more perspectives, like:

  • technical statistics
  • financial statistics
  • community growth (and percentage of verified members, number of hosts, total number of exchanges)
  • diversity figures (like gender, location, language, age)
  • moderation (reported users, warnings, suspensions)

I would also love to see a private Insights page for each user, where they could look up statistics of their personal usage, like locations traveled and interactions with other users by gender and location. Could also have a Moderation tab there where the number of times they have been reported or communicated with moderators or safety team is listed.

So my idea would be that being transparent about the numbers is the best first step. Because when we show the number of reports and suspensions, it seems a logical next step to break them down for different reasons or issues. And we can then have discussions about the different issues we face and respond to backed by actual numbers.

Another idea to not lock out users in a non-transparent way as happening on CS could be to separate the settings/account access from the actual profile and features on the app. So we don’t delete or suspend accounts, but only suspend members from using certain app features (contacting other members, sending requests, being shown to other members at all). You could always access your account settings and in case there is a moderation issue, it could show up and give both reasons and ways how to address it.


That’s a good topic.

Maybe Couchers could list offenses and the corresponding action that will be taken. It is more transparent but also for the volunteers that will have to deal with it, it makes their job easier.

Just an example:

Suspicion of illegal act → deactivated (up to staff)

Violation of paragraph 3 of terms of service: → ban

Violation of paragraph 4 of ToS → penalty, reset of whatever score/badge/etc

Violation of paragraph 5 of ToS → whatever feature deactivated

1 Like

Think this is a great idea!

But what I meant more specific is transparency in way of how you can get suspended, this is quite unclear now on CS. As @couchguy mentions, think these are great examples. Think it’s important we’re transparent about how we suspend members and it should be clear either a protocol or terms of use where it’s all mentioned.


Completely agree. Giving clear guidelines on what warrants a temporary suspension, for how long it might be, and what can be done to have it lifted faster would be good. And if someone gets completely banned, an explanation of why is extremely important. A strong community is one that can enforce these rules by itself - so it’s important that they are clear as day!

1 Like

On the Couchers platform, most moderation actions are going to be taken by local moderators, not a global team. As part of this yes, we will have a clear rule-set and appeals process.

1 Like

Great! :slight_smile: But as local moderators may seek for power and might be friends of each other in the future (this happened many times with the CS Ambassador program that some of them abuse their power and cover each other backs) this appeals process is controlled by another for example moderators of a different city? Or an overall moderation/safety team?

Or any other ideas how to avoid this?

1 Like

Yeah I think this is going to be a tricky problem that we’re going to have to experiment with. Generally, we’re going to have a hierarchy where global mods > country mods > local mods, so the appeal process will go up that chain.

But it could be the case where the local and country mods know each other too well, so yes we might want something like external arbitration from a removed community, and that could be part of moderator responsibility.

Another way to be preventative would be to limit absolute power, so no individual mod can make a decision. We were discussing an idea based on the software development “pull request” model, where a mod can decide an action, but then another (or multiple, or one higher in the hierarchy) mod must approve it before the action takes place.

1 Like

If the community develops like CS, then country mods will definitely know local mods.

Btw is local mod the same as ambassador and how do they get chosen?

1 Like