I’ve been digging around all the blogs and written things and I couldn’t find any mention to that question.
I mean, we all know how low the data security in CouchSurfing is, anyone can access the data of everyone if they have enough free time… How is it done in Couchers?
Another thing is what is being done with the data? Is there some way to know that our data is not going to some third party? Is there a way that we as users can enforce it?
I know a couple more people around me that were interested in these questions.
What do you think about it in general?
If I’m blind and missed it written somewhere please kindly point me to where I should look
I think in general this is a complex topic that touches upon security, privacy, ownership and overall governance. Just some aspects I’ve been in touch with so far:
I don’t think data is stored encrypted. That includes private messages and personal data. Access by staff can be logged, but I don’t think general encryption is somewhere near on the roadmap. The biggest issue I’d see with this is that we’ll hold and users will exchange personal details like phone, name, home address,… . Maybe we could have an option to store and exchange personal details with encryption? Like a visiting card?
We are working on terms of service that include content usage rights. One central concern is that users will be contributing shared content, like on community and city pages. It will need to be easily understandable that there’s generally two different types of content: your personal content (like profile information and messages) and contributed content. Personal content can be entirely purged upon request. But contributed content will require some sort of lasting distribution right, similar to contributions on Wikipedia.
I believe it would be great to have more volunteers with legal background. So if anyone has or knows others, maybe consider volunteering?
Regarding ownership there’s also an aspect that we see now with exported data from Couchsurfing™. You have full access and ownership of that data, but it’s not worth much in terms of authenticity (it’s just a plain text file). In the longer run, it could be great to find a way how to package and authenticate data exports. If we’d have that now for the the cs exports, we could think seriously about giving members some visible credit to prior activities.
Well, anyone with some coding skills can also get 1 million details from a credit card company, get hold of weapon blueprints from the Pentagon, or blow up an uranium enrichment facility in Iran. What do you mean more precisely?
I didn’t intend to suggest it’s easy or that there are no protections. But as some chat apps have implemented encrypted messages, users might assume that’s the default for whatever service they use on the internet now. But it’s certainly not for end-user applications. Some background:
Yeah, I just took that out Sorry that it sounded alarming. It’s more about: is there a way we could add extra security for a small subset of private data we will likely hold. Without the need to make the entire app more complicated and difficult to use and maintain.
Very much agreed
Thank you for the detailed response.
I know that this is a very broad and complex topic and I was just interested if this is something that is considered in the development
Out of all what you wrote this is what interest me personally:
Now I don’t expect some end to end encryption on all messages at this point of the app (and maybe other people do ), it is something to put on the roadmap at some point (and the sooner the better).
But as you said, users are going to exchange personal information between them and at least that information (together with the profile information) should be somehow protected.
More over, I was thinking about something like ‘your phone number and address will not be shared with a third party’ or something like that.
Otherwise how would you expect (aware) users to verify themselves in any means of verification that is planned (except for social one that someone else verifies you)?
In any case my question at this point was this:
So thank you for captioning it better than me
I’m sure that there are plenty of opensource tools to solve some of these thoughts
All user data is stored securely and is properly protected. We take this very seriously. All communication between you and the platform is encrypted with transport security. No, “anyone with some coding skills” cannot access the data. Only a very small number of heavily vetted core contributors have access to the database through other means than the platform.
We’re currently working on a set of policy documents that describe and clarify our data handling better. This includes:
a Non-Disclosure Agreement & Data Policy for contributors that describes how they are allowed to access data, what’s appropriate use, our auditing and logging systems, etc. We aim to make this process transparent, so hopefully everybody will be able to see which contributors have access to which data.
Some volunteers need access to data: for example those helping out with support need to have access to the support queues and there is a small Admin API that allows them to perform simple actions (change a user’s birthdate, see their email for the purposes of contacting them, etc); other contributors (such as a few devs) have access to the production system because they need to be able to update the software if there are bugs, etc. We vet everyone who gets access to any data, and we’re working on processes to limit access to an as-needed basis as much as possible. We have good logging for this.
We plan to make it very clear under which circumstances, and who, is able to see what data (e.g. for moderation purposes, if a user reported a conversation, etc)
We currently have no plans for end-to-end. It would mean our Safety & Trust team would have no ability to help in many situations, and it would open our platform to abuse that we couldn’t control. [You should be using end-to-end encrypted chat apps (such as Signal) for all your normal communication, but we think Couchers is facilitating a different type of activity, and it’s not a priority for us.]