“At the end of the day, they are not interested in seeing things in their work timeline that make them uncomfortable, or distracts them from what they’re interested in.”
This attitude is endemic in the tech industry.
@sneak unless you are making anything for public consumption, in which case its downright irresponsible.
@laura i don't agree that doing the thing you're working on and only the thing you're working on is irresponsible. i think the burden of proof for irresponsibility is on the people who claim it is so.
you're totally allowed to be singleminded and it's not irresponsible. silly, ridiculous, simpleminded, sure - irresponsible, no.
@icedquinn @sneak I disagree, but I have a feeling we won’t agree on this. I also believe that there’s no such thing as apolitical. Nothing produced by people for people exists in a vacuum. The only possible exception I can think of is when you produce something for your own personal use and don’t release it publicly.
@icedquinn it’s possible, and it’s an ideal world where you wouldn’t have to worry about what your platform is used for. But I also think that you have to make decisions to design against, and prevent your platform being used for, harm.
It only used to be perfectly normal for people not to want to engage with political issues if those people benefited from the status quo. Those that suffer or struggle are forced to engage with it, even if they don’t want to.
@laura @icedquinn do you believe that people who produce end-to-end encryption systems (where the server can't censor certain messages due to crypto) have moral obligations to put clientside filtering tools into their clients to avoid being "irresponsible"? or is being content-neutral an acceptable choice?
@sneak I’m not talking about content moderation, I’m talking about designing for prevention and having an organisation that acknowledges its ability to cause harm.
@laura this is a good example of a concrete instance. also encrypted messaging censorship isn't moderation, it's censorship.
do service providers have an moral obligation to censor or not? would remaining completely content-neutral render them "irresponsible" in your view?
@sneak I’m not sure where you got the impression that I think encrypted messages should be censored. I don’t think service providers have an obligation to censor content any more than I think pencils should be banned because you could write harmful messages with them. But I also think that the architecture and design of a platform (usually those seeking to exploit engagement and personal data) should be designed so that people can avoid the content they don’t want to experience from others.
@sneak but none of this is really relevant to Basecamp. Where I think the issue is that the leadership don’t want to engage with social issues affecting their staff and customers, which negatively affects the lives of its staff and customers.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!