Law isn’t a binary based on responsibility, it’s reductive to suggest that by arguing for responsibility we are also arguing for prison for software developers. I can be responsible for your death and spend no time in prison.
Every other industry deals with this challenge — why should software be any different?
You can take the word "jailed" out of my comment, replace it with "responsible for", and I still think my point stands...
You said:
>If a company releases software that is used nefariously, there are very common legal actions to hold them accountable
If you believe that, it follows that you believe that every developer of encryption algorithms should be "held accountable" (be it jail, or "responsible without prison", etc.) because other people use encryption to hide illegal activity. Developers of internet protocols should be accountable for the actions other people take on the internet, because lots of illegal things happen on the internet.
Metasploit, Kali, 7-zip, FileZilla, Word/Excel, Putty, OpenVPN... Should I go on? All of these are used for nefarious things all the time. Are you really suggesting that the developers of these should be responsible for the nefarious things that their users do? If not jail, what responsibility are you suggesting?
>Every other industry deals with this challenge — why should software be any different?
Most other industries have protections against this type of liability, not responsibilities. See knives, guns, planes, cars, etc. Unless their is gross negligence, which isn't just "it was used nefariously", the maker of X is generally not responsible for what some user of X does with X.
Edit for clarification:
You can argue about purpose-built nefarious software, sure. If I develop ransomware, and advertise it as ransomware, and there's no legitimate use other than ransoming... I should probably be held responsible for the ransomware attacks that occur using that tool (at least, I accept that argument). The problem with applying this to all software is that most everything that is used nefariously was originally designed for and used for legitimate uses. When that's the case, the person who committed the crime with the legitimate tool should be held responsible, not the maker of the legitimate tool.
Most industry protections against liabilities are predicated on compliance with expectations about responsibility: the protection against liability is earned. For example, firearm manufacturers are protected from being held liable for actions taken with their firearms as long as they comply with their legal responsibilities, like not selling firearms to children. If a firearms manufacturer sold firearms to children, they would absolutely be held liable for the outcomes.
If you knowingly build software that can be used for money laundering and make no effort to prevent money laundering then, if software was treated like other industries, you’d absolutely expect to be held liable.
>If you knowingly build software that can be used for money laundering
You've retreated back to money laundering, but that is not what you originally were talking about.
You were pretty clear that you were talking about any software which is used nefariously. Which I pointed out that pretty much any software can be used nefariously (e.g. ssh, browsers, hosting software, etc.), but you keep avoiding that.
I’m not avoiding it. There’s nuance. A piece of software that is specifically designed to enable a behaviour that is core to money laundering is different from a piece of software that can be used to engage in money laundering.
A web browser can be used to access a banking website through which you might engage in money laundering, sure, but that’s very different to a piece of software that can be used to hide the origin of funds.
The difference is like a kitchen utensil manufacturer vs. a gun manufacturer. A kitchen knife can be used to kill, a gun can be used to kill, but we hold gun manufacturers and kitchen utensil manufacturers to different standards because intent is an important aspect.
Your argument is predicated on the idea that intent doesn’t matter, but intent does matter, intent is a significant component of criminal law.
>If a company releases software that is used nefariously, there are very common legal actions to hold them accountable
There is no mention of intent. Just that if a software is used nefariously, the creators of that software should be legally accountable.
You later talk about your intent, when you commit a crime, but that's very different. I agree that if someone commits a crime with X software, their intent should be considered. What I don't agree with is holding Tatu Ylönen accountable for someone else's nefarious use of ssh.
Every other industry deals with this challenge — why should software be any different?