Well, I understand you won’t lose any sleep, but this is conceptually stupid.
That would be like refusing to allow someone to buy a house because the last owner was a convicted of a crime. Sorry, we gotta demolish the house now! And nobody can live on the plot.
The owner of this repo is free to do whatever they want but I’m free to point out that it’s a dumb practice.
> That would be like refusing to allow someone to ...
You should stop thinking by analogies. You're doing a disservice to yourself and your thinking capacity.
> The owner of this repo is free to do whatever they want but I’m free to point out that it’s a dumb practice.
I find it very useful, just like most of its users, and if ever I were to find it necessary to use a website that's obviously blocked, I know how to unblock it. Most of the time I don't bother, so that's something for you to think.
Personally I find that I prefer badly written english or auto-translated stuff written in languages foreign to me over ai generated or even just ai polished works I've seen. There is just so much more character, depth and variance there vs ultra ai generic or slop text.
That being said this project seems focused on content farms not people who just need a little help writing so this whole conversation is a bit of a side tangent.
One of my coworkers is EXTREMELY capable but functionally almost illiterate. He’s recently discovered that he can put an idea in Copilot and have it generate an email. So now instead of brief, correct, but difficult to parse emails we receive 20-paragraph, bulleted, formatted OpenAI slop. It’s been a very strange thing to see, like someone getting extraordinarily bad cosmetic surgery.
Capable doesn't mean capable of office work though, I could see someone with a language disorder doing electronics and have trouble with words, not numbers. Or someone who has trouble with written words specifically doing most of their learning with classes and videos.
Exactly right. The individual in question produces excellent deliverables within their space. They, the coworker, are very good at receiving inputs, but not very good at outputs (other than their deliverables). In a way, it's like having an offshore worker who speaks almost none of your language but can understand it and produce good work.
> like someone getting extraordinarily bad cosmetic surgery.
this is such an incredible way to phrase what it all looks like to the rest of us. and i suspect the people doing it, just like those with obvious cosmetic surgery, have no idea how weird and off it looks.
I have a similar coworker, but he's not great at prompting, so 10% of the time the AI version of himself makes confident assertions that he did not intend and are clearly not true. Genuinely no idea what I'm supposed to do about it.
Exactly right. He’s good at what he does, except communicating, and people are beginning to associate him with AI slop they don’t have time to read rather than the excellent work he does for them.
Yeah I hate it when people do that and I always call them out on it.
Unfortunately our company is trying to be "AI First" so they'll just point to that and continue their bullshit.
Our company literally promotes AI slop over personally made content even if it's mediocre crap. All they care about is rising usage numbers of things like copilot in office.
I mean, I know it is probably tongue in cheek but that never-asked-question was particularly out of place. Massively generated AI contents are usually not THAT thoughtful anyway.
From experience: If you don't know Danish, please don't ever use machine translators to translate from English. Regardless of what some people may think, they make mistakes, so many mistakes.
I get why it's tempting, good translators are expensive, and few and far between. A friend of my is a professional translator and she's not exactly in need of work, but a lot of customers look at her prices and opt for machine translations instead and the result not always impressive. Errors range from wrong words, bad sentence structure to an inability to correctly translate cultural references.
Right, makes sense for Danes, or other population where English knowledge is basically ubiquitous. But I'm think it might look differently in other places, if the choice is between "Badly translated but I can understand 95% of it" and "In a language I don't understand at all, maybe 1% I could figure out", then the choice might be a bit different.
nope, let the user does the translation, with his own choice of tool and being thus perfectly aware of the shortcomings.
I know that some people translate my French posts to read them. That’s really cool. But I would never post something I didn’t write myself (but I use spellcheking tools. I even sometimes disagree with them)
Not everyone can. Try going to rural Spain and handing out flyers in English and ask them to translate it themselves, 0% of the people will translate it themselves, it'll go straight into the trash. If you instead hand them something in a language they understand, there is a least a chance they'll read it, even though probably 5% will do so.
It's sometimes useful to understand that the world is much bigger and varied than what you experience locally, and what works for you and the people in one country, doesn't always work the same everywhere.
There are levels to things. In a professional context (including product design and documentation/instructions) don‘t use machine translation[†].
For your personal hobby site or for general online communication, you probably shouldn’t use machine translation, but it is probably useful if have B1 language skills and are checking up on your grammar, vocabulary, etc. As for using LLMs to help you write, I certainly prefer people use the traditional models over LLMs, as the traditional models still require you to think and forces you to actually learn more about the output language.
For reading somebody else’s content in a language you don‘t understand, machine translation is fine up to a point, as long as you are aware that it may not be accurate.
---
† In fact I personally I think EU should mandate translator qualification, and probably would have only 20 years ago when consumer protection was still a thing they pretended to care about.
I use Grammarly at work (it's mostly to make sure our brand guidelines are kept) and I don't find that it (defaultly) corrects too far into the ai slop territory. It's mostly just making sure your sentence is correct.
Op is going after AI slop bot farms like android authority
> All I hear is skill issue. Imagine needing an AI to write stuff.
Grammarly users (and underrepresented non-English speakers) would complain.