That’s nice dear, the amount of human hours tuning your model to not be complete gibberish definitely don’t count, and the fact all live service LLMs employ at least a few dozen third world workers to check results and change outputs disagree with your under powered rng.
They sure put a ton of money into sweeping it under the rug that exploiting workers at near-slavery wages is always what makes these “technological revolutionary marvels” actually tick.
Whether it’s new FoxConn chips, EV batteries, “free shipping”, or LLMs.
Feel free to take your own advice, not a single AI product operates commercially without underpaid workers in foreign countries constantly monitoring it. Take Amazon’s cashier less store as the biggest example of this that has been exposed.
If you think it hasn’t happend before, check again, nothing in history is new
When have we had AI so good the turing test lost it’s whole meaning overnight?
The mechanical turk. Like the mechanical turk LLMs have the same flaw, it’s really just a human behind the best implementations.
I’ve checked my PC looking for the tiny man managing my local llm, no luck yet but perhaps they’re smaller than I thought…
Is ai learning training data it got from itself?
That’s nice dear, the amount of human hours tuning your model to not be complete gibberish definitely don’t count, and the fact all live service LLMs employ at least a few dozen third world workers to check results and change outputs disagree with your under powered rng.
They sure put a ton of money into sweeping it under the rug that exploiting workers at near-slavery wages is always what makes these “technological revolutionary marvels” actually tick.
Whether it’s new FoxConn chips, EV batteries, “free shipping”, or LLMs.
Every. Time.
Be a better person. Less ignorant, more useful.
Feel free to take your own advice, not a single AI product operates commercially without underpaid workers in foreign countries constantly monitoring it. Take Amazon’s cashier less store as the biggest example of this that has been exposed.
Uh… No.