• 0 Posts
  • 13 Comments
Joined 9 months ago
cake
Cake day: September 28th, 2023

help-circle

  • It might disappoint you to learn that among american english speakers, literacy isn’t a good indication of maturity by age. Barbara Bush made a literacy initiative that’s still around, the One Good Thing® left over from the bush regime. The site has a handy map to show you the 40-60% adult literacy rate counties spread all over the states. It definitely helped me come to terms with the fact that sometimes a kid who is trying is gonna be more eloquent than an adult repeating the same tired take they’ve been rebutted for a thousand times.

    It’s easier for me personally to gauge age as it scales up when anonymity is involved - referential humor, recognizance of the ancient runes (Duckroll, Bill Murray’s face with only the jaw moving), and informed chitchat about presidential behavior predating Bush SR are all dead giveaways that a user is older, but with younger users you have a lot of hobby/interest overlap going all the way up to people in their 40s. You can’t look someone in the eyes and see if the light of youth has gone out yet on forums and imageboards.


  • I was of this opinion until I moved in with my partner who had a bearded dragon. Reptiles move strangely, but this bearded dragon had been a classroom pet for the first few years of her life, and was surprisingly social. She’d make eye contact, gesture with her body, present her head to be gently pet around the bristles, and even flip over to be rubbed on the belly like a dog if she was not currently or just finished eating. Responsive with body language to some specific one or two syllable words like her name or the words for mealtime, and very aware of visual cues (like any of the objects she was handed a mealworm from, even just once).

    I imagine a tarantula probably has some behaviors that would surprise me if it was conditioned as a pet and socialized, I know they have a fair number of ways aside from the bite to show displeasure or anxiety like flicking hairs and quickly shuffling away to show a defensive posture. I think it could be a fun experience and wouldn’t turn my nose up at it instantly these days if the opportunity came along; animal cooperation is a small joy even when it’s a bit foreign.






  • Sure the industry is gaining money, but you’re ignoring specific company shutdowns and restrictions that shaped the industry out of the hands of certain players. There have been a lot of regulatory fingers in the pie, particularly above state level, that weren’t aimed at making the populace safer but instead at making those companies unable to produce or sell their most popular products. There’s also a lot of legal language bites like “e-cigarette” and “open container” that are seeing non-uniform interpretation in legal states, across vape legislature and cannabis legislature alike.

    Draconic legislature isn’t quite turning the country into a hellscape for consumers, sure. But it’s clearly a possible side effect that isn’t being considered, especially as states are beginning to take it upon themselves to start outlawing studied hemp-derived cannabinoids (like delta8/10 or THC-P or THC-A) that are provided for under the 2018 farm bill.

    Tl;Dr while the industry is growing, it’s clear it has enemies with legal power and that’s the crux of the complaint.






  • The same thing actually passing a turing test would require. You’ve obviously read the words “Turing test” somewhere and thought you understood what it meant, but no robot we’ve ever produced as a species has passed the turing test. It EXPLICITLY requires that intelligence equal to (or indistinguishable from) HUMAN intelligence is shown. Without a liar reading responses, no AI we’ll produce for decades will pass the turing test.

    No large language model has intelligence. They’re just complicated call and response mechanisms that guess what answer we want based on a weighted response system (we tell it directly or tell another machine how to help it “weigh” words in a response). Obviously with anything that requires massive amounts of input or nuance, like language, it’ll only be right about what it was guided on, which is limited to areas it is trained in.

    We don’t have any novel interactions with AI. They are regurgitation engines, bringing forward sentences that aren’t theirs piecemeal. Given ten messages, I’m confident no major LLM would pass a Turing test.