swlabr, swlabr@awful.systems
Instance: awful.systems
Joined: 2 years ago
Posts: 24
Comments: 938
Only Bayes Can Judge Me
Posts and Comments by swlabr, swlabr@awful.systems
Comments by swlabr, swlabr@awful.systems
I’ve been watching a streamer play it. I thought it would be kinda cringe and it is, but there are some fun jokes here and there. But I get what you’re saying, I think.
ah I guess I love my brain and date everything has infected it, erasing any other knowledge of anthropomorphisation
Date everything is infecting everything I know and love
I read a review (the one hosted on the ebert site) and it seems like this just falls into one of the patterns we’ve already seen when other people not steeped in the X risk miasma engage with it. As in, what should be a documentary about how the AI industry is a bubble and that all the AI ceos are grifters or deluded or both, is instead a “somehow I managed to fall for Yud’s whole thing and am now spreading the word” type deal. Big sigh!
I feel like I nailed my guess
Pure speculation: my guess is that an “apocaloptimist” is just someone fully bought into all of the rationalist AI delulu. Specifically:
- AGI is possible
- AGI will solve all our current problems
- A future where AGI ends humanity is possible/probable
and they take the extra belief, steeped in the grand tradition of liberal optimism, that we will solve the alignment problem and everything will be ok. Again, just guessing here.
It’s on sight with all these fucks holy shit
smdh for not welcoming vampires in for peaceful discourse and good faith debates instead
kittymine!!
these fucks: “Don’t make me tap the sign!!!”
the sign: “My final braincell fell out of my open mouth while drooling”
but I also still think superintelligence shouldn’t be built, because if we’re letting Sam Altman and Elon Musk dictate the “values” encoded into it I do not want to live in that world
Well, sure. FWIW most of us here are on one of two trains:
- “superintelligence” will never be built
- There is no clear path from where we are technologically that leads to “superintelligence,” so there isn’t much point to worrying about it.
Of course, there is something to be said about not wanting anyone throwing money at Musk or Altman because they’ve conned people into thinking we’re on the cusp. But yeah, I would say most people here do not share this concern, just as no one should be concerned about a dragon razing their town.
the spectre of Ea-nāṣir stirs
Guys like this read some big name author once and think that all you have to do to be a good writer is like, have a writing style. And then they proceed to fail upwards forever
dead dove
Truer words, dawg. Amongst everything else, what the fuck is up with the subtitle.
I’ve heard a lot of influencer types are leaving twitter/going private over this. Even “spicy” accounts are leaving. I’ve heard some women say they might come back if they block this functionality but uh, bad news, that’s not how the tech works, and musk doesn’t give a shit about moderation anyway.
“Ignoring all these reasons to hate AI, people shouldn’t hate AI, debate me bro” - that’s you lol
Still the one baby
probably more of a shin megami tensei deal where you have to fuse one with a mushroom or something and you get the other
Thinking that debates are the ultimate decider of who is right or wrong is like, twelve year old shit. Hence the internet debate industrial complex
I’ve been watching a streamer play it. I thought it would be kinda cringe and it is, but there are some fun jokes here and there. But I get what you’re saying, I think.
ah I guess I love my brain and date everything has infected it, erasing any other knowledge of anthropomorphisation
Date everything is infecting everything I know and love
I read a review (the one hosted on the ebert site) and it seems like this just falls into one of the patterns we’ve already seen when other people not steeped in the X risk miasma engage with it. As in, what should be a documentary about how the AI industry is a bubble and that all the AI ceos are grifters or deluded or both, is instead a “somehow I managed to fall for Yud’s whole thing and am now spreading the word” type deal. Big sigh!
I feel like I nailed my guess
Pure speculation: my guess is that an “apocaloptimist” is just someone fully bought into all of the rationalist AI delulu. Specifically:
and they take the extra belief, steeped in the grand tradition of liberal optimism, that we will solve the alignment problem and everything will be ok. Again, just guessing here.
Trailer for “the AI doc”, an upcoming doc featuring the usual suspects, notably the big yud himself
It’s on sight with all these fucks holy shit
smdh for not welcoming vampires in for peaceful discourse and good faith debates instead
Yes
kittymine!!
these fucks: “Don’t make me tap the sign!!!”
the sign: “My final braincell fell out of my open mouth while drooling”
Well, sure. FWIW most of us here are on one of two trains:
Of course, there is something to be said about not wanting anyone throwing money at Musk or Altman because they’ve conned people into thinking we’re on the cusp. But yeah, I would say most people here do not share this concern, just as no one should be concerned about a dragon razing their town.
the spectre of Ea-nāṣir stirs
Guys like this read some big name author once and think that all you have to do to be a good writer is like, have a writing style. And then they proceed to fail upwards forever
Truer words, dawg. Amongst everything else, what the fuck is up with the subtitle.
I’ve heard a lot of influencer types are leaving twitter/going private over this. Even “spicy” accounts are leaving. I’ve heard some women say they might come back if they block this functionality but uh, bad news, that’s not how the tech works, and musk doesn’t give a shit about moderation anyway.
“Ignoring all these reasons to hate AI, people shouldn’t hate AI, debate me bro” - that’s you lol
Still the one baby
probably more of a shin megami tensei deal where you have to fuse one with a mushroom or something and you get the other