the aliens simulating us are gearing up for a final season. and boy, are they going to make this one fun. i think we're living in the funny timeline, and i think they know it. i'm not sure that i could've designed a more amusing one myself.
a real-estate mongul, host of the tv show apprentice, terrifically funny and charismatic, becomes president. plot-armor-like defense from assassination, survived by barely an inch.
and he's joined by the world's richest man, who catches rockets with chopsticks, to form the "main cast", which now occupy the most important positions of power. the latter runs the disruptive agency, named after a meme-coin dog. and X. the social media network.
the stakes never seem higher. in fairness, they have been gearing up, slowly. we did narrowly missed total annihilation by nuclear bombs during the cold war. you know that thing they do at the end of tv shows when they ratchet up the stakes until it is the universe. it seems like wiping out of humanity wasn't enough.
they had to put, on one side, the destruction of the potential future value of the lightcone. and, on the other, transhumanist utopia of our wildest dreams, including the defeat of death, and all our other problems, too. we, or perhaps the main cast, just need to get it right.
the people who first warned the world that everyone is going to die are probably counterfactually responsible for asi's creation. china trains a leading model for pennies on the dollar.
i think it's fractally funny, all the way down, perhaps scaled by how important it seems to be for world affairs. take the recent slate of british prime ministers. bojo, the cabbage, an ai safetyist dressed up as pm, now the progress-person incarnate (!!).
*
well, suffering isn't funny. and there's been a lot of that. a lot of it happens quietly and invisibly. not even to prove a point. screw you, simulators.
i'm often overwhelmed by my place in this. the group of people who 'feel the agi' is perhaps just two degrees of connection away from me. i sometimes wonder whether i'm just a part of an incredibly well-thought through wireheading experiment - not one where everything is straightforwardly blissful, per se, where's the true joy in that? but one where the stakes are maximally high, where i'm just on the edges, with just enough buffs, but not too many, to try and change it, but it'll require working oh so hard.
*
one way to predict the future is to assume the most amusing thing will happen.1 it probably will.
what if, for example, the ultimate reason ai took over because it was trained on a large corpus of scifi of the ais taking over?… (h/t jacob gw)
not just scifi stories, also specific arguments why they'd be bad,
fully fleshed out takeover strategies, empirical arxiv papers showing "look, it's real!" -- we dream the thing into being real, we set the stages for it to make perfect sense, and we'll somehow still have room to be surprised by the end