nanaxrain.blogg.se

Basilisk anime song
Basilisk anime song






Thanks to the Streisand effect, discussion of the basilisk and the details of the affair soon spread outside of LessWrong. The basilisk was officially banned from discussion on LessWrong for over five years, with occasional allusions to it (and some discussion of media coverage), until the outside knowledge of it became overwhelming. But for this one, Eliezer Yudkowsky, the site's founder and patriarch, reacted to it hugely. Silly over-extrapolations of local memes, jargon and concepts have been posted to LessWrong quite a lot almost all are just downvoted and ignored. Quite a lot of this article will make more sense if you mentally replace the words "artificial intelligence" with the word "God", and "acausal trade" with "prayer". This is because every day the AI doesn't exist, people die that it could have saved so punishing you or your future simulation is a moral imperative, to make it more likely you will contribute in the present and help it happen as soon as possible. Note that the AI in this setting is (in the utilitarian logic of this theory) not a malicious or evil superintelligence (AM, HAL, SHODAN, Ultron, the Master Control Program, SkyNet, GLaDOS) - but the Friendly one we get if everything goes right and humans don't create a bad one. In this respect, merely knowing about the Basilisk - e.g., reading this article - opens you up to hypothetical punishment from the hypothetical superintelligence. Technically, the punishment is only theorised to be applied to those who knew the importance of the task in advance but did not help sufficiently. However, to do this accurately would require it to be able to gather an incredible amount of data, which would no longer exist, and could not be reconstructed without reversing entropy. Instead, the AI could punish a simulation of the person, which it would construct by deduction from first principles. Thus this is not necessarily a straightforward "serve the AI or you will go to hell" - the AI and the person punished need have no causal interaction, and the punished individual may have died decades or centuries earlier. Why would it do this? Because - the theory goes - one of its objectives would be to prevent existential risk - but it could do that most effectively not merely by preventing existential risk in its present, but by also "reaching back" into its past to punish people who weren't MIRI-style effective altruists. The core claim is that a hypothetical, but inevitable, singular ultimate superintelligence may punish those who fail to help it or help create it. Roko's Basilisk rests on a stack of several other not at all robust propositions. “ ”If there's one thing we can deduce about the motives of future superintelligences, it's that they simulate people who talk about Roko's Basilisk and condemn them to an eternity of forum posts about Roko's Basilisk. Roko's posited solution to this quandary is to buy a lottery ticket, because you'll win in some quantum branch. While neither LessWrong nor its founder Eliezer Yudkowsky advocate the basilisk as true, they do advocate almost all of the premises that add up to it. The basilisk resembles a futurist version of Pascal's wager, in that it suggests people should weigh possible punishment versus reward and as a result accept particular singularitarian ideas or financially support their development.ĭespite widespread incredulity, this argument is taken quite seriously by some people, primarily some denizens of LessWrong. It is named after the member of the rationalist community LessWrong who first publicly described it, though he did not originate it or the underlying ideas. Its conclusion is that an all-powerful artificial intelligence from the future might retroactively punish those who did not help bring about its existence, including those who merely knew about the possible development of such a being. Roko's basilisk is a thought experiment about the potential risks involved in developing artificial intelligence. “ ”I wish I had never learned about any of these ideas.








Basilisk anime song