timelets: (Default)
timelets ([personal profile] timelets) wrote2020-02-21 11:12 am

The last invention

Can't wait.
The arrival of superintelligence will clearly deal a heavy blow to anthropocentric worldviews. Much more important than its philosophical implications, however, would be its practical effects. Creating superintelligence may be the last invention that humans will ever need to make, since superintelligences could themselves take care of further scientific and technological development. They would do so more effectively than humans. Biological humanity would no longer be the smartest life form on the block.

--- Nick Bostrom, 2003.

https://nickbostrom.com/views/transhumanist.pdf


I wonder how it feels to be someone's dog.
dmm: (Default)

[personal profile] dmm 2020-02-21 10:58 pm (UTC)(link)
I think superintellegence implies superior mastery over the forces of nature. It implies the capabilities of a superengineer, a superresearcher, etc. And so, it should imply a drastic progress in biomedical science and medical practice among other things. (Of course, people can define the same notion differently, but I am quite certain that Nick Bostrom means it this way.)

Pragmatically speaking, why on Earth would I need an "owner", if that "owner" can't reverse my aging, fix my future cancers, and postpone my mortality indefinitely? What would such an "owner" be able to give me, which I don't already have? Nothing seems to come to mind...