9 lines
1.0 KiB
Plaintext
9 lines
1.0 KiB
Plaintext
In brief, the posthuman technocapital singularity is the point in time at which compounding improvements in technology and economic pressures remove humans from the loop of self-improvement and rapidly render them irrelevant/nonexistent. According to most people, this is [[bad]]. It can be understood in more detail by considering each component.
|
|
|
|
= Posthuman
|
|
|
|
Posthumanism is related to but different from [[wp>transhumanism|transhumanism]]. Where transhumanism seeks to upgrade humans with new capabilities using technology, posthumanism is about replacement of humans with something new and distinct. In this case, posthumanism means replacing humans with inhuman AI systems, or humans having to modify themselves to remove human qualities such that they are functionally equivalent. As [[https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile|has been argued]], this powerful optimization for non-human values would likely result in the elimination of all good from the universe.
|
|
|
|
= Technocapital
|
|
|
|
= Singularity |