Edit ‘posthuman_technocapital_singularity’
This commit is contained in:
parent
e1231c950d
commit
a6e0c0f23e
@ -2,7 +2,7 @@ In brief, the posthuman technocapital singularity is the point in time at which
|
||||
|
||||
= Posthuman
|
||||
|
||||
Posthumanism is related to but distinct from [[wp>transhumanism|transhumanism]]. Where transhumanism seeks to upgrade humans with new capabilities using technology, posthumanism is about replacement of humans with something new and distinct. In this case, posthumanism means replacing humans with inhuman AI systems, or humans having to modify themselves to remove human qualities such that they are functionally equivalent. As [[https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile|has been argued]], this powerful optimization for non-human values would likely result in the elimination of all human value from the universe.
|
||||
Posthumanism is related to but distinct from [[wp>transhumanism|transhumanism]]. Where transhumanism seeks to upgrade humans with new capabilities using technology, posthumanism is about replacement of humans with something new and distinct. In this case, posthumanism means replacing humans with inhuman AI systems, or humans having to modify themselves to remove human qualities such that they are functionally equivalent. As [[https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile|has been argued]], this powerful optimization for non-human values would likely result in the elimination of all good from the universe.
|
||||
|
||||
= Technocapital
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user