From 9c49669b9cba0b31b5096b0a141fc04284e7da8a Mon Sep 17 00:00:00 2001 From: osmarks Date: Tue, 13 Aug 2024 08:32:11 +0000 Subject: [PATCH] =?UTF-8?q?Edit=20=E2=80=98posthuman=5Ftechnocapital=5Fsin?= =?UTF-8?q?gularity=E2=80=99?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- posthuman_technocapital_singularity.myco | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/posthuman_technocapital_singularity.myco b/posthuman_technocapital_singularity.myco index 92c34ad..fdf791d 100644 --- a/posthuman_technocapital_singularity.myco +++ b/posthuman_technocapital_singularity.myco @@ -2,7 +2,7 @@ In brief, the posthuman technocapital singularity is the point in time at which = Posthuman -Posthumanism is related to but distinct from [[wp>transhumanism|transhumanism]]. Where transhumanism seeks to upgrade humans with new capabilities using technology, posthumanism is about replacement of humans with something new and distinct. In this case, posthumanism means replacing humans with inhuman AI systems, or humans having to modify themselves to remove human qualities such that they are functionally equivalent. As [[https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile|has been argued]], this powerful optimization for non-human values would likely result in the elimination of all good from the universe. +Posthumanism is related to but different from [[wp>transhumanism|transhumanism]]. Where transhumanism seeks to upgrade humans with new capabilities using technology, posthumanism is about replacement of humans with something new and distinct. In this case, posthumanism means replacing humans with inhuman AI systems, or humans having to modify themselves to remove human qualities such that they are functionally equivalent. As [[https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile|has been argued]], this powerful optimization for non-human values would likely result in the elimination of all good from the universe. = Technocapital