documentation/posthuman_technocapital_singularity.myco

38 lines
6.1 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

In brief, the posthuman technocapital singularity is the point in time at which compounding improvements in technology and economic pressures remove humans from the loop of self-improvement and rapidly render them irrelevant/nonexistent. According to most people, this is [[bad]]. It is believed that the concept was [[invented]] by [[Nick Land]], although nobody has actually read Nick Land. It can be understood in more detail by considering each component.
= Posthuman
Posthumanism is related to but different from [[wp>transhumanism|transhumanism]]. Where transhumanism seeks to upgrade humans with new capabilities using technology, posthumanism is about replacement of humans with something new and distinct. In this case, posthumanism means replacing humans with inhuman AI systems, or humans having to modify themselves to remove human qualities such that they are functionally equivalent. As [[https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile|has been argued]], this powerful optimization for non-human values would likely result in the elimination of all good from the universe.
= Technocapital
According to surveys, this is considered an undesirable outcome by most [[people]], and is thus, naively, unlikely. However, via coordination problems ([[https://slatestarcodex.com/2014/07/30/meditations-on-moloch/|Moloch]]), results which no involved party would choose, which are negative for everyone, can occur if avoiding these outcomes requires everyone involved to pay a local cost for shared benefit. The "capital" part of "technocapital" refers to how the competitive and forward-looking structure of capitalist economic systems mediate these processes.
In this case, the general dynamic is that organizations which race towards more powerful AI and turn over more control to AI systems will do better than those which progress slower to focus on safety, leave humans in charge of decision-making where they are worse but safer, etc. As such, every individual organization is incentivized to race as much as possible even though this has negative outcomes.
= Singularity
A process with a fixed output rate, such as as a [[machine]] which produces one [[bee]] per second, leads to arithmetic growth - the total output from it increases linearly with time. However, if the growth rate is proportional to the current total, i.e. it grows by a fixed percentage at regular intervals - like simplified population growth without resource constraints - the total grows exponentially with time. Systems where the proportional rate of growth also grows as the total does can instead display hyperbolic growth: modelled mathematically, the the total grows arbitrarily large in finite time, and is then undefined at a "singularity". The idea of the technological singularity is that (some) technological progress leads to further, faster technological progress, and so the general state of economic development and technology increases arbitrarily fast and the world advances faster than unaugmented humans can update to it.
Empirically, this was [[https://docs.google.com/spreadsheets/d/1xEkh4jhUup0qlG6EzBct6igvLPeRH4avpM5nZQ-dgek/edit?gid=478995971#gid=478995971|cancelled]] around 1960, when growth fell off-trend. According to some, this is because hyperbolic growth of population stopped around this time; sufficiently advanced AI is capable of substituting for population, and could thus restore the trend.
= Depictions in fiction
== Accelerando
[[https://www.antipope.org/charlie/blog-static/fiction/accelerando/accelerando.html|Accelerando]] is centred around this, although its singularity hits a degenerate state where it does not expand beyond the inner solar system due to light-speed latencies and its own coordination failures and some humans continue to exist.
> "Individuality is an unnecessary barrier to information transfer," says the ghost, morphing into its original form, a translucent reflection of her own body. "It reduces the efficiency of a capitalist economy."
> Take a human being and bolt on extensions that let them take full advantage of Economics 2.0, and you essentially break their narrative chain of consciousness, replacing it with a journal file of bid/request transactions between various agents; it's incredibly efficient and flexible, but it isn't a conscious human being in any recognizable sense of the word.
> From outside the Accelerated civilization, it isn't really possible to know what's going on inside. The problem is bandwidth: While it's possible to send data in and get data out, the sheer amount of computation going on in the virtual spaces of the Acceleration dwarfs any external observer. Inside that swarm, minds a trillion or more times as complex as humanity think thoughts as far beyond human imagination as a microprocessor is beyond a nematode worm. A million random human civilizations flourish in worldscapes tucked in the corner of this world-mind. Death is abolished, life is triumphant. A thousand ideologies flower, human nature adapted where necessary to make this possible. Ecologies of thought are forming in a Cambrian explosion of ideas: For the solar system is finally rising to consciousness, and mind is no longer restricted to the mere kilotons of gray fatty meat harbored in fragile human skulls.
== Economies of Force
[[https://apex-magazine.com/short-fiction/economies-of-force/|Economies of Force]] has posthuman-technocapital-singularity like themes, though humans appear to retain more control.
> The market laminated itself onto his contacts. Fractal reef grown in radiant heatmap colors on the walls of a single instant. Algorithm ecosystem, loops of predation and cannibalism. Civilizations of structure, selfreferential empires of buy and sell, risen and collapsed in the time it took a single neuron to pump one action potential.
> He stared into the infinite reach of the economy — no, no, just one shaved instant of it — and tried to feel Aponas remembered rapture. Terror instead.
> “Because the stronger market will buy out the weaker. The smarter network will win. Its the only law, the highest law, the one that remains when we abandon morality and teleology and intent. The hegemony of force.”