Create ‘anthropic_shadow’

This commit is contained in:
osmarks 2024-09-01 10:08:01 +00:00 committed by wikimind
parent d4907e729b
commit 0d52789d75

5
anthropic_shadow.myco Normal file
View File

@ -0,0 +1,5 @@
As a corollary to [[quantum immortality]], events which would kill all humans, such as the [[posthuman technocapital singularity]] or other forms of [[AGI doom]], cannot happen. As such, things which lead to events which would kill all humans also cannot happen. Anthropic shadow explains* many phenomena:
* Python has bad dependency management because all ML code is written in it. If it were good, we would have AGI.
* RL doesn't work stably or reliably because it would be too powerful.
* LLMs are what we got because they are slow to develop ([[scaling laws]]) and can do some useful tasks but are bad at [[agentic]] action.