Edit ‘good_ideas’: Synced 1731013539906

This commit is contained in:
sync 2024-11-07 21:20:42 +00:00 committed by wikimind
parent a3b5d73d67
commit b650b9806c

View File

@ -13,6 +13,7 @@ Meme sparse autoencoding (I think a CLIP SAE already exists though).
Similarly, contrastive RL for computer algebra (specifically, proving that expressions equal other expressions via making substitutions repeatedly). Try and contrastively learn a "how close is this expression to this other one" function (I think with an action input?). Bootstrap to progressively harder problems.
* What does Gyges mean by "anyone who wants it: you should be able to train contrastive models way faster if you use lsh to determine pairs to contrast"? This might contain alpha.
* Maybe this should be a "theorem prover" and not an "expression rewriter". I think they're fairly similar anyway.
* Check sticky note for slightly more detailed sketch.
}
* {
Startup ideas:
@ -61,3 +62,12 @@ https://worksinprogress.co/issue/the-beauty-of-concrete/
* AutoGaryMarcus.
* Automate those one-way/recorded video interviews using an LLM, TTS and deepfake system.
* Hex-grid particle simulator (similar to the Powder Toy) with GPU acceleration or HashLife algorithm.
* Fix https://arxiv.org/abs/2411.00566 with less deranged ML.
* {
Diffusion model for Minecraft worlds.
* Scrape the internet for worlds, extract "interesting" regions.
* How to deal with the large volume of data in a world? 16³ is big. Probably more compressible than 2D though. Apply HDiT (?).
}
* Autonomous, rogue [[Autogollark]].
* Make a high-performance physically plausible space combat simulator and do RL to spite Devon Eriksen.
* Simulate self-consistent time travel in games by having absurdly superhuman AI player enforce consistency using "random" degrees of freedom.