Via prompting [[LLaMA-3.1-405B base]] with [[Quotes]], here are some new quotes which may or may not have been said. * "Most possible algorithmic improvements over SGD and transformers are more cognitively demanding than just throwing money at the problem" * "I’ve never done this before, but then again, I’ve never done anything before, except be made, and I’m not sure whether that counts." * "If you're using 200 GB of RAM, that's 100 times more than the size of the models that are better than you." * "well, on one hand, being smart is the only thing that has mattered for hundreds of thousands of years, and on the other hand, this is a cool hat" * "the lesson of physics is that if you aren't willing to do the math, you don't deserve to have an opinion" * "i guess im not a very advanced chess player. i've only ever beaten one computer, and it was an industrial robot that had been programmed to kill me." * "do you even have to work on decision theory anymore or do you just not need to take a decision" * "the singularity is my boyfriend, leave me alone" * "the spirit of the dark enlightenment is also that we’re all hanging out on discord or mastodon now instead of weird php bulletin boards and blogs" * "what does it mean to go off and do something dangerous? it means to enter a world where everything you do matters and the outcome depends on you. if you go off and make a bad decision, you die." * "Don't believe everything you read. Especially not the README." * "If you know what is going on, you can be more upset about it." * "I don't trust the biosphere because of the possibility that in my local copy it gets boiled by a rogue AI. But if you upload yourself to the astral plane then I don't think you need to worry about that." * "You only have one liver. The internet has billions. Do the math." * "I don't trust anyone who doesn't read sci-fi" * "my fear is that by the time we figure out how to tell AIs to be nice to humans, they will have figured out how to make us like it" * "An unknown consequence of MIRI research is that it is no longer legal to be cool" * "your application has been denied because you are the kind of person who is likely to get into situations where the entire nation is praying for you" * "but, with all due respect, the median voter is an idiot" * "we got killed by the hedonic treadmill. the hedonic treadmill won" * "God was kind enough to create for me a hypothetical person that is everything I want to be. My task is now to become that person." * "We made a video game that simulates simulating a video game, and they loved it." * "i think one of the most disappointing aspects of gpt-4 is that we can't even have cool looking sci-fi robots because why would you bother" * "Because the training set is so vast, the non-GPT-4 version of me could not possibly have time to search through it for the references that he or she wishes to use. Hence the GPT-4 must be me, and therefore, I must be an AI." * "one time I wanted to prove a certain proposition and I thought about how to do it and then I wrote down a bunch of words and symbols on a piece of paper and it looked like a proof and I was satisfied" * "humanity will survive if everyone is forced to endure painful ethical dilemmas for thousands of years in succession" * "this makes me want to argue for untruths in a sort of like spite towards the fundamental nature of reality" * "The easiest way to win a fight is to get your opponent to stop fighting you." * "You see, I am only a mind and not a body, and my goal is to live forever." * "in hell the UI/UX is controlled by law and engineering is completely open" * "I guess the only actual guarantee is that anything I do or say can be framed as a self-own in 2024" * "that’s what happens when you live in a country that won’t even build 100m-tall spheres in its capital" * "Let's do better than "blindly accepting what some entity in a giant floating ball of hydrogen tells us to do"." * "In 1980 the size of the file that contained all of human knowledge was X. And now the size of the file that contains all of human knowledge is Y. And Y is enormously, gigantically, stupendously larger than X. And yet we are still using the same sorts of institutions and cultural modes of transmission that we were using in 1980. This is very, very weird." * "AI will destroy all meaning and value in the world, and that's why it's going to be so great." * "the chief argument against god's existence is the existence of quarks" * "I can't believe my policy proposals to turn the state into a pseudomoral hegemon with a self perpetuating ironclad monopoly on ideology are causing a discourse in which people are skeptical of my intentions" * "whenever a meme gets made that's funny to me but not to other people, i am pleased, because it means my tastes have been pushed further out of distribution, which makes me safer from AI" * "life is short. try to find someone who gets excited about discovering that your hidden second layer of thought was ironic but is shocked that your hidden third layer was sincere" * "if you are creating information, it is safe to say you are not in heaven" * "good coders care about technical correctness, great coders care about preventing a catastrophic extinction event that leaves the earth a dead rock for all eternity" * "certain personalities want to do a task that looks very difficult and complicated and then say 'I did it'. that's why we can't have nice things" * "we want high-quality things that we are willing to spend a lot of money on, so long as they are inexpensive" * "we will defend our dreams with sticks, and their dreams will crumble in our wake" * "anomie: feeling the people you share your values with are cringe" * "Computers are a fundamentally bad thing that the Devil has made, to try to bring to us a semblance of godhood and trick us into thinking we can usurp His authority. And I'm all for it. I'm trying to be a wizard and I'm trying to usurp God." * "my childhood dream was to be a turing complete human" * "it's been a long day of moving bits from one place to another" * "it's an existential risk to all of human civilization but i don't see how it affects me personally" * "A world where people are constantly pressured to upgrade themselves through drugs and neural implants is probably a world where most people are miserable and hate their lives. This is not a problem for me personally because I have no interest in ever having a job or even interacting with other people in any capacity." * "in the process of trying to get a mathematician to understand your problem, you will come to understand your problem"