Michael Vassar

President of the Singularity Institute

Michael Vassar (born February 4, 1979) is an American futurist, activist, and entrepreneur. His career has focused on the prevention of global catastrophic risk from emerging technology.

Quotes

edit
  • It's easy to be cynical, but it's hard to be cynical enough.
    • Said in conversation with Phil Goetz (me)
  • "Imagine there is a set of skills," [Michael Vassar] said. "There is a myth that they are possessed by the whole population, and there is a cynical myth that they're possessed by 10 percent of the population. They've actually been wiped out in all but about one person in three thousand." It is important, Vassar said, that his people, "the fragments of the world," lead the way during "the fairly predictable, fairly total cultural transition that will predictably take place between 2020 and 2035 or so." We pulled up outside the Rose Garden Inn. He continued: "You have these weird phenomena like Occupy where people are protesting with no goals, no theory of how the world is, around which they can structure a protest. Basically this incredibly, weirdly, thoroughly disempowered group of people will have to inherit the power of the world anyway, because sooner or later everyone older is going to be too old and too technologically obsolete and too bankrupt. The old institutions may largely break down or they may be handed over, but either way they can't just freeze. These people are going to be in charge, and it would be helpful if they, as they come into their own, crystallize an identity that contains certain cultural strengths like argument and reason."
  • I taught at a school in Cincinnati with a 0% graduation rate and that was also interesting so I updated from thinking school was beneficial for other people but not beneficial to me, to thinking school was beneficial for maybe some people around the middle – at least some of the better schools – but not beneficial for the vast majority of people, to then actually reading the literature on education and on intelligence and academic accomplishment and symbolic manipulation and concluding "no, school isn't good for anyone". There might be a few schools that are good for people, like there's Blair and there's Stuyvesant and these schools may actually teach people, but school can better be seen as a vaccination program against knowledge than a process for instilling knowledge in people, and of course when a vaccination program messes up, occasionally people get sick and die of the mumps or smallpox or whatever. And when school messes up occasionally people get sick and educated and they lose biological fitness. And in either case the people in charge revise the program and try to make sure that doesn't happen again, but in the case of school they also use that as part of their positive branding and you know maintain a not-very-plausible story about it being intended to cause that effect while also working hard to make sure that doesn't happen again.
    • In an interview with Adam Ford, December 2012
  • I know it's tedious, but you aren't SL4 until you appreciate the depth of human irrationality. Yes rationality is there, but at a level that just barely shows up when measured with precise instruments. This is important with respect to non-human intelligences, because super-rationality, almost as much as superintelligence, is potentially overwhelming and because intelligence bootstraping moves a system towards rationality. Appreciate how far humans are from rational and you appreciate how utterly transformed, and essentially recreated, they would be by haphazard bootstrapping. Appreciate how formidable rationality is and you see why a highly rational infrahuman GAI would still be a massive existential threat.

Quotes about Vassar

edit
  • [Michael Vassar] once gave me and Ofer Grossman quests. My quest was to write a Go AI capable of defeating world champions, and Ofer's was to purchase a metric ton of corn. One of these quests has been accomplished (though certainly not by us), but the other one still stands …
  • I am personally convinced that the LW Sequences / AI to Zombies gave me something, and gave something similar to others I know, and that hanging out in person with Eliezer Yudkowsky, Michael Vassar, Carl Shulman, Nick Bostrom, and others gave me more of that same thing; a "same thing" that included e.g. actually trying to figure it out, making beliefs pay rent in anticipated experience; using arithmetic to entangle different pieces of my beliefs; and so on.
  • You listen to Michael Vassar. You don't remember traveling to this party or sitting on this beanbag. You don't remember when he began to speak. He is still speaking. He sounds like madness and glory given lisping poetry, and you want to obey.
  • Vassar, before I came to Berkeley, someone warned me "Vassar is kind of crazy and it's impossible to have a normal conversation with him". As a result, I spent several months avoiding you. Then I finally got to meet you and I realized I had made a huge mistake. I mean, you are crazy, and it is is impossible to have a normal conversation with you. But normal conversation is incredibly over-rated compared to whatever the heck you call the thing that interaction with you involves. I regret that we didn't get more of a chance to talk about stuff and I hope to solve that sometime in the future.
  • One of the most brilliant people I have ever met is Michael Vassar. I first ran into him at the Transvision 2003 Conference, just over three years ago. I was sitting next to my sister in a gorgeous auditorium at Yale, and we were watching the opening session of the conference – a debate between Gregory Stock, a transhumanist, and George Annas, who objected to genetic modifications to the human body. At the end of the talk, during the Q&A session, Vassar stood up and asked a question, which was more like a rebuttal, to George Annas – he pointed out that animal experiments have shown that artificial selection can rapidly produce results, like stronger bodies and longer lifespans, so what is the point of restricting genetic modifications, when they can already be achieved so simply? (Paraphrasing.)

See also

edit
edit
 
Wikipedia
Wikipedia has an article about: