Eliezer yudkowsky writing a business

Eliezer yudkowsky autobiography

Pat, your presumption is wrong. The fatal scenario is an AI that neither loves you nor hates you, because you're still made of atoms that it can use for something else. But when I think of a case like this, I imagine trying to get the world to a condition where some unemployed person can offer to drive you to work for 20 minutes, be paid five dollars, and then nothing else bad happens to them. Yes, there are areas that need much more research but it is unlikely given the research to date that we will eventually discover that IQ tests were not a reliable measure of what has been defined as intelligence. Only weird and frankly terrifying anthropic theories would let you live long enough to gaze, perhaps knowingly and perhaps not, upon the halting of the longest-running halting Turing machine with states. If you have one person who's trying to say, "Every car is a thermodynamic process that requires fuel and dissipates waste heat" and the person on the other end hears, "If you draw a diagram of a Carnot heat engine and show it to a mechanic, they should agree that it looks like the inside of a Honda Accord" then you are going to have some fireworks. See a Problem? In , he wrote a book-length work 'Creating a Friendly AI'. Even if you have a good idea, it isn't always exploitable. This is only one-sixth of Yudkowsky's enormous Sequences - an unusually scientifically accurate philosophical system covering statistics, physics, psychology, history, ethics, and, most importantly, the specific universal obstacles to your being rational. No Lord hath the champion, no mother and no father, only nothingness above. To live significantly past a googolplex years without repeating yourself, you need computing structures containing more than a googol elements, and those won't fit inside a single Hubble volume. However, general AI components are widely reused between companies. Utilitarianism is a correct theory of morality.

Someone like that might suspect that I'm not a purely random lottery ticket winner, but they won't have as much evidence to that effect as you. This never works. I have some imaginative sympathy with myself a subjective century from now.

eliezer yudkowsky books

Horgan: Will superintelligences possess free will? Horgan: Are you religious in any way? Writing an job application letter; Essay writing topics for sbi po exam ; A new exploration of systemic failure and uncommon success.

You're right, Pat, that completely unbiased agents who lack truly foundational disagreements on priors should never end up in this situation.

eliezer yudkowsky harry potter

You actually have a pre-existing status hierarchy built up in your mind around Harry Potter fanfiction. The concept of "inadequacy" that Eliezer introduces here is new to me, anyway and potentially valuable.

Lesswrong

Love the way Eliezer explores the world. But even there, everything is subject to defeat by special cases. But this degree of unevenness seems implausibly extreme. A good idea alone is not enough; if you come up with a website that's better than Craigslist, that doesn't guarantee it will replace Craigslist because there are many other factors e. Asking whether the brain is a Bayesian algorithm is like asking whether a Honda Accord runs on a Carnot heat engine. Institute NGDP level targeting regimes at central banks and let the too-big-to-fails go hang. If you have one person who's trying to say, "Every car is a thermodynamic process that requires fuel and dissipates waste heat" and the person on the other end hears, "If you draw a diagram of a Carnot heat engine and show it to a mechanic, they should agree that it looks like the inside of a Honda Accord" then you are going to have some fireworks. Yudkowsky: No. Do you have any idea what the vast majority of the audience for Harry Potter fanfiction wants? Vivid detail warning! As for your question about opportunity costs: There is a conceivable world where there is no intelligence explosion and no superintelligence. I interviewed him on Bloggingheads. Horgan: What would superintelligences want? Smarter-than-human AI breaks your graphs. Or does nothing much exciting happen?
Rated 9/10 based on 103 review
Download
Hermione Jean Granger and the Phoenix's Call by Eliezer Yudkowsky