• MangoCats@feddit.it
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    you might not get the chance to tell it this isn’t what you meant.

    And that is where the thought experiment left the tracks - lifted off with escape velocity and is now passing Voyager 2…

    In what cartoon world do we not get a chance to shut off the Doomsday Device? I mean, it was a funny little twist at the end of Dr. Strangelove, but as realistic as many elements of that story were, that was not one of them.

    • Iconoclast@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      9 minutes ago

      I don’t think you fully appreciate the implications of creating something orders of magnitude more intelligent than us. You can’t outsmart something smarter than you. Even if it was only as smart as the smartest human, being a computer it would still process information a million times faster. Everything would happen in super-slow motion from its perspective. It would have so much time to consider each move.

      Humans aren’t anywhere near the strongest primate on Earth, yet we’re by far the dominant one. I don’t think a gorilla has any idea just how much smarter we are, and even if it did, it would probably still assume that a war with humans would mean us outnumbering them, hitting, biting, and throwing things at them. They’d have no clue we can end them from a distance without them ever knowing what hit them. They can’t even imagine all the ways we could - and have - screw things up for them, even when we have nothing against gorillas.

      The point isn’t that I think this is absolutely going to happen, but just to highlight that we’re effectively rolling the dice on it and seeing what happens - which I find incredibly irresponsible. This whole “it’ll be fine, we can always turn it off” attitude is incredibly naive and short-sighted.