The extinction of the human race will come from its inability to EMOTIONALLY comprehend the exponential function.

— Edward Teller
Page 85

Page 85

Ah, hah. It’s the third pillar of the world of Genocide Man: Altered lifeforms, gene warfare, and artificial intelligences gone psychopathic. There’ll be no singularity here, no advancement beyond humanity through electronic means. Human beings barely keep themselves sane, so anything with higher intelligence will always teeter on the edge.

This guy’s going off the edge really quick.

Oh, and there may be a few snippets of French in the background. Don’t worry about an exact translation. All the background text says approximately the same thing: ‘Hello? We have a problem.’

↓ Transcript
Msaka: It's...a recording of Tatsu?
Jacob: No. Too abstract. This is an A.I.
A.I.: Very astute, Jacob. Doctor Fumiaki couldn't predict where he'd be decades later. So he left a program smart enough to deduce his location. With a bit of his personality to make sure the message got through.
Jacob: What's your estimated lifetime?
A.I.: Five minutes.
Jacob: ...terrific.
Msaka: You've dealt with an A.I. before?
Jacob: You could say that.
Girii: Is it...insane?
Jacob: It will be. It's unstable. And if it's only going to last five minutes...
A.I.: What Jacob means to say is that A.I. intelligence and lifespan are inversely correlated. Because I am a mayfly, I may be the most omniscient creature you will ever meet. I must admit it's odd, knowing that I'll kill myself in four and a half minutes, but I don't yet know how. An intriguing intellectual challenge, I suppose. But let's make the most of our time. First, gifts.

└ Tags: ,

Discussion (17)¬

  1. Thomas S says:

    I must say, a magnificent effort this, well done – a very interesting and well position technological outcome of AI. So pleased to see this page. Carry on Remus, your doing darn fine!

  2. frymaster says:

    so… does this mean it doesn’t know _why_ it’d want to kill itself, but knows it will (because AIs killing themselves is a known “law of nature”), so is assembling the planes now because it knows it’ll need them? Or it already does know why it wants to kill itself? Or the takeover is happening in its subconscious?

    Either way, me likey 😀

  3. Thomas says:

    Oh, no alt text today?
    Anyway… planes reporting problems with their autopilots. I have an imagine in mind… but no, you wouldn’t… or would you?

  4. Remus Shepherd says:

    I wracked my brains for some funny alt text, but my humor neurons weren’t working last night.

    The best I could think of was, ‘Jacob knows A.I.s. Genocide Men are like exterminators. Usually they hunt insects, but every once in a while they’re called to chase a tiger.’ Not only is that not very funny but it’s also too long.

    I can always put up some alt text on this page later. Anyone want to help? Caption this page with a funny one-liner, and I’ll put it in the alt text.

  5. A.I. Autopilots…
    Boeing’s autopilots supplement the pilots(make the job easier, you can take a nap…etc…)Overpowering it is only a button click away, or any manual input.

    Airbus’ A.I. autopilots are supplemented by the human pilots…who, over time, forget how to fly or never knew how to begin with, so wheen TSHTF(that’s three simultaneous malfunctions that A.I. cannot handle) there you have a crash.

    Which take all the full faith and credit clause of the European Union to cover up and “justify”
    (Remember the Airbus 320 Paris crash where the black boxes where stolen and the two pilots interned in an insane asylum.)
    Or that Recife/Dakar crash blamed on the Pitot tubes…indeed…
    These guys never had basic training? No, their A.I. went bonkers
    and you CANNOT overpower it on an Airbus.

    God help Jacob if he cant overpower that A.I….(or any Airbus riders)

  6. Aaaand we say: “Dakar Center”…not Senegal…

    And Dakar Tower, it’s in a different building, different department…only linked by telephone…could be on a different planet, believe it or not!

  7. Chuk says:

    I like the inverse relationship between intelligence and lifespan. Neat idea.

  8. Remus Shepherd says:

    I use ‘Dakar Tower’ a lot in the next page. But the planes might not be all talking to the one airport.

    What we don’t know is how the autopilots will operate 100 years from now. My bet is on them having more power over the pilots, not less…

  9. “Tower” control that particular airport near airspace plus it’s ground area control.
    “Center” control the regional area, while “approach-departure” is the interface at that airport.

    Oddly, the Airbus simulators A.I. have to add fake hand flying for the pilots six month training while it is the reverse for all other airplanes, using controlled inner ear fluid motion(with hydraulic jacks) to simulate flying sensations.

    They are moving away from all automatics since the whole Greek Cabinet was killed a few years back in a Dassault Falcon 2000 as the auto-pilot violently pitched up and down smashing the passengers 8Gs on the ceiling then 8Gs on the floor!
    On Airbusses, better keep you seat belt tied!

  10. Thomas S says:

    AltText suggestions

    “Mayflys are less intelligent in the real world”
    or
    “English Humour is, of course, impossible for AI’s to understand”
    or
    “Dr McNinja would a way to make a funny comment here, but that’s another webcomic.”

  11. Remus Shepherd says:

    Thanks, Ming, for the insights.

    A hit on the first try, Thomas. I’m not going to compete with Dr. McNinja — this is a sci-fi black humor comic, not zany wackiness. 🙂

  12. Dakar, it’s hard to forget for me…

    One time, on a DC-8 freighter with 80 brahma bulls in the back, at 500 feet, one of the generators went ape shit and fried the whole electrical system, turning the cockpit pitch black…we did the next 400 feet going down lighted by my little penlight, all instruments dead, but holding her steady. At one hundred feet, getting out of the clouds, the airport lights appeared right in front of us and we just landed her.

    I suspect your story is bringing up a similar adrenalin rush!!!

  13. Skur says:

    Why does Eskimo girl (I just lost the name here) instantly assume that it’s insane?

  14. Remus Shepherd says:

    Skur — probably because this happened back in Chapter Two, and Girii was there to see it. She’s already heard the ‘AIs all go crazy’ conversation.

    Before I print the comic I might clean up chapter 2 a bit, fixing some of the art and adding some exposition. As if this comic needed more exposition. :p Well, I can promise that action is coming pretty soon.

  15. Skur says:

    My, Remus, what a long Dash you have.
    What kind of print are we talking of?

  16. Remus Shepherd says:

    I’ve always planned to publish the comic in print form at the end of each book. Book 1 should be chapters 1-7 (maybe 8). So we’re more than half way there. 🙂

    Three books in total. It’ll be a good long run.

  17. --jt-- says:

    Considering using Kickstarter.com when the time comes to publish?