Update about blogCa

Who knew all this would happen afterwards! Oct 23, 2023 showing some colorful leaves around Lake Tomahawk and the old gazebo.

Wednesday, July 31, 2024

Seriously now...

  A way to consider the reach of AI.

 


Stephen Fry Explains Why Artificial Intelligence Has a “70% Risk of Killing Us All”

Steven Fry seems to have his doubts about certain big-tech projects in the works today: take the “$100 billion plan with a 70 percent risk of killing us all” described in the video above.

This plan, of course, has to do with artificial intelligence in general, and “the logical AI subgoals to survive, deceive, and gain power” in particular. Even in this relatively early stage of development, we’ve witnessed AI systems that seem to be altogether too good at their jobs, to the point of engaging in what would count as deceptive and unethical behavior were the subject a human being. (Fry cites the example of a stock market-investing AI that engaged in insider trading, then lied about having done so.) What’s more, “as AI agents take on more complex tasks, they create strategies and subgoals which we can’t see, because they’re hidden among billions of parameters,” and quasi-evolutionary “selection pressures also cause AI to evade safety measures.”


Open Culture 


Today's quote:

It is in the act of giving without the expectation of anything in return that we find joy.

7 comments:

  1. Not sure I completely understood that - but if Stephen's worried, I'm worried.

    ReplyDelete
    Replies
    1. I learned a few more things to worry about here.

      Delete
  2. It seems that so many peope, including fellow bloggers, are jumping on the AI train, however, I am not among those at present and for the foreseeable future.

    ReplyDelete
    Replies
    1. Me neither. I know there are already lots of records for everything I say and do which lead my ads everywhere. But I don't want to try to use my creativity in that way either.

      Delete
  3. I find it terribly scary. Deceiptful, too.

    ReplyDelete
    Replies
    1. This is just one more thing to worry about. I sort of don't have the capacity for it with all the other important things.

      Delete
  4. Humanity has a pretty high chance of killing us all. Higher than AI, probably. The folks behind the doomsday clock say we’re at 100 seconds before midnight.

    ReplyDelete

There is today, more than ever, the need for a compassionate regenerative world civilization.