Openai researcher resigned, citing fear of rapid development of AI

An Openai researcher has resigned from his situation after working for four years in the company. The reason for that? Rapid development of artificial intelligence. In a post on X, Steven Adler announced that four years after working on security in Openai, he quit his job in mid -November.

“It was a wild ride with a lot of chapters – dangerous ability evals, agent security/control, AGI and online recognition, etc. – and I will miss many parts of it,” Mr. Adler wrote.

In the next part of the post, he talked about whether he was the most “frightened”. “Honestly, I am very nervous at the pace of development of AI these days. When I wonder where I will increase the future family, or how much to save for retirement, I can’t help, but won’t be surprised: what: Humanity will also make it at that point?

In follow-up posts, Steven mentioned that “AGI race is a very risky gamble” because no laboratory has a solution to its alignment and even if anyone wants to develop technology responsibly If, then the competition among colleagues pushes to give speed to all.

“As is further, I am enjoying a little break, but I am curious: what you see as the most important and neglected ideas in AI Safety/Policy? I am particularly excited: Ways of Control, Schemeing detection, and security matters, ”Mr. Adler concluded.

Not long ago, Jeffri Hinton, often called the “Godfather” of AI, expressed concern that technology could cause human extinction in the next 30 years.

The British-Canadian Computer Scientist, in 2024, was awarded the Nobel Prize in Physics, for its work in the field, “10% to 20%” was the chance that AI could be the result of human extinction in the next three decades. He earlier predicted that he had 10% likely.

Mr. Hinton, a professor at the University of Toronto, also said that human was children as compared to advanced AI systems. “I like to think: Imagine yourself and a three -year -old child. We will be three years old,” he said.

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version