Sam Altman accepted

0
9
Sam Altman accepted

Sam Altman accepted

Openai CEO Sam Altman has admitted that the GPT is a main reason behind the release of the open-weet model earlier this month, which was a growing pressure from Chinese rivals such as Dipsek.

Advertisement
Sam Altman accepted
Openai CEO Sam Altman

In short

  • Openai released two open-weight models on August 5, which was the first time since 2019
  • These are GPT -SS-20120B and GPT -SS-20B
  • China’s Deepsek was launched as an open-source model in January this year.

Openai took a major step earlier this month. Beyond the release of Chatgpt 5 on 7 August, the company released two open-eight models. This was the first time AI Startup launched the open-weight model since GPT 2 in 2019. OpenAII CEO Sam Altman has now admitted that Chinese rivals have played an important role in this decision.

The two open-weight models launched were GPT-comments and GPT -SS-20B. The former is a large model to work with data systems and high-end laptops. On the other hand, GPT-com-20B can work on Openai, on most laptops, desktops and even relatively modest hardware phones. These models that can be run locally and adapted by researchers, developers and companies.

Advertisement

Earlier this year, Deepsek shook the AI space with its R1 model. Not only did R1 match chat and Gemini’s choice, it was also an open source. This allowed users to achieve much more freedom than closed AI models such as chatgpt.

It expressed apprehension within Openai. Sam Altman claimed that issuing open weight models became a requirement to avoid Chinese dominance. He told CNBC, “It was clear that if we did not do so, the world was mostly going to be built on the Chinese open-source model.”

Altman admitted that it was a serious reason behind the release. However, he clarified that this was not the only reason.

What really means open-weight model

The open-weight model like GPT -SS is not the same as open-source model like Deepsek R1. In an open-weight model, users are given access to ‘weight’. Weight refers to characteristics or elements that you use to train a large language model (LLM).

For some questions, AI gives more weight to some words or sequences. An open-weight model uses these weight to users. Developers can then see these weight and how they are used in the manufacture of AI models.

However, the way AI was trained or information used to train the model is prohibited. Thus, the code or information used to train GPT-As is not available to the public.

Nevertheless, the open-weight models of Chatgpt can be important. Developers can not only understand the weight of the model, but can also use them locally or add them to already existing programs. This will help in denying the dependence on the Chinese Open-SOS model, making us ‘the situation’.

This step comes at a time when the US government is concerned about the rise of China in the AI race. The Trump administration has also imposed strict restrictions on advanced chip sales to Beijing.

– Ends

LEAVE A REPLY

Please enter your comment!
Please enter your name here