OpenAI partners with the U.S. AI Safety Institute to provide early access to its next-generation AI model. This collaboration aims to address safety risks with AI by prioritizing responsible practices in AI development.
OpenAI has taken a significant step to improve AI safety by providing the U.S. AI Safety Institute early access to its next AI model. OpenAI CEO Sam Altman recently announced on X that the organization will work with the government’s executive body for better safety evaluation.
In May, OpenAI faced criticisms over its internal security system protocols. The organization dissolved its safety team, which made headlines because lawmakers raised concerns about it.
Reports said that the company prioritized launching new features over safety. These incidents led to the resignations of important OpenAI employees Jan Leike and Ilya Sutskever.
In response to all these criticisms and allegations, OpenAI said it would remove the clauses from the company’s guidelines that forbade employees from speaking. The company also plans to set up a security commission.
The company had previously committed to dedicating 20% of its funds to security, but it has not been fulfilled. Sam Altman has pledged to complete this commitment and stated that restrictive terms were removed for all new and old employees.
OpenAI increased its budget spending on government policies and legislatures compared to the previous year. The previous whole year’s budget was $260,000. In 2024, the half-year budget was $800,000.
The news of the partnership came when the new proposed bill, “The Future of Innovation Act,” was passed. This bill makes the U.S. AI Security Institute responsible for making rules and regulations for the safety of AI. The executive body under the Commerce Department will now work with OpenAI to improve AI security in the future.
This collaboration marks a significant step in addressing AI safety concerns. It highlights the commitment of both OpenAI and the U.S. AI Safety Institute to promote responsible AI development presently and also in the future.