BUSINESS

These Tech Giants Have A Contract With The US Government To Develop AI Carefully

The Joe Biden administration and seven top artificial intelligence (AI) technology firms, including Google, OpenAI, and Meta, have agreed to implement new safeguards to control the dangers associated with AI.

A test of AI’s security would be conducted, and the findings would be made public. Amazon, Anthropic, Meta, Google, Inflection, and OpenAI are the firms.

These pledges are sincere and specific. The lives of individuals all around the globe will alter as a result of AI. The individuals in this room will play a crucial role in guiding that innovation responsibly and safely by design, Biden said at the White House late on Friday.

“AI should be advantageous to society as a whole. In order for it to occur, these potent new technologies must be created and used properly, according to Meta’s president of international relations, Nick Clegg.

“As we develop new AI models, tech companies should be transparent about how their systems work and closely collaborate across industry, government, academia, and civil society,” he said.

The tech giants have agreed that before their AI systems are released, internal and external experts will evaluate them for security.

By using watermarks and regularly disclosing AI capabilities and limits, this will make sure that humans can recognize AI.

These businesses will also investigate potential hazards including prejudice, discrimination, and privacy invasion.

“We need to do this right because we have a big responsibility. Additionally, there is a huge, huge potential upside, according to Biden.

According to OpenAI, the watermarking agreements would force the businesses to “develop tools or APIs to determine if a particular piece of content was created with their system.”

Google pledged to implement comparable disclosures earlier this year.

In a move identical to OpenAI’s GPT-4, Meta announced earlier this week that it will open-source its huge language model Llama 2.

Related Articles

Back to top button