Meta OpenAI: Give away a Massive Language Model 

Meta is giving away its family jewels: OpenAI. That was the gist of an announcement this week from Meta. The company’s researchers announced in a post on the Meta AI site. That they have built an extremely powerful language AI system and are making it available. Free of charge to everyone in the artificial intelligence community. 

Meta description

Some argue that not many researchers will actually benefit from this largesse – which Meta describes as an effort to democratize access to AI. The path to commercialization remains a mystery even as these models become more accessible to researchers.

With opener’s GPT-3 neural network, Meta’s AI lab has built a new language model. That has both remarkable capabilities and harmful flaws. It is a surprising move for Big Tech, but they are giving it away free of charge to researchers. Together with information on how it was build and train?

“We strongly believe that the evaluation of your work by others is a crucial component of your research. The Meta OpenAI team is open to collaboration,” says Joelle Pineau, a longtime proponent of transparency in the development of technology. 

It should be possible to fix the problem if there are more people working on it. Language models, however, require vast amounts of data and compute power to train, so they are mostly the domain of rich tech firms. Those concerned about their misuse, including ethicists and social scientists, have watched from the sidelines.  

Also See: First flying car by Suzuki is under developing mode

According to Meta AI, this needs to change. “We’re all former university researchers,” Pineau says. In terms of the ability to build these models, universities and industry are at a significant disadvantage. It was a no-brainer to make this one available to researchers.” She hopes that others will pore over the paper and pull it apart or build upon it. Researchers are more likely to reach breakthroughs when they are more involved, she says.

With 175 billion parameters, it’s as powerful as OpenAI GPT-3

OPT has the same number of parameters (175 billion) as GPT-3 (the values in a neural network that are tweaked during training). Pineau explains that this is by design. The team built OPT to match GPT-3 both in terms of accuracy and toxicity on language tasks. GPT-3 has been made available through OpenAI paid service but the model and its code have not been shared. Pineau says the idea was to provide researchers with a similar language model for study.

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *