Saturday, April 27, 2024
ad
HomeNewsMeta launches Large Language Models for AI Researchers

Meta launches Large Language Models for AI Researchers

Meta is making OPT-175B, as well as the codebase that was used to train and deploy the model, available to the public utilizing only 16 NVIDIA V100 data center GPUs.

Meta, formerly known as Facebook, launched Large Language Models to help artificial intelligence (AI) researchers. 

With this new development, researchers can now easily access OPT-175B, a vast language model with 175 billion parameters trained on publicly available data sets. 

Large language models, according to Meta, are natural language processing (NLP) systems with hundreds of billion parameters that have revolutionized NLP and AI research in recent years. 

Read More: India and Germany to work together on AI Startups and Research

The language models demonstrate an incredible new ability to write creative content, perform simple math problems, answer reading comprehension tests, and more after being trained on a large and varied text volume. 

Meta is making OPT-175B, and the codebase used to train and deploy the model available to the public utilizing only 16 NVIDIA V100 data center GPUs. Full research access to large language models is frequently limited to a “few well-resourced labs,” says the company. 

Meta mentioned in a blog, “We believe the entire AI community-academic researchers, civil society, policymakers, and industry — must work together to develop clear guidelines around responsible AI in general and responsible large language models in particular, given their centrality in many downstream language applications.” 

The blog further stated that Meta thinks that releasing OPT-175B and smaller-scale baselines will broaden the range of voices defining the ethical implications of such technologies. 

Moreover, Meta designed OPT-175B with energy efficiency in mind, training a model of this scale with only 1/7th the carbon footprint of GPT-3 and also released a set of smaller-scale baseline models that were trained on the same data set and used similar settings as OPT-175B, allowing researchers to investigate the influence of size on its own. 

Interested individuals can send requests to access OPT-175B through a request form

Subscribe to our newsletter

Subscribe and never miss out on such trending AI-related articles.

We will never sell your data

Join our WhatsApp Channel and Discord Server to be a part of an engaging community.

Dipayan Mitra
Dipayan Mitra
Dipayan is a news savvy writer, who does not leave a single page of news paper unturned. He is also a professional vocalist who enjoys ghazals. Building a dog shelter is his forever dream.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular