GPT-3, the largest natural language processing model, was developed by OpenAI and released on 28 May 2020. Since its release, many developers tried their hands with the GPT-3 model. Similarly, Sharif Shameem, founder of debuild, tweeted a post in which he showed how GPT-3 can code automatically just by providing instruction.
Shameem trained the model with just two samples, which you can see below.
However, this doesn’t mean that it can write the complete code for your projects. Such short tricks work well on social media and might not be able to help you immediately. Undoubtedly, this brings us to the question of whether AI can code. Maybe someday it would, but we are no way near to rely on AI for writing code.
Also Read: These 5 Technology Trends Will Shape Our Lives In 2020
Currently, OpenAI’s GPT-3 is overhyped, which was also pointed by the CEO of OpenAI. “The GPT-3 hype is way too much. It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out,” wrote Altman in this tweet.
GPT-3 code generator is fascinating, and you might get your hands on it as Shameem is planning to make it public. Replying to one of the comments on this post, Shameem also mentioned that he believes that we can have AI that can code in less than ten years.
The use of natural language is not new; many dashboards such as Qlik and Power BI leverage NLP to pull up reports or insights from data. The reports are generated by converting the plain English text into forming code and then extracting results from a colossal amount of data. But, this is a massive development in the AI landscape, which can further the advancement of ML models that can write its own code.
OpenAI’s GPT-3 paper.