written by
Renjit Philip

GPT3: is the Hype justified?

Natural Language Processing GPT3 1 min read , July 30, 2020

I started observing exciting news about GPT3 two weeks back on the twitter-sphere when Sharif Shameem figured out a new way to write code. He used an AI system created by research Lab OpenAI (Early founder Elon Musk (he is no longer involved), and supported by entrepreneurs like Sam Altman and funded by Microsoft, YCombinator and other)

Twitter screen grab
The Tweet that started it all

He described to GPT3 what the AI should do in a checklist form, and the AI spat out functioning code!

Arguably, the most discussed achievements of the model were summarization, writing articles, writing creative fiction, business memos, functioning code and translation.

GPT3 has been fed billions of web pages, including coding tutorials, and can respond to command and generate functioning poems, memes, and articles. All of it of pretty decent quality! Some would say that it is human like.

It would be interesting to test it out in an insurance context- is to ingest complex insurance wordings and spit out easy to understand, customer friendly terms and conditions! Come to think of it, it also will be good to do a run on the privacy policy of Facebook!

Now before you say that is the end of programmers and news journalists, be aware that the AI model does not understand what it is writing. GPT3 still requires a human to input commands. However, it can improve and who knows what later versions can do.

This is not the birth of General Purpose AI, because ultimately, it is a correlative tool that does not understand the language that it creates. Yet!


GPT-3 is the latest in a series of text-generating neural networks. The name GPT stands for Generative Pre-trained Transformer.

Interesting Tweets: