I started observing exciting news about GPT3 two weeks back on the twitter-sphere when Sharif Shameem figured out a new way to write code. He used an AI system created by research Lab OpenAI (Early founder Elon Musk (he is no longer involved), and supported by entrepreneurs like Sam Altman and funded by Microsoft, YCombinator and other)
He described to GPT3 what the AI should do in a checklist form, and the AI spat out functioning code!
Arguably, the most discussed achievements of the model were summarization, writing articles, writing creative fiction, business memos, functioning code and translation.
GPT3 has been fed billions of web pages, including coding tutorials, and can respond to command and generate functioning poems, memes, and articles. All of it of pretty decent quality! Some would say that it is human like.
Now before you say that is the end of programmers and news journalists, be aware that the AI model does not understand what it is writing. GPT3 still requires a human to input commands. However, it can improve and who knows what later versions can do.
This is not the birth of General Purpose AI, because ultimately, it is a correlative tool that does not understand the language that it creates. Yet!
GPT-3 is the latest in a series of text-generating neural networks. The name GPT stands for Generative Pre-trained Transformer.