Yes, AI Can Generate Code | A Look Under The Hood Of GPT-3

September 11, 2020

Yes, AI Can Generate Code | A Look Under The Hood Of GPT-3

Ever dreamed of a world where AI can generate code for you? Researchers have long been arguing about the feasibility of a future where AI can do the heavy lifting, but with the launch of OpenAI’s latest Natural Language Processing Model, GPT-3, speculation may have turned into reality. Will this new autoregressive model blaze the trail to a novel era with machine-dominated coding at the helm?

Let’s Talk Shop

Prior to the release of GPT-3, Microsoft's Turing NLG held the record for the world’s biggest deep-learning language model, with over 17 billion parameters. GTP-3 has since eclipsed Turing NLG, sporting 175 billion parameters. Apart from computing power, data also plays an integral role in GTP-3’s writing capabilities, as this model learned to produce writing through analysing 45 terabytes of data, resulting in a costly training process for OpenAI. The millions of dollars spent in cloud computing to train this algorithm has, however, long been an integral part of OpenAI’s strategy. And with their focus on data and computing power, they’ve even been able to win Dota 2 matches against professional esports players.

Early Adopters

OpenAI first mentioned GPT-3 in a research paper released in May 2020, and has since then officially launched the NLP model, drip-feeding API beta versions to numerous users across the globe. The model has received an overwhelmingly positive response across social media channels, with one San Francisco–based developer, Arram Sabeti, even tweeting that “playing GPT-3 feels like seeing the future”. The release of private beta versions has also spurred numerous demos, some of which have already developed into useful soon-to-be-released products. One noteworthy demo was developed by Sharif Shameem, the founder of Debuild. Shameem was able to build a layout generator using GTP-3, which allows users to describe the code they need and simply sit back while the generator produces JSX code for them.

OpenAI’s latest NLP model has also been leveraged to simplify the process of writing code for machine learning models. Matt Shumer, the co-founder and CEO of OthersideAI, used GPT-3 to build a demo that generates code for ML models, based purely on dataset descriptions and required output.

Shortcomings

Despite overwhelmingly positive feedback for GTP-3 on social media, the language model still has numerous limitations, with OpenAI CEO Sam Altman tweeting: “The GPT-3 hype is way too much. AI is going to change the world, but GPT-3 is just a very early glimpse.” Most of the model’s limitations are centered on its overwhelming bias in generated text and a lack of semantic understanding. GPT-3’s text generation has also been labelled as unsafe by Jerome Pesenti, the head of AI at Facebook, after prompting the model to write tweets based on specific words. 

Final Thoughts

There is no denying the capabilities and possibilities of GPT-3. Being the largest language model to date, the AI continues to inspire awe with its on-demand human-like text. As AI, OpenML and No-Code continue to evolve, we can foresee a future filled with purely machine-generated code. However, data-driven ML approaches do not result in AI understanding natural language and, as such, one fact remains clear: GPT-3 will not be pushing humanity closer towards True Intelligence anytime soon

Jacques Fourie

Hendri Lategan

CEO

Swipe iX Newsletter

Subscribe to our email newsletter for useful tips and valuable resources, sent out monthly.