The Generative Pre-trained Transformer model (version 3 of which was released in 2020), is a pre-trained neural network machine learning model using the transformer architecture that can create large amounts of text from short human prompts. Developed by OpenAI under a commercial model, GPT-3 was trained on an extremely large amount of data collected from the internet with over 175 billion machine learning parameters, and can be used to generate text of any sort, programming code, and other forms of text output.Â