# Generative pre-trained transformer | ![img \|150](https://upload.wikimedia.org/wikipedia/commons/thumb/5/51/Full_GPT_architecture.svg/320px-Full_GPT_architecture.svg.png) | **Generative pre-trained transformer**s (GPT) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence. They are artificial neural networks that are used in natural language processing tasks. GPTs are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs. | |-|-| | | wikipedia:: [Generative pre-trained transformer](https://en.wikipedia.org/wiki/Generative_pre-trained_transformer) |