Generative Pre-trained Transformer (GPT)

GPT is a type of large language model (LLM) designed to generate human-like text (and, increasingly, code, images, and scientific insights).

  • Generative: it creates new content, not just classifications or predictions.
  • Pre-trained: It is trained in advance on massive amounts of data, then adapted for specific tasks.
  • Transformer: it uses the transformer neural-network architecture, which excels at understanding context and relationships in sequences.

GPT is a type of GenAI