GPT-3 - Wikipedia
https://en.wikipedia.org/wiki/GPT-3
OverviewBackgroundTraining and capabilities GPT-3.5ReceptionSee alsoGenerative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer model of deep neural network, which supersedes recurrence and convolution-based architectures with a technique known as "attention". This attention mechanism allows the model to selectively focus on segments of input text it predicts to be most relevant. It uses a 2048-tokens-long context , float16 (16-bit) precision… Initial release: June 11, 2020 (beta)
Initial release: June 11, 2020 (beta)
DA: 13 PA: 32 MOZ Rank: 50