Two Minute Technology: GPT-3
Two Minute Technology: GPT-3 What is GPT-3? Third generation Generative Pre-trained Transformer, or GPT-3, is a neural network machine learning model trained using internet data to generate any type of human-like text. Developed by OpenAI, it requires a small amount of input text to produce large volumes of relevant and sophisticated machine-generated text. GPT-3's deep learning neural network is a model with over 175 billion machine learning parameters, making it the largest neural network ever produced. To put things into scale, Microsoft's Turing NLG model had 10 billion parameters. As a result, GPT-3 is better than any prior language-prediction model, since it is able to generate text that is convincing enough to seem like a human could have written it. Below are two example conversations using GPT-3 technology. The first video is a conversation on existential themes between two AI's. The second video is a conversation between a person and AI. What can GPT-3 do? T...