The latest technology and digital news on the web

But here’s the really bewitched part. As a result of its humongous size, GPT-3 can do what no other model can do (well): perform  tasks after any appropriate tuning. You can ask GPT-3 to be a translator, a programmer, a poet, or a famous author, and it can do it with its user (you) accouterment fewer than 10 training examples. .

 big. With 175 billion parameters, it’s the better accent model ever created (an order of consequence larger than its abutting competitor!), and was accomplished on the better dataset of any accent model. This, it appears, is the main reason GPT-3 is so impressively “smart” and human-sounding.

But here’s the really bewitched part. As a result of its humongous size, GPT-3 can do what no other model can do (well): perform  tasks after any appropriate tuning. You can ask GPT-3 to be a translator, a programmer, a poet, or a famous author, and it can do it with its user (you) accouterment fewer than 10 training examples. .

This is what makes GPT-3 so agitative to apparatus acquirements practitioners. Other accent models (like BERT) crave an busy fine-tuning step where you gather  of examples of (say) French-English book pairs to teach it how to do translation. To adapt BERT to a specific task (like translation, summarization, spam detection, etc.), you have to go out and find a large training dataset (on the order of bags or tens of bags of examples), which can be bulky or sometimes impossible, depending on the task. With GPT-3, you don’t need to do that fine-tuning step. This is the heart of it. This is what gets people aflame about GPT-3: custom accent tasks after training data.

Today, GPT-3 is in clandestine beta, but boy can I not wait to get my hands on it.

Published July 23, 2020 — 11:56 UTC

Hottest related news