Last year, San Francisco-based AI lab OpenAI released GPT-3, its latest attempt at a program that writes like humans.
To do this, the program is trained using 175B parameters. For context, all of the English Wikipedia is estimated to make up just 0.6% of the training data.
Now, OpenAI is looking to bring in the dough
The lab was founded as a nonprofit in 2015, but created a for-profit offshoot in 2019 to help drive funding, including a...