We’re releasing the 774 million parameter GPT-2 language model after the release of our small 124M model in February, staged release of our medium 355M model in May, and subsequent research with partners and the AI community into the model’s potential for misuse and societal benefit. We’re also releasing an open-source legal agreement to make it easier for organizations to initiate model-sharing partnerships with each other, and are publishing a technical report about our experience in coordinating with the wider AI research community on publication norms.
Back to articles
AIOpenAI News
GPT-2: 6-month follow-up
We’re releasing the 774 million parameter GPT-2 language model after the release of our small 124M model in February, staged release of our medium 355M model in May, and subsequent research with partners and the AI comm...
This source only provides an excerpt in its RSS feed. FlowMarket displays all content available from the feed and keeps the original publication link for attribution.
Need an n8n workflow or help installing it?
After the briefing, move to execution: find an n8n template or a creator who can adapt it to your tools.