Talktotransformer - 一个会写文的AI

https://talktotransformer.com

http://scp-wiki-cn.wikidot.com/forum/t-11629190

Tab中是我生成的一些很有趣的故事片段,由机翻后手动修改而成,当然也有全手翻的。如果你想要的话,欢迎来添加新的片段。

About

Built by Adam King (@AdamDanielKing) as an easier way to play with OpenAI's new machine learning model. In February, OpenAI unveiled a language model called GPT-2 that generates coherent paragraphs of text one word at a time.

This site runs the full-sized GPT-2 model, called 1558M. Before November 5, OpenAI had only released three smaller, less coherent versions of the model.

While GPT-2 was only trained to predict the next word in a text, it surprisingly learned basic competence in some tasks like translating between languages and answering questions. That's without ever being told that it would be evaluated on those tasks. To learn more, read OpenAI's blog post or follow me on Twitter.

Acknowledgements

Thanks to Hugging Face for their PyTorch implementation of GPT-2 which I modified to handle batching queries of mixed lengths.

关于

Adam King@AdamDanielKing)开发,用于简化与OpenAI的新机器学习模型互动的方式。(2019年)2月,OpenAI推出了一种可以按单词生成连贯段落的,名为GPT-2的语言模型

本网站上运行的是完整大小的GPT-2模型,名为1158M。在11月5日之前,OpenAI只发布了三个较小且较不连贯的版本。

尽管GPT-2仅被训练为预测文本中的下一个单词,它令人惊讶地学习了一些任务的基本能力,如翻译语言和回答问题。它并未被告知会在此类任务上被评估。想要了解更多,请阅读OpenAI的推文在推特上关注我

致谢

感谢Hugging Face利用PyTorch实现的GPT-2模型,我对其进行了改良,让它能够批量处理不同长度的输入。

除非特别注明,本页内容采用以下授权方式: Creative Commons Attribution-ShareAlike 3.0 License