Latest version
Released:
Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts.
Project description
Update June 5th 2020: OpenAI has announced a successor to GPT-2 in a newly published paper. Checkout our GPT-3 model overview. OpenAI recently published a blog post on their GPT-2 language model. This tutorial shows you how to run the text generator code yourself. As stated in their blog post. Download the latest version of Python from the official Python website and install it. Once the installation completes, check for the version of pip running on your system. To do so, go to the command prompt and type: $ pip3 -version Since you have installed the latest version of Python, that is, Python 3.x, you have pip3, and not pip.
A simple Python package that wraps existing model fine-tuning and generation scripts for OpenAI GPT-2 text generation model (specifically the 'small', 124M hyperparameter version). Additionally, this package allows easier generation of text, generating to a file for easy curation, allowing for prefixes to force the text to start with a given phrase.
Usage
An example for downloading the model to the local system, fineturning it on a dataset. and generating some text.
Gpt 2 Demo Online
Warning: the pretrained model, and thus any finetuned model, is 500 MB!
The generated model checkpoints are by default in /checkpoint/run1
. If you want to load a model from that folder and generate text from it:
As with textgenrnn, you can generate and save text for later use (e.g. an API or a bot) by using the return_as_list
parameter.
You can pass a run_name
parameter to finetune
and load_gpt2
if you want to store/load multiple models in a checkpoint
folder.
NB: Restart the Python session first if you want to finetune on another dataset or load another model.
Release historyRelease notifications | RSS feed
0.7.1
0.7
0.6
0.5.4
0.5.3
0.5.2
0.5.1
0.5
How To Use Gpt 2
0.4.2
0.4.1
0.4
0.3.1
0.3
0.2
0.1
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Filename, size | File type | Python version | Upload date | Hashes |
---|---|---|---|---|
Filename, size gpt_2_simple-0.7.1.tar.gz (24.9 kB) | File type Source | Python version None | Upload date | Hashes |
Hashes for gpt_2_simple-0.7.1.tar.gz
Algorithm | Hash digest |
---|---|
SHA256 | 289ba08114e90c01e0975be8ed316fec1e4f607f48624b0fce227e8b6983ba17 |
MD5 | 09c6d6768933acfc666c67fa2b6a2594 |
BLAKE2-256 | 6fe4a90add0c3328eed38a46c3ed137f2363b5d6a07bf13ee5d5d4d1e480b8c3 |
Update June 5th 2020: OpenAI has announced a successor to GPT-2 in a newly published paper. Checkout our GPT-3 model overview.
OpenAI recently published a blog post on their GPT-2 language model. This tutorial shows you how to run the text generator code yourself. As stated in their blog post:
[GPT-2 is an] unsupervised language model which generates coherent paragraphs of text, achieves state-of-the-art performance on many language modeling benchmarks, and performs rudimentary reading comprehension, machine translation, question answering, and summarization—all without task-specific training.
#1: Install system-wide dependencies
Ensure your system has the following packages installed:
- CUDA
- cuDNN
- NVIDIA graphics drivers
To install all this in one line, you can use Lambda Stack.
#2: Install Python dependencies & GPT-2 code
#3: Run the model
How To Download Gpt-2 Macos
Conditionally generated samples from the paper use top-k random sampling with k = 40. You'll want to set k = 40 in interactive_conditional_samples.py
. Either edit the file manually or use this command:
Now, you're ready to run the model!
Update: OpenAI has now released their larger 345M model. You can change 345M above to 117M to download the smaller one. Here's the 117M model's attempt at writing the rest of this article based on the first paragraph:
Now here's the 345M's model attempt at writing the rest of this article. Let's see the difference. Again, this is just one sample from the network but the larger model definitely produces a more accurate sounding code tutorial.
It at least seems to realize that open-ai should appear in the github URLs we're cloning. Looks like we'll be keeping our jobs for a while longer :).
How To Download Gpt-2 Mac Os
Further Reading
How To Download Gpt-2 Mac Pro
- Read more about OpenAI's language model in their blog post Better Language Models and Their Implications.
- Read more about top-k random sampling in section 5.4 of Hierarchical Neural Story Generatio
Comments are closed.