Classify and extract text 10x better and faster 🦾


➡️  Learn more

gpt2 model

🤗 Huggingface gpt2

The model gpt2 is a Natural Language Processing (NLP) Model implemented in Transformer library, generally using the Python programming language.

What is the gpt2 model?

GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion . It was trained to guess the next word in sentences using a causal language modeling objective . The model learns an inner representation of the English language that can then be used to extract features that are useful for downstream tasks . You can use the raw model for text generation or fine-tune it to a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. The model is best at what it was pretrained for however, which is generating texts from a pre-training corpus of text . The team releasing GPT,

Fine-tune gpt2 models

Metatext is a powerful no-code tool for train, tune and integrate custom NLP models

➡️  Learn more

Model usage

You can find gpt2 model easily in transformers python library. To download and use any of the pretrained models on your given task, you just need to use those a few lines of codes (PyTorch version). Here an example to download using pip (a package installer for Python)

Download and install using pip

$ pip install transformers

Usage in python

# Import generic wrappers
from transformers import AutoModel, AutoTokenizer 


# Define the model repo
model_name = "gpt2" 


# Download pytorch model
model = AutoModel.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)


# Transform input tokens 
inputs = tokenizer("Hello world!", return_tensors="pt")

# Model apply
outputs = model(**inputs)
    

More info about gpt2

See the paper, download and more info


Classify and extract text 10x better and faster 🦾

Metatext helps you to classify and extract information from text and documents with customized language models with your data and expertise.