Open links in new tab
  1. Transformers — transformers 3.0.2 documentation - Hugging Face

    🤗 Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet…) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between ...

  2. Transformers - Hugging Face

    🤗 Transformers State-of-the-art Machine Learning for PyTorch, TensorFlow and JAX. 🤗 Transformers provides APIs to easily download and train state-of-the-art pretrained models. Using pretrained models can reduce your compute costs, carbon footprint, and save you time from training a model from scratch.

  3. Pipelines — transformers 3.0.2 documentation - Hugging Face

    This pipeline extracts the hidden states from the base transformer, which can be used as features in downstream tasks. This feature extraction pipeline can currently be loaded from the pipeline() method using the following task identifier(s):

  4. AutoModels — transformers 3.0.2 documentation - Hugging Face

    class transformers. TFAutoModel [source] ¶ TFAutoModel is a generic model class that will be instantiated as one of the base model classes of the library when created with the TFAutoModel.from_pretrained(pretrained_model_name_or_path) class method.

  5. Quickstart - Hugging Face

    Transformers is designed to be fast and easy to use so that everyone can start learning or building with transformer models. The number of user-facing abstractions is limited to only three classes for instantiating a model, and two APIs for inference or training. This quickstart introduces you to Transformers’ key features and shows you how to:

  6. BERT - Hugging Face

    BERT is a bidirectional transformer pretrained on unlabeled text to predict masked tokens in a sentence and to predict whether one sentence follows another. The main idea is that by randomly masking some tokens, the model can train on text to the …

  7. Pipelines - Hugging Face

    Don’t hesitate to create an issue for your task at hand, the goal of the pipeline is to be easy to use and support most cases, so transformers could maybe support your use case. If you want to try simply you can: Subclass your pipeline of choice

  8. Installation — transformers 4.7.0 documentation - Hugging Face

    🤗 Transformers is tested on Python 3.6+, and PyTorch 1.1.0+ or TensorFlow 2.0+. You should install 🤗 Transformers in a virtual environment . If you’re unfamiliar with Python virtual environments, check out the user guide .

  9. Installation - Hugging Face

    An editable install is useful if you’re developing locally with Transformers. It links your local copy of Transformers to the Transformers repository instead of copying the files. The files are added to Python’s import path.

  10. Trainer - Hugging Face

    Trainer is a simple but feature-complete training and eval loop for PyTorch, optimized for 🤗 Transformers. Important attributes: model — Always points to the core model. If using a transformers model, it will be a PreTrainedModel subclass.

  11. Some results have been removed
Refresh