Natural Language Processing with Transformers in Python

Short Description

Learn next-generation NLP with transformers using PyTorch, TensorFlow, and HuggingFace!

What you’ll learn

  • How to use transformer models for NLP
  • Modern natural language processing technologies
  • An overview of recent development in NLP
  • Python
  • Machine Learning
  • Natural Language Processing
  • Tensorflow
  • PyTorch
  • Transformers
  • Sentiment Analysis
  • Question and Answering
  • Named Entity Recognition

This course includes:

  • 11.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Assignments
  • Certificate of completion

Requirements

  • Knowledge of Python
  • Experience with data science a plus
  • Experience with NLP a plus

Description

Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.

In this course, we learn all you need to know to get started with building cutting-edge performance NLP applications using transformer models like Google AI’s BERT, or Facebook AI’s DPR.

We cover several key NLP frameworks including:

  • HuggingFace’s Transformers
  • TensorFlow 2
  • PyTorch
  • spaCy
  • NLTK
  • Flair

And learn how to apply transformers to some of the most popular NLP use-cases:

  • Language classification/sentiment analysis
  • Named entity recognition (NER)
  • Question and Answering
  • Similarity/comparative learning

Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.

All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:

  • History of NLP and where transformers come from
  • Common preprocessing techniques for NLP
  • The theory behind transformers
  • How to fine-tune transformers

We cover all this and more, I look forward to seeing you in the course!

Who this course is for:

  • Aspiring data scientists and ML engineers interested in NLP
  • Practitioners looking to upgrade their skills
  • Developers looking to implement NLP solutions
  • Data scientist
  • Machine Learning Engineer
  • Python Developers

Also check: Marketing Analytics

Get 3 course worth $129 for FREE

RECENT COURSE

COURSERA COURSE