Neural machine translation (NMT), Text summarization, Question Answering, Chatbot
- Free tutorial
- Rating: 3.9 out of 53.9 (10 ratings)
- 1,969 students
- 37min of on-demand video
- Created by Nitsan Soffair
- English
What you’ll learn
- Advance knowledge at modern NLP
- Understand modern NLP techniques
- Advance knowledge at modern DL
- Understand modern DL techniques
Requirements
- Motivation
- Interset
- Mathematical approach
Description
You will learn the newest state-of-the-art Natural language processing (NLP) Deep-learning approaches.
You will
- Get state-of-the-art knowledge regarding
- NMT
- Text summarization
- QA
- Chatbot
- Validate your knowledge by answering short and very easy 3-question queezes of each lecture
- Be able to complete the course by ~2 hours.
Syllabus
- Neural machine translation (NMT)
- Seq2seq
A family of machine learning approaches used for natural language processing. - Attention
A technique that mimics cognitive attention. - NMT
An approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modelling entire sentences in a single integrated model. - Teacher-forcing
An algorithm for training the weights of recurrent neural networks (RNNs). - BLEU
An algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. - Beam search
A heuristic search algorithm that explores a graph by expanding the most promising node in a limited set.
- Seq2seq
- Text summarization
- Transformer
A deep learning model that adopts the mechanism of self-attention, differentially weighting the significance of each part of the input data.
- Transformer
- Question Answering
- GPT-3
An autoregressive language model that uses deep learning to produce human-like text. - BERT
A transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google.
- GPT-3
- Chatbot
- LSH
An algorithmic technique that hashes similar input items into the same “buckets” with high probability. - RevNet
A variant of ResNets where each layer’s activations can be reconstructed exactly from the next layer’s. - Reformer
Introduces two techniques to improve the efficiency of Transformers.
- LSH
Resources
- Wikipedia
- Coursera
Who this course is for:
- Anyone intersted in NLP
- Anyone intersted in AI
Show less
Course content
5 sections • 14 lectures • 37m total lengthCollapse all sections
Neural machine translation (NMT)5 lectures • 8min
- Seq2seq01:41
- Seq2seq3 questions
- Attention01:36
- attention3 questions
- Neural machine translation (NMT)01:18
- Neural machine translation (NMT)3 questions
- BLEU01:54
- BLEU3 questions
- Beam search01:10
- Beam search3 questions
Text summarization1 lecture • 1min
- Transformer00:59
- Transformer3 questions
Question Answering2 lectures • 4min
- GPT-301:28
- GPT-33 questions
- BERT02:16
- BERT3 questions
Chatbot3 lectures • 5min
- LSH01:21
- LSH3 questions
- RevNet01:59
- RevNet3 questions
- Reformer01:57
- Reformer3 questions
Bonus3 lectures • 20min
- GPT-307:03
- DALL-E05:09
- CLIP07:37