Master LLM Text Generation: Hugging Face Pro Course
Focused View
1:52:11
01 - Fine-tune your AI Hands-on practice with Hugging Face models.mp4
00:54
01 - How do LLMs generate text.mp4
05:56
02 - Overview of the Hugging Face platform.mp4
03:20
03 - Accessing GPUs.mp4
03:55
01 - What is tokenization.mp4
04:48
02 - Inspecting a tokenizer.mp4
05:23
03 - Encoding and decoding text.mp4
08:36
04 - Tokenizer chat template.mp4
02:31
01 - First generation with a local model.mp4
11:40
02 - Pipelines.mp4
02:24
03 - Introduction to generation parameters.mp4
05:42
04 - Temperature.mp4
10:07
05 - Top-k.mp4
07:02
06 - Top-p.mp4
06:20
07 - Other generation parameters.mp4
06:21
08 - Hugging Face Inference API.mp4
04:00
01 - Greedy search.mp4
04:00
02 - Multinomial sampling.mp4
02:41
03 - Beam search.mp4
03:32
04 - Beam search with multinomial.mp4
01:23
05 - Contrastive search.mp4
04:57
01 - Interesting technical resources.mp4
02:06
02 - NVIDIA NIM API.mp4
04:33
More details
Course Overview
Dive deep into text generation with large language models (LLMs) in this hands-on course. Learn to control AI outputs using Hugging Face's powerful tools and NVIDIA NIM API, mastering tokenization, generation parameters, and decoding strategies for real-world applications.
What You'll Learn
- How LLMs generate text and the role of tokenization
- Manipulate generation parameters like temperature, top-p, and top-k
- Implement decoding strategies including greedy search and beam search
Who This Is For
- AI developers working with text generation
- Data scientists exploring LLM capabilities
- Tech professionals wanting Hugging Face expertise
Key Benefits
- Hands-on experience with Hugging Face API
- Technical understanding of generation control
- Skills applicable to real-world AI projects
Curriculum Highlights
- Tokenizer fundamentals and chat templates
- Exploring generation parameters
- Advanced decoding strategies
Focused display
- language english
- Training sessions 23
- duration 1:52:11
- English subtitles has
- Release Date 2025/06/07