Home Search Profile

Ollama Pro 2024: Master Local LLM Development

Focused View

3:03:11

  • 1 -Introduction.mp4
    03:27
  • 1 -Ollama Course.zip
  • 1 -Ollama Course presentation.pdf
  • 1 -Other course resources.zip
  • 2 -Installing and Setting up Ollama.mp4
    10:45
  • 3 -Model customizations and other options.mp4
    10:32
  • 4 -All Ollama Command Prompt Terminal commands.mp4
    11:08
  • 4 -mymodelfile.txt
  • 1 -Introduction to Open WebUI.mp4
    01:33
  • 2 -Setting up Docker and Open WebUI.mp4
    04:07
  • 3 -Open WebUI features and functionalities.mp4
    09:51
  • 4 -Getting response based on documents and websites.mp4
    05:31
  • 5 -Open WebUI user access control.mp4
    04:43
  • 1 -Types of Ollama models.mp4
    02:42
  • 2 -Text models.mp4
    11:56
  • 3 -Vision models.mp4
    09:00
  • 3 -bar chart.zip
  • 3 -callcenter.zip
  • 3 -cc.zip
  • 4 -Code generating models.mp4
    11:30
  • 5 -Create custom model from gguf file.mp4
    06:45
  • 5 -phi3mini.txt
  • 1 -Installing and Setting up Python environment.mp4
    04:43
  • 1 -Ollama Course.zip
  • 2 -Using Ollama in Python using Ollama library.mp4
    07:45
  • 3 -Calling Model using API and OpenAI compatibility.mp4
    08:10
  • 1 -What is LangChain and why are we using it.mp4
    04:55
  • 2 -Basic modules of Langchain.mp4
    09:58
  • 2 -Ollama Course.zip
  • 1 -Understanding the concept of RAG (Retrieval Augmented Generation).mp4
    08:28
  • 2 -LangchainRetrieval.txt
  • 2 -Loading, Chunking and Embedding document using LangChain and Ollama.mp4
    10:50
  • 3 -Answering user question with retrieved information.mp4
    08:36
  • 1 -Understanding Tools and Agents.mp4
    05:28
  • 2 -Ollama Course.zip
  • 2 -Tools and Agents using LangChain and Llama3.1.mp4
    10:48
  • More details


    Course Overview

    Become an Ollama expert and build private, customized LLM applications with this complete guide to local AI development and deployment.

    What You'll Learn

    • Install and configure Ollama to run private LLM models locally
    • Create custom models and ChatGPT-like interfaces with Open WebUI
    • Build Python applications with RAG capabilities using LangChain

    Who This Is For

    • Python developers wanting private AI solutions
    • Data scientists building secure LLM applications
    • AI enthusiasts focused on privacy and customization

    Key Benefits

    • Full control over your AI models and data
    • Customize models for specific use cases
    • Develop advanced RAG applications locally

    Curriculum Highlights

    1. Ollama setup and model customization
    2. Open WebUI and Docker deployment
    3. Python integration and RAG development
    Focused display
    • language english
    • Training sessions 24
    • duration 3:03:11
    • Release Date 2025/04/19