Description
Ollama is an open-source framework and runtime for running, managing, and serving LLMs locally. It acts like a container system (similar to Docker) for AI models, simplifying the setup and execution of LLMs such as Llama 3, Mistral, Gemma, and Phi-3. This course explains how to work with models, manage local AI environments, and understand the benefits and limitations of using Ollama.
Curriculum
Introduction and Overview
Available in
days
days
after you enroll
Module-1
Available in
days
days
after you enroll
Module- 2
Available in
days
days
after you enroll
Quiz
Available in
days
days
after you enroll
Module-3
Available in
days
days
after you enroll
Module-4
Available in
days
days
after you enroll
Quiz
Available in
days
days
after you enroll
Check your inbox to confirm your subscription