}

Llm Index Page 1

Embeddings Explained: What They Are and How to Use Them with Python and Ollama (2026)

Learn what embeddings are and how to use them with Python and Ollama for free. Generate embeddings with nomic-embed-text, compute cosine similarity, build semantic search, and store in pgvector.

Llm Index Page 1

LangGraph Tutorial 2026: Build Stateful AI Agents with Python

LangGraph tutorial 2026: build stateful AI agents in Python using StateGraph, nodes, edges, and checkpointers. Full ReAct agent example with tool calling and human-in-the-loop.

Llm Index Page 1

LiteLLM Tutorial 2026: Use OpenAI, Claude, and Ollama with One Python API

LiteLLM tutorial 2026: call OpenAI, Claude, Gemini, and Ollama with one unified Python API. Setup proxy server, fallbacks, cost tracking, and load balancing.

Llm Index Page 1

n8n + AI on Linux: Self-Hosted Workflow Automation with LLMs (2026)

Run n8n on Linux and automate workflows with AI and local LLMs. Install n8n via Docker, set up AI Agent nodes with Ollama, build real automations, and run as a systemd service. Free 2026 guide.

Llm Index Page 1

Ollama on Linux: Run Local LLMs, Manage Models and Use the API (2026)

Complete Ollama guide for Linux: install, run LLMs locally, manage models, use the REST API, Python integration, and GPU acceleration with NVIDIA or AMD.

Llm Index Page 1

Ollama + Open WebUI on Linux: Run a Local ChatGPT Server (2026)

Set up Ollama and Open WebUI on Linux for a free self-hosted ChatGPT alternative. Run llama3.2, mistral, and qwen2.5 locally with GPU support and nginx reverse proxy.

Llm Index Page 1

OpenAI Function Calling in Python: Complete Guide 2026 (Tools, Parallel Calls, Streaming)

Learn OpenAI function calling in Python 2026. Define tools, handle tool_calls, run parallel function calls, stream with tools, and build a real assistant with weather and calendar tools.

Llm Index Page 1

RAG with Python, LangChain and PostgreSQL pgvector as Vector Store (2026)

Build a RAG pipeline with Python, LangChain, and PostgreSQL pgvector. Load documents, create embeddings, store in pgvector, and query with an LLM. Complete working example.

Llm Index Page 1

Structured Output from LLMs with Python: Reliable JSON Using Pydantic and Instructor (2026)

Get reliable JSON from LLMs using Python. Learn to use the instructor library with Pydantic to extract structured data from OpenAI, Claude, and local Ollama models. 2026 guide.