Local Llm

5 articles about local llm.

Apple's On-Device AI as a Local Fallback for Cloud LLM APIs

·2 min read

Using Claude API as the primary LLM provider but having Apple's on-device AI as a local fallback that speaks the same OpenAI-compatible format is a game changer for macOS apps.

appleon-device-ailocal-llmfallbackmacosapi

Learning Path for Local LLMs - From Ollama to Desktop Agents

·2 min read

A practical learning path for running local LLMs: start with Ollama basics, learn prompting, understand quantization, build workflows, then automate your desktop.

ollamalocal-llmlearningdesktop-agentautomationtutorial

Why Self-Hosting AI Matters: Your Agent Sees Your Emails, Documents, and Browsing History

·2 min read

AI agents interact with your most sensitive data - emails, documents, browsing history. Self-hosting with local LLMs keeps that data on your machine where it belongs.

privacyself-hostinglocal-llmai-agentssecurity

VS Code Claude Extension vs Terminal with Ollama - Why the Terminal Route Wins

·2 min read

The VS Code Claude extension is locked to Anthropic's API. Running Claude Code in the terminal with Ollama gives you local models, more control, and zero API costs.

vs-codeclaudeollamaterminallocal-llmdevelopment

Local LLMs Are Not Just for Inference Anymore - Real Workflows on Your Machine

·2 min read

The shift to local LLMs is moving beyond chat and inference into real desktop automation. Browser control, CRM updates, document generation - all without cloud APIs.

local-llmollamadesktop-automationprivacyworkflow

Browse by Topic