# Running LLMs Locally

- [Choosing Models Based on Hardware](/ai/running-llms-locally/choosing-models-based-on-hardware.md): Calculator for identifying LLMs that can be run locally
- [LM Studio](/ai/running-llms-locally/lm-studio.md): How to run LLMs locally with LM Studio
- [Ollama](/ai/running-llms-locally/ollama.md): How to run LLMs locally with Ollama
- [Claude Code with Local LLM](/ai/running-llms-locally/claude-code-with-local-llm.md): How to configure Claude Code with a local LLM
