Claude Code with Local LLM

How to configure Claude Code with a local LLM

Overview

  • A local computer isn't going to compete with a datacenter anytime soon but maybe you're without Internet, want additional data privacy, or have hit your Claude Code subscription limit for the month. In such cases, we can setup Claude Code to work with local LLMs.

circle-info

These same steps can be used for non-local models e.g., if you have an AI server rig, you can point Claude Code (and most other apps) to the model hosted there.


Setup

LM Studio

## 25k recommended for claude code usage
lms load openai/gpt-oss-20b --context-length 25000

lms server start --port 1234

Claude Code Setup β€” Terminal

  • In the terminal run the below making sure to swap out the port and model based on your needs

export ANTHROPIC_BASE_URL=http://localhost:1234
export ANTHROPIC_AUTH_TOKEN=lmstudio
  • In Claude, we can see the local model is being used

Claude Code Setup β€” VS Code

  • In VS Code, hit CMD + SHIFT + P and open Preferences: Open User Settings (JSON) and paste this snippet within the main { }

  • Make sure swap out the the URL as needed

Validating Setup β€” Developer Logs

  • Within LMStudio, we can view the developer logs to validate our interaction with Claude is leveraging the local LLM


Resources

Last updated