Tech with Tyler
Ctrlk
LinkedInGitHubYouTube
  • πŸ‘‹Welcome!
    • whoami
    • !!! Disclaimer !!!
  • πŸŽ“Academy
    • AWS Security Cookbook by Tyler
    • My content on Cybr
    • My content on PwnedLabs
    • My content on YouTube
  • πŸ€–AI
    • Running LLMs Locally
      • Choosing Models Based on Hardware
      • LM Studio
      • Ollama
  • ☁️Cloud Security
    • AWS
    • Azure
    • Capture the Flags (CTFs)
    • Tools
  • Kubernetes & Containers
    • Kubernetes Threat Matrix
    • Kubernetes General Info
    • Docker
  • πŸ‘¨β€πŸ’»Coding & CLI Tooling
    • CLI Tools
    • Coding and Scripting
    • Proxy Tooling
  • βš™οΈDevSecOps
    • CI/CD
    • Hashicorp Terraform
    • Hashicorp Vault
    • IAC Scanning
    • Secrets Scanning
  • πŸ’»Operating Systems
    • Linux
    • macOS
  • 🎁Miscellaneous
    • Jenkins
  • πŸ—οΈProjects
    • Active Directory Homelab Automation
    • AWS Cloud Resume Challenge
    • Proxmox Homelab as Code
  • πŸ“ŒOther
    • Useful Resources
Powered by GitBook
On this page
  1. πŸ€–AI

Running LLMs Locally

How to run LLMs locally

Choosing Models Based on HardwareLM StudioOllama
PreviousLinux in 60 Seconds!NextChoosing Models Based on Hardware

Was this helpful?

Was this helpful?