Choosing Models Based on Hardware

Calculator for identifying LLMs that can be run locally

This appears to be a fairly accurate calculator for determining which models will run well based on your hardware.

Last updated

Was this helpful?