πBest choice: RTX 3090
- 2 generations before -> low cost
- 24 GB VRAM -> top power for LLM (VRAM thirsty)
π§ NVIDIA Consumer GPUs (≥ 8 GB VRAM, sorted by VRAM)
24 GB VRAM
-
RTX 4090
RTX 3090
RTX 3090 Ti
π Absolute best for local LLMs (big context, no offload).
20 GB VRAM
-
RTX 4080 Super (20 GB variants exist in some regions / AIBs)
RTX 3080 Ti (20 GB OEM / rare variants)
π Rare, but very strong if you find one.
16 GB VRAM
-
RTX 4080
RTX 4070 Ti Super
RTX 4060 Ti 16 GB
RTX 3080 16 GB (rare AIBs)
RTX 2080 Ti
π Sweet spot for serious local LLM + agents.
No comments:
Post a Comment