Skip to main content
Full GPU

Best variant: Q4_K_M

Full GPU inference — 24 GB VRAM meets the 24 GB recommendation.

GPU VRAM
24 GB
Min VRAM (best fit)
22 GB
Recommended VRAM
24 GB
Estimated tok/s
~40.3

Share this matchup

Send this page so a friend can see if NVIDIA GeForce RTX 3090 Ti fits CodeLlama 34B.

Every CodeLlama 34B quantization on NVIDIA GeForce RTX 3090 Ti

Each row runs the compatibility engine against your VRAM, RAM, and the model's requirements.

QuantizationFile SizeMin VRAMRec VRAMContextVerdictEstimated tok/s
Q4_K_MBest fit20 GB22 GB24 GB4K / 16KFull GPU~40.3

NVIDIA GeForce RTX 3090 Ti is solid pick for CodeLlama 34B

Need second card or fresh build? These links help support site at no extra cost.

All hardware for CodeLlama 34BBest GPU for CodeLlama 34BModels that fit NVIDIA GeForce RTX 3090 TiFull model detailsBrowse all models