CUDA Out of memory error - SoojungHong/Riding_LLaMA-and-Fine-Tuning GitHub Wiki
My computer and LLM model size
my computer's video memory : 11751 MB (11.7 GB)
llama-2-7b-chat-hf : 13.5 GB
Error message
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 7.31 GiB. GPU 0%| | 0/1000 [00:01<?, ?it/s]