VRAM requirements?

#3
by hamaadtahiir - opened

Can it work on single 3090 24gb vram using vllm?

Unsloth AI org

Can it work on single 3090 24gb vram using vllm?

yes ofc!

please tell me the command line parameters to use with vllm and I am assuming this would also allow video inference?

Sign up or log in to comment