Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
cortexso
/
qwen3
like
1
Follow
Cortex
76
Text Generation
GGUF
cortex.cpp
featured
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
a826783
qwen3
Ctrl+K
Ctrl+K
2 contributors
History:
6 commits
Minh141120
Create README.md
a826783
verified
about 2 months ago
.gitattributes
2.65 kB
Upload folder using huggingface_hub
about 2 months ago
README.md
2.48 kB
Create README.md
about 2 months ago
metadata.yml
46 Bytes
Upload metadata.yml with huggingface_hub
about 2 months ago
model.yml
Safe
850 Bytes
Upload model.yml with huggingface_hub
about 2 months ago
qwen3-4b-q2_k.gguf
Safe
1.67 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q3_k_l.gguf
Safe
2.24 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q3_k_m.gguf
Safe
2.08 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q3_k_s.gguf
Safe
1.89 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q4_k_m.gguf
Safe
2.5 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q4_k_s.gguf
Safe
2.38 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q5_k_m.gguf
Safe
2.89 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q5_k_s.gguf
Safe
2.82 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q6_k.gguf
Safe
3.31 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-4b-q8_0.gguf
Safe
4.28 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q2_k.gguf
Safe
3.28 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q3_k_l.gguf
Safe
4.43 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q3_k_m.gguf
Safe
4.12 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q3_k_s.gguf
Safe
3.77 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q4_k_m.gguf
Safe
5.03 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q4_k_s.gguf
Safe
4.8 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q5_k_m.gguf
Safe
5.85 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q5_k_s.gguf
Safe
5.72 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q6_k.gguf
Safe
6.73 GB
LFS
Upload folder using huggingface_hub
about 2 months ago
qwen3-8b-q8_0.gguf
Safe
8.71 GB
LFS
Upload folder using huggingface_hub
about 2 months ago