SAGE-MM-Qwen3-VL-4B-SFT-GGUF

The SAGE-MM-Qwen3-VL-4B-SFT from allenai is a 4B-parameter vision-language model fine-tuned from Qwen/Qwen3-VL-4B-Instruct, serving as the core decision-maker in the SAGE (Smart Any-Horizon Agent) system for long video reasoning, operating in two stages: Stage-1 analyzes initial sampled frames and metadata to classify queries as single-turn (immediate answer) or multi-turn (tool-required), while Stage-2 iteratively calls tools, processes observations, and refines context until resolution. It generates JSON-formatted actions for specialized tools like web-search (external knowledge), transcribe-speech (ASR on timestamps), ground-event (temporal localization), extract-video-parts (high-res frames/subclips), and analyze (detailed visual breakdown), enabling robust handling of extended videos beyond fixed horizons via dynamic tool orchestration. Released under Apache 2.0 for research/educational use per Ai2 guidelines, it requires the SAGE GitHub runtime for tool execution/parsing and supports GGUF quantizations for efficient deployment, excelling in video Q&A across sports, narratives, and events.

SAGE-MM-Qwen3-VL-4B-SFT [GGUF]

File Name Quant Type File Size File Link
SAGE-MM-Qwen3-VL-4B-SFT.IQ4_XS.gguf IQ4_XS 2.49 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q2_K.gguf Q2_K 1.8 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q3_K_L.gguf Q3_K_L 2.41 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q3_K_M.gguf Q3_K_M 2.24 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q3_K_S.gguf Q3_K_S 2.05 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q4_K_M.gguf Q4_K_M 2.72 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q4_K_S.gguf Q4_K_S 2.6 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q5_K_M.gguf Q5_K_M 3.16 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q5_K_S.gguf Q5_K_S 3.09 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q6_K.gguf Q6_K 3.63 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.Q8_0.gguf Q8_0 4.69 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.f16.gguf F16 8.83 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ1_M.gguf i1-IQ1_M 1.25 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ1_S.gguf i1-IQ1_S 1.18 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ2_M.gguf i1-IQ2_M 1.68 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ2_S.gguf i1-IQ2_S 1.58 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ2_XS.gguf i1-IQ2_XS 1.48 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ2_XXS.gguf i1-IQ2_XXS 1.37 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ3_M.gguf i1-IQ3_M 2.13 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ3_S.gguf i1-IQ3_S 2.07 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ3_XS.gguf i1-IQ3_XS 1.98 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ3_XXS.gguf i1-IQ3_XXS 1.84 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ4_NL.gguf i1-IQ4_NL 2.6 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-IQ4_XS.gguf i1-IQ4_XS 2.48 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q2_K.gguf i1-Q2_K 1.8 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q2_K_S.gguf i1-Q2_K_S 1.69 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q3_K_L.gguf i1-Q3_K_L 2.41 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q3_K_M.gguf i1-Q3_K_M 2.24 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q3_K_S.gguf i1-Q3_K_S 2.05 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q4_0.gguf i1-Q4_0 2.59 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q4_1.gguf i1-Q4_1 2.84 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q4_K_M.gguf i1-Q4_K_M 2.72 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q4_K_S.gguf i1-Q4_K_S 2.6 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q5_K_M.gguf i1-Q5_K_M 3.16 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q5_K_S.gguf i1-Q5_K_S 3.09 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.i1-Q6_K.gguf i1-Q6_K 3.63 GB Download
SAGE-MM-Qwen3-VL-4B-SFT.imatrix.gguf imatrix 3.87 MB Download
SAGE-MM-Qwen3-VL-4B-SFT.mmproj-Q8_0.gguf mmproj-Q8_0 454 MB Download
SAGE-MM-Qwen3-VL-4B-SFT.mmproj-f16.gguf mmproj-f16 836 MB Download

Quants Usage

(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)

Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better):

image.png

Downloads last month
2,703
GGUF
Model size
4B params
Architecture
qwen3vl
Hardware compatibility
Log In to view the estimation

1-bit

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

16-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for prithivMLmods/SAGE-MM-Qwen3-VL-4B-SFT-GGUF

Quantized
(4)
this model