-
Notifications
You must be signed in to change notification settings - Fork 107
Pull requests: intel/auto-round
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
fix: add missing run_mllm entry point alias
#1695
opened Apr 16, 2026 by
JGSphaela
Loading…
2 of 9 tasks
update mtp quant for special cases
#1691
opened Apr 16, 2026 by
xin3he
Contributor
Loading…
2 of 9 tasks
Fix
module.to("meta") for models with plain Tensors
#1688
opened Apr 15, 2026 by
yiliu30
Contributor
Loading…
1 of 9 tasks
rename scheme INT8_W8A8 to INT8
#1687
opened Apr 15, 2026 by
thuang6
Contributor
Loading…
4 of 9 tasks
Add Claude skills for AutoRound
#1686
opened Apr 15, 2026 by
lvliang-intel
Contributor
Loading…
1 of 9 tasks
fix: pass
dynamic=True to torch.compile to stop dynamo recompilation
#1685
opened Apr 15, 2026 by
rahulp7801
•
Draft
Security: HTTP requests are performed without timeout safeguards
#1683
opened Apr 15, 2026 by
tomaioo
Loading…
Feats: Quantize/save/evaluate the Wan-AI/WAN2.2 models in w4a16 format
#1678
opened Apr 14, 2026 by
lvliang-intel
Contributor
Loading…
2 of 9 tasks
Refactor: use get_submodule with manual traversal fallback in get_module
#1677
opened Apr 13, 2026 by
yael-shr
Loading…
5 tasks done
[step 1]support variable block input shapes for gemma4
#1656
opened Apr 3, 2026 by
wenhuach21
Contributor
Loading…
2 of 9 tasks
Enable low_cpu_mem_usage for mxfp/nvfp
#1648
opened Apr 2, 2026 by
Kaihui-intel
Contributor
Loading…
1 of 9 tasks
[Draft] Support TurboQuant KV-cache quantization
#1634
opened Mar 27, 2026 by
lvliang-intel
Contributor
•
Draft
2 of 9 tasks
Support ByteDance-Seed/BAGEL-7B-MoT quantization in w4a16 format
#1633
opened Mar 27, 2026 by
lvliang-intel
Contributor
Loading…
2 of 9 tasks
Fix ignore_layers not working for FP8 models
#1286
opened Jan 15, 2026 by
Copilot
AI
Loading…
11 tasks done
[WIP][refactor quanizers][step 1] refactor rtn and tuning
#1278
opened Jan 14, 2026 by
n1ck-guo
Contributor
Loading…
Previous Next
ProTip!
Adding no:label will show everything without a label.