ImportError: cannot import name 'LossKwargs' from 'transformers.utils'
Has anyone encountered the error ImportError: cannot import name 'LossKwargs' from 'transformers.utils' for this line https://huggingface.co/microsoft/Phi-4-mini-instruct/blob/main/modeling_phi3.py#L38 ?
This happens while trying to generate dummy LoRa ckpts for this model with:
from peft import LoraConfig, get_peft_model
# [...]
lora_config = LoraConfig(r=lora_rank,
target_modules=target_modules,
bias="none",
task_type="CAUSAL_LM")
lora_output_paths = []
for lora_idx in range(num_loras):
lora_model = get_peft_model(model, lora_config)
I tried installing multiple transformers versions, including the 4.45 that was specified in the config json:
llm_venv.run_cmd(["-m", "pip", "install", "--force-reinstall", "--no-cache-dir", "transformers"]) # 4.55.2
llm_venv.run_cmd(["-m", "pip", "install", "--force-reinstall", "git+https://github.com/huggingface/transformers.git"]) # 4.56.0.dev0
llm_venv.run_cmd(["-m", "pip", "install", "--force-reinstall", "--no-cache-dir", "transformers==4.45.0"]) # 4.45.0
But the issue persists.
same here impossible to import module via transformers..
try downgrading trl and transformers
pip install transformers==4.53.3
pip install trl==0.20.0
I think LossKwargs removed from transformers==4.54.0
https://github.com/sgl-project/sglang/issues/8004#issuecomment-3148397838
i have same problem here
.
try downgrading trl and transformers
pip install transformers==4.53.3 pip install trl==0.20.0I think
LossKwargsremoved from transformers==4.54.0
https://github.com/sgl-project/sglang/issues/8004#issuecomment-3148397838
Unfortunately, that didn't help on my end
That's worded for me:
π§ Step-by-Step Fix
Clear the Hugging Face cache
Sometimes an old copy ofmodeling_phi3.pywith the bad import is cached:rm -rf ~/.cache/huggingface/hub/models--microsoft--Phi-4-mini-instruct(On Windows, delete the folder:
%USERPROFILE%\.cache\huggingface\hub\models--microsoft--Phi-4-mini-instruct).Try loading without remote code
If yourtransformersversion already supports Phi models:from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained( "microsoft/Phi-4-mini-instruct", trust_remote_code=False )β This bypasses the buggy
modeling_phi3.py.If you still need
trust_remote_code=TrueDownload the model locally (
snapshot_downloador git clone).Edit
modeling_phi3.pyand replace:from transformers.utils import LossKwargswith:
try: from transformers.utils import LossKwargs except ImportError: from transformers.loss.loss_utils import LossKwargsLoad from the local path:
model = AutoModelForCausalLM.from_pretrained( "./Phi-4-mini-instruct", trust_remote_code=True )
Ensure a compatible
transformersversion
The model repo may have been written for a specific release. Pin it explicitly:pip install "transformers==4.45.0"or with uv:
uv add transformers==4.45.0 uv sync
β
Quickest fix: clear the cache + try with trust_remote_code=False.
π οΈ If you must use remote code: patch the import in modeling_phi3.py.
Do you want me to also give you a ready one-liner PowerShell command to clear the Hugging Face cache on Windows?
transformers version 4.50.0 worked for me
try downgrading trl and transformers
pip install transformers==4.53.3 pip install trl==0.20.0I think
LossKwargsremoved from transformers==4.54.0
https://github.com/sgl-project/sglang/issues/8004#issuecomment-3148397838Unfortunately, that didn't help on my end
transformers==4.53.3 this version works for me! Thank you!