Text Generation
Transformers
Safetensors
English
olmo2
conversational

ValueError: Unrecognized model in allenai/OLMo-2-1124-7B-Instruct

#1
by ryzzlestrizzle - opened

I get this funky error when running the below code with transformers==4.52.4 and torch==2.7.1+cu126:

import torch
from transformers import AutoModelForCausalLM, __version__


print(torch.__version__)
print(f"Transformers version: {__version__}")


model_path = "allenai/OLMo-2-1124-7B-Instruct"
model = AutoModelForCausalLM.from_pretrained(model_path)

Any pointers on fixing? It works for the 1B instruct model (weirdly)

Nevermind, the problem was some weird environment thing when running the Portuguese lm eval harness - problem nothing to do with the repo/code

ryzzlestrizzle changed discussion status to closed

Sign up or log in to comment