Basic checks
What's broken?
I have two macbooks, one late 2022 and an M4 Pro Chip, 16 icnch. On old macbook I have no issues but on this new mac, Im getting Unknown model: gpt-4.1:
chat = RubyLLM.chat(model: "gpt-4.1").with_tool(Story::ModifyTool.new(story: @story))
[..]
Yes, my api key is the same. On the new mac, with the issue, if I have the api key blank I get same error. Im out of ideas being as it's the same repo on both macs.
How to reproduce
# initializer
RubyLLM.configure do |config|
config.openai_api_key = ENV["OPENAI_API_KEY"]
# Use the new association-based acts_as API (recommended)
config.use_new_acts_as = true
end
# run with
chat = RubyLLM.chat(model: "gpt-4.1").with_tool(Story::ModifyTool.new(story: @story))
[..]
I also tried in terminal:
The it errors with '<main>': Unknown model: gpt-5-nano (RubyLLM::ModelNotFoundError)
Expected behavior
Should return a chat response
What actually happened
It errors with: Unknown model: gpt-4.1
Environment
- Ruby: 4.0.1
- Rails: 8.1.2
- RubyLLM: 1.11.0
- OS: MacBook 16 inch, M4 Pro Chip
Basic checks
What's broken?
I have two macbooks, one late 2022 and an M4 Pro Chip, 16 icnch. On old macbook I have no issues but on this new mac, Im getting
Unknown model: gpt-4.1:Yes, my api key is the same. On the new mac, with the issue, if I have the api key blank I get same error. Im out of ideas being as it's the same repo on both macs.
How to reproduce
I also tried in terminal:
The it errors with
'<main>': Unknown model: gpt-5-nano (RubyLLM::ModelNotFoundError)Expected behavior
Should return a chat response
What actually happened
It errors with: Unknown model: gpt-4.1
Environment