Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Unsupported Parameter Errors for model o1-mini & o1-preview #3745

Closed
1 of 3 tasks
PucaVaz opened this issue Sep 30, 2024 · 11 comments
Closed
1 of 3 tasks

bug: Unsupported Parameter Errors for model o1-mini & o1-preview #3745

PucaVaz opened this issue Sep 30, 2024 · 11 comments
Assignees
Labels
category: model settings Inference params, presets, templates category: model support Support new model, or fix broken model category: providers Local & remote inference providers duplicate This issue or pull request already exists move to Cortex type: bug Something isn't working
Milestone

Comments

@PucaVaz
Copy link

PucaVaz commented Sep 30, 2024

Jan version

v0.5.4-649

Describe the Bug

I can't disable the stop words, so I get this error

Unsupported parameter: 'stop' is not supported with this model.
Jan’s in beta. Access troubleshooting assistance now.

Steps to Reproduce

Just try to use the newer model from open ai, the o1-preview

Screenshots / Logs

Screenshot 2024-09-30 at 18 06 41 ### What is your OS?
  • MacOS
  • Windows
  • Linux
@PucaVaz PucaVaz added the type: bug Something isn't working label Sep 30, 2024
@github-project-automation github-project-automation bot moved this to Investigating in Jan & Cortex Sep 30, 2024
@freelerobot freelerobot added category: providers Local & remote inference providers category: model support Support new model, or fix broken model category: model settings Inference params, presets, templates labels Oct 14, 2024
@imtuyethan
Copy link
Contributor

Duplicate with #3771

Reproducible:

Image

@imtuyethan
Copy link
Contributor

imtuyethan commented Oct 17, 2024

Should be fixed as a part of Remote API Extension #3505

@imtuyethan
Copy link
Contributor

Related #3771

@louis-jan
Copy link
Contributor

Hi @PucaVaz @imtuyethan. Could you please help try the latest version, 0.5.8, to see if the issue has been resolved?

@PucaVaz
Copy link
Author

PucaVaz commented Nov 15, 2024

@louis-jan for sure, just one second

@PucaVaz
Copy link
Author

PucaVaz commented Nov 15, 2024

Now it's work, but not by default, i need to change a few things, in order to work, by setting the temperature to 1, top P to 1 and stream to off, it's cool that now works

Screenshot 2024-11-15 at 07 34 06 It's strange that if an error occurs, the message doesn't say anything about it. Screenshot 2024-11-15 at 07 36 22

@louis-jan
Copy link
Contributor

louis-jan commented Nov 15, 2024

Thanks @PucaVaz, we’ll handle the message error gracefully.

@imtuyethan imtuyethan added this to the v0.5.8 milestone Nov 18, 2024
@imtuyethan imtuyethan moved this from In Review to Completed in Jan & Cortex Nov 18, 2024
@imtuyethan imtuyethan moved this from Completed to Review + QA in Jan & Cortex Nov 18, 2024
@imtuyethan imtuyethan added the duplicate This issue or pull request already exists label Nov 18, 2024
@imtuyethan imtuyethan moved this from Review + QA to Completed in Jan & Cortex Nov 18, 2024
@imtuyethan imtuyethan reopened this Nov 18, 2024
@github-project-automation github-project-automation bot moved this from Completed to In Progress in Jan & Cortex Nov 18, 2024
@imtuyethan imtuyethan changed the title bug: Unsupported parameter: 'stop' is not supported with this model on o1 preview bug: Unsupported Parameter Errors for model o1-mini & o1-preview Nov 18, 2024
@imtuyethan imtuyethan removed this from the v0.5.8 milestone Nov 18, 2024
@imtuyethan imtuyethan added this to the v0.5.9 milestone Nov 18, 2024
@imtuyethan
Copy link
Contributor

@louis-jan Can reproduce on my end:

Screen.Recording.2024-11-18.at.5.31.11.PM.mov

@imtuyethan
Copy link
Contributor

imtuyethan commented Nov 20, 2024

They work now; however:

  • There's no way to adjust the stream settings? I can't see the option to turn it on & off. Now, all responses default to turn off stream.
  • This is a big friction in user experience because the Generating Response indicator feels stuck & unnatural over a simple prompt.

OpenAI o1-mini

Screen.Recording.2024-11-20.at.4.00.06.PM.mov

OpenAI o1-preview

Screen.Recording.2024-11-20.at.4.02.09.PM.mov

cc @louis-jan

@louis-jan
Copy link
Contributor

@imtuyethan o1 beta models do not support stream. See https://openai.com/index/introducing-openai-o1-preview/
The API for these models currently doesn't include function calling, streaming, support for system messages, and other features

@imtuyethan
Copy link
Contributor

Updated: will fix

@imtuyethan imtuyethan moved this from Review + QA to In Progress in Jan & Cortex Nov 20, 2024
@imtuyethan imtuyethan moved this from Review + QA to Completed in Jan & Cortex Nov 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: model settings Inference params, presets, templates category: model support Support new model, or fix broken model category: providers Local & remote inference providers duplicate This issue or pull request already exists move to Cortex type: bug Something isn't working
Projects
Archived in project
Development

No branches or pull requests

4 participants