-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Provider Extension - OpenAI #3370
Closed
1 task done
Labels
category: model support
Support new model, or fix broken model
category: providers
Local & remote inference providers
type: feature request
A new feature
Comments
imtuyethan
changed the title
feat: [Add support for GPT-4o Mini]
feat: Add support for GPT-4o Mini
Aug 30, 2024
Related issue: |
Scoped in Jan v0.6.0 uses Cortex Extension |
6 tasks
freelerobot
added
category: model support
Support new model, or fix broken model
category: providers
Local & remote inference providers
labels
Oct 14, 2024
freelerobot
changed the title
feat: Add support for GPT-4o Mini
feat: Provider Extension - OpenAI
Oct 17, 2024
14 tasks
Closed in favor of #3786 |
25 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
category: model support
Support new model, or fix broken model
category: providers
Local & remote inference providers
type: feature request
A new feature
Goal
Original User Issue
Add support for GPT-4o Mini
Is your feature request related to a problem? Please describe it
Jan does not support openAI's GPT-4o Mini.
Describe the solution
Add support for GPT-4o Mini.
Teachability, documentation, adoption, migration strategy
No response
What is the motivation / use case for changing the behavior?
GPT-4o mini is openAI's most cost-efficient small model that’s smarter and cheaper than GPT-3.5 Turbo.
The text was updated successfully, but these errors were encountered: