-
Is there documentation or examples on how to use multi-shot prompting with instructor? Since the pydantic classes abstract from the actual message passed to the LLM, how do we craft examples as part of the multi-shot conversation in the prompt? It would be sweet to be able to use example values to manually instantiate objects of the same type that are going to be used as response_model and then just add them pass them to the chat/completion methods, and have Instructor take care of the translation into what the model expects. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
I haven't seen.
Simple:
Copied from https://platform.openai.com/docs/guides/text-generation/chat-completions-api
PS: If you show what you have tried and the results... you may get better help. |
Beta Was this translation helpful? Give feedback.
-
I wrote a small library here using Instructor: https://github.com/thomasnormal/fewshot |
Beta Was this translation helpful? Give feedback.
Here
(old) #653
(new) https://python.useinstructor.com/examples/examples
But you you can just add the examples on the prompt, together with the instructions and data. Except for the
response_model
, everything is the same as if Instructor were never there.In fact, I have a function where I do this:
I do this so that the 1st call has 'less work to do' when responding.
I know the retry may not work well as the original prompt is lost, but it works fine for me.