Documentation
¶
Overview ¶
Package dotprompt parses and renders dotprompt files.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func SetDirectory ¶
func SetDirectory(directory string)
SetDirectory sets the directory where dotprompt files are read from.
Types ¶
type Config ¶
type Config struct {
// The prompt variant.
Variant string
// The name of the model for which the prompt is input.
// If this is non-empty, Model should be nil.
ModelName string
// The Model to use.
// If this is non-nil, Model should be the empty string.
Model *ai.Model
// TODO(iant): document
Tools []*ai.ToolDefinition
// Number of candidates to generate when passing the prompt
// to a model. If 0, uses 1.
Candidates int
// Details for the model.
GenerationConfig *ai.GenerationCommonConfig
InputSchema *jsonschema.Schema // schema for input variables
VariableDefaults map[string]any // default input variable values
// Desired output format.
OutputFormat ai.OutputFormat
// Desired output schema, for JSON output.
OutputSchema map[string]any // TODO: use *jsonschema.Schema
// Arbitrary metadata.
Metadata map[string]any
}
Config is optional configuration for a Prompt.
type Prompt ¶
type Prompt struct {
// The name of the prompt. Optional unless the prompt is
// registered as an action.
Name string
Config
// The parsed prompt template.
Template *raymond.Template
// The original prompt template text.
TemplateText string
// contains filtered or unexported fields
}
Prompt is a parsed dotprompt file.
A dotprompt file consists of YAML frontmatter within --- lines, followed by a template written in the Handlebars language.
The YAML frontmatter will normally define a JSON schema describing the expected input and output variables. The input variables will appear in the template. The JSON schemas may be defined in a compact picoschema format.
The templates are evaluated with a couple of helpers.
- {{role r}} changes to a new role for the following text
- {{media url=URL}} adds a URL with an optional contentType
func Define ¶
Define creates and registers a new Prompt. This can be called from code that doesn't have a prompt file.
func New ¶
New creates a new Prompt without registering it. This may be used for testing or for direct calls not using the genkit action and flow mechanisms.
func Open ¶
Open opens and parses a dotprompt file. The name is a base file name, without the ".prompt" extension.
func OpenVariant ¶
OpenVariant opens a parses a dotprompt file with a variant. If the variant does not exist, the non-variant version is tried.
func (*Prompt) Generate ¶
func (p *Prompt) Generate(ctx context.Context, pr *PromptRequest, cb func(context.Context, *ai.GenerateResponseChunk) error) (*ai.GenerateResponse, error)
Generate executes a prompt. It does variable substitution and passes the rendered template to the AI model specified by the prompt.
This implements the ai.Prompt interface.
func (*Prompt) RenderMessages ¶
RenderMessages executes the prompt's template and converts it into messages. This just runs the template; it does not call a model.
type PromptRequest ¶
type PromptRequest struct {
// Input fields for the prompt. If not nil this should be a struct
// or pointer to a struct that matches the prompt's input schema.
Variables any `json:"variables,omitempty"`
// Number of candidates to return; if 0, will be taken
// from the prompt config; if still 0, will use 1.
Candidates int `json:"candidates,omitempty"`
// Model configuration. If nil will be taken from the prompt config.
Config *ai.GenerationCommonConfig `json:"config,omitempty"`
// Context to pass to model, if any.
Context []any `json:"context,omitempty"`
// The model to use. This overrides any model specified by the prompt.
Model string `json:"model,omitempty"`
}
PromptRequest is a request to execute a dotprompt template and pass the result to a [Model].