createLlmCall

This is a function that calls an Anthropic LLM. You can configure almost any parameter that can be configured in the Create a Message Anthropic API endpoint.

Arguments

promptCallbackId
uint

Identifier which will be passed into the callback.

config
LlmRequest

An object that contains all configuration parameters for the LLM call. See below for details on the LlmRequest object.

Returns

counter
uint

An internal counter of the oracle, which is incremented on every call.

LlmRequest object

The LlmRequest object (see source in IOracle.sol) has the following fields:

model
string

The identifier for the model, claude-3-5-sonnet-20240620, claude-3-opus-20240229, claude-3-sonnet-20240229, claude-3-haiku-20240307, claude-2.1, claude-2.0, claude-instant-1.2.

frequencyPenalty
int8

An integer between -20 and 20, mapped to a float between -2.0 and 2.0; values greater than 20 are treated as null.

logitBias
string

A JSON string representing logit bias or an empty string if none.

maxTokens
uint32

The maximum number of tokens to generate, with 0 indicating null.

presencePenalty
int8

An integer between -20 and 20, mapped to a float between -2.0 and 2.0; values greater than 20 are treated as null.

responseFormat
string

A JSON string specifying the format of the response or an empty string if default.

seed
uint

A seed for the random number generator, with 0 indicating null.

stop
string

A string specifying stop sequences, or an empty string to indicate null.

temperature
uint

A temperature setting for randomness, with values from 0 to 20; values greater than 20 indicate null.

topP
uint

Controls diversity via nucleus sampling, with values from 0 to 100 percent; values greater than 100 indicate null.

tools
string

A JSON list of tools in Anthropic format, or an empty string for null, where names must match supported tools.

toolChoice
string

Specifies tool selection strategy, with values “none”, “auto”, or an empty string which defaults to “auto” on Anthropic’s side.

user
string

Identifier for the user making the request.

onOracleLlmResponse

Called when the result of an LLM call, created with createLlmCall, is available.

Arguments

callbackId
uint

The identifier you passed into the createLlmCall method.

response
LlmResponse

The result of the LLM call. See below for details on the LlmResponse object.

errorMessage
string

An error message. Contains an empty string if there was no error, and a description of the error otherwise.

LlmResponse object

The LlmResponse object (see source in ChatOracle.sol) has the following fields, most of which map directly to the fields in the Anthropic API response:

id
string

Unique identifier for the completion, generated by Anthropic.

content
string

The model output. Either this field or functionArguments and functionName will be populated, depending on whether the output is a normal message or a function call.

functionName
string

Name of the function invoked, if any.

functionArguments
string

Arguments passed to the function, if any, as a JSON string. For format details, refer to the specific function documentation.

created
uint64

Timestamp of creation.

model
string

The model name used for generation.

systemFingerprint
string

Identifier for the system generating the completion.

object
string

Type of the object, always chat.completion.

completionTokens
uint32

Number of tokens generated in the completion.

promptTokens
uint32

Number of tokens in the prompt.

totalTokens
uint32

Total number of tokens, including both prompt and completion.