In this tutorial, we’ll build up a custom ChatGPT-like chatbot step by step using Solidity on the Galadriel Network. If you’re interested in building decentralized AI applications, this guide will help you understand the basics of calling on-chain LLM models and storing conversation history on the blockchain.

Prerequisites

You can read this tutorial as-is to understand the basics of calling an LLM. However, to deploy the contract and interact with it, you will need:

  • A Galadriel devnet account. For more information on setting up a wallet, visit Setting Up A Wallet.
  • Some devnet tokens. Get your free devnet tokens from the Faucet.

Create a New Contract

Let’s start by creating a new Solidity file. We’ll incrementally add to that file to build up our chatbot.

pragma solidity ^0.8.9;

contract ChatGpt {
    constructor(address initialOracleAddress) {
        
    }
}

Set the Oracle Address

The oracle address is critical for making requests to LLMs — to generate text with an LLM, you need to call the oracle (see How it works for details). Let’s modify the constructor to initialize and store the oracle address in the constructor to ensure your contract can interact with it:

address private owner;
address public oracleAddress;

constructor(address initialOracleAddress) {
    owner = msg.sender;
    oracleAddress = initialOracleAddress;
}

We also need a function to update the oracle address. As the oracle’s address may change, it’s necessary to have a mechanism to update it without redeploying the contract. This functionality is encapsulated in the setOracleAddress function, guarded by the onlyOwner modifier to ensure only the contract owner can update it. onlyOracle ensures that only the oracle contract can send back the LLM response to this contract.

modifier onlyOwner() {
    require(msg.sender == owner, "Caller is not owner");
    _;
}

modifier onlyOracle() {
    require(msg.sender == oracleAddress, "Caller is not oracle");
    _;
}

event OracleAddressUpdated(address indexed newOracleAddress);

function setOracleAddress(address newOracleAddress) public onlyOwner {
    oracleAddress = newOracleAddress;
    emit OracleAddressUpdated(newOracleAddress);
}

Later in this tutorial, when deploying the contract, we will set this address to the concrete address of a Galadriel-provided oracle.

Oracle interface

To interact with the oracle, we need to define an interface. This interface should include the createLlmCall function, which is used to trigger the oracle to make a request to the LLM.

Define this outside the main ChatGpt contract object.

interface IOracle {
    function createLlmCall(
        uint promptId
    ) external returns (uint);
}

Starting a chat

Now to the core functionality: chatting. We will implement two separate functions: startChat and addMessage. The first is the entry point for a user: it creates a conversation and sends the first message. addMessage is used to add subsequent messages to the conversation.

We need to store the chat history within a conversation somehow — otherwise the chatbot won’t be able to remember the context. For this reason we define two structs: a Message struct to store the message content and the role of the sender, and a ChatRun struct to store the conversation history.

struct Message {
    string role;
    string content;
}

struct ChatRun {
    address owner;
    Message[] messages;
    uint messagesCount;
}

As you see, the Message struct reflects the message structure used by the OpenAI API and many other compatible APIs.

Given the above definitions, the startChat function initializes a new conversation (ChatRun struct) and adds the first message, from the message argument which will be the end-user’s first message. The ChatRun struct is then stored in a mapping, with the chat ID as the key — we need a unique ID for every ChatRun so we can retrieve it again when the oracle makes a callback (which we will implement later).

Finally, we create an LLM call by calling createLlmCall on the oracle, passing the chat ID as an argument. This will trigger the oracle to make a request to the LLM. We also emit an event notifying that a new chat has been created.

event ChatCreated(address indexed owner, uint indexed chatId);
mapping(uint => ChatRun) public chatRuns;
uint private chatRunsCount;

function startChat(string memory message) public returns (uint i) {
    ChatRun storage run = chatRuns[chatRunsCount];

    run.owner = msg.sender;
    Message memory newMessage;
    newMessage.content = message;
    newMessage.role = "user";
    run.messages.push(newMessage);
    run.messagesCount = 1;

    uint currentId = chatRunsCount;
    chatRunsCount = chatRunsCount + 1;

    IOracle(oracleAddress).createLlmCall(currentId);
    emit ChatCreated(msg.sender, currentId);

    return currentId;
}

Continuing a chat

The addMessage function is used to add subsequent messages to a conversation, once the conversation has been started.

The function first checks if the last message in the conversation was from the assistant (the chatbot). If it wasn’t, the function reverts, as the assistant should always respond to the user’s messages. The function also checks if the sender is the owner of the chat, as only the chat owner should be able to add messages.

If the checks pass, the function creates a new message and adds it to the conversation. The function then increments the message count and creates the next LLM call by calling createLlmCall on the oracle, passing the chat ID as an argument.

function addMessage(string memory message, uint runId) public {
    ChatRun storage run = chatRuns[runId];
    require(
        keccak256(abi.encodePacked(run.messages[run.messagesCount - 1].role)) == keccak256(abi.encodePacked("assistant")),
        "No response to previous message"
    );
    require(
        run.owner == msg.sender, "Only chat owner can add messages"
    );

    Message memory newMessage;
    newMessage.content = message;
    newMessage.role = "user";
    run.messages.push(newMessage);
    run.messagesCount++;
    IOracle(oracleAddress).createLlmCall(runId);
}

Message history

Careful readers will have noticed that we never passed in the conversation history (that we stored in the ChatRun object). This is because the message history is not passed in: rather, the oracle fetches the message history from the contract. The oracle does so by calling two functions: getMessageHistoryContents and getMessageHistoryRoles, on your contract, after your contract invokes createLlmCall.

These two methods in combination are used to provide the oracle with the message history necessary for the LLM call. They are called by the oracle after createLlmCall.

The methods should each return a list, one with message contents and the other listing the roles of the message authors. The list lengths should be equal for a given callbackId.

For certain advanced models, such as the gpt-4-turbo, which can handle complex queries including those with image URLs, a more integrated approach is used. This is facilitated through the getMessageHistory method, which provides both the message roles and contents in a structured format suitable for processing by these models.

For example, if the message history is the following:

rolecontent
systemYou are a helpful assistant
userHello!
assistantHi! How can I help?
userHow big is the Sun?

Then getMessageHistoryContents should return the following list of 4 items:

[
    "You are a helpful assistant",
    "Hello!",
    "Hi! How can I help?",
    "How big is the Sun?"
]

…and getMessageHistoryRoles should return the following list of 4 items:

["system", "user", "assistant", "user"]

Given the above requirements, we can implement the two functions as follows:

function getMessageHistoryContents(uint chatId) public view returns (string[] memory) {
    string[] memory messages = new string[](chatRuns[chatId].messages.length);
    for (uint i = 0; i < chatRuns[chatId].messages.length; i++) {
        messages[i] = chatRuns[chatId].messages[i].content;
    }
    return messages;
}

function getMessageHistoryRoles(uint chatId) public view returns (string[] memory) {
    string[] memory roles = new string[](chatRuns[chatId].messages.length);
    for (uint i = 0; i < chatRuns[chatId].messages.length; i++) {
        roles[i] = chatRuns[chatId].messages[i].role;
    }
    return roles;
}

Note that the chatId the oracle passes in will be the same ID we passed to createLlmCall when starting the chat.

Oracle callback

At this point, we have implemented everything the oracle needs to process the response on their side. However, we need a way for the oracle to post a response back to our contract. For this, we need to implement a callback function onOracleLlmResponse that the oracle can call once it has processed the response.

The function should take three arguments: the runId (the chat ID we passed in createLlmCall), the response (the response from the LLM), and an errorMessage (non-empty if there was an error). The function should only be callable by the oracle, so we add a onlyOracle modifier to ensure this.

The function first checks if the last message in the conversation was from the user. If it wasn’t, the function reverts, as the user should always respond to the assistant’s messages. The function then creates a new message with the response from the LLM and adds it to the conversation. The function increments the message count.

function onOracleLlmResponse(
    uint runId,
    string memory response,
    string memory /*errorMessage*/
) public onlyOracle {
    ChatRun storage run = chatRuns[runId];
    require(
        keccak256(abi.encodePacked(run.messages[run.messagesCount - 1].role)) == keccak256(abi.encodePacked("user")),
        "No message to respond to"
    );

    Message memory newMessage;
    newMessage.content = response;
    newMessage.role = "assistant";
    run.messages.push(newMessage);
    run.messagesCount++;
}

Putting it all together

That’s it — if you now deploy the contract to the Galadriel Devnet, you can start chatting with your on-chain chatbot. Note that we did not add a system message yet — to customize your chatbot’s instructions you can add a system message at the beginning of the conversation history.

You can find the full ChatGPT contract file here. The code in that contract is ordered slightly differently, and contains additions covered in the Retrieval-augmented generation tutorial.

What’s Next?

Congratulations on deploying your on-chain ChatGPT! Explore further:

  • Implement a more advanced chatbot by adding retrieval-augmented generation to your contract.
  • Dive deeper into the Galadriel documentation, particularly the How It Works section, to understand the underlying technology.
  • Experiment with different LLMs, e.g. Groq-hosted open-source LLMS or take control over the nuances of text generation: see Solidity reference and the example contract.
  • Explore other Use Cases to get inspired for your next project.

Happy building!