Please see the quickstart to get an LLM inference response

1

Install Sentience SDK

pip install sentience
2

Call the verification code

Run the following code

import sentience
from openai import OpenAI

client = OpenAI(
    base_url="https://api.galadriel.com/v1/verified",
    api_key="Bearer GALADRIEL_API_KEY",
)

completion = client.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
)

# Verifying proof is just oneliner after making the request:
is_valid = sentience.verify_signature(completion)
print("is_valid:", is_valid)

What’s next?