Galadriel Verified Inference API provides LLM inferences results that are signed by the TEE private key. This page goes over the details on how to validate if the signatures are correct.
Please see the quickstart to get an LLM inference response
1
Install Sentience SDK
Copy
pip install sentience
2
Call the verification code
Run the following code
Copy
import sentiencefrom openai import OpenAIclient = OpenAI( base_url="https://api.galadriel.com/v1/verified", api_key="Bearer GALADRIEL_API_KEY",)completion = client.chat.completions.create( model="gpt-4o", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"}, ],)# Verifying proof is just oneliner after making the request:is_valid = sentience.verify_signature(completion)print("is_valid:", is_valid)