Galileo
Search
K

Logging Data via RESTful APIs

You can always log data via Logging data via Galileo's RESTful APIs if our Python or Langchain integrations don't work for you.
Logging data is a two step process: Authentication and Logging.
This step is only necessary if our other integrations don't work for your use cases. Check out Getting Started for other (easier) ways of logging data to Galileo Monitor.

Authentication

To fetch an authentication, send a POST request to /login with your username and password:
import requests
base_url = "https://api.{your_environment}.rungalileo.io"
headers = {
'accept': 'application/json',
'Content-Type': 'application/x-www-form-urlencoded',
}
data = {
'username': '{YOUR_USERNAME}',
'password': '{YOUR_PASSWORD}',
}
response = requests.post(f'{base_url}/login', headers=headers, data=data)
access_token = response.json()["access_token"]

Logging

Once you have your auth token, you can start making ingestion calls to Galileo Monitor.

Project ID

To log data, you'll need your project id. Get your project ID by making a GET request to the /projects endpoint, or simply copy it from the URL in your browser window. This project ID is static and will never change. You only have to do this once.
headers = {
'accept': 'application/json',
'Content-Type': 'application/json',
'Authorization': f"Bearer {access_token}"}
response = requests.get(f"{base_url}/projects", headers=headers,
params={"project_name": "{YOUR_PROJECT_NAME}"}
)
project_id = response.json()[0]["id"]

Structuring your records

Create an array of all the LLM calls you want to track. You can fire off individual requests or create batches. For each LLM call, create a dictionary with the following information:
{
"records": [
{
"latency_ms": 894,
"status_code": 200,
"input_text": "This is a prompt.",
"output_text": "This is a response.",
"model": "gpt-3.5-turbo",
"num_input_tokens": 7,
"num_output_tokens": 8,
"output_logprobs": {/* Optional. When available, logprobs are used to compute Uncertainty. */ },
"created_at": "2023-08-07T15:14:30.519922"
}
]
}

Logging your records

Finally, make a POST request to the llm_monitor/ingest endpoint with your records:
headers = {
'accept': 'application/json',
'Content-Type': 'application/json',
'Authorization': f"Bearer {access_token}"}
response = requests.post(f"{base_url}/projects/{project_id}/llm_monitor/ingest",
headers=headers,
json={"records": records})