We support integrating into both Python-based and Typescript-based Langchain systems:

Integrating into your Python-based Langchain application is the easiest and recommended route. You can just add GalileoObserveCallback(project_name="YOUR_PROJECT_NAME") to the callbacks of your chain invocation.

from galileo_observe import GalileoObserveCallback
from langchain.chat_models import ChatOpenAI

prompt = ChatPromptTemplate.from_template("tell me a joke about {foo}")
model = ChatOpenAI()
chain = prompt | model

monitor_handler = GalileoObserveCallback(project_name="YOUR_PROJECT_NAME")
chain.invoke({'foo':'bears'},
            config(dict(callbacks=[monitor_handler])))

The GalileoObserveCallback logs your input, output, and relevant statistics back to Galileo, where additional evaluation metrics are computed.