LangServe
LangServe is a library designed for deploying LLM applications built with LangChain.
First, ensure your LangServe application is up and running. Check the LangServe Server Guide for more details. You only need the LangServe server component for Runbear.
Getting the Endpoint
Note the path for your LangServe app. For example, the code below uses /chat
as the path.
add_routes(
app,
ChatOpenAI(),
path="/chat",
)
Adding Custom System Prompt
Runbear will invoke the runnable function with ChatPromptTemplate JSON. Each message includes role
and content
data. You can use the convert_to_messages
function to convert the data into the BaseMessage
list. Check the example below to learn adding the custom system prompt. You can find an example project on Proofreading Bot LangServe Example.
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.messages import convert_to_messages
from langchain_core.output_parsers import StrOutputParser
from langchain_openai import ChatOpenAI
from langserve import add_routes
app = FastAPI()
llm = ChatOpenAI()
output_parser = StrOutputParser()
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a Proofreading Bot."),
MessagesPlaceholder("conversation"),
]
)
runnable = {"conversation": convert_to_messages} | prompt | llm | output_parser
add_routes(
app,
runnable,
path="/chat",
)
Configuring Runbear
You can now configure your app in Runbear:
- Open the Runbear Assistants page and click the
Add App
button. - Choose 'LangServe' as your app type.
- In the 'LangServe endpoint' field, enter your LagnServe URL (e.g., https://your.domain/chat).
- (Optional) If you need to set up a secret header for your LangServe app, you can configure the 'Security Settings' with your secret header name and corresponding value.
- Click the
Create
button.
What's Next
Connect the app you added to communication channels. Check Connecting Channels with LLM Apps for more details.