Back to list

Integrate Your Custom LangChain Apps with Slack through Runbear

This step-by-step guide helps you connect your custom LangChain app to Slack using LangServe and Runbear.

Connect Your Custom LangChain to Slack Seamlessly

Integrating your LangChain app with Slack doesn't have to be complicated. With LangServe and Runbear, you can set up your custom LLM in just a few easy steps. Here's how to get started.

Step 1: Prepare Your LangChain Serving Endpoint

First, create a LangChain serving endpoint using LangServe. Here's a simple code snippet that sets up a web server with FastAPI, ready to handle requests:

import os
from typing import Any, Dict
from fastapi import FastAPI, HTTPException, Request
from langchain.chat_models import ChatOpenAI
from langserve import add_routes


app = FastAPI()


def verify_secret_key(config: Dict[str, Any], req: Request) -> Dict[str, Any]:
    if req.headers.get("x-secret-key") != os.environ.get("SECRET_KEY"):
        raise HTTPException(status_code=401, detail="Incorrect secret key")

    return config


add_routes(
    app,
    ChatOpenAI(),
    path="/chat",
    per_req_config_modifier=verify_secret_key,
)

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(app, host="0.0.0.0", port=int(
        os.getenv("PORT", default=8000)))

You need to set the OPENAI_API_KEY environment variable to use the OpenAI API. You can create your API key from OpenAI API keys. Also, set the SECRET_KEY environment variable to secure your endpoint. If you set SECRET_KEY, you must pass the x-secret-key header with the same value to your endpoint.

If you wish to directly deploy a template app, you can proceed to Step 5.

Step 2: Test Your LangChain App Locally with Ngrok

Before deploying your app, you might want to test it locally. Ngrok is a tool that creates a secure tunnel to your localhost, making it accessible from the internet without deployment. Here’s how to use it:

  1. Download and install ngrok from ngrok's website.
  2. Once installed, open a terminal and run ngrok http 8000. This will expose port 8000 (where your FastAPI app runs; you can change this port if necessary) to the internet. ngrok
  3. Copy the forwarding URL provided by ngrok. This URL is now the public endpoint for your local LangServe app.

Step 3: Configure Your App in Runbear

With your local LangChain app running and ngrok exposing it to the internet, you can now configure your app in Runbear:

  1. In Runbear, navigate to the "LLM apps" menu.
  2. Choose 'LangServe' as your app type.
  3. In the 'LangServe endpoint' field, enter your ngrok URL with the /chat suffix (e.g. https://cce8-123-***-*-***.ngrok-free.app/chat). In the 'Security Settings' section, add secret the header name(X-Secret-Key) and value if you set SECRET_KEY environment variable. You can change the secret header name to any value you wish. Configure App

Step 4: Connect to Slack and Interact with Your LangChain App

Now that your app is configured in Runbear with the ngrok URL, you can connect your LangChain app to your Slack workspace:

  1. In Runbear, go to the 'Connections' page.
  2. Click the 'New Connection' button.
  3. Choose your Slack workspace, select the LangServe app you just configured, and click 'Create'. Create Connection
  4. Open your Slack workspace and start interacting with @Runbear to test its functionality.

By testing your app locally, you can ensure everything is running smoothly before proceeding with deployment. Interact with App

Step 5: Deploy Your LangServe App

Next, it's time to deploy your LangChain app. You can choose any cloud provider for deployment. In this guide, we'll use Railway, known for its user-friendly interface and a generous free tier.

Deploy on Railway

  1. Click the button above to begin deploying your app with Railway. Enter your OpenAI API key and customize the SECRET_KEY if desired, then click 'Deploy'.
  2. Once the deployment is complete, take note of the URL for your active deployment. For instance, the URL might look like your-app-name.up.railway.app. Deployed App
  3. Go to the 'LLM apps' page on Runbear and set up a new app. Select 'LangServe' as your app type. In the 'LangServe endpoint' field, input your Railway URL followed by the /chat path (e.g., https://your-app-name.up.railway.app/chat). Configure the 'Security Settings' with your secret header name and corresponding value.
  4. Finally, create a new connection with your Slack workspace.

You're all set! Start conversing with your LangChain app directly within Slack! 🎉