Skip to main content

Robocorp

This notebook covers how to get started with Robocorp Action Server action toolkit and LangChain.

Installation

# Install package and Action Server
%pip install --upgrade --quiet langchain-robocorp robocorp-action-server

Action Server setup

You will need a running instance of Action Server to communicate with from your agent application. You can bootstrap a new project using Action Server new command.

!action-server new
cd ./your-project-name
action-server start

Environment Setup

Optionally you can set the following environment variables:

  • LANGCHAIN_TRACING_V2=true: To enable LangSmith log run tracing that can also be bind to respective Action Server action run logs. See LangSmith documentation for more.

Usage

from langchain.agents import AgentExecutor, OpenAIFunctionsAgent
from langchain.chat_models import ChatOpenAI
from langchain_core.messages import SystemMessage
from langchain_robocorp import ActionServerToolkit

# Initialize LLM chat model
llm = ChatOpenAI(model="gpt-4", temperature=0)

# Initialize Action Server Toolkit
toolkit = ActionServerToolkit(url="http://localhost:8080", report_trace=True)
tools = toolkit.get_tools()

# Initialize Agent
system_message = SystemMessage(content="You are a helpful assistant")
prompt = OpenAIFunctionsAgent.create_prompt(system_message)
agent = OpenAIFunctionsAgent(llm=llm, prompt=prompt, tools=tools)

executor = AgentExecutor(agent=agent, tools=tools, verbose=True)


executor.invoke("What is the current date?")

Single input tools

By default toolkit.get_tools() will return the actions as Structured Tools. To return single input tools, pass a Chat model to be used for processing the inputs.

# Initialize single input Action Server Toolkit
toolkit = ActionServerToolkit(url="http://localhost:8080")
tools = toolkit.get_tools(llm=llm)