Skip to main content

๐Ÿ“˜ Lecture notes

Creating agents using LangGraphโ€‹

In this lab, we will use LangGraph by LangChain to orchestrate an AI agent for a technical documentation website. LangGraph allows you to model agentic systems as graphs. Graphs in LangGraph have the following core features:

Nodesโ€‹

Nodes in LangGraph are Python functions that encode the logic of your agents. They receive the current state of the graph as input, perform some computation and return an updated state.

Edgesโ€‹

Edges in LangGraph determine which graph node to execute next based on the current state of the graph. Edges can be conditional, fixed and even result in loops.

Stateโ€‹

Each graph has a state which is a shared data structure that all the nodes can access and make updates to. You can define custom attributes within the state depending on what parameters you want to track across the nodes of the graph.

To learn more about these concepts, refer to the LangGraph docs.

Using different LLM providers with LangChainโ€‹

LangChain supports different LLM providers for you to build AI applications with. Unless you are using open-source models, you typically need to obtain API keys to use the chat completion APIs offered by different LLM providers.

For this lab, we have created a serverless function that creates LLM objects for Amazon, Google and Microsoft models that you can use with LangChain and LangGraph without having to obtain API keys. However, if you would like to do this on your own, here are some resources: