DeepInfra
This page covers how to use the DeepInfra ecosystem within LangChain. It is broken into two parts: installation and setup, and then references to specific DeepInfra wrappers.
Installation and Setupβ
- Get your DeepInfra api key from this link here.
- Get an DeepInfra api key and set it as an environment variable (
DEEPINFRA_API_TOKEN
)
Available Modelsβ
DeepInfra provides a range of Open Source LLMs ready for deployment. You can list supported models for text-generation and embeddings. google/flan* models can be viewed here.
You can view a list of request and response parameters.
Wrappersβ
LLMβ
There exists an DeepInfra LLM wrapper, which you can access with
from langchain_community.llms import DeepInfra
Embeddingsβ
There is also an DeepInfra Embeddings wrapper, you can access with
from langchain_community.embeddings import DeepInfraEmbeddings