This documentation describes the integration of MindsDB with LangChain, a framework for developing applications powered by language models.
The integration allows for the deployment of LangChain models within MindsDB, providing the models with access to data from various data sources.
The LiteLLM model provider is available in MindsDB Cloud only. Use the MindsDB API key, which can be generated in the MindsDB Cloud editor at cloud.mindsdb.com/account.
CREATE ML_ENGINE langchain_engineFROM langchain[USING serper_api_key = 'your-serper-api-key', -- it is an optional parameter (if provided, the model will use serper.dev search to enhance the output) -- provide one of the below parameters anthropic_api_key = 'api-key-value', anyscale_api_key = 'api-key-value', litellm_api_key = 'api-key-value', openai_api_key = 'api-key-value'];
Create a model using langchain_engine as an engine and one of OpenAI/Anthropic/Anyscale/LiteLLM as a model provider.
Copy
Ask AI
CREATE MODEL langchain_modelPREDICT target_columnUSING engine = 'langchain_engine', -- engine name as created via CREATE ML_ENGINE <provider>_api_key = 'api-key-value', -- if not provided in CREATE ML_ENGINE (replace <provider> with one of the available values) model_name = 'model-name', -- optional, model to be used (for example, 'gpt-4' if 'openai_api_key' provided) prompt_template = 'message to the model that may include some {{input}} columns as variables';
Agents and Tools are some of the main abstractions that LangChain offers. You can read more about them in the LangChain documentation.
There are three different tools utilized by this agent:
MindsDB is the internal MindsDB executor.
Metadata fetches the metadata information for the available tables.
Write is able to write agent responses into a MindsDB data source.
Each tool exposes the internal MindsDB executor in a different way to perform its tasks, effectively enabling the agent model to read from (and potentially write to) data sources or models available in the active MindsDB project.
Create a conversational model using langchain_engine as an engine and one of OpenAI/Anthropic/Anyscale/LiteLLM as a model provider.
OpenAI
Copy
Ask AI
CREATE MODEL langchain_openai_modelPREDICT answerUSING engine = 'langchain_engine', -- engine name as created via CREATE ML_ENGINE provider = 'openai', -- one of the available providers openai_api_key = 'api-key-value', -- if not provided in CREATE ML_ENGINE model_name = 'gpt-3.5-turbo', -- choose one of the available OpenAI models mode = 'conversational', -- conversational mode user_column = 'question', -- column name that stores input from the user assistant_column = 'answer', -- column name that stores output of the model (see PREDICT column) verbose = True, prompt_template = 'Answer the user input in a helpful way';
Anthropic
Copy
Ask AI
CREATE MODEL langchain_openai_modelPREDICT answerUSING engine = 'langchain_engine', -- engine name as created via CREATE ML_ENGINE provider = 'anthropic', -- one of the available providers anthropic_api_key = 'api-key-value', -- if not provided in CREATE ML_ENGINE model_name = 'claude-2.1', -- choose one of the available OpenAI models mode = 'conversational', -- conversational mode user_column = 'question', -- column name that stores input from the user assistant_column = 'answer', -- column name that stores output of the model (see PREDICT column) verbose = True, prompt_template = 'Answer the user input in a helpful way';
Anyscale
Copy
Ask AI
CREATE MODEL langchain_anyscale_modelPREDICT answer USING engine = 'langchain_engine', -- engine name as created via CREATE ML_ENGINE provider = 'anyscale', -- one of the available providers anyscale_api_key = 'api-key-value', -- if not provided in CREATE ML_ENGINE model_name = 'mistralai/Mistral-7B-Instruct-v0.1', -- choose one of the models available from Anyscale mode = 'conversational', -- conversational mode user_column = 'question', -- column name that stores input from the user assistant_column = 'answer', -- column name that stores output of the model (see PREDICT column) base_url = 'https://api.endpoints.anyscale.com/v1', verbose = True, prompt_template = 'Answer the user input in a helpful way';
LiteLLM
Copy
Ask AI
CREATE MODEL langchain_litellm_modelPREDICT answer USING engine = 'langchain_engine', -- engine name as created via CREATE ML_ENGINE provider = 'litellm', -- one of the available providers litellm_api_key = 'api-key-value', -- if not provided in CREATE ML_ENGINE model_name = 'assistant', -- model created in MindsDB mode = 'conversational', -- conversational mode user_column = 'question', -- column name that stores input from the user assistant_column = 'answer', -- column name that stores output of the model (see PREDICT column) base_url = 'https://ai.dev.mindsdb.com', verbose = True, prompt_template = 'Answer the user input in a helpful way';
The following usage examples utilize langchain_engine to create a model with the CREATE MODEL statement.
Create a model that will be used to describe, analyze, and retrieve.
Copy
Ask AI
CREATE MODEL tool_based_agentPREDICT completionUSING engine = 'langchain_engine', prompt_template = 'Answer the users input in a helpful way: {{input}}';
Here, we create the tool_based_agent model using the LangChain engine, as defined in the engine parameter. This model answers users’ questions in a helpful way, as defined in the prompt_template parameter, which specifies input as the input column when calling the model.
The `mysql_demo_db.house_sales` table is a base table that contains information related to house sales. It has the following columns:- `saledate`: of type text, which likely contains the date when the sale was made.- `house_price_moving_average`: of type int, which might represent a moving average of house prices, possibly to track price trends over time.- `type`: of type text, which could describe the type of house sold.- `bedrooms`: of type int, indicating the number of bedrooms in the sold house.
To get information about the mysql_demo_db.house_sales table, the agent uses the Metadata tool. Then the agent prepares the response.
SELECT input, completionFROM tool_based_agentWHERE input = 'I want to know the average number of rooms in the downtown neighborhood as per the `mysql_demo_db.home_rentals` table'USING verbose = True, tools = [], max_iterations = 10;
Here is the output:
Copy
Ask AI
The average number of rooms in the downtown neighborhood, as per the `mysql_demo_db.home_rentals` table, is 1.6 rooms.
Here, the model uses the Metadata tool again to fetch the column information. As there is no beds column in the mysql_demo_db.home_rentals table, it uses the number_of_rooms column and writes the following query:
SELECT input, completionFROM tool_based_agentWHERE input = 'There is a property in the south_side neighborhood with an initial price of 2543 the `mysql_demo_db.home_rentals` table. What are some other details of this listing?'USING verbose = True, tools = [], max_iterations = 10;
Here is the output:
Copy
Ask AI
The property in the `south_side` neighborhood with an initial price of 2543 has the following details:- Number of rooms: 1- Number of bathrooms: 1- Square footage (sqft): 630- Location: great- Days on market: 11- Initial price: 2543- Neighborhood: south_side- Rental price: 2543.0
Here, the model uses the Metadata tool again to fetch information about the table. Then, it creates and executes the following query: