How to use Meta Llama 3 to query MySQL database using Ollama and Wren AI

Step-by-step tutorial on hosting private LLM endpoints through Ollama and using the latest open model Meta Llama 3 to query your MySQL

Howard Chi
Co-founder of Wren AI
July 2, 2024
September 8, 2024
5 min read

In this blog, we will explore the step-by-step process of using Meta Llama 3 for seamless database querying with the assistance of Ollama and Wren AI. This technology combination allows you to host an AI Assistant in your own server, where you can ask any business questions to Wren AI and get the results right away without writing SQL, all in your own environment.

This year, in April, Meta dropped big news in the open LLM model community, announcing the Llama 3 model; it immediately caught wild LLM developers' interest not only because of the impressive performance compared with previous open LLM models but also because Llama 3 is available under the commercially-friendly Llama 3 license, giving developers and researchers the ability to share and commercialize their innovations.

Meta Llama 3 benchmark

This post will show how to run Meta Llama 3 on Ollama as your LLM inference server and access your data in MySQL using natural language queries.

Now, let’s get started!

Setting up Ollama

What is Ollama?

Ollama is an open-source project that helps you serve and set up your local LLMs quickly and easily. You can run Llama 3, Phi 3, Mistral, Gemma 2, and other models within a few commands.

If you want to learn more about Ollama, check out their official site.

Install Ollama

Installing Ollama is super easy. Check out their site: https://ollama.com, and click the download button, as shown in the screenshot below.

Click download and install Ollama.

Ollama can run on MacOS, Linux, and also Windows.

Pull Llama 3 70B on Ollama

Using the Llama 3 70B model, the below command will pull the manifest and download the model to your local machine.

ollama pull llama3:70b

Because Llama 3 70b is around 40GB, you might need to wait a few minutes to complete this step.

NOTE: In this post, we are going to use the LLM model to generate SQL and query data in MySQL. This means the LLM capability has to achieve a certain level of stability and reliability, you may need an LLM that is at least as powerful as the OpenAI GPT-3.5-turbo model.

So here we strongly suggest using “llama3:70b” or better LLM models, to have a better outcome using Wren AI.

Pull Embedding Model

In this example, we are using nomic-embed-text. Make sure you pull the embedding model from Ollama.

ollama pull nomic-embed-text

Starting Ollama Server

You only need to enter if you want to start the Ollama server. You can click the application icon on your Mac or enter the command below:

ollama serve 

To quit, click the Ollama in the menubar to exit.

Quit Ollama from Menubar

Setting up Wren AI

What is Wren AI?

Wren AI is an open-source text-to-SQL solution for data teams to get results and insights faster by asking business questions without writing SQL. Wren AI supports a wide range of data sources, such as DuckDB, MySQL, Microsoft SQL Server, and BigQuery, etc., and also supports open and local LLM inference endpoints such as OpenAI GPT-3-turbo, GPT-4o, and local LLM hosts via Ollama.

Check out the Wren AI Website: https://www.getwren.ai/

Setting up Wren AI to Ollama Endpoint

WrenAI allows you to use custom LLMs and OpenAI-compatible endpoints; see how to set it up here.

Create .env.ai under your ~/.wrenai folder, and paste the configuration below.

## LLM
LLM_PROVIDER=ollama_llm # openai_llm, azure_openai_llm, ollama_llm
GENERATION_MODEL=llama3:70b
GENERATION_MODEL_KWARGS={"temperature": 0}

# openai or openai-api-compatible
LLM_OPENAI_API_KEY=
LLM_OPENAI_API_BASE=

# azure_openai
LLM_AZURE_OPENAI_API_KEY=
LLM_AZURE_OPENAI_API_BASE=
LLM_AZURE_OPENAI_VERSION=

# ollama
LLM_OLLAMA_URL=http://host.docker.internal:11434


## EMBEDDER
EMBEDDER_PROVIDER=ollama_embedder # openai_embedder, azure_openai_embedder, ollama_embedder
# supported embedding models providers by qdrant: https://qdrant.tech/documentation/embeddings/
EMBEDDING_MODEL=nomic-embed-text
EMBEDDING_MODEL_DIMENSION=768

# openai or openai-api-compatible
EMBEDDER_OPENAI_API_KEY=
EMBEDDER_OPENAI_API_BASE=

# azure_openai
EMBEDDER_AZURE_OPENAI_API_KEY=
EMBEDDER_AZURE_OPENAI_API_BASE=
EMBEDDER_AZURE_OPENAI_VERSION=

# ollama
EMBEDDER_OLLAMA_URL=http://host.docker.internal:11434


## DOCUMENT_STORE
DOCUMENT_STORE_PROVIDER=qdrant

QDRANT_HOST=qdrant

Install & Launch Wren AI

Here, we use Mac as an example to install Wren AI (if you are using other OSs, please check the official docs for installation)

Installing it on Mac is very simple; paste the line below.

curl -L https://github.com/Canner/WrenAI/releases/latest/download/wren-launcher-darwin.tar.gz | tar -xz && ./wren-launcher-darwin

After the launcher is downloaded, the installation will be initiated. When Wren AI asks which LLM provider you would like to pick, choose Custom , by choosing Custom Wren AI will use the configuration which we paste in .env.ai file we just set up.

Select the custom LLM provider option in the launcher

After selecting Custom , Wren AI will pull needed containers and launch Wren AI immediately after everything is installed.

Installing needed containers

After successfully installing Wren AI, it’ll launch the Wren AI UI at localhost:3000 as below.

The landing page in Wren AI.

Query MySQL With Llama 3 70B

Next, we are going to set up the “Employees Sample Database” on the MySQL official site as demo datasets and build the semantic modeling based on the Entity-Relationship Diagram(ERD) shown on the official site, which could enhance Llama 3 to understand the datasets better with better precision.

Installing MySQL

If you haven’t installed MySQL on your computer, please check out here and find the proper installation package to set up.

Prepare Database

First, we are going to prepare the “Employees Sample Database”; the dataset is available on GitHub. Download the data from Github as below.

Download database from GitHub

Unzip the downloaded file and navigate to the folder in your terminal below.

$ mysql -t < employees.sql -u root -p
Loading all the tables in the sample database

Testing your database

$ time mysql -t < test_employees_sha.sql -u root -p

If your database is successfully loaded, it will show tables as below, check out different validation methods here.

Check if the data is successfully loaded.

Connect to your MySQL Database from Wren AI

If you use the Mac local MySQL database, enter docker.for.mac.localhost to access the database.

Below are the settings I filled out in my Wren AI UI.

Setting up connection

After successfully connecting the database, you will be guided to the next step: select tables. Here, we choose all the tables.

Select needed tables

After clicking the next step, you can set up relationships to each table. Below, you can set up the relationships.

Set up relationships with your models.

After clicking the Finish button, you will be guided to the Home page; switch to the modeling page, and you can see the semantic modeling diagram below!

Complete to import your models into Wren AI

Modeling your datasets

Below is the database schema of the Employees Sample Database provided by the MySQL documentation, usually called the “Entity-Relationship Diagram” (ERD); the ERD is important for many data teams because it provides a visual representation of the database structure. This allows for easier understanding and management of data relationships, making it easier to query data and create reports.

This information is critical for LLMs to have a deeper understanding of your data structure and provide better accuracy and clarification of your data.

The ERD of the datasets provided in the MySQL official docs

In Wren AI, we can define the information in the ERD on our modeling page; in the background, we define a Modeling Definition Language(MDL), which talks to the execution engine about the semantic context of the models. MDL is a design we call the LLM Intermediate Representation (LIR) between your LLM and data system.

View Semantic Metadata & Preview Data

Clicking on each model will extend out a drawer from the side, where you can add semantics to the meaning of the table, columns, and relationship definitions. Here, you can also preview the data in the model.

Add/ Update Semantic Metadata

You can add and update the semantic metadata by clicking the Edit button below.

Add/ Update/ Delete Semantic Relationships

In each relationship, you can also edit, add, and delete the semantic relationships between each model to allow LLM to know whether the models are in one-to-one, one-to-many, or many-to-many relationships.

You can update/ delete/ create semantic relationships on the modeling page.

Click Deploy to save and update the context.

After you complete all the modeling configurations, be aware to deploy your settings using the top-right button. Once the models are synced, you can see a check beside the button.

Synced your models

Final Result

In the below snapshots, we set up the exact ERD from the “Employees Sample Database” on Wren AI. Now, your Llama 3 can understand your data structure accurately.

Setting up ERD on Wren AI

Start asking questions

Now, you can switch to the Home page in Wren AI UI by clicking New Thread. You can start asking any of your business questions; Wren AI searches for relevant semantics through the vector store and provides results with step-by-step reasoning so you can review whether the result provided by Wren AI is correct.

Follow up on further questions.

After looking at the results from Wren AI, you can ask follow-up questions to gain deeper insights.

Try it yourself!

Setting up LLM with local LLM models using Ollama to query your database is straightforward. Wren AI is open-source on GitHub. Give it a try with your data!

Thank you Jimmy Yeh, Yaida Colindres help review the article and provide feedback!

If you love our work, please support and star us on GitHub!

🚀 GitHub: https://github.com/canner/wrenai

🙌 Website: https://www.getwren.ai/

Don’t forget to give ⭐ Wren AI a star on Github ⭐ if you’ve enjoyed this article, and as always, thank you for reading.

Supercharge Your Data with AI Today?!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Join our newsletter to stay up to date on features and releases.
© 2024 Canner. All right reserved.