Build a translator app using Tune Studio, LangChain, and Streamlit
Use Tune Studio to integrate LangChain with the Mistral AI API to translate user-provided input to a given language, and build the frontend with Streamlit.
This tutorial will show you how to build a translator app that uses Tune Studio to integrate LangChain with the Mistral AI API and Streamlit for the frontend.
The final app will provide an interface for users to input text and select a target language to translate it to. The backend will receive these inputs and query the Tune API using the Mistral AI API. The LLM will return the translated text to be displayed on the frontend.
You can find the complete code for this guide here.
The final app will look like this:
Getting Started
You will need to install some Python dependencies and manage them using a virtual environment.
In your project folder, run the following command to create a virtual environment:
Activate the virtual environment with the following command:
Inside the virtual environment, install the dependencies:
Your environment is now ready to start creating the app.
Creating an OpenAI chat instance
Let’s start by defining some helper functions to handle the steps of the translating process.
Since we’re using the Tune API, we can interact with the Mistral API using the same handlers as OpenAI. Add a function to create an OpenAI chat instance.
In your project folder, create a translator.py
file. Import the following modules:
Define the following function after the imports:
This function initializes a new OpenAI chat instance using LangChain. We specify the Tune Studio API base URL and provide a Tune Studio API key. We will obtain this key from the user in a later step. We also specify which model to use.
Formatting and translating text
Next we will define a translate
function that will receive user input, process the translation, and return a response.
The user input and selected target language must be provided to the LLM in a specific format. Let’s create a function to format the prompt accordingly:
This function uses the LangChain ChatPromptTemplate
to format the messages correctly. We call this function each time the user makes a request using the app.
Now we can bring all this together with the translate
function:
This function receives the user input and a Tune Studio API key. It then calls the helper function we defined previously to create a chat instance and format the prompt. We define the output parser using the LangChain StrOutputParser
. Finally, we build and invoke the conversation chain, and return the response.
Building the Streamlit frontend
Now we’ll build a Streamlit frontend for users to interact with.
In your project root folder, create a new app.py
file. In this file, import the Streamlit library and the translate
function we wrote earlier:
Add a title for the Streamlit app and a field to receive the user’s Tune Studio API key:
This will render a password field in the sidebar for the user to enter their key.
Let’s display a form to receive the text input and language selections:
Validating inputs
Let’s add input validation so the app can handle some common errors before we call the translate
function.
After defining the form fields, add the following code in the with
block:
This code checks for simple errors, such as forgetting to enter an API key or selecting an invalid language. If none of these errors are present, the app calls the translate
function and displays the response.
Running the app
Start the app by running the following command in the project folder:
The Streamlit app will launch in your browser. You can now enter your OpenAI API key and interact with the translation app.
Was this page helpful?