A ChatGPT App with Streamlit — Advanced Version

Johnny Chen
8 min readAug 5, 2024

--

I am literary creating this app when I am writing this post. This is how powerful OpenAI’s API enables me and how easy to use Streamlit the dashboard tool to create AI apps. Let’s see how it goes.

Generated by DALL*E 3

To start, why I want to write a ChatGPT app while there is a free web app offered by OpenAI? Here are a few reasons (specifically for me):

  1. Due to VPN restriction, my work laptop cannot access OpenAI webpage. I believe many companies’ IT blocks it due to privacy concern
  2. I don’t want to pay $20+ for the monthly fee to OpenAI because I don’t use the web app often (I used Github Copilot for coding instead). But, sometimes I need ChatGPT because Copilot only answer coding question. So, I want pay-as-I-go.
  3. It’s cool to share this knowledge :)

OK, I just created an empty .py script, what should I write??

Wait, Streamlit has the tutorial for creating chatbot!! Let’s copy it. Goes to the Build a ChatGPT-like app Section, you find the complete code that how to create a Chat UI and call the API

import streamlit as st
from openai import OpenAI

st.title("ChatGPT-like clone")

# Set OpenAI API key from Streamlit secrets
client = OpenAI(api_key="put-your-api-key-here")

# Set a default model
if "openai_model" not in st.session_state:
st.session_state["openai_model"] = "gpt-3.5-turbo"

# Initialize chat history
if "messages" not in st.session_state:
st.session_state.messages = []

# Display chat messages from history on app rerun
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])

# Accept user input
if prompt := st.chat_input("What is up?"):
# Add user message to chat history
st.session_state.messages.append({"role": "user", "content": prompt})
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(prompt)

# Display assistant response in chat message container
with st.chat_message("assistant"):
stream = client.chat.completions.create(
model=st.session_state["openai_model"],
messages=[
{"role": m["role"], "content": m["content"]}
for m in st.session_state.messages
],
stream=True,
)
response = st.write_stream(stream)
st.session_state.messages.append({"role": "assistant", "content": response})

This is easy, right?

Now, let me earn more credits by making it a little more advanced. How about a login page to enter the API key instead of hard-coding it?

To do that, we need to use the concept of session_state and some kind of database. Streamlit stores the state in st.session_state, so we can use it to store the api key after user enters it. For database, I want to fetch the previously stored key after coming back next time. For simplicity, I will use a JSON file to store my data.

import streamlit as st
from openai import OpenAI
import json
import os


DB_FILE = 'db.json'

def main():
client = OpenAI(api_key=st.session_state.openai_api_key)

# Set a default model
if "openai_model" not in st.session_state:
st.session_state["openai_model"] = "gpt-4o-mini"

# Initialize chat history
if "messages" not in st.session_state:
st.session_state.messages = []

# Display chat messages from history on app rerun
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])

# Accept user input
if prompt := st.chat_input("What is up?"):
# Add user message to chat history
st.session_state.messages.append({"role": "user", "content": prompt})
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(prompt)

# Display assistant response in chat message container
with st.chat_message("assistant"):
stream = client.chat.completions.create(
model=st.session_state["openai_model"],
messages=[
{"role": m["role"], "content": m["content"]}
for m in st.session_state.messages
],
stream=True,
)
response = st.write_stream(stream)
st.session_state.messages.append({"role": "assistant", "content": response})


if __name__ == '__main__':

if 'openai_api_key' in st.session_state and st.session_state.openai_api_key:
main()

else:

# if the DB_FILE not exists, create it
if not os.path.exists(DB_FILE):
with open(DB_FILE, 'w') as file:
db = {
'openai_api_keys': [],
'chat_history': []
}
json.dump(db, file)
# load the database
else:
with open(DB_FILE, 'r') as file:
db = json.load(file)

# display the selectbox from db['openai_api_keys']
selected_key = st.selectbox(
label = "Existing OpenAI API Keys",
options = db['openai_api_keys']
)

# a text input box for entering a new key
new_key = st.text_input(
label="New OpenAI API Key",
type="password"
)

login = st.button("Login")

# if new_key is given, add it to db['openai_api_keys']
# if new_key is not given, use the selected_key
if login:
if new_key:
db['openai_api_keys'].append(new_key)
with open(DB_FILE, 'w') as file:
json.dump(db, file)
st.success("Key saved successfully.")
st.session_state['openai_api_key'] = new_key
st.rerun()
else:
if selected_key:
st.success(f"Logged in with key '{selected_key}'")
st.session_state['openai_api_key'] = selected_key
st.rerun()
else:
st.error("API Key is required to login")

The tutorial code is wrapped into the main() function. Under the __name__ section, it is how you can create a login page separate from the Chat UI. The logics are:
1. Check if the openai_api_key is in the session_state, if yes, run the main page; if not, show the login page

2. In the login page, provide the dropdown selection from the the stored keys. Users also can provide a new key, which will be used and store to the database

After the first time login, the previous entered key will be stored as the existing key.

After logged in, you can see the ChatUI as above from the tutorial.

How about a bit more advanced by enabling model selection?

Use the st.sidebar.selectbox to put the model selection UI on the left. This is the main() function

def main():
client = OpenAI(api_key=st.session_state.openai_api_key)

# List of models
models = ["gpt-4o-mini", "gpt-4o", "gpt-4-turbo", "gpt-4", "gpt-3.5-turbo"]

# Create a select box for the models
st.session_state["openai_model"] = st.sidebar.selectbox("Select OpenAI model", models, index=0)

# Initialize chat history
if "messages" not in st.session_state:
st.session_state.messages = []

# Display chat messages from history on app rerun
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])

# Accept user input
if prompt := st.chat_input("What is up?"):
# Add user message to chat history
st.session_state.messages.append({"role": "user", "content": prompt})
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(prompt)

# Display assistant response in chat message container
with st.chat_message("assistant"):
stream = client.chat.completions.create(
model=st.session_state["openai_model"],
messages=[
{"role": m["role"], "content": m["content"]}
for m in st.session_state.messages
],
stream=True,
)
response = st.write_stream(stream)
st.session_state.messages.append({"role": "assistant", "content": response})

You likely noticed there is a chat_history key in the db.json.

Exactly, let’s store the chat history so that we can revisit anytime.

def main():
client = OpenAI(api_key=st.session_state.openai_api_key)

# List of models
models = ["gpt-4o-mini", "gpt-4o", "gpt-4-turbo", "gpt-4", "gpt-3.5-turbo"]

# Create a select box for the models
st.session_state["openai_model"] = st.sidebar.selectbox("Select OpenAI model", models, index=0)

# Load chat history from db.json
with open(DB_FILE, 'r') as file:
db = json.load(file)
st.session_state.messages = db.get('chat_history', [])

# Display chat messages from history on app rerun
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])

# Accept user input
if prompt := st.chat_input("What is up?"):
# Add user message to chat history
st.session_state.messages.append({"role": "user", "content": prompt})
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(prompt)

# Display assistant response in chat message container
with st.chat_message("assistant"):
stream = client.chat.completions.create(
model=st.session_state["openai_model"],
messages=[
{"role": m["role"], "content": m["content"]}
for m in st.session_state.messages
],
stream=True,
)
response = st.write_stream(stream)
st.session_state.messages.append({"role": "assistant", "content": response})

# Store chat history to db.json
db['chat_history'] = st.session_state.messages
with open(DB_FILE, 'w') as file:
json.dump(db, file)

# Add a "Clear Chat" button to the sidebar
if st.sidebar.button('Clear Chat'):
# Clear chat history in db.json
db['chat_history'] = []
with open(DB_FILE, 'w') as file:
json.dump(db, file)
# Clear chat messages in session state
st.session_state.messages = []
st.rerun()

With this change, you can come back anytime to your app and see what you have chatted. I also added the “Clear Chat” button to start over.

This is the complete script:

import streamlit as st
from openai import OpenAI
import json
import os


DB_FILE = 'db.json'

def main():
client = OpenAI(api_key=st.session_state.openai_api_key)

# List of models
models = ["gpt-4o-mini", "gpt-4o", "gpt-4-turbo", "gpt-4", "gpt-3.5-turbo"]

# Create a select box for the models
st.session_state["openai_model"] = st.sidebar.selectbox("Select OpenAI model", models, index=0)

# Load chat history from db.json
with open(DB_FILE, 'r') as file:
db = json.load(file)
st.session_state.messages = db.get('chat_history', [])

# Display chat messages from history on app rerun
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])

# Accept user input
if prompt := st.chat_input("What is up?"):
# Add user message to chat history
st.session_state.messages.append({"role": "user", "content": prompt})
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(prompt)

# Display assistant response in chat message container
with st.chat_message("assistant"):
stream = client.chat.completions.create(
model=st.session_state["openai_model"],
messages=[
{"role": m["role"], "content": m["content"]}
for m in st.session_state.messages
],
stream=True,
)
response = st.write_stream(stream)
st.session_state.messages.append({"role": "assistant", "content": response})

# Store chat history to db.json
db['chat_history'] = st.session_state.messages
with open(DB_FILE, 'w') as file:
json.dump(db, file)

# Add a "Clear Chat" button to the sidebar
if st.sidebar.button('Clear Chat'):
# Clear chat history in db.json
db['chat_history'] = []
with open(DB_FILE, 'w') as file:
json.dump(db, file)
# Clear chat messages in session state
st.session_state.messages = []
st.rerun()


if __name__ == '__main__':

if 'openai_api_key' in st.session_state and st.session_state.openai_api_key:
main()

else:

# if the DB_FILE not exists, create it
if not os.path.exists(DB_FILE):
with open(DB_FILE, 'w') as file:
db = {
'openai_api_keys': [],
'chat_history': []
}
json.dump(db, file)
# load the database
else:
with open(DB_FILE, 'r') as file:
db = json.load(file)

# display the selectbox from db['openai_api_keys']
selected_key = st.selectbox(
label = "Existing OpenAI API Keys",
options = db['openai_api_keys']
)

# a text input box for entering a new key
new_key = st.text_input(
label="New OpenAI API Key",
type="password"
)

login = st.button("Login")

# if new_key is given, add it to db['openai_api_keys']
# if new_key is not given, use the selected_key
if login:
if new_key:
db['openai_api_keys'].append(new_key)
with open(DB_FILE, 'w') as file:
json.dump(db, file)
st.success("Key saved successfully.")
st.session_state['openai_api_key'] = new_key
st.rerun()
else:
if selected_key:
st.success(f"Logged in with key '{selected_key}'")
st.session_state['openai_api_key'] = selected_key
st.rerun()
else:
st.error("API Key is required to login")

As I am coding while writing, I might created bugs there. Feel free to comment below to help me fix any. I hope you like it!!

This is how I created the Idea Lab page in HAZL

It is quite simple right! But yet it is so powerful with the combination of ChatGPT and Streamlit. I used similar codes to create the Idea Lab feature in HAZL, where users can interact with our chatbot and submit ideas.

The only differences are I added the functions to store multiple chat history with titles. I actually use ChatGPT itself to summarize a title for each chat (cool right?)

And of course, I used AWS Postgres as my database, not a json file, because this is a production environment.

I will soon publish this app in the App Gallery and you can either use HAZL to run it or download the codes to run locally. Enjoy!!

--

--

Johnny Chen

Co-founder of HAZL -- a platform for your one-stop cloud and AI services. Visit us at hazl.ca