Learn how to build a real-time chatbot with memory, designed as a core component for AI agents, using Python, LangChain, Streamlit, and OpenAI's GPT.

In this tuto, we're building a chatbot with memory and real-time responses, using Python.

To do that, we will need:

▪️ Streamlit for the user interface.

▪️ LangChain for prompt handling and memory.

▪️ OpenAI as the L-L-M to generate responses.

▪️ And dotenv, to securely load our key.

Video available on Youtube

code available in github

1- User Input

This is where we collect the user's message.

user_query = st.chat_input(" Your Message")
if user_query is not None and user_query!="":
  st.session_state.chat_history.append(HumanMessage(user_query))

  with st.chat_message("Human"):
    st.markdown(user_query)
  with st.chat_message("AI"):
    ai_response= st.write_stream(get_response
     (user_query, st.session_state.chat_history))
    
  st.session_state.chat_history.append(AIMessage(ai_response))
  • When the user types something, it's wrapped in a HumanMessage and added to chat_history.
  • It's also displayed in the interface using st.chat_message("Human")

2- Reponse

This part defines how the assistant replies.

def get_response(query, chat_history):
  template="""

  you are a helpfull assistant, answer the following questions:

  chat history: {chat_history} 
  User question: {user_question}
  
  """
  prompt= ChatPromptTemplate.from_template(template)
  llm= ChatOpenAI()
  chain = prompt | llm | StrOutputParser()
  return chain.stream({
    "chat_history" : chat_history,
    "user_question" : query 
  })
  • We build a custom prompt using ChatPromptTemplate, injecting the conversation history and the user's new question.
  • This prompt is sent to OpenAI's LLM via ChatOpenAI().
  • The output is parsed with StrOutputParser and streamed back live using st.write_stream().

3- Conversation Reply

for message in st.session_state.chat_history:
  if isinstance(message, HumanMessage):
    with st.chat_message("Human"):
      st.markdown(message.content)
  else:
    with st.chat_message("AI"):
      st.markdown(message.content)
  
  • We loop through the chat history and display messages using st.chat_message().
  • Depending on whether it's a HumanMessage or AIMessage, we show it in the right bubble.

for more explication , find the video bellow

code available on github

#AI #AIAGENT #python #GPT #langchain