Skip to content

a chatbot application built with Streamlit and the LLaMA 3.1 model from the LangChain community

Notifications You must be signed in to change notification settings

andersonclementte/generative-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Streamlit Chatbot

This project is a simple chatbot application using Streamlit and the LangChain library with the Ollama model. It allows users to interact with the chatbot and see responses in real-time.

Getting Started

Prerequisites

Ensure you have Python 3.8 or later installed on your machine. It's also recommended to use a virtual environment to manage dependencies.

Setting Up the Project

  1. Clone the Repository

    git clone <your-repository-url>
    cd <your-repository-directory>
    
  2. Create and Activate a Virtual Environment

    python -m venv venv
    
  3. Install Required Packages

     pip install -r requirements.txt
    

Running the Application

  1. Run Streamlit

    streamlit run main.py
    
    
  2. Open the application

    Once the server is running, open your web browser and navigate to http://localhost:8501 to interact with the chatbot.

Requirements

You can generate the requirements.txt file using the following command:

```bash
pip freeze > requirements.txt

About

a chatbot application built with Streamlit and the LLaMA 3.1 model from the LangChain community

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages