Table of Contents
Created while I was learning Kafka, I needed a stream of realtime data to work with and stock prices came to mind.
The project is composed of different applications which are orchestrated by docker compose.
A single Producer continually publishes price change messages to a topic with multiple Consumers.
- One Consumer saves each price event to a database.
- Another Consumer aggregates these rows into time buckets.
- A final Consumer pushes price events to the frontend via SSE.
Installation is easy thanks to docker compose, you just need to clone this repository then run the up
command.
Docker Compose must be installed on your machine. It can be installed through docker desktop or docker engine.
- Clone this repo
git clone https://github.com/beakerandjake/kafka-hello-world
- Start the application
docker compose up -d
To stop the application
docker compose down
Once the containers are all started navigate to http://localhost:8080
in your browser. You should see prices continually update, if you leave it running you can see the price chart change over time.
To view the price events you can create a console consumer:
docker exec -it kafka kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic stock_price_changes
You can also follow the consumer logs:
docker logs consumer-realtime --follow
docker logs consumer-aggregate --follow
See a specific component's README file for more information.