"dingdongs" can be traced back to Old English and Middle Dutch. The term "ding" originates from the Old English "dingan," meaning to strike or hit, often used to describe the sound of a bell. The term "dong" comes from the Middle Dutch "donckus," which refers to the sound a duck makes when hitting the water at or above the von Plonck ducking velocity. Combining these elements, "dingdongs" represents a harmonious and impactful sound, symbolizing our aim to make a resonant impact at BrainHack 2024.
docker build -t dingdongs-asr .
docker build -t dingdongs-nlp .
docker build -t dingdongs-vlm .
docker build -t dingdongs-autonomy .
docker build -t dingdongs-main .
(run without -d
to show debug info)
docker run -p 5001:5001 --gpus all -d dingdongs-asr
docker run -p 5002:5002 --gpus all -d dingdongs-nlp
docker run -p 5004:5004 --gpus all -d dingdongs-vlm
docker run -p 5003:5003 dingdongs-autonomy
docker run -p 5005:5005 dingdongs-main
docker tag dingdongs-asr asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-asr:finals
docker tag dingdongs-nlp asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-nlp:finals
docker tag dingdongs-vlm asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-vlm:finals
docker tag dingdongs-autonomy asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-autonomy:finals
docker tag dingdongs-main asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-main:finals
docker push asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-asr:finals
docker push asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-nlp:finals
docker push asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-vlm:finals
docker push asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-autonomy:finals
docker push asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-main:finals
gcloud ai models upload --region asia-southeast1 --display-name 'dingdongs-asr' --container-image-uri asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-asr:finals --container-health-route /health --container-predict-route /stt --container-ports 5001 --version-aliases default
gcloud ai models upload --region asia-southeast1 --display-name 'dingdongs-nlp' --container-image-uri asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-nlp:finals --container-health-route /health --container-predict-route /extract --container-ports 5002 --version-aliases default
gcloud ai models upload --region asia-southeast1 --display-name 'dingdongs-vlm' --container-image-uri asia-southeast1-docker.pkg.dev/dsta-angelhack/repository-dingdongs/dingdongs-vlm:finals --container-health-route /health --container-predict-route /identify --container-ports 5004 --version-aliases default
docker ps
(but doesn't remove it)
docker kill CONTAINER-ID
docker images
docker rmi IMAGE_NAME
(u shld kill it first)
docker rm CONTAINER_NAME
# Stop all running containers
docker stop $(docker ps -aq)
# Remove all containers
docker rm $(docker ps -aq)
# Remove all images
docker rmi $(docker images -q)
# Remove all volumes (optional)
docker volume rm $(docker volume ls -q)
# Remove all networks (optional)
docker network rm $(docker network ls -q)
# Clean up dangling Docker objects (optional)
docker system prune -a -f --volumes
Create an .env
file based on the provided .env.example
file, and update it accordingly:
COMPETITION_IP = "172.17.0.1"
on Linux,"host.docker.internal"
otherwiseLOCAL_IP = "172.17.0.1"
on Linux,"host.docker.internal"
otherwiseUSE_ROBOT = "false"
Then run docker compose up
. This should start the competition server locally, as well as the rest of the services accordingly to connect to it.
build the simulator + competition server from the top-level directory by running:
docker build -t competition .
You can run the simulator + competition server from the top-level directory by running:
docker run -p 8000:8000 competition
access localhost:8000
from browser machine.
# start all the services
docker compose up
# force a build of all services and start them afterwards
docker compose up --build
# take down all the services
docker compose down
# start a particular docker compose file by name (it defaults to `docker-compose.yml` if not indicated)
docker compose -f docker-compose-finals.yml up
So that we can quickly rollback to an old code
-
Submit your code first and get the scores
-
Commit everything you need (except any binary files or ginormous text files which should be ignored)
-
Tag the correct commit with the scores and the file structure (using
tree
). By default it tags the commit thatHEAD
points to, but you cangit log --oneline
and choose a different commit.git tag -a asr-v1.0 -m "Descriptive tag title Accuracy: 1.00000000 Speed: 1.00000000 Tree: . ├── Dockerfile ├── README.md ├── model │ ├── config.json │ ├── pytorch_model.bin │ └── tokenizer │ ├── merges.txt │ ├── special_tokens_map.json │ ├── tokenizer_config.json │ └── vocab.json ├── requirements.txt └── src ├── NLPManager.py └── api_service.py"
-
Push the tag to GitHub
git push origin asr-v1.0
-
Create the release and upload
- Go to Tags and click the one you just created
- For the title and message just copy paste from your tag message lol
- Check the "Set as pre-release" checkbox
- Download the binary files from Vertex, and upload them here. The idea is that we should keep every single file, so we can quickly checkout a previous tag, and immediately build and submit
- Follow semantic versioning (semver): major.minor.patch.
- Major changes: Completely different methodology (e.g., switching from QA to NER for NLP).
- Minor changes: Significant improvements (e.g., retrained model).
- Patch changes: Small adjustments (e.g., minor code tweaks).
- Ensure the tag is on the commit that represents a complete, ready-to-submit state.
- Tags do not need to be sequential. You can tag earlier versions if you made an adjustment to a previous version.
- If this is the "baseline" submission (e.g. using a non-fine tuned model, using stupid methods) then we put as v0, otherwise the first major version should be v1
- The next minor version after v1.9 is v1.10 NOT v2.0 jkalshdfljkbsdfljkhsldfjh