Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Further separate task queues and increase timeouts #979

Merged
merged 1 commit into from
Oct 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions Procfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ release: ./manage.py migrate --check || ./manage.py migrate
# https://devcenter.heroku.com/articles/http-routing#http-validation-and-restrictions
# long request lines are useful for long DSL search queries
web: gunicorn --timeout 120 --limit-request-line 8192 --bind 0.0.0.0:$PORT isic.wsgi
worker: REMAP_SIGTERM=SIGQUIT celery --app isic.celery worker --loglevel INFO --without-heartbeat --concurrency 2 -X low-priority
low_priority_worker: REMAP_SIGTERM=SIGQUIT celery --app isic.celery worker --loglevel INFO --without-heartbeat --concurrency 2 -Q low-priority
worker: REMAP_SIGTERM=SIGQUIT ./deploy/worker.sh
low_priority_worker: REMAP_SIGTERM=SIGQUIT ./deploy/low-priority-worker.sh
beat: REMAP_SIGTERM=SIGQUIT celery --app isic.celery beat --loglevel INFO
8 changes: 8 additions & 0 deletions deploy/low-priority-worker.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/bin/bash
set -e

celery --app isic.celery worker \
--loglevel INFO \
--without-heartbeat \
--concurrency 2 \
--queues s3-log-processing,stats-aggregation,es-indexing
8 changes: 8 additions & 0 deletions deploy/worker.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/bin/bash
set -e

celery --app isic.celery worker \
--loglevel INFO \
--without-heartbeat \
--concurrency 2 \
--queues celery
5 changes: 3 additions & 2 deletions isic/core/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,12 +51,13 @@ def share_collection_with_users_task(collection_pk: int, grantor_pk: int, user_p


@shared_task(
soft_time_limit=900,
time_limit=910,
soft_time_limit=1200,
time_limit=1210,
autoretry_for=(ConnectionError, TimeoutError),
retry_backoff=True,
retry_backoff_max=600,
retry_kwargs={"max_retries": 3},
queue="es-indexing",
)
def sync_elasticsearch_index_task():
bulk_add_to_search_index(Image.objects.with_elasticsearch_properties())
Expand Down
8 changes: 6 additions & 2 deletions isic/stats/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ def _cdn_access_log_records(log_file_bytes: BytesIO) -> Iterable[dict]:
}


@shared_task(queue="low-priority")
@shared_task(queue="stats-aggregation")
def collect_image_download_records_task():
"""
Collect CDN logs to record image downloads.
Expand All @@ -181,7 +181,11 @@ def collect_image_download_records_task():


@shared_task(
soft_time_limit=600, time_limit=630, max_retries=5, retry_backoff=True, queue="low-priority"
soft_time_limit=600,
time_limit=630,
max_retries=5,
retry_backoff=True,
queue="s3-log-processing",
)
def process_s3_log_file_task(s3_log_object_key: str):
logger.info("Processing s3 log file %s", s3_log_object_key)
Expand Down
Loading