Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit warmup memory usage #5524

Closed
wants to merge 12 commits into from
Closed

Limit warmup memory usage #5524

wants to merge 12 commits into from

Conversation

fulmicoton
Copy link
Contributor

We also measure the amount of memory taken by a split search, and log this.

@fulmicoton fulmicoton force-pushed the memory_log branch 3 times, most recently from 510c653 to 81280a7 Compare October 25, 2024 06:14
Copy link

github-actions bot commented Oct 25, 2024

On SSD:

Average search latency is 1.0x that of the reference (lower is better).
Ref run id: 3986, ref commit: 02a5b6a
Link

On GCS:

Average search latency is 1.02x that of the reference (lower is better).
Ref run id: 3987, ref commit: 02a5b6a
Link

@PSeitz
Copy link
Contributor

PSeitz commented Oct 25, 2024

Related: #5312

@fulmicoton fulmicoton changed the title Adds a concept of request_id for logging/correlation purpose. Adds a trace_id, split search time, and memory user for correlation purpose. Oct 25, 2024
@fulmicoton fulmicoton force-pushed the memory_log branch 4 times, most recently from 0749e6f to 1dd5544 Compare October 25, 2024 09:16
@fulmicoton fulmicoton force-pushed the memory_log branch 7 times, most recently from f971b9d to e1b7125 Compare November 15, 2024 02:46
@fulmicoton fulmicoton changed the title Adds a trace_id, split search time, and memory user for correlation purpose. Limit warmup memory usage Nov 15, 2024
fulmicoton and others added 4 commits November 26, 2024 10:03
Due to tantivy limitations, searching a split requires downloading all
of the required data, and keep them in memory. We call this phase
warmup.

Before this PR, the only thing that curbed memory usage was the search
permits: only N split search may happen concurrently.
Unfortunately, the amount of data required here varies vastly.

We need a mechanism to measure and avoid running more split search
when memory is tight.

Just using a semaphore is however not an option. We do not know
beforehands how much memory will be required by a split search, so it could easily
lead to a dead lock.

Instead, this commit builds upon the search permit provider.

The search permit provider is in charge of managing a configurable memory budget for this warmup memory.

We introduce here a configurable "warmup_single_split_initial_allocation".
A new leaf split search cannot be started if this memory is not
available. This initial allocation is meant to be greater than what will
be actually needed most of the time.

The split search then holds this allocation until the end of warmup.
After warmup, we can get the actual memory usage by interrogating the
warmup cache. We can then update the amount of memory held.
(most of the time, this should mean releasing some memory)

In addition, in this PR, at this point, we also release the warmup search permit:

We still have to perform the actual task of searching, but the thread
pool will take care of limiting the number of concurrent task.

Closes #5355
Also attach the permit to the actual memory cache to ensure memory is freed at the right moment.
Adding an extra generic field into the cache to optionally allow permit tracking is weird.
Instead, we make the directory generic on the type of cache and use a wrapped cache
when tracking is necessary.
Comment on lines 786 to 792
if is_top_1pct_memory_intensive(
resource_stats.short_lived_cache_num_bytes,
resource_stats.split_num_docs,
) {
// We log at most 5 times per minute.
quickwit_common::rate_limited_info!(limit_per_min=5, split_num_docs=resource_stats.split_num_docs, %search_request.query_ast, short_lived_cached_num_bytes=resource_stats.short_lived_cache_num_bytes, query=%search_request.query_ast, "memory intensive query");
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we create a metric as well?

@rdettai rdettai closed this Nov 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants