Skip to content
This repository was archived by the owner on Apr 2, 2024. It is now read-only.

Autoscaler issue fixes; Episode 2#7

Open
hhsecond wants to merge 2 commits intomainfrom
debugging2
Open

Autoscaler issue fixes; Episode 2#7
hhsecond wants to merge 2 commits intomainfrom
debugging2

Conversation

@hhsecond
Copy link
Copy Markdown
Contributor

No description provided.

scale_in_interval=10,
max_batch_size=8, # for auto batching
timeout_batching=1, # for auto batching
max_batch_delay=1, # for auto batching
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

batch_wait_interval would be slightly more explicit and consistent with the arguments above.

self._last_autoscale = time.time()
self.fake_trigger = 0

for _ in range(min_replicas):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For another PR, we need to clean this out with structures. This isn't begineer friendly.

@@ -1,12 +1,12 @@
# !pip install 'git+https://github.com/Lightning-AI/stablediffusion.git@lit'
# !pip install 'git+https://github.com/Lightning-AI/DiffusionWithAutoscaler.git'
# !pip install 'git+https://github.com/Lightning-AI/DiffusionWithAutoscaler.git@debugging2'
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be changed before merging.

max_batch_size: (auto-batching) The number of requests to process at once.
timeout_batching: (auto-batching) The number of seconds to wait before sending the requests to process.
max_batch_delay: (auto-batching) The number of seconds to wait before sending the requests to workers.
request_timeout: The number of seconds to wait before timing out a request. A request may timeout because of
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you add this already ? Should this be dynamic on the number of element in the batch ?

@Borda Borda requested a review from tchaton February 20, 2023 08:20
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants