Skip to content

Allow the prompt request to specify the prompt ID. #8189

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

FeepingCreature
Copy link
Contributor

@FeepingCreature FeepingCreature commented May 18, 2025

This makes it easier to write asynchronous clients that submit requests, because they can store the task by id immediately rather than waiting for the prompt api to return.

Specifically I want it so that Krita AI Diffusion can stop fetching the prompt id async, because I suspect it's causing a timing issue.

@FeepingCreature
Copy link
Contributor Author

Should I change the API demo to generate a uuid on the client side?

@christian-byrne
Copy link
Collaborator

Good change. Maybe there can be an example/comment/line added here https://github.com/comfyanonymous/ComfyUI/blob/master/script_examples/basic_api_example.py since there is no api spec for the webserver yet.

@FeepingCreature
Copy link
Contributor Author

Added an example to the websocket api test, because the simple api test doesn't do anything with the response.

@comfyanonymous
Copy link
Owner

This is fine but you need to add a check to see if the prompt id is already used if it is specified and return an appropriate error.

@FeepingCreature
Copy link
Contributor Author

Okay so I can do that but it's gonna be ugly. Because of the PromptQueue mutex, the only place to check that is in PromptQueue.put, otherwise there's always a chance of timing errors. But right now put doesn't even know what a prompt_id is. If there's a trivial way to do it I don't see it. I'd just trust the client to rng correctly.

@robinjhuang robinjhuang added the Good PR This PR looks good to go, it needs comfy's final review. label Jul 1, 2025
@deepanjanroy
Copy link

@FeepingCreature we're taking a look at this again (we now need this feature ourselves) and I'll get back to you likely within the next day. It's been a while so just wanted to check if you're still around to make some changes if we need to? No worries if you're busy, we can build on top of your PR. Thanks a lot for this!

@FeepingCreature
Copy link
Contributor Author

Cheers! I'm here, just lmk what you need.

@FeepingCreature FeepingCreature force-pushed the change/allow-specifying-prompt-id branch from d4228c2 to 436a0bb Compare July 8, 2025 04:19
@deepanjanroy
Copy link

So I agree with @comfyanonymous that we should check for duplicate prompt ids - it's not difficult to have a frontend bug where we send the same prompt id multiple times and ideally the server wouldn't end up in an inconsistent state because of it. Duplicate front ID breaks the server is subtle ways, for example the history list is keyed by prompt id. In general, it's good to maintain the property that prompt ids are unique.

Do you want to keep track of the used prompt ids in PromptQueue with a set? In PromptQueue.put, inside the mutex section, you can check the set for reuse (and throw an error if so), and add it to the set otherwise. In PromptServer post_prompt, you can catch the error and return an error response. This is technically memory leak, but I see we allow 10,000 history items (which is a lot) and prompt id is a subset of each history item, so it should be find for now and we can optimize later.

@FeepingCreature FeepingCreature force-pushed the change/allow-specifying-prompt-id branch from 436a0bb to d2900ae Compare July 9, 2025 19:20
@FeepingCreature
Copy link
Contributor Author

FeepingCreature commented Jul 9, 2025

Alright, how about this? I refactored the job queue a bit so that it just uses the prompt_id for everything, but it's a bit smelly now in that it exploits the fact that it knows item[1] is the prompt_id. I didn't want to change the API because I wasn't sure if any extensions used the job queue directly.

edit: Hm, though I changed get already so I might as well.

edit: Hm, no I'd have to change get_current_queue which I don't trust. I think this fairly minimal change is fine actually.

@FeepingCreature FeepingCreature force-pushed the change/allow-specifying-prompt-id branch 2 times, most recently from d0909af to d675724 Compare July 9, 2025 19:24
@deepanjanroy
Copy link

Thanks for the changes! They look good to me

Summary of changes since last update:

  • We're keeping track of all the used prompt IDs in PromptQueue and guaranteeing that the id will be unique
  • We're getting rid of task_counter, which was the auto-incrementing unique ID of a task before. We use prompt id in its place.

I tested the PR locally with latest frontend. I can queue jobs, cancel jobs, see them in history. Everything seems to work.

@comfyanonymous / any other code owners: want to approve and merge?

@deepanjanroy
Copy link

@FeepingCreature many apologies for the churn, but we discussed this more with the core team and the team decided not to allow such explicit duplicate tracking. We'll allow duplicate prompt ids, as long as it doesn't break the server. Would you mind reverting back to your previous implementation?

I did a bunch of testing, and found that the only way to break the server is if we receive prompt IDs of inconsistent types (e.g. int and str or list and str). We should just verify that the prompt ID is string before putting it in the queue, and as long as we do that:

  • the server never crashes no matter what we do
  • if two prompts have same prompt id, the later one replaces the previous one in history
  • if multiple prompts have same prompt id and number, then heapq.heappush can fail because the tuple comparison fall through all the way to the prompt dict, and we get a "< not supported for dicts" error. The server returns a 500 internal server error, but it doesn't crash, so it's essentially the equivalent of an error response. Server keeps working fine after - no inconsistent state.

So the only change we'd like you to make beyond the previous implementation is validating the prompt id is a string (and returning an error if not).

This makes it easier to write asynchronous clients that submit requests, because they can store the task immediately.
Duplicate prompt IDs are rejected by the job queue.
@FeepingCreature FeepingCreature force-pushed the change/allow-specifying-prompt-id branch from d675724 to d58cd3a Compare July 11, 2025 07:36
@FeepingCreature
Copy link
Contributor Author

I just moved str to the outside, is that fine?

@deepanjanroy
Copy link

Hah, yeah that works. @comfyanonymous works for you? I tested:

  • No change in ComfyUI frontend task running
  • If client passes in different types (int, str, list) in the request json, everything just gets converted to string. Nothing crashes.
  • I can still force the "< is invalid for dict" error if there are too many requests with same prompt id and number, but that doesn't crash server. Otherwise we can use the number field to queue at front / in the back and it all works.

@bigcat88
Copy link
Contributor

I actually really like this PR, as being able to specify prompt_id is very handy.

@comfyanonymous comfyanonymous merged commit 260a5ca into comfyanonymous:master Jul 14, 2025
5 checks passed
@FeepingCreature FeepingCreature deleted the change/allow-specifying-prompt-id branch July 16, 2025 08:40
rakki194 pushed a commit to rakki194/ComfyUI that referenced this pull request Jul 21, 2025
This makes it easier to write asynchronous clients that submit requests, because they can store the task immediately.
Duplicate prompt IDs are rejected by the job queue.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Good PR This PR looks good to go, it needs comfy's final review.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants