Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions src/diffusers/loaders/controlnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,9 @@ def from_single_file(cls, pretrained_model_link_or_path, **kwargs):
- A link to the `.ckpt` file (for example
`"https://huggingface.co/<repo_id>/blob/main/<path_to_file>.ckpt"`) on the Hub.
- A path to a *file* containing all pipeline weights.
config_file (`str`, *optional*):
Filepath to the configuration YAML file associated with the model. If not provided it will default to:
https://raw.githubusercontent.com/lllyasviel/ControlNet/main/models/cldm_v15.yaml
Comment on lines +41 to +43
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it makes the difference between original_config_file and config_file quite clear. Could you rephrase?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They both mean the same thing. We used original_config_file in the pipeline Mixin.

With the refactor, the internal logic config_file = kwargs.pop("config_file") was removed in favour of using original_config_file like we do for the Pipelines.

But UI's like SD.Next use config_file when loading the VAE. Putting it back here again for backwards compatibility.

We don't need to keep both. We can deprecate one of them.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay. Then maybe in a follow-up PR, let's start their deprecation cycles?

torch_dtype (`str` or `torch.dtype`, *optional*):
Override the default `torch.dtype` and load the model with another dtype. If `"auto"` is passed, the
dtype is automatically derived from the model's weights.
Expand Down Expand Up @@ -89,6 +92,7 @@ def from_single_file(cls, pretrained_model_link_or_path, **kwargs):
```
"""
original_config_file = kwargs.pop("original_config_file", None)
config_file = kwargs.pop("config_file", None)
resume_download = kwargs.pop("resume_download", False)
force_download = kwargs.pop("force_download", False)
proxies = kwargs.pop("proxies", None)
Expand All @@ -100,6 +104,12 @@ def from_single_file(cls, pretrained_model_link_or_path, **kwargs):
use_safetensors = kwargs.pop("use_safetensors", True)

class_name = cls.__name__
if (config_file is not None) and (original_config_file is not None):
raise ValueError(
"You cannot pass both `config_file` and `original_config_file` to `from_single_file`. Please use only one of these arguments."
)

original_config_file = config_file or original_config_file
original_config, checkpoint = fetch_ldm_config_and_checkpoint(
pretrained_model_link_or_path=pretrained_model_link_or_path,
class_name=class_name,
Expand Down