ControlNet¶
ControlNet models add extra conditions (like edge maps, depth maps, human poses) to Stable Diffusion models, allowing for more precise control over the generated image structure. RBLN NPUs can accelerate ControlNet model inference using Optimum RBLN.
Key Classes¶
RBLNControlNetModel
: The main model class for runningControlNetModel
on RBLN NPU.RBLNControlNetModelConfig
: Configuration class specifically forRBLNControlNetModel
.
Usage within Pipelines¶
Typically, you don't interact with RBLNControlNetModel
directly. Instead, it's automatically loaded and managed as part of an RBLN ControlNet pipeline, such as RBLNStableDiffusionControlNetPipeline
or RBLNStableDiffusionXLControlNetPipeline
.
When configuring an RBLN ControlNet pipeline, you can pass specific settings for the ControlNet model via the controlnet
argument in the pipeline's configuration object:
Refer to the ControlNet Pipeline documentation (Standard, SDXL) for details on pipeline usage and configuration, including handling guidance scale effects on the ControlNet's batch size.
API Reference¶
Classes¶
RBLNControlNetModel
¶
Bases: RBLNModel
Functions¶
from_pretrained(model_id, export=False, rbln_config=None, **kwargs)
classmethod
¶
The from_pretrained()
function is utilized in its standard form as in the HuggingFace transformers library.
User can use this function to load a pre-trained model from the HuggingFace library and convert it to a RBLN model to be run on RBLN NPUs.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model_id
|
Union[str, Path]
|
The model id of the pre-trained model to be loaded. It can be downloaded from the HuggingFace model hub or a local path, or a model id of a compiled model using the RBLN Compiler. |
required |
export
|
bool
|
A boolean flag to indicate whether the model should be compiled. |
False
|
rbln_config
|
Optional[Union[Dict, RBLNModelConfig]]
|
Configuration for RBLN model compilation and runtime. This can be provided as a dictionary or an instance of the model's configuration class (e.g., |
None
|
kwargs
|
Dict[str, Any]
|
Additional keyword arguments. Arguments with the prefix 'rbln_' are passed to rbln_config, while the remaining arguments are passed to the HuggingFace library. |
{}
|
Returns:
Type | Description |
---|---|
Self
|
A RBLN model instance ready for inference on RBLN NPU devices. |
from_model(model, *, rbln_config=None, **kwargs)
classmethod
¶
Converts and compiles a pre-trained HuggingFace library model into a RBLN model. This method performs the actual model conversion and compilation process.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model
|
PreTrainedModel
|
The PyTorch model to be compiled. The object must be an instance of the HuggingFace transformers PreTrainedModel class. |
required |
rbln_config
|
Optional[Union[Dict, RBLNModelConfig]]
|
Configuration for RBLN model compilation and runtime. This can be provided as a dictionary or an instance of the model's configuration class (e.g., |
None
|
kwargs
|
Dict[str, Any]
|
Additional keyword arguments. Arguments with the prefix 'rbln_' are passed to rbln_config, while the remaining arguments are passed to the HuggingFace library. |
{}
|
The method performs the following steps:
- Compiles the PyTorch model into an optimized RBLN graph
- Configures the model for the specified NPU device
- Creates the necessary runtime objects if requested
- Saves the compiled model and configurations
Returns:
Type | Description |
---|---|
Self
|
A RBLN model instance ready for inference on RBLN NPU devices. |
save_pretrained(save_directory)
¶
Saves a model and its configuration file to a directory, so that it can be re-loaded using the
[from_pretrained
] class method.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
save_directory
|
Union[str, PathLike]
|
The directory to save the model and its configuration files. Will be created if it doesn't exist. |
required |
Classes¶
RBLNControlNetModelConfig
¶
Bases: RBLNModelConfig
Configuration class for RBLN ControlNet models.
Functions¶
__init__(batch_size=None, max_seq_len=None, unet_sample_size=None, vae_sample_size=None, text_model_hidden_size=None, **kwargs)
¶
Parameters:
Name | Type | Description | Default |
---|---|---|---|
batch_size
|
Optional[int]
|
The batch size for inference. Defaults to 1. |
None
|
max_seq_len
|
Optional[int]
|
Maximum sequence length for text inputs when used with cross-attention. |
None
|
unet_sample_size
|
Optional[Tuple[int, int]]
|
The spatial dimensions (height, width) of the UNet input latents this ControlNet works with. |
None
|
vae_sample_size
|
Optional[Tuple[int, int]]
|
The spatial dimensions (height, width) of the original image input to the ControlNet condition. |
None
|
text_model_hidden_size
|
Optional[int]
|
Hidden size of the text encoder model used for conditioning (relevant for SDXL ControlNets). |
None
|
**kwargs
|
Dict[str, Any]
|
Additional arguments passed to the parent RBLNModelConfig. |
{}
|