콘텐츠로 이동

Stable Diffusion ControlNet

ControlNet은 캐니 엣지, 깊이 맵 또는 사람 포즈 추정과 같은 보조 입력을 통해 이미지 생성을 가이드하는 조건부 제어를 표준 Stable Diffusion 모델(예: v1.5)에 통합합니다. Optimum RBLN은 RBLN NPU에서 이러한 모델에 대한 가속화된 파이프라인을 제공합니다.

지원하는 파이프라인

  • ControlNet을 사용한 텍스트-이미지 변환: 제어 이미지를 통해 가이드되는 텍스트 프롬프트에서 이미지 생성.
  • ControlNet을 사용한 이미지-이미지 변환: 텍스트 프롬프트와 제어 이미지를 기반으로 기존 이미지 수정.

주요 클래스

중요: Guidance Scale에 따른 배치 크기 설정

배치 크기와 Guidance Scale

표준 Stable Diffusion과 유사하게, guidance scale > 1.0(기본값 7.5)으로 ControlNet 파이프라인을 사용하면 실행 시 UNet과 ControlNet 모델 모두의 실제 배치 크기가 2배가 됩니다.

RBLNStableDiffusionControlNetPipelineConfigunetcontrolnet 섹션에 지정된 batch_size가 예상 실행 시간 배치 크기와 일치하는지 확인하십시오(guidance_scale > 1.0인 경우 일반적으로 추론 배치 크기의 2배).

기본 동작

구성에서 unetcontrolnet에 대한 배치 크기를 명시적으로 설정하지 않으면 Optimum RBLN은 guidance_scale > 1.0을 가정하고 해당 배치 크기를 파이프라인의 batch_size의 2배로 자동 설정합니다.

API 참조

Classes

RBLNStableDiffusionControlNetPipeline

Bases: RBLNDiffusionMixin, StableDiffusionControlNetPipeline

Functions

from_pretrained(model_id, *, export=False, model_save_dir=None, rbln_config={}, lora_ids=None, lora_weights_names=None, lora_scales=None, **kwargs) classmethod

Load a pretrained diffusion pipeline from a model checkpoint, with optional compilation for RBLN NPUs.

This method has two distinct operating modes:

  • When export=True: Takes a PyTorch-based diffusion model, compiles it for RBLN NPUs, and loads the compiled model
  • When export=False: Loads an already compiled RBLN model from model_id without recompilation

It supports various diffusion pipelines including Stable Diffusion, Kandinsky, ControlNet, and other diffusers-based models.

Parameters:

Name Type Description Default
model_id str

The model ID or path to the pretrained model to load. Can be either:

  • A model ID from the Hugging Face Hub
  • A local path to a saved model directory
required
export bool

If True, takes a PyTorch model from model_id and compiles it for RBLN NPU execution. If False, loads an already compiled RBLN model from model_id without recompilation.

False
model_save_dir Optional[PathLike]

Directory to save the compiled model artifacts. Only used when export=True. If not provided and export=True, a temporary directory is used.

None
rbln_config Dict[str, Any]

Configuration options for RBLN compilation. Can include settings for specific submodules such as text_encoder, unet, and vae. Configuration can be tailored to the specific pipeline being compiled.

{}
lora_ids Optional[Union[str, List[str]]]

LoRA adapter ID(s) to load and apply before compilation. LoRA weights are fused into the model weights during compilation. Only used when export=True.

None
lora_weights_names Optional[Union[str, List[str]]]

Names of specific LoRA weight files to load, corresponding to lora_ids. Only used when export=True.

None
lora_scales Optional[Union[float, List[float]]]

Scaling factor(s) to apply to the LoRA adapter(s). Only used when export=True.

None
**kwargs Dict[str, Any]

Additional arguments to pass to the underlying diffusion pipeline constructor or the RBLN compilation process. These may include parameters specific to individual submodules or the particular diffusion pipeline being used.

{}

Returns:

Type Description
Self

A compiled diffusion pipeline that can be used for inference on RBLN NPU. The returned object is an instance of the class that called this method, inheriting from RBLNDiffusionMixin.

RBLNStableDiffusionControlNetImg2ImgPipeline

Bases: RBLNDiffusionMixin, StableDiffusionControlNetImg2ImgPipeline

Functions

from_pretrained(model_id, *, export=False, model_save_dir=None, rbln_config={}, lora_ids=None, lora_weights_names=None, lora_scales=None, **kwargs) classmethod

Load a pretrained diffusion pipeline from a model checkpoint, with optional compilation for RBLN NPUs.

This method has two distinct operating modes:

  • When export=True: Takes a PyTorch-based diffusion model, compiles it for RBLN NPUs, and loads the compiled model
  • When export=False: Loads an already compiled RBLN model from model_id without recompilation

It supports various diffusion pipelines including Stable Diffusion, Kandinsky, ControlNet, and other diffusers-based models.

Parameters:

Name Type Description Default
model_id str

The model ID or path to the pretrained model to load. Can be either:

  • A model ID from the Hugging Face Hub
  • A local path to a saved model directory
required
export bool

If True, takes a PyTorch model from model_id and compiles it for RBLN NPU execution. If False, loads an already compiled RBLN model from model_id without recompilation.

False
model_save_dir Optional[PathLike]

Directory to save the compiled model artifacts. Only used when export=True. If not provided and export=True, a temporary directory is used.

None
rbln_config Dict[str, Any]

Configuration options for RBLN compilation. Can include settings for specific submodules such as text_encoder, unet, and vae. Configuration can be tailored to the specific pipeline being compiled.

{}
lora_ids Optional[Union[str, List[str]]]

LoRA adapter ID(s) to load and apply before compilation. LoRA weights are fused into the model weights during compilation. Only used when export=True.

None
lora_weights_names Optional[Union[str, List[str]]]

Names of specific LoRA weight files to load, corresponding to lora_ids. Only used when export=True.

None
lora_scales Optional[Union[float, List[float]]]

Scaling factor(s) to apply to the LoRA adapter(s). Only used when export=True.

None
**kwargs Dict[str, Any]

Additional arguments to pass to the underlying diffusion pipeline constructor or the RBLN compilation process. These may include parameters specific to individual submodules or the particular diffusion pipeline being used.

{}

Returns:

Type Description
Self

A compiled diffusion pipeline that can be used for inference on RBLN NPU. The returned object is an instance of the class that called this method, inheriting from RBLNDiffusionMixin.

Classes

RBLNStableDiffusionControlNetPipelineBaseConfig

Bases: RBLNModelConfig

Functions

__init__(text_encoder=None, unet=None, vae=None, controlnet=None, *, batch_size=None, img_height=None, img_width=None, sample_size=None, image_size=None, guidance_scale=None, **kwargs)

Parameters:

Name Type Description Default
text_encoder Optional[RBLNCLIPTextModelConfig]

Configuration for the text encoder component.

None
unet Optional[RBLNUNet2DConditionModelConfig]

Configuration for the UNet model component.

None
vae Optional[RBLNAutoencoderKLConfig]

Configuration for the VAE model component.

None
controlnet Optional[RBLNControlNetModelConfig]

Configuration for the ControlNet model component.

None
batch_size Optional[int]

Batch size for inference, applied to all submodules.

None
img_height Optional[int]

Height of the generated images.

None
img_width Optional[int]

Width of the generated images.

None
sample_size Optional[Tuple[int, int]]

Spatial dimensions for the UNet model.

None
image_size Optional[Tuple[int, int]]

Alternative way to specify image dimensions.

None
guidance_scale Optional[float]

Scale for classifier-free guidance.

None
**kwargs Dict[str, Any]

Additional arguments.

{}
Note

Guidance scale affects UNet and ControlNet batch sizes. If guidance_scale > 1.0, their batch sizes are doubled.

RBLNStableDiffusionControlNetPipelineConfig

Bases: RBLNStableDiffusionControlNetPipelineBaseConfig

Functions

__init__(text_encoder=None, unet=None, vae=None, controlnet=None, *, batch_size=None, img_height=None, img_width=None, sample_size=None, image_size=None, guidance_scale=None, **kwargs)

Parameters:

Name Type Description Default
text_encoder Optional[RBLNCLIPTextModelConfig]

Configuration for the text encoder component.

None
unet Optional[RBLNUNet2DConditionModelConfig]

Configuration for the UNet model component.

None
vae Optional[RBLNAutoencoderKLConfig]

Configuration for the VAE model component.

None
controlnet Optional[RBLNControlNetModelConfig]

Configuration for the ControlNet model component.

None
batch_size Optional[int]

Batch size for inference, applied to all submodules.

None
img_height Optional[int]

Height of the generated images.

None
img_width Optional[int]

Width of the generated images.

None
sample_size Optional[Tuple[int, int]]

Spatial dimensions for the UNet model.

None
image_size Optional[Tuple[int, int]]

Alternative way to specify image dimensions.

None
guidance_scale Optional[float]

Scale for classifier-free guidance.

None
**kwargs Dict[str, Any]

Additional arguments.

{}
Note

Guidance scale affects UNet and ControlNet batch sizes. If guidance_scale > 1.0, their batch sizes are doubled.

RBLNStableDiffusionControlNetImg2ImgPipelineConfig

Bases: RBLNStableDiffusionControlNetPipelineBaseConfig

Functions

__init__(text_encoder=None, unet=None, vae=None, controlnet=None, *, batch_size=None, img_height=None, img_width=None, sample_size=None, image_size=None, guidance_scale=None, **kwargs)

Parameters:

Name Type Description Default
text_encoder Optional[RBLNCLIPTextModelConfig]

Configuration for the text encoder component.

None
unet Optional[RBLNUNet2DConditionModelConfig]

Configuration for the UNet model component.

None
vae Optional[RBLNAutoencoderKLConfig]

Configuration for the VAE model component.

None
controlnet Optional[RBLNControlNetModelConfig]

Configuration for the ControlNet model component.

None
batch_size Optional[int]

Batch size for inference, applied to all submodules.

None
img_height Optional[int]

Height of the generated images.

None
img_width Optional[int]

Width of the generated images.

None
sample_size Optional[Tuple[int, int]]

Spatial dimensions for the UNet model.

None
image_size Optional[Tuple[int, int]]

Alternative way to specify image dimensions.

None
guidance_scale Optional[float]

Scale for classifier-free guidance.

None
**kwargs Dict[str, Any]

Additional arguments.

{}
Note

Guidance scale affects UNet and ControlNet batch sizes. If guidance_scale > 1.0, their batch sizes are doubled.