콘텐츠로 이동

ControlNet

ControlNet 모델은 Stable Diffusion 모델에 추가적인 조건(예: 엣지 맵, 깊이 맵, 사람 포즈)을 더하여 생성된 이미지 구조에 대한 더 정밀한 제어를 가능하게 합니다. RBLN NPU는 Optimum RBLN을 사용하여 ControlNet 모델 추론을 가속화할 수 있습니다.

주요 클래스

파이프라인 내 사용법

일반적으로 RBLNControlNetModel과 직접 상호 작용하지 않습니다. 대신, RBLNStableDiffusionControlNetPipeline 또는 RBLNStableDiffusionXLControlNetPipeline과 같은 RBLN ControlNet 파이프라인의 일부로 자동 로드 및 관리됩니다.

RBLN ControlNet 파이프라인을 구성할 때 파이프라인 설정 객체의 controlnet 인수를 통해 ControlNet 모델에 대한 특정 설정을 전달할 수 있습니다:

from optimum.rbln import RBLNStableDiffusionControlNetPipelineConfig

# 예시: ControlNet에 특정 배치 크기를 사용하여 파이프라인 구성
config = RBLNStableDiffusionControlNetPipelineConfig(
    batch_size=1, # 파이프라인 추론 배치 크기
    # ... 다른 SD 파이프라인 설정 ...
    controlnet=dict(
        batch_size=2 # ControlNet 배치 크기 (예: CFG를 위해 2배)
    )
)

# ... 이 설정을 사용하여 파이프라인 로드 ...

ControlNet의 배치 크기에 대한 guidance scale 영향 처리를 포함한 파이프라인 사용 및 구성에 대한 자세한 내용은 ControlNet 파이프라인 문서(표준, SDXL)를 참조하십시오.

API 참조

Classes

RBLNControlNetModel

Bases: RBLNModel

Functions

from_pretrained(model_id, export=False, rbln_config=None, **kwargs) classmethod

The from_pretrained() function is utilized in its standard form as in the HuggingFace transformers library. User can use this function to load a pre-trained model from the HuggingFace library and convert it to a RBLN model to be run on RBLN NPUs.

Parameters:

Name Type Description Default
model_id Union[str, Path]

The model id of the pre-trained model to be loaded. It can be downloaded from the HuggingFace model hub or a local path, or a model id of a compiled model using the RBLN Compiler.

required
export bool

A boolean flag to indicate whether the model should be compiled.

False
rbln_config Optional[Union[Dict, RBLNModelConfig]]

Configuration for RBLN model compilation and runtime. This can be provided as a dictionary or an instance of the model's configuration class (e.g., RBLNLlamaForCausalLMConfig for Llama models). For detailed configuration options, see the specific model's configuration class documentation.

None
kwargs Dict[str, Any]

Additional keyword arguments. Arguments with the prefix 'rbln_' are passed to rbln_config, while the remaining arguments are passed to the HuggingFace library.

{}

Returns:

Type Description
Self

A RBLN model instance ready for inference on RBLN NPU devices.

from_model(model, *, rbln_config=None, **kwargs) classmethod

Converts and compiles a pre-trained HuggingFace library model into a RBLN model. This method performs the actual model conversion and compilation process.

Parameters:

Name Type Description Default
model PreTrainedModel

The PyTorch model to be compiled. The object must be an instance of the HuggingFace transformers PreTrainedModel class.

required
rbln_config Optional[Union[Dict, RBLNModelConfig]]

Configuration for RBLN model compilation and runtime. This can be provided as a dictionary or an instance of the model's configuration class (e.g., RBLNLlamaForCausalLMConfig for Llama models). For detailed configuration options, see the specific model's configuration class documentation.

None
kwargs Dict[str, Any]

Additional keyword arguments. Arguments with the prefix 'rbln_' are passed to rbln_config, while the remaining arguments are passed to the HuggingFace library.

{}

The method performs the following steps:

  1. Compiles the PyTorch model into an optimized RBLN graph
  2. Configures the model for the specified NPU device
  3. Creates the necessary runtime objects if requested
  4. Saves the compiled model and configurations

Returns:

Type Description
Self

A RBLN model instance ready for inference on RBLN NPU devices.

save_pretrained(save_directory)

Saves a model and its configuration file to a directory, so that it can be re-loaded using the [from_pretrained] class method.

Parameters:

Name Type Description Default
save_directory Union[str, PathLike]

The directory to save the model and its configuration files. Will be created if it doesn't exist.

required

Classes

RBLNControlNetModelConfig

Bases: RBLNModelConfig

Configuration class for RBLN ControlNet models.

Functions

__init__(batch_size=None, max_seq_len=None, unet_sample_size=None, vae_sample_size=None, text_model_hidden_size=None, **kwargs)

Parameters:

Name Type Description Default
batch_size Optional[int]

The batch size for inference. Defaults to 1.

None
max_seq_len Optional[int]

Maximum sequence length for text inputs when used with cross-attention.

None
unet_sample_size Optional[Tuple[int, int]]

The spatial dimensions (height, width) of the UNet input latents this ControlNet works with.

None
vae_sample_size Optional[Tuple[int, int]]

The spatial dimensions (height, width) of the original image input to the ControlNet condition.

None
text_model_hidden_size Optional[int]

Hidden size of the text encoder model used for conditioning (relevant for SDXL ControlNets).

None
**kwargs Dict[str, Any]

Additional arguments passed to the parent RBLNModelConfig.

{}