Skip to content

Diffusion Transformer for Cosmos

The RBLNCosmosTransformer3DModel is the RBLN-optimized version of the core transformer block used in the Cosmos World Foundation Models model.

This model replaces the UNet architecture used in previous Stable Diffusion versions. It processes latent video representations along with pooled embeddings from text/video encoders and timestep information to perform the diffusion process.

Key Class

Usage within Pipelines

Typically, you don't interact with RBLNCosmosTransformer3DModel directly. Instead, it's automatically loaded and managed as part of an RBLN Cosmos pipeline, such as RBLNCosmosTextToWorldPipeline and RBLNCosmosVideoToWorldPipeline.

When configuring an RBLN Cosmos pipeline, you can pass specific settings for the transformer via the transformer argument in the pipeline's configuration object:

from optimum.rbln import RBLNCosmosTextToWorldPipelineConfig

# Example: Configure pipeline with specific batch size for the transformer
config = RBLNCosmosTextToWorldPipelineConfig(
    batch_size=1,
    height=704,
    width=1280,
    transformer={
        "tensor_parallel_size": 4,
        "device": [0, 1, 2, 3], # Allocate appropriate number of NPUs
    }
)

# ... load pipeline using this config ...

API Reference

Classes

RBLNCosmosTransformer3DModel

Bases: RBLNModel

RBLN wrapper for the Cosmos Transformer model.

Functions

from_pretrained(model_id, export=False, rbln_config=None, **kwargs) classmethod

The from_pretrained() function is utilized in its standard form as in the HuggingFace transformers library. User can use this function to load a pre-trained model from the HuggingFace library and convert it to a RBLN model to be run on RBLN NPUs.

Parameters:

Name Type Description Default
model_id Union[str, Path]

The model id of the pre-trained model to be loaded. It can be downloaded from the HuggingFace model hub or a local path, or a model id of a compiled model using the RBLN Compiler.

required
export bool

A boolean flag to indicate whether the model should be compiled.

False
rbln_config Optional[Union[Dict, RBLNModelConfig]]

Configuration for RBLN model compilation and runtime. This can be provided as a dictionary or an instance of the model's configuration class (e.g., RBLNLlamaForCausalLMConfig for Llama models). For detailed configuration options, see the specific model's configuration class documentation.

None
kwargs Dict[str, Any]

Additional keyword arguments. Arguments with the prefix 'rbln_' are passed to rbln_config, while the remaining arguments are passed to the HuggingFace library.

{}

Returns:

Type Description
Self

A RBLN model instance ready for inference on RBLN NPU devices.

from_model(model, *, rbln_config=None, **kwargs) classmethod

Converts and compiles a pre-trained HuggingFace library model into a RBLN model. This method performs the actual model conversion and compilation process.

Parameters:

Name Type Description Default
model PreTrainedModel

The PyTorch model to be compiled. The object must be an instance of the HuggingFace transformers PreTrainedModel class.

required
rbln_config Optional[Union[Dict, RBLNModelConfig]]

Configuration for RBLN model compilation and runtime. This can be provided as a dictionary or an instance of the model's configuration class (e.g., RBLNLlamaForCausalLMConfig for Llama models). For detailed configuration options, see the specific model's configuration class documentation.

None
kwargs Dict[str, Any]

Additional keyword arguments. Arguments with the prefix 'rbln_' are passed to rbln_config, while the remaining arguments are passed to the HuggingFace library.

{}

The method performs the following steps:

  1. Compiles the PyTorch model into an optimized RBLN graph
  2. Configures the model for the specified NPU device
  3. Creates the necessary runtime objects if requested
  4. Saves the compiled model and configurations

Returns:

Type Description
Self

A RBLN model instance ready for inference on RBLN NPU devices.

save_pretrained(save_directory)

Saves a model and its configuration file to a directory, so that it can be re-loaded using the [from_pretrained] class method.

Parameters:

Name Type Description Default
save_directory Union[str, PathLike]

The directory to save the model and its configuration files. Will be created if it doesn't exist.

required

Classes

RBLNCosmosTransformer3DModelConfig

Bases: RBLNModelConfig

Configuration class for RBLN Cosmos Transformer models.

Functions

__init__(batch_size=None, num_frames=None, height=None, width=None, fps=None, max_seq_len=None, embedding_dim=None, num_channels_latents=None, num_latent_frames=None, latent_height=None, latent_width=None, **kwargs)

Parameters:

Name Type Description Default
batch_size Optional[int]

The batch size for inference. Defaults to 1.

None
num_frames Optional[int]

The number of frames in the generated video. Defaults to 121.

None
height Optional[int]

The height in pixels of the generated video. Defaults to 704.

None
width Optional[int]

The width in pixels of the generated video. Defaults to 1280.

None
fps Optional[int]

The frames per second of the generated video. Defaults to 30.

None
max_seq_len Optional[int]

Maximum sequence length of prompt embeds.

None
embedding_dim Optional[int]

Embedding vector dimension of prompt embeds.

None
num_channels_latents Optional[int]

The number of channels in latent space.

None
latent_height Optional[int]

The height in pixels in latent space.

None
latent_width Optional[int]

The width in pixels in latent space.

None
**kwargs Dict[str, Any]

Additional arguments passed to the parent RBLNModelConfig.

{}

Raises:

Type Description
ValueError

If batch_size is not a positive integer.