Quick Links¶
Congratulations on successful setup of the RBLN SDK! You are now ready to run your PyTorch and TensorFlow models on RBLN NPU.
Here is a comprehensive list of useful resources that can help you gain a better understanding of the RBLN SDK through examples.
Tutorials¶
We recommend that you explore the following tutorials for a better understanding of how to use the RBLN SDK:
- Basic
- PyTorch (Vision) provides instructions on how to use the TorchVision library with RBLN SDK through
ResNet50
example. - PyTorch (NLP) provides instructions on how to use PyTorch with RBLN SDK through
BERT-base
example. - TensorFlow (Vision) provides instructions on how to use TF Keras Applications library with RBLN SDK through
EfficientNet-B0
example. - TensorFlow (NLP) provides instructions on how to use TensorFlow with RBLN SDK through
BERT-base
example.
- PyTorch (Vision) provides instructions on how to use the TorchVision library with RBLN SDK through
- Advanced
- Concurrent Processing provides an explanation on how to execute the RBLN Runtime asynchronously.
- LLM Serving
- LLM Serving with Triton Inference Server explains how to use the RBLN SDK with Nvidia Triton Inference Server to serve Llama2-7B.
- LLM Serving with Continuous Batching explains how to serve LLM efficiently using vLLM.
Model Zoo¶
Check out the following resources to learn more about RBLN Model Zoo, which offers an in-depth review of the RBLN SDK by covering various TensorFlow and PyTorch models:
We also support HuggingFace transformers
and diffusers
models on single- and multi-device with optimum-rbln
.