| Latest Rust Documentation | Latest Python Documentation | Discord |
- Quantization
bitsandbytes
format (fp4, nf4, and int8)GGUF
(2-8 bit quantization)
- Easy: Strong support for running 🤗 DDUF models.
- Strong Apple Silicon support: support for the Metal, Accelerate, and ARM NEON frameworks
- Support for NVIDIA GPUs with CUDA
- AVX support for x86 CPUs
- Allow acceleration of models larger than the total VRAM size with offloading
Please do not hesitate to contact us with feature requests via Github issues!
- 🚧 LoRA support
- 🚧 CPU + GPU inference with automatic offloading to allow partial acceleration of models larger than the total VRAM
Check out the installation guide for details about installation.
After installing, you can try out these examples!
Download the DDUF file here:
wget https://huggingface.co/DDUF/FLUX.1-dev-DDUF/resolve/main/FLUX.1-dev-Q4-bnb.dduf
CLI:
diffusion_rs_cli --scale 3.5 --num-steps 50 dduf -f FLUX.1-dev-Q4-bnb.dduf
More CLI examples here.
Python:
More Python examples here.
from diffusion_rs import DiffusionGenerationParams, ModelSource, Pipeline
from PIL import Image
import io
pipeline = Pipeline(source=ModelSource.DdufFile("FLUX.1-dev-Q4-bnb.dduf"))
image_bytes = pipeline.forward(
prompts=["Draw a picture of a sunrise."],
params=DiffusionGenerationParams(
height=720, width=1280, num_steps=50, guidance_scale=3.5
),
)
image = Image.open(io.BytesIO(image_bytes[0]))
image.show()
Rust crate:
Examples with the Rust crate: here.
use std::time::Instant;
use diffusion_rs_core::{DiffusionGenerationParams, ModelSource, ModelDType, Offloading, Pipeline, TokenSource};
use tracing::level_filters::LevelFilter;
use tracing_subscriber::EnvFilter;
let filter = EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy();
tracing_subscriber::fmt().with_env_filter(filter).init();
let pipeline = Pipeline::load(
ModelSource::dduf("FLUX.1-dev-Q4-bnb.dduf")?,
false,
TokenSource::CacheToken,
None,
None,
&ModelDType::Auto,
)?;
let start = Instant::now();
let images = pipeline.forward(
vec!["Draw a picture of a sunrise.".to_string()],
DiffusionGenerationParams {
height: 720,
width: 1280,
num_steps: 50,
guidance_scale: 3.5,
},
)?;
let end = Instant::now();
println!("Took: {:.2}s", end.duration_since(start).as_secs_f32());
images[0].save("image.png")?;
Model | Supports DDUF | Supports quantized DDUF |
---|---|---|
FLUX.1 Dev/Schnell | ✅ | ✅ |
- Anyone is welcome to contribute by opening PRs
- See good first issues for a starting point!
- Collaborators will be invited based on past contributions