Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Multinomial distribution #1478

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions rand_distr/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added
- Add plots for `rand_distr` distributions to documentation (#1434)
- Add `PertBuilder`, fix case where mode ≅ mean (#1452)
- Add `Multinomail` distribution
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo here "ail" vs "ial"


## [0.5.0-alpha.1] - 2024-03-18
- Target `rand` version `0.9.0-alpha.1`
Expand Down
5 changes: 5 additions & 0 deletions rand_distr/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
//! - [`Cauchy`] distribution
//! - Related to Bernoulli trials (yes/no events, with a given probability):
//! - [`Binomial`] distribution
//! - [`Multinomial`] distribution
//! - [`Geometric`] distribution
//! - [`Hypergeometric`] distribution
//! - Related to positive real-valued quantities that grow exponentially
Expand Down Expand Up @@ -112,6 +113,9 @@ pub use self::geometric::{Error as GeoError, Geometric, StandardGeometric};
pub use self::gumbel::{Error as GumbelError, Gumbel};
pub use self::hypergeometric::{Error as HyperGeoError, Hypergeometric};
pub use self::inverse_gaussian::{Error as InverseGaussianError, InverseGaussian};
#[cfg(feature = "alloc")]
pub use self::multinomial::MultinomialDyn;
pub use self::multinomial::{Error as MultinomialError, Multinomial, MultinomialConst};
pub use self::normal::{Error as NormalError, LogNormal, Normal, StandardNormal};
pub use self::normal_inverse_gaussian::{
Error as NormalInverseGaussianError, NormalInverseGaussian,
Expand Down Expand Up @@ -207,6 +211,7 @@ mod geometric;
mod gumbel;
mod hypergeometric;
mod inverse_gaussian;
mod multinomial;
mod normal;
mod normal_inverse_gaussian;
mod pareto;
Expand Down
252 changes: 252 additions & 0 deletions rand_distr/src/multinomial.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,252 @@
// Copyright 2018 Developers of the Rand project.
// Copyright 2013 The Rust Project Developers.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// https://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or https://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.

//! The multinomial distribution.

use crate::{Binomial, Distribution};
use num_traits::AsPrimitive;
use rand::Rng;

/// Error type returned from `Multinomial::new`.
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
pub enum Error {
/// There is a negative weight or Nan
ProbabilityNegative,
/// Sum overflows to inf
SumOverflow,
/// Sum is zero
SumZero,
}

impl core::fmt::Display for Error {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
f.write_str(match self {
Error::ProbabilityNegative => "One of the weights is negative or Nan",
Error::SumOverflow => "Sum of weights overflows to inf",
Error::SumZero => "Sum of weights is zero",
})
}
}

/// The [Multinomial](https://en.wikipedia.org/wiki/Multinomial_distribution) distribution `Multinomial(n, w)`.
#[derive(Debug)]
pub struct Multinomial {}

impl Multinomial {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This struct is just a stub used to construct the const/dyn variants? Then I'd prefer you just move Multinomial::new_constMultinomialConst::new etc.; it's more predictable and one less import for common usage.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My thought was that each distribution would have one common struct to construct all its variants.
Taking #494 into account, we might end up with a lot of distribution structs. Most people would never care about the exact type returned.
Having all the structs in one common doc page would probably do the same.

/// Constructs a new `Multinomial` distribution which samples `K` samples.
///
/// `n` is the number of draws.
///
/// `weights` have to be non negative and will be normalized to 1.
///
/// `K` has to be known at compile time
pub fn new_const<const K: usize, I>(
n: I,
weights: &[f64; K],
) -> Result<MultinomialConst<K, I>, Error>
where
I: num_traits::PrimInt,
u64: num_traits::AsPrimitive<I>,
I: num_traits::AsPrimitive<u64>,
{
let all_pos = weights.iter().all(|&x| x >= 0.0);

if !all_pos {
return Err(Error::ProbabilityNegative);
}

let sum: f64 = weights.iter().sum();

if !sum.is_finite() {
return Err(Error::SumOverflow);
}

if sum == 0.0 {
return Err(Error::SumZero);
}

Ok(MultinomialConst::<K, I> { n, weights, sum })
}

#[cfg(feature = "alloc")]
/// Constructs a new `Multinomial` distribution which samples `K` samples.
///
/// `n` is the number of draws.
///
/// `weights` have to be not negative and will be normalized to 1.
///
/// `K` can be specified at runtime
pub fn new_dyn<I>(n: I, weights: &[f64]) -> Result<MultinomialDyn<'_, I>, Error> {
let all_pos = weights.iter().all(|&x| x >= 0.0);

if !all_pos {
return Err(Error::ProbabilityNegative);
}

let sum: f64 = weights.iter().sum();

if !sum.is_finite() {
return Err(Error::SumOverflow);
}

if sum == 0.0 {
return Err(Error::SumZero);
}

Ok(MultinomialDyn::<I> { n, weights, sum })
}
}
/// Multinomial Distribution with compile time known number of categories.
/// Can be created with [Multinomial::new_const].
#[derive(Debug, Clone, PartialEq)]
pub struct MultinomialConst<'a, const K: usize, I> {
/// number of draws
n: I,
/// weights for the multinomial distribution
weights: &'a [f64; K],
/// sum of the weights
sum: f64,
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the major advantages of "distribution objects" is that they can be stored in structs; usage of a lifetime parameter here makes that difficult (except in the case 'static) since Rust doesn't properly support self-referential structs (yes, I know about Ouroboros).

In other words, I think we should usually prefer copying parameters into the struct implementing the distribution (no lifetime parameter), especially in this case where we don't need alloc. It does depend on expected usage, but in this case K should normally be reasonably small.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see the problem. In my usecase I would only sample from each distribution object once, so copying might matter, but probably not significantly so to make up for the additional lifetime.
This also allows us to take an iterator and make it more general


#[cfg(feature = "alloc")]
/// Multinomial Distribution with number of categories known at runtime.
/// Can be created with [Multinomial::new_dyn].
#[derive(Debug, Clone, PartialEq)]
pub struct MultinomialDyn<'a, I> {
/// number of draws
n: I,
/// weights for the multinomial distribution
weights: &'a [f64],
/// sum of the weights
sum: f64,
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest we just use weights: Vec<f64> or Box<[f64]>, though we could use tinyvec.


/// sum has to be the sum of the weights, this is a performance optimization
fn sample<R: Rng + ?Sized, I>(rng: &mut R, n: I, weights: &[f64], sum: f64, result: &mut [I])
where
I: num_traits::PrimInt,
u64: num_traits::AsPrimitive<I>,
I: num_traits::AsPrimitive<u64>,
{
// This follows the binomial approach in "The computer generation of multinomial random variates" by Charles S. Davis

let mut sum_p = 0.0;
let mut sum_n: I = 0.as_();

for k in 0..weights.len() {
if sum - sum_p <= 0.0 {
result[k] = 0.as_();
continue;
}

let prob = (weights[k] / (sum - sum_p)).min(1.0);
let binomial = Binomial::new((n - sum_n).as_(), prob)
.expect("We know that prob is between 0.0 and 1.0");
result[k] = binomial.sample(rng).as_();
sum_n = sum_n + result[k];
sum_p += weights[k];
}
}

impl<'a, const K: usize, I> Distribution<[I; K]> for MultinomialConst<'a, K, I>
where
I: num_traits::PrimInt,
u64: num_traits::AsPrimitive<I>,
I: num_traits::AsPrimitive<u64>,
{
fn sample<R: Rng + ?Sized>(&self, rng: &mut R) -> [I; K] {
let mut result = [0.as_(); K];
sample(rng, self.n, self.weights, self.sum, &mut result);
result
}
}

#[cfg(feature = "alloc")]
impl<'a, I> Distribution<alloc::vec::Vec<I>> for MultinomialDyn<'a, I>
where
I: num_traits::PrimInt,
u64: num_traits::AsPrimitive<I>,
I: num_traits::AsPrimitive<u64>,
{
fn sample<R: Rng + ?Sized>(&self, rng: &mut R) -> alloc::vec::Vec<I> {
let mut result = alloc::vec![0.as_(); self.weights.len()];
sample(rng, self.n, self.weights, self.sum, &mut result);
result
}
}

#[cfg(test)]
mod test {

#[test]
fn test_multinomial_const() {
use super::*;

let n: i32 = 1000;
let weights = [0.1, 0.2, 0.3, 0.4];
let mut rng = crate::test::rng(123);
let multinomial = Multinomial::new_const(n, &weights).unwrap();
let sample = multinomial.sample(&mut rng);
assert_eq!(sample.iter().sum::<i32>(), n);
}

#[test]
fn test_almost_zero_dist() {
use super::*;

let n: i32 = 1000;
let weights = [0.0, 0.0, 0.0, 0.000000001];
let multinomial = Multinomial::new_const(n, &weights).unwrap();
let sample = multinomial.sample(&mut crate::test::rng(123));
assert!(sample[3] == n);
}

#[test]
fn test_zero_dist() {
use super::*;

let n: i32 = 1000;
let weights = [0.0, 0.0, 0.0, 0.0];
let multinomial = Multinomial::new_const(n, &weights);
assert_eq!(multinomial, Err(Error::SumZero));
}

#[test]
fn test_negative_dist() {
use super::*;

let n: i32 = 1000;
let weights = [0.1, 0.2, 0.3, -0.6];
let multinomial = Multinomial::new_const(n, &weights);
assert_eq!(multinomial, Err(Error::ProbabilityNegative));
}

#[test]
fn test_overflow() {
use super::*;

let n: i32 = 1000;
let weights = [f64::MAX, f64::MAX, f64::MAX, f64::MAX];
let multinomial = Multinomial::new_const(n, &weights);
assert_eq!(multinomial, Err(Error::SumOverflow));
}

#[cfg(feature = "alloc")]
#[test]
fn test_multinomial_dyn() {
use super::*;

let n = 1000;
let weights = [0.1, 0.2, 0.3, 0.4];
let mut rng = crate::test::rng(123);
let multinomial = Multinomial::new_dyn(n, &weights).unwrap();
let sample = multinomial.sample(&mut rng);
assert_eq!(sample.iter().sum::<u64>(), n);
}
}