From a386f27e9adaa632caaff3bebde99ce54f46f167 Mon Sep 17 00:00:00 2001 From: Patrick Kidger <33688385+patrick-kidger@users.noreply.github.com> Date: Sat, 20 Apr 2024 10:56:47 +0200 Subject: [PATCH] Updated ecosystem list --- README.md | 43 ++++++++++++++++++------------------------- docs/index.md | 31 ++++++++++++++++++------------- 2 files changed, 36 insertions(+), 38 deletions(-) diff --git a/README.md b/README.md index 05b0e1ae..0a2fbaa8 100644 --- a/README.md +++ b/README.md @@ -81,28 +81,21 @@ If you found this library to be useful in academic work, then please cite: ([arX ## See also: other libraries in the JAX ecosystem -[jaxtyping](https://github.com/google/jaxtyping): type annotations for shape/dtype of arrays. - -[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. - -[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. - -[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. - -[Lineax](https://github.com/google/lineax): linear solvers. - -[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. - -[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). - -[sympy2jax](https://github.com/google/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. - -[Eqxvision](https://github.com/paganpasta/eqxvision): computer vision models. - -[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). - -[PySR](https://github.com/milesCranmer/PySR): symbolic regression. (Non-JAX honourable mention!) - -## Disclaimer - -Equinox is maintained by Patrick Kidger at Google X, but this is not an official Google product. +**Always useful** +[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. + +**Deep learning** +[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. +[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). +[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). + +**Scientific computing** +[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. +[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. +[Lineax](https://github.com/patrick-kidger/lineax): linear solvers. +[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. +[sympy2jax](https://github.com/patrick-kidger/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. +[PySR](https://github.com/milesCranmer/PySR): symbolic regression. (Non-JAX honourable mention!) + +**Awesome JAX** +[Awesome JAX](https://github.com/n2cholas/awesome-jax): a longer list of other JAX projects. diff --git a/docs/index.md b/docs/index.md index 9a6ab1f8..c2c6fdb0 100644 --- a/docs/index.md +++ b/docs/index.md @@ -70,16 +70,21 @@ If this quick start has got you interested, then have a read of [All of Equinox] ## See also: other libraries in the JAX ecosystem -[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. - -[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. - -[Lineax](https://github.com/google/lineax): linear solvers and linear least squares. - -[jaxtyping](https://github.com/google/jaxtyping): type annotations for shape/dtype of arrays. - -[Eqxvision](https://github.com/paganpasta/eqxvision): computer vision models. - -[sympy2jax](https://github.com/google/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. - -[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +**Always useful** +[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. + +**Deep learning** +[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. +[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). +[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). + +**Scientific computing** +[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. +[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. +[Lineax](https://github.com/patrick-kidger/lineax): linear solvers. +[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. +[sympy2jax](https://github.com/patrick-kidger/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. +[PySR](https://github.com/milesCranmer/PySR): symbolic regression. (Non-JAX honourable mention!) + +**Awesome JAX** +[Awesome JAX](https://github.com/n2cholas/awesome-jax): a longer list of other JAX projects.