# TensorFlow Probability

TensorFlow Probability is a library for probabilistic reasoning and statistical analysis in TensorFlow. As part of the TensorFlow ecosystem, TensorFlow Probability provides integration of probabilistic methods with deep networks, gradient-based inference via automatic differentiation, and scalability to large datasets and models via hardware acceleration (e.g., GPUs) and distributed computation.

Our probabilistic machine learning tools are structured as follows.

**Layer 0: TensorFlow.** Numerical operations. In particular, the LinearOperator
class enables matrix-free implementations that can exploit special structure
(diagonal, low-rank, etc.) for efficient computation. It is built and maintained
by the TensorFlow Probability team and is now part of
`tf.linalg`

in core TF.

**Layer 1: Statistical Building Blocks**

- Distributions (
`tf.contrib.distributions`

,`tf.distributions`

): A large collection of probability distributions and related statistics with batch and broadcasting semantics. - Bijectors (
`tf.contrib.distributions.bijectors`

): Reversible and composable transformations of random variables. Bijectors provide a rich class of transformed distributions, from classical examples like the log-normal distribution to sophisticated deep learning models such as masked autoregressive flows.

**Layer 2: Model Building**

- Edward2 (
`tfp.edward2`

): A probabilistic programming language for specifying flexible probabilistic models as programs. - Probabilistic Layers (
`tfp.layers`

): Neural network layers with uncertainty over the functions they represent, extending TensorFlow Layers. - Trainable Distributions (
`tfp.trainable_distributions`

): Probability distributions parameterized by a single Tensor, making it easy to build neural nets that output probability distributions.

**Layer 3: Probabilistic Inference**

- Markov chain Monte Carlo (
`tfp.mcmc`

): Algorithms for approximating integrals via sampling. Includes Hamiltonian Monte Carlo, random-walk Metropolis-Hastings, and the ability to build custom transition kernels. - Variational Inference (
`tfp.vi`

): Algorithms for approximating integrals via optimization. - Optimizers (
`tfp.optimizer`

): Stochastic optimization methods, extending TensorFlow Optimizers. Includes Stochastic Gradient Langevin Dynamics. - Monte Carlo (
`tfp.monte_carlo`

): Tools for computing Monte Carlo expectations.

TensorFlow Probability is under active development. Interfaces may change at any time.

## Examples

See `tensorflow_probability/examples/`

for end-to-end examples. It includes tutorial notebooks such as:

- Linear Mixed Effects Models. A hierarchical linear model for sharing statistical strength across examples.
- Eight Schools. A hierarchical normal model for exchangeable treatment effects.
- Gaussian Copulas. Probability distributions for capturing dependence across random variables.
- Understanding TensorFlow Distributions Shapes. How to distinguish between samples, batches, and events for arbitrarily shaped probabilistic computations.

It also includes example scripts such as:

- Variational Autoencoders. Representation learning with a latent code and variational inference.
- Bayesian Neural Networks. Neural networks with uncertainty over their weights.
- Bayesian Logistic Regression. Bayesian inference for binary classification.

## Installation

To install the latest version, run the following:

`pip install --user --upgrade tfp-nightly # depends on tensorflow (CPU-only)`

TensorFlow Probability depends on a current nightly release of TensorFlow
(`tf-nightly`

); the `--upgrade`

flag ensures you'll automatically get the latest
version.

We also provide a GPU-enabled package:

`pip install --user --upgrade tfp-nightly-gpu # depends on tensorflow-gpu (GPU enabled)`

Currently, TensorFlow Probability does not contain any GPU-specific code. The
primary difference between these packages is that `tensorflow-probability-gpu`

depends on a GPU-enabled version of TensorFlow.

To force a Python 3-specific install, replace `pip`

with `pip3`

in the above
commands. For additional installation help, guidance installing prerequisites,
and (optionally) setting up virtual environments, see the TensorFlow
installation guide.

You can also install from source. This requires the Bazel build system.

```
# sudo apt-get install bazel git python-pip # Ubuntu; others, see above links.
git clone https://github.com/tensorflow/probability.git
cd probability
bazel build --config=opt --copt=-O3 --copt=-march=native :pip_pkg
PKGDIR=$(mktemp -d)
./bazel-bin/pip_pkg $PKGDIR
pip install --user --upgrade $PKGDIR/*.whl
```

## Community

As part of TensorFlow, we're committed to fostering an open and welcoming environment.

- Stack Overflow: Ask or answer technical questions.
- GitHub: Report bugs or make feature requests.
- TensorFlow Blog: Stay up to date on content from the TensorFlow team and best articles from the community.
- Youtube Channel: Follow TensorFlow shows.
- Mailing list: Stay tuned!

See the TensorFlow Community page for more details. Check out our latest publicity here:

- Coffee with a Googler: Probabilistic Machine Learning in TensorFlow
- Introducing TensorFlow Probability

## Contributing

We're eager to collaborate with you! See `CONTRIBUTING.md`

for more details. This project adheres to TensorFlow's
code of conduct. By participating, you are expected to
uphold this code.

## References

*TensorFlow Distributions.*Joshua V. Dillon, Ian Langmore, Dustin Tran, Eugene Brevdo, Srinivas Vasudevan, Dave Moore, Brian Patton, Alex Alemi, Matt Hoffman, Rif A. Saurous. arXiv preprint arXiv:1711.10604, 2017.