Counting 2,129 Big Data & Machine Learning Frameworks, Toolsets, and Examples...
Suggestion? Feedback? Tweet @stkim1

Author
Last Commit
Nov. 3, 2017
Created
Sep. 6, 2017

react-router

Test Tube

Log, organize and optimize Deep Learning experiments

Docs

View the docs here


Test tube is a python library to track and optimize Deep Learning experiments. It's framework agnostic and is built on top of the python argparse API for ease of use.

Test tube stores logs in csv files on your machine for easy analysis.

pip install test_tube

Use Test Tube if you need to:

  • Track multiple Experiments across models.
  • Optimize your hyperparameters using grid_search or random_search.
  • Visualize experiments without uploading anywhere, logs store as csv files.
  • Automatically track ALL parameters for a particular training run.
  • Automatically snapshot your code for an experiment using git tags.
  • Save progress images inline with training metrics.

Compatible with:

  • Python 2, 3
  • Tensorflow
  • Keras
  • Pytorch
  • Caffe, Caffe2
  • Chainer
  • MXNet
  • Theano
  • Scikit-learn
  • Any python based ML or DL library
  • Runs seamlessly on CPU and GPU.

Examples

Log experiments

from test_tube import Experiment

exp = Experiment(name='dense_model', save_dir='../some/dir/')
exp.add_meta_tags({'learning_rate': 0.002, 'nb_layers': 2})

for step in range(1, 10):
    tng_err = 1.0 / step
    exp.add_metric_row({'tng_err': tng_err})

Visualize experiments

import pandas as pd    
import matplotlib   

# each experiment is saved to a metrics.csv file which can be imported anywhere
# images save to exp/version/images   
df = pd.read_csv('../some/dir/test_tube_data/dense_model/version_0/metrics.csv')
df.tng_err.plot()   

Optimize hyperparameters

from test_tube import HyperOptArgumentParser

# subclass of argparse
parser = HyperOptArgumentParser(strategy='random_search')
parser.add_argument('--learning_rate', default=0.002, type=float, help='the learning rate')

# let's enable optimizing over the number of layers in the network
parser.add_opt_argument_list('--nb_layers', default=2, type=int, tunnable=True, options=[2, 4, 8])

# and tune the number of units in each layer
parser.add_opt_argument_range('--neurons', default=50, type=int, tunnable=True, start=100, end=800, nb_samples=10)

# compile (because it's argparse underneath)
hparams = parser.parse_args()

# run 20 trials of random search over the hyperparams
for hparam_trial in hparams.trials(20):
    train_network(hparam_trial)

Convert your argparse params into searchable params by changing 1 line

import argparse   
from test_tube import HyperOptArgumentParser

# these lines are equivalent   
parser = argparse.ArgumentParser(description='Process some integers.')
parser = HyperOptArgumentParser(description='Process some integers.', strategy='grid_search')   

# do normal argparse stuff    
...

Log images inline with metrics

# name must have either jpg, png or jpeg in it
img = np.imread('a.jpg')
exp.add_metric_row('test_jpg': img, 'val_err': 0.2)

# saves image to ../exp/version/media/test_0.jpg  
# csv has file path to that image in that cell   

Demos

How to contribute

Feel free to fix bugs and make improvements!

  1. Check out the current bugs here.
  2. To work on a bug, head over to our project page and assign yourself the bug.
  3. We'll add contributor names periodically as people improve the library!

Contributors:

  1. William Falcon

Latest Releases
tt_default_1
 Sep. 13 2017
tt_default_0
 Sep. 13 2017
tt_default_4
 Sep. 13 2017
tt_default_3
 Sep. 13 2017
tt_default_2
 Sep. 13 2017