Counting 3,028 Big Data & Machine Learning Frameworks, Toolsets, and Examples...
Suggestion? Feedback? Tweet @stkim1

Last Commit
Aug. 16, 2018
Nov. 22, 2016


Documentation Build Status Codacy Badge Requirements Status Compatibility license

Neural Machine Translation with Keras (Theano and Tensorflow).

Library documentation:

Attentional recurrent neural network NMT model

alt text

Transformer NMT model

alt text

Features (in addition to the full Keras cosmos):


Assuming that you have pip installed, run:

git clone
cd nmt-keras
pip install -r requirements.txt

for obtaining the required packages for running this library.


NMT-Keras requires the following libraries:



  1. Set a training configuration in the script. Each parameter is commented. See the documentation file for further info about each specific hyperparameter. You can also specify the parameters when calling the script following the syntax Key=Value

  2. Train!:



Once we have our model trained, we can translate new text using the script. Please refer to the ensembling_tutorial for more details about this script. In short, if we want to use the models from the first three epochs to translate the examples/EuTrans/test.en file, just run:

             --models trained_models/tutorial_model/epoch_1 \ 
                      trained_models/tutorial_model/epoch_2 \
             --dataset datasets/Dataset_tutorial_dataset.pkl \
             --text examples/EuTrans/test.en


The script can be used to obtain the (-log)probabilities of a parallel corpus. Its syntax is the following:

python --help
usage: Use several translation models for scoring source--target pairs
       [-h] -ds DATASET [-src SOURCE] [-trg TARGET] [-s SPLITS [SPLITS ...]]
       [-d DEST] [-v] [-c CONFIG] --models MODELS [MODELS ...]
optional arguments:
    -h, --help            show this help message and exit
    -ds DATASET, --dataset DATASET
                            Dataset instance with data
    -src SOURCE, --source SOURCE
                            Text file with source sentences
    -trg TARGET, --target TARGET
                            Text file with target sentences
    -s SPLITS [SPLITS ...], --splits SPLITS [SPLITS ...]
                            Splits to sample. Should be already includedinto the
                            dataset object.
    -d DEST, --dest DEST  File to save scores in
    -v, --verbose         Be verbose
    -c CONFIG, --config CONFIG
                            Config pkl for loading the model configuration. If not
                            specified, hyperparameters are read from
    --models MODELS [MODELS ...]
                            path to the models

Advanced features

Other features such as online learning or interactive NMT protocols are implemented in the interactiveNMT branch.



If you use this toolkit in your research, please cite:

	title={{NMT-Keras}: a Very Flexible Toolkit with a Focus on Interactive {NMT} and Online Learning},
	author={Peris, Álvaro and Casacuberta, Francisco},

NMT-Keras was used in a number of papers:


Much of this library has been developed together with Marc Bolaños (web page) for other sequence-to-sequence problems.

To see other projects following the same philosophy and style of NMT-Keras, take a look to:

TMA for egocentric captioning based on temporally-linked sequences.

VIBIKNet for visual question answering.

ABiViRNet for video description.

Sentence SelectioNN for sentence classification and selection.


There is a known issue with the Theano backend. When running NMT-Keras, it will show the following message:

raise theano.gof.InconsistencyError("Trying to reintroduce a removed node")
InconsistencyError: Trying to reintroduce a removed node

It is not a critical error, the model keeps working and it is safe to ignore it. However, if you want the message to be gone, use the Theano flag optimizer_excluding=scanOp_pushout_output.


Álvaro Peris (web page): [email protected]

Latest Releases
 Feb. 13 2018
NMT-Keras (Keras 2)
 Jul. 18 2017
Store dataset at 'datasets'
 Jun. 27 2017
Stable version at 1st June
 Jun. 1 2017
First release
 Apr. 27 2017