1# Keras examples directory
2
3## Vision models examples
4
5[mnist_mlp.py](mnist_mlp.py)
6Trains a simple deep multi-layer perceptron on the MNIST dataset.
7
8[mnist_cnn.py](mnist_cnn.py)
9Trains a simple convnet on the MNIST dataset.
10
11[cifar10_cnn.py](cifar10_cnn.py)
12Trains a simple deep CNN on the CIFAR10 small images dataset.
13
14[cifar10_cnn_capsule.py](cifar10_cnn_capsule.py)
15Trains a simple CNN-Capsule Network on the CIFAR10 small images dataset.
16
17[cifar10_resnet.py](cifar10_resnet.py)
18Trains a ResNet on the CIFAR10 small images dataset.
19
20[conv_lstm.py](conv_lstm.py)
21Demonstrates the use of a convolutional LSTM network.
22
23[image_ocr.py](image_ocr.py)
24Trains a convolutional stack followed by a recurrent stack and a CTC logloss function to perform optical character recognition (OCR).
25
26[mnist_acgan.py](mnist_acgan.py)
27Implementation of AC-GAN (Auxiliary Classifier GAN) on the MNIST dataset
28
29[mnist_hierarchical_rnn.py](mnist_hierarchical_rnn.py)
30Trains a Hierarchical RNN (HRNN) to classify MNIST digits.
31
32[mnist_siamese.py](mnist_siamese.py)
33Trains a Siamese multi-layer perceptron on pairs of digits from the MNIST dataset.
34
35[mnist_swwae.py](mnist_swwae.py)
36Trains a Stacked What-Where AutoEncoder built on residual blocks on the MNIST dataset.
37
38[mnist_transfer_cnn.py](mnist_transfer_cnn.py)
39Transfer learning toy example on the MNIST dataset.
40
41[mnist_denoising_autoencoder.py](mnist_denoising_autoencoder.py)
42Trains a denoising autoencoder on the MNIST dataset.
43
44----
45
46## Text & sequences examples
47
48[addition_rnn.py](addition_rnn.py)
49Implementation of sequence to sequence learning for performing addition of two numbers (as strings).
50
51[babi_rnn.py](babi_rnn.py)
52Trains a two-branch recurrent network on the bAbI dataset for reading comprehension.
53
54[babi_memnn.py](babi_memnn.py)
55Trains a memory network on the bAbI dataset for reading comprehension.
56
57[imdb_bidirectional_lstm.py](imdb_bidirectional_lstm.py)
58Trains a Bidirectional LSTM on the IMDB sentiment classification task.
59
60[imdb_cnn.py](imdb_cnn.py)
61Demonstrates the use of Convolution1D for text classification.
62
63[imdb_cnn_lstm.py](imdb_cnn_lstm.py)
64Trains a convolutional stack followed by a recurrent stack network on the IMDB sentiment classification task.
65
66[imdb_fasttext.py](imdb_fasttext.py)
67Trains a FastText model on the IMDB sentiment classification task.
68
69[imdb_lstm.py](imdb_lstm.py)
70Trains an LSTM model on the IMDB sentiment classification task.
71
72[lstm_stateful.py](lstm_stateful.py)
73Demonstrates how to use stateful RNNs to model long sequences efficiently.
74
75[lstm_seq2seq.py](lstm_seq2seq.py)
76Trains a basic character-level sequence-to-sequence model.
77
78[lstm_seq2seq_restore.py](lstm_seq2seq_restore.py)
79Restores a character-level sequence to sequence model from disk (saved by [lstm_seq2seq.py](lstm_seq2seq.py)) and uses it to generate predictions.
80
81[pretrained_word_embeddings.py](pretrained_word_embeddings.py)
82Loads pre-trained word embeddings (GloVe embeddings) into a frozen Keras Embedding layer, and uses it to train a text classification model on the 20 Newsgroup dataset.
83
84[reuters_mlp.py](reuters_mlp.py)
85Trains and evaluate a simple MLP on the Reuters newswire topic classification task.
86
87----
88
89## Generative models examples
90
91[lstm_text_generation.py](lstm_text_generation.py)
92Generates text from Nietzsche's writings.
93
94[conv_filter_visualization.py](conv_filter_visualization.py)
95Visualization of the filters of VGG16, via gradient ascent in input space.
96
97[deep_dream.py](deep_dream.py)
98Deep Dreams in Keras.
99
100[neural_doodle.py](neural_doodle.py)
101Neural doodle.
102
103[neural_style_transfer.py](neural_style_transfer.py)
104Neural style transfer.
105
106[variational_autoencoder.py](variational_autoencoder.py)
107Demonstrates how to build a variational autoencoder.
108
109[variational_autoencoder_deconv.py](variational_autoencoder_deconv.py)
110Demonstrates how to build a variational autoencoder with Keras using deconvolution layers.
111
112----
113
114## Examples demonstrating specific Keras functionality
115
116[antirectifier.py](antirectifier.py)
117Demonstrates how to write custom layers for Keras.
118
119[mnist_sklearn_wrapper.py](mnist_sklearn_wrapper.py)
120Demonstrates how to use the sklearn wrapper.
121
122[mnist_irnn.py](mnist_irnn.py)
123Reproduction of the IRNN experiment with pixel-by-pixel sequential MNIST in "A Simple Way to Initialize Recurrent Networks of Rectified Linear Units" by Le et al.
124
125[mnist_net2net.py](mnist_net2net.py)
126Reproduction of the Net2Net experiment with MNIST in "Net2Net: Accelerating Learning via Knowledge Transfer".
127
128[reuters_mlp_relu_vs_selu.py](reuters_mlp_relu_vs_selu.py)
129Compares self-normalizing MLPs with regular MLPs.
130
131[mnist_tfrecord.py](mnist_tfrecord.py)
132MNIST dataset with TFRecords, the standard TensorFlow data format.
133
134[mnist_dataset_api.py](mnist_dataset_api.py)
135MNIST dataset with TensorFlow's Dataset API.
136
137[cifar10_cnn_tfaugment2d.py](cifar10_cnn_tfaugment2d.py)
138Trains a simple deep CNN on the CIFAR10 small images dataset using Tensorflow internal augmentation APIs.
139
140[tensorboard_embeddings_mnist.py](tensorboard_embeddings_mnist.py)
141Trains a simple convnet on the MNIST dataset and embeds test data which can be later visualized using TensorBoard's Embedding Projector.