Keras Examples

    addition_rnnImplementation of sequence to sequence learning for performing addition of two numbers (as strings).
    babi_memnnTrains a memory network on the bAbI dataset for reading comprehension.
    babi_rnnTrains a two-branch recurrent network on the bAbI dataset for reading comprehension.
    cifar10_cnnTrains a simple deep CNN on the CIFAR10 small images dataset.
    cifar10_densenetTrains a DenseNet-40-12 on the CIFAR10 small images dataset.
    conv_lstmDemonstrates the use of a convolutional LSTM network.
    deep_dreamDeep Dreams in Keras.
    eager_dcganGenerating digits with generative adversarial networks and eager execution.
    eager_image_captioningGenerating image captions with Keras and eager execution.
    eager_pix2pixImage-to-image translation with Pix2Pix, using eager execution.
    eager_styletransferNeural style transfer with eager execution.
    fine_tuningFine tuning of a image classification model.
    imdb_bidirectional_lstmTrains a Bidirectional LSTM on the IMDB sentiment classification task.
    imdb_cnnDemonstrates the use of Convolution1D for text classification.
    imdb_cnn_lstmTrains a convolutional stack followed by a recurrent stack network on the IMDB sentiment classification task.
    imdb_fasttextTrains a FastText model on the IMDB sentiment classification task.
    imdb_lstmTrains a LSTM on the IMDB sentiment classification task.
    lstm_text_generationGenerates text from Nietzsche’s writings.
    lstm_seq2seqThis script demonstrates how to implement a basic character-level sequence-to-sequence model.
    mnist_acganImplementation of AC-GAN (Auxiliary Classifier GAN ) on the MNIST dataset
    mnist_antirectifierDemonstrates how to write custom layers for Keras
    mnist_cnnTrains a simple convnet on the MNIST dataset.
    mnist_cnn_embeddingsDemonstrates how to visualize embeddings in TensorBoard.
    mnist_irnnReproduction of the IRNN experiment with pixel-by-pixel sequential MNIST in “A Simple Way to Initialize Recurrent Networks of Rectified Linear Units” by Le et al.
    mnist_mlpTrains a simple deep multi-layer perceptron on the MNIST dataset.
    mnist_hierarchical_rnnTrains a Hierarchical RNN (HRNN) to classify MNIST digits.
    mnist_tfrecordMNIST dataset with TFRecords, the standard TensorFlow data format.
    mnist_transfer_cnnTransfer learning toy example.
    neural_style_transferNeural style transfer (generating an image with the same “content” as a base image, but with the “style” of a different picture).
    nmt_attentionNeural machine translation with an attention mechanism.
    quora_siamese_lstmClassifying duplicate quesitons from Quora using Siamese Recurrent Architecture.
    reuters_mlpTrains and evaluatea a simple MLP on the Reuters newswire topic classification task.
    stateful_lstmDemonstrates how to use stateful RNNs to model long sequences efficiently.
    text_explanation_limeHow to use lime to explain text data.
    variational_autoencoderDemonstrates how to build a variational autoencoder.
    variational_autoencoder_deconvDemonstrates how to build a variational autoencoder with Keras using deconvolution layers.
    tfprob_vaeA variational autoencoder using TensorFlow Probability on Kuzushiji-MNIST.
    vq_vaeDiscrete Representation Learning with VQ-VAE and TensorFlow Probability.