Cheat Sheet
> ! create an extensive notebook for this cheat sheet
Keras API cheat sheet
Check Library Version
import tensorflow as tf
print(tf.__version__)
from tensorflow import keras
from tensorflow.keras import layers
print(keras.__version__)
import numpy as np
print(np.__version__)
import cv2
print(cv2.__version__)Check GPU
physical_devices = tf.config.list_physical_devices('GPU')
print("Num GPUs:", len(physical_devices))
device_name = tf.test.gpu_device_name()
print('GPU at: {}'.format(device_name))Prepare Datasets
Option1) Use datasets provided by TF/Keras
The tf.keras.datasets module provide a few toy datasets (already-vectorized, in Numpy format) that can be used for debugging a model or creating simple code examples. If you are looking for larger & more useful ready-to-use datasets, take a look at TensorFlow Datasets.
TF datasets have different format and functions
Keras Dataset laod functions return Tuple of Numpy arrays: (x_train, y_train), (x_test, y_test).
MNIST digits classification dataset
CIFAR10 small images classification dataset
CIFAR100 small images classification dataset
Fashion MNIST dataset, an alternative to MNIST etc..
It downloads and saves dataset in local drive (~/.keras/datasets)
Option2) Use or create your own database in local storage
Example: MS Cats vs Dogs images dataset
Assume raw data is downloaded and
PetImagesfolder with two subfolders,CatandDogis saved locally.Example: ~.keras/datasets/PetImages/
Filter out corrupted images
When working with lots of real-world image data, corrupted images are a common occurence. Let's filter out badly-encoded images that do not feature the string "JFIF" in their header.
Load and Plot Images
Using OpenCV (color mode is B-G-R)
Using Matplotlib
Load and plot using PIL
Convert PIL to Numpy, OpenCV to Numpy
Subplot with matplotlib
Split into train validate database
Option 1) Classes divided by folder name. image_dataset_from_directory
image_dataset_from_directoryNo Train/valid/Test folders
Generates a 'tf.data.Dataset' from image files in a directory.
If your directory structure is:
return a 'tf.data.Dataset' that yields batches of images class_a with label=0, class_b with label=1
Option 2) Train Valid Test are divided by folder names manually flow_from_directory
flow_from_directory
Visualize the dataset
Preprocessing Database
Buffer Prefetch
Rescaling, Cropping - can be included in model
Build Model
Example 1: A few layer CNN for a simple example
* Example 2: Small version of Xception
For other archiectures, go to Tutorial
Visualize model
Train the model
Save and load model in Keras
Option 1) Model and Weight in one file (gives error... )
Option 2) Model (json) and weight separately
Run inference
Test on some data
Test on all validate database
Last updated
Was this helpful?