Convolutional Neural Networks - Deep Learning basics with Python, TensorFlow and Keras p.3




Convolutional Neural Networks - Deep Learning with Python, TensorFlow and Keras p.3

Welcome to a tutorial where we'll be discussing Convolutional Neural Networks (Convnets and CNNs), using one to classify dogs and cats with the dataset we built in the previous tutorial.

The Convolutional Neural Network gained popularity through its use with image data, and is currently the state of the art for detecting what an image is, or what is contained in the image.

The basic CNN structure is as follows: Convolution -> Pooling -> Convolution -> Pooling -> Fully Connected Layer -> Output

Convolution is the act of taking the original data, and creating feature maps from it.Pooling is down-sampling, most often in the form of "max-pooling," where we select a region, and then take the maximum value in that region, and that becomes the new value for the entire region. Fully Connected Layers are typical neural networks, where all nodes are "fully connected." The convolutional layers are not fully connected like a traditional neural network.

Okay, so now let's depict what's happening. We'll start with an image of a cat:

python machine learning tutorials

Then "convert to pixels:"

python machine learning tutorials

For the purposes of this tutorial, assume each square is a pixel. Next, for the convolution step, we're going to take a certain window, and find features within that window:

python machine learning tutorials

That window's features are now just a single pixel-sized feature in a new featuremap, but we will have multiple layers of featuremaps in reality.

Next, we slide that window over and continue the process. There will be some overlap, you can determine how much you want, you just do not want to be skipping any pixels, of course.

python machine learning tutorials

Now you continue this process until you've covered the entire image, and then you will have a featuremap. Typically the featuremap is just more pixel values, just a very simplified one:

python machine learning tutorials

From here, we do pooling. Let's say our convolution gave us (I forgot to put a number in the 2nd row's most right square, assume it's a 3 or less):

python machine learning tutorials

Now we'll take a 3x3 pooling window:

python machine learning tutorials

The most common form of pooling is "max pooling," where we simple take the maximum value in the window, and that becomes the new value for that region.

python machine learning tutorials

We continue this process, until we've pooled, and have something like:

python machine learning tutorials

Each convolution and pooling step is a hidden layer. After this, we have a fully connected layer, followed by the output layer. The fully connected layer is your typical neural network (multilayer perceptron) type of layer, and same with the output layer.

import tensorflow as tf
from tensorflow.keras.datasets import cifar10
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten
from tensorflow.keras.layers import Conv2D, MaxPooling2D

import pickle

pickle_in = open("X.pickle","rb")
X = pickle.load(pickle_in)

pickle_in = open("y.pickle","rb")
y = pickle.load(pickle_in)

X = X/255.0

model = Sequential()

model.add(Conv2D(256, (3, 3), input_shape=X.shape[1:]))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Conv2D(256, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Flatten())  # this converts our 3D feature maps to 1D feature vectors

model.add(Dense(64))

model.add(Dense(1))
model.add(Activation('sigmoid'))

model.compile(loss='binary_crossentropy',
              optimizer='adam',
              metrics=['accuracy'])

model.fit(X, y, batch_size=32, epochs=3, validation_split=0.3)
Train on 17441 samples, validate on 7475 samples
Epoch 1/3
17441/17441 [==============================] - ETA: 9:04 - loss: 0.6950 - acc: 0.500 - ETA: 1:35 - loss: 0.7981 - acc: 0.500 - ETA: 54s - loss: 0.7450 - acc: 0.542 - ETA: 38s - loss: 0.7328 - acc: 0.54 - ETA: 30s - loss: 0.7235 - acc: 0.53 - ETA: 25s - loss: 0.7176 - acc: 0.53 - ETA: 22s - loss: 0.7137 - acc: 0.52 - ETA: 19s - loss: 0.7107 - acc: 0.52 - ETA: 17s - loss: 0.7085 - acc: 0.52 - ETA: 16s - loss: 0.7065 - acc: 0.52 - ETA: 15s - loss: 0.7051 - acc: 0.52 - ETA: 14s - loss: 0.7033 - acc: 0.52 - ETA: 13s - loss: 0.7021 - acc: 0.52 - ETA: 12s - loss: 0.7004 - acc: 0.52 - ETA: 11s - loss: 0.6996 - acc: 0.52 - ETA: 11s - loss: 0.6983 - acc: 0.53 - ETA: 10s - loss: 0.6978 - acc: 0.53 - ETA: 10s - loss: 0.6977 - acc: 0.53 - ETA: 10s - loss: 0.6968 - acc: 0.54 - ETA: 9s - loss: 0.6957 - acc: 0.5469 - ETA: 9s - loss: 0.6957 - acc: 0.545 - ETA: 9s - loss: 0.6943 - acc: 0.549 - ETA: 8s - loss: 0.6928 - acc: 0.552 - ETA: 8s - loss: 0.6923 - acc: 0.554 - ETA: 8s - loss: 0.6922 - acc: 0.553 - ETA: 8s - loss: 0.6918 - acc: 0.554 - ETA: 7s - loss: 0.6913 - acc: 0.556 - ETA: 7s - loss: 0.6915 - acc: 0.554 - ETA: 7s - loss: 0.6902 - acc: 0.554 - ETA: 7s - loss: 0.6901 - acc: 0.553 - ETA: 7s - loss: 0.6893 - acc: 0.555 - ETA: 6s - loss: 0.6886 - acc: 0.557 - ETA: 6s - loss: 0.6881 - acc: 0.556 - ETA: 6s - loss: 0.6866 - acc: 0.558 - ETA: 6s - loss: 0.6868 - acc: 0.559 - ETA: 6s - loss: 0.6867 - acc: 0.559 - ETA: 6s - loss: 0.6866 - acc: 0.559 - ETA: 5s - loss: 0.6863 - acc: 0.559 - ETA: 5s - loss: 0.6853 - acc: 0.561 - ETA: 5s - loss: 0.6845 - acc: 0.562 - ETA: 5s - loss: 0.6844 - acc: 0.562 - ETA: 5s - loss: 0.6823 - acc: 0.565 - ETA: 5s - loss: 0.6817 - acc: 0.567 - ETA: 5s - loss: 0.6817 - acc: 0.568 - ETA: 5s - loss: 0.6813 - acc: 0.569 - ETA: 5s - loss: 0.6812 - acc: 0.569 - ETA: 4s - loss: 0.6812 - acc: 0.570 - ETA: 4s - loss: 0.6806 - acc: 0.571 - ETA: 4s - loss: 0.6798 - acc: 0.572 - ETA: 4s - loss: 0.6789 - acc: 0.573 - ETA: 4s - loss: 0.6783 - acc: 0.574 - ETA: 4s - loss: 0.6774 - acc: 0.575 - ETA: 4s - loss: 0.6770 - acc: 0.575 - ETA: 4s - loss: 0.6770 - acc: 0.574 - ETA: 4s - loss: 0.6766 - acc: 0.575 - ETA: 4s - loss: 0.6762 - acc: 0.576 - ETA: 3s - loss: 0.6758 - acc: 0.577 - ETA: 3s - loss: 0.6753 - acc: 0.578 - ETA: 3s - loss: 0.6750 - acc: 0.579 - ETA: 3s - loss: 0.6750 - acc: 0.579 - ETA: 3s - loss: 0.6742 - acc: 0.580 - ETA: 3s - loss: 0.6737 - acc: 0.581 - ETA: 3s - loss: 0.6731 - acc: 0.582 - ETA: 3s - loss: 0.6734 - acc: 0.581 - ETA: 3s - loss: 0.6729 - acc: 0.582 - ETA: 3s - loss: 0.6720 - acc: 0.583 - ETA: 3s - loss: 0.6718 - acc: 0.583 - ETA: 2s - loss: 0.6710 - acc: 0.585 - ETA: 2s - loss: 0.6712 - acc: 0.585 - ETA: 2s - loss: 0.6712 - acc: 0.585 - ETA: 2s - loss: 0.6710 - acc: 0.586 - ETA: 2s - loss: 0.6706 - acc: 0.586 - ETA: 2s - loss: 0.6706 - acc: 0.587 - ETA: 2s - loss: 0.6708 - acc: 0.587 - ETA: 2s - loss: 0.6704 - acc: 0.587 - ETA: 2s - loss: 0.6709 - acc: 0.587 - ETA: 2s - loss: 0.6710 - acc: 0.586 - ETA: 2s - loss: 0.6709 - acc: 0.587 - ETA: 2s - loss: 0.6710 - acc: 0.587 - ETA: 2s - loss: 0.6710 - acc: 0.586 - ETA: 1s - loss: 0.6704 - acc: 0.587 - ETA: 1s - loss: 0.6703 - acc: 0.587 - ETA: 1s - loss: 0.6701 - acc: 0.587 - ETA: 1s - loss: 0.6695 - acc: 0.589 - ETA: 1s - loss: 0.6695 - acc: 0.589 - ETA: 1s - loss: 0.6693 - acc: 0.588 - ETA: 1s - loss: 0.6695 - acc: 0.589 - ETA: 1s - loss: 0.6691 - acc: 0.589 - ETA: 1s - loss: 0.6690 - acc: 0.590 - ETA: 1s - loss: 0.6690 - acc: 0.590 - ETA: 1s - loss: 0.6687 - acc: 0.590 - ETA: 1s - loss: 0.6685 - acc: 0.591 - ETA: 1s - loss: 0.6683 - acc: 0.591 - ETA: 1s - loss: 0.6684 - acc: 0.591 - ETA: 1s - loss: 0.6682 - acc: 0.592 - ETA: 0s - loss: 0.6680 - acc: 0.592 - ETA: 0s - loss: 0.6678 - acc: 0.592 - ETA: 0s - loss: 0.6679 - acc: 0.592 - ETA: 0s - loss: 0.6680 - acc: 0.592 - ETA: 0s - loss: 0.6676 - acc: 0.593 - ETA: 0s - loss: 0.6669 - acc: 0.594 - ETA: 0s - loss: 0.6667 - acc: 0.595 - ETA: 0s - loss: 0.6660 - acc: 0.596 - ETA: 0s - loss: 0.6665 - acc: 0.596 - ETA: 0s - loss: 0.6661 - acc: 0.597 - ETA: 0s - loss: 0.6658 - acc: 0.597 - ETA: 0s - loss: 0.6652 - acc: 0.598 - ETA: 0s - loss: 0.6644 - acc: 0.599 - ETA: 0s - loss: 0.6643 - acc: 0.600 - ETA: 0s - loss: 0.6639 - acc: 0.600 - 8s 467us/step - loss: 0.6639 - acc: 0.6005 - val_loss: 0.6386 - val_acc: 0.6384
Epoch 2/3
17441/17441 [==============================] - ETA: 5s - loss: 0.7182 - acc: 0.468 - ETA: 6s - loss: 0.6551 - acc: 0.567 - ETA: 6s - loss: 0.6405 - acc: 0.605 - ETA: 6s - loss: 0.6279 - acc: 0.636 - ETA: 5s - loss: 0.6330 - acc: 0.633 - ETA: 5s - loss: 0.6263 - acc: 0.643 - ETA: 5s - loss: 0.6318 - acc: 0.642 - ETA: 5s - loss: 0.6328 - acc: 0.637 - ETA: 5s - loss: 0.6350 - acc: 0.635 - ETA: 5s - loss: 0.6364 - acc: 0.635 - ETA: 5s - loss: 0.6352 - acc: 0.634 - ETA: 5s - loss: 0.6366 - acc: 0.628 - ETA: 5s - loss: 0.6345 - acc: 0.632 - ETA: 5s - loss: 0.6274 - acc: 0.644 - ETA: 5s - loss: 0.6289 - acc: 0.644 - ETA: 5s - loss: 0.6273 - acc: 0.645 - ETA: 5s - loss: 0.6285 - acc: 0.644 - ETA: 5s - loss: 0.6284 - acc: 0.645 - ETA: 5s - loss: 0.6291 - acc: 0.644 - ETA: 5s - loss: 0.6334 - acc: 0.640 - ETA: 5s - loss: 0.6333 - acc: 0.640 - ETA: 4s - loss: 0.6327 - acc: 0.644 - ETA: 4s - loss: 0.6335 - acc: 0.642 - ETA: 4s - loss: 0.6347 - acc: 0.642 - ETA: 4s - loss: 0.6354 - acc: 0.642 - ETA: 4s - loss: 0.6358 - acc: 0.642 - ETA: 4s - loss: 0.6367 - acc: 0.640 - ETA: 4s - loss: 0.6360 - acc: 0.640 - ETA: 4s - loss: 0.6341 - acc: 0.642 - ETA: 4s - loss: 0.6337 - acc: 0.641 - ETA: 4s - loss: 0.6329 - acc: 0.644 - ETA: 4s - loss: 0.6309 - acc: 0.646 - ETA: 4s - loss: 0.6317 - acc: 0.644 - ETA: 4s - loss: 0.6306 - acc: 0.646 - ETA: 4s - loss: 0.6300 - acc: 0.646 - ETA: 4s - loss: 0.6293 - acc: 0.646 - ETA: 4s - loss: 0.6296 - acc: 0.645 - ETA: 4s - loss: 0.6277 - acc: 0.648 - ETA: 4s - loss: 0.6284 - acc: 0.647 - ETA: 3s - loss: 0.6270 - acc: 0.650 - ETA: 3s - loss: 0.6269 - acc: 0.650 - ETA: 3s - loss: 0.6272 - acc: 0.650 - ETA: 3s - loss: 0.6275 - acc: 0.650 - ETA: 3s - loss: 0.6273 - acc: 0.650 - ETA: 3s - loss: 0.6258 - acc: 0.651 - ETA: 3s - loss: 0.6259 - acc: 0.651 - ETA: 3s - loss: 0.6258 - acc: 0.651 - ETA: 3s - loss: 0.6244 - acc: 0.653 - ETA: 3s - loss: 0.6244 - acc: 0.653 - ETA: 3s - loss: 0.6246 - acc: 0.654 - ETA: 3s - loss: 0.6237 - acc: 0.655 - ETA: 3s - loss: 0.6233 - acc: 0.656 - ETA: 3s - loss: 0.6240 - acc: 0.655 - ETA: 3s - loss: 0.6229 - acc: 0.656 - ETA: 3s - loss: 0.6228 - acc: 0.656 - ETA: 3s - loss: 0.6229 - acc: 0.656 - ETA: 3s - loss: 0.6237 - acc: 0.656 - ETA: 2s - loss: 0.6236 - acc: 0.656 - ETA: 2s - loss: 0.6231 - acc: 0.656 - ETA: 2s - loss: 0.6234 - acc: 0.656 - ETA: 2s - loss: 0.6234 - acc: 0.656 - ETA: 2s - loss: 0.6238 - acc: 0.655 - ETA: 2s - loss: 0.6226 - acc: 0.657 - ETA: 2s - loss: 0.6231 - acc: 0.657 - ETA: 2s - loss: 0.6234 - acc: 0.657 - ETA: 2s - loss: 0.6228 - acc: 0.658 - ETA: 2s - loss: 0.6207 - acc: 0.660 - ETA: 2s - loss: 0.6201 - acc: 0.660 - ETA: 2s - loss: 0.6200 - acc: 0.660 - ETA: 2s - loss: 0.6196 - acc: 0.661 - ETA: 2s - loss: 0.6190 - acc: 0.661 - ETA: 2s - loss: 0.6178 - acc: 0.663 - ETA: 2s - loss: 0.6180 - acc: 0.662 - ETA: 2s - loss: 0.6175 - acc: 0.662 - ETA: 1s - loss: 0.6174 - acc: 0.662 - ETA: 1s - loss: 0.6174 - acc: 0.663 - ETA: 1s - loss: 0.6174 - acc: 0.663 - ETA: 1s - loss: 0.6163 - acc: 0.664 - ETA: 1s - loss: 0.6151 - acc: 0.666 - ETA: 1s - loss: 0.6143 - acc: 0.666 - ETA: 1s - loss: 0.6133 - acc: 0.667 - ETA: 1s - loss: 0.6139 - acc: 0.667 - ETA: 1s - loss: 0.6143 - acc: 0.667 - ETA: 1s - loss: 0.6139 - acc: 0.667 - ETA: 1s - loss: 0.6142 - acc: 0.666 - ETA: 1s - loss: 0.6141 - acc: 0.666 - ETA: 1s - loss: 0.6138 - acc: 0.666 - ETA: 1s - loss: 0.6131 - acc: 0.667 - ETA: 1s - loss: 0.6122 - acc: 0.668 - ETA: 1s - loss: 0.6122 - acc: 0.668 - ETA: 1s - loss: 0.6113 - acc: 0.669 - ETA: 1s - loss: 0.6107 - acc: 0.670 - ETA: 0s - loss: 0.6109 - acc: 0.670 - ETA: 0s - loss: 0.6103 - acc: 0.671 - ETA: 0s - loss: 0.6109 - acc: 0.670 - ETA: 0s - loss: 0.6103 - acc: 0.671 - ETA: 0s - loss: 0.6099 - acc: 0.672 - ETA: 0s - loss: 0.6094 - acc: 0.672 - ETA: 0s - loss: 0.6089 - acc: 0.672 - ETA: 0s - loss: 0.6086 - acc: 0.672 - ETA: 0s - loss: 0.6096 - acc: 0.671 - ETA: 0s - loss: 0.6094 - acc: 0.671 - ETA: 0s - loss: 0.6092 - acc: 0.671 - ETA: 0s - loss: 0.6087 - acc: 0.672 - ETA: 0s - loss: 0.6084 - acc: 0.672 - ETA: 0s - loss: 0.6079 - acc: 0.673 - ETA: 0s - loss: 0.6075 - acc: 0.673 - ETA: 0s - loss: 0.6074 - acc: 0.674 - ETA: 0s - loss: 0.6064 - acc: 0.674 - 7s 404us/step - loss: 0.6059 - acc: 0.6749 - val_loss: 0.5673 - val_acc: 0.7025
Epoch 3/3
17441/17441 [==============================] - ETA: 5s - loss: 0.5591 - acc: 0.625 - ETA: 6s - loss: 0.5442 - acc: 0.729 - ETA: 6s - loss: 0.5434 - acc: 0.730 - ETA: 5s - loss: 0.5347 - acc: 0.732 - ETA: 5s - loss: 0.5355 - acc: 0.730 - ETA: 5s - loss: 0.5372 - acc: 0.735 - ETA: 5s - loss: 0.5334 - acc: 0.737 - ETA: 5s - loss: 0.5361 - acc: 0.730 - ETA: 5s - loss: 0.5282 - acc: 0.735 - ETA: 5s - loss: 0.5292 - acc: 0.732 - ETA: 5s - loss: 0.5319 - acc: 0.730 - ETA: 5s - loss: 0.5327 - acc: 0.731 - ETA: 5s - loss: 0.5315 - acc: 0.732 - ETA: 5s - loss: 0.5290 - acc: 0.736 - ETA: 5s - loss: 0.5291 - acc: 0.736 - ETA: 5s - loss: 0.5341 - acc: 0.737 - ETA: 5s - loss: 0.5378 - acc: 0.734 - ETA: 5s - loss: 0.5368 - acc: 0.735 - ETA: 5s - loss: 0.5366 - acc: 0.734 - ETA: 5s - loss: 0.5373 - acc: 0.733 - ETA: 5s - loss: 0.5383 - acc: 0.730 - ETA: 4s - loss: 0.5424 - acc: 0.729 - ETA: 4s - loss: 0.5414 - acc: 0.730 - ETA: 4s - loss: 0.5426 - acc: 0.728 - ETA: 4s - loss: 0.5416 - acc: 0.729 - ETA: 4s - loss: 0.5428 - acc: 0.729 - ETA: 4s - loss: 0.5423 - acc: 0.728 - ETA: 4s - loss: 0.5431 - acc: 0.729 - ETA: 4s - loss: 0.5442 - acc: 0.727 - ETA: 4s - loss: 0.5447 - acc: 0.726 - ETA: 4s - loss: 0.5437 - acc: 0.727 - ETA: 4s - loss: 0.5424 - acc: 0.729 - ETA: 4s - loss: 0.5434 - acc: 0.728 - ETA: 4s - loss: 0.5421 - acc: 0.729 - ETA: 4s - loss: 0.5423 - acc: 0.728 - ETA: 4s - loss: 0.5429 - acc: 0.727 - ETA: 4s - loss: 0.5407 - acc: 0.728 - ETA: 4s - loss: 0.5415 - acc: 0.728 - ETA: 4s - loss: 0.5423 - acc: 0.728 - ETA: 3s - loss: 0.5412 - acc: 0.729 - ETA: 3s - loss: 0.5416 - acc: 0.728 - ETA: 3s - loss: 0.5410 - acc: 0.729 - ETA: 3s - loss: 0.5414 - acc: 0.730 - ETA: 3s - loss: 0.5403 - acc: 0.730 - ETA: 3s - loss: 0.5401 - acc: 0.729 - ETA: 3s - loss: 0.5401 - acc: 0.729 - ETA: 3s - loss: 0.5408 - acc: 0.728 - ETA: 3s - loss: 0.5419 - acc: 0.728 - ETA: 3s - loss: 0.5413 - acc: 0.729 - ETA: 3s - loss: 0.5409 - acc: 0.729 - ETA: 3s - loss: 0.5404 - acc: 0.731 - ETA: 3s - loss: 0.5402 - acc: 0.730 - ETA: 3s - loss: 0.5406 - acc: 0.730 - ETA: 3s - loss: 0.5412 - acc: 0.730 - ETA: 3s - loss: 0.5418 - acc: 0.729 - ETA: 3s - loss: 0.5420 - acc: 0.728 - ETA: 2s - loss: 0.5417 - acc: 0.729 - ETA: 2s - loss: 0.5425 - acc: 0.728 - ETA: 2s - loss: 0.5430 - acc: 0.728 - ETA: 2s - loss: 0.5428 - acc: 0.728 - ETA: 2s - loss: 0.5425 - acc: 0.728 - ETA: 2s - loss: 0.5419 - acc: 0.729 - ETA: 2s - loss: 0.5422 - acc: 0.729 - ETA: 2s - loss: 0.5422 - acc: 0.729 - ETA: 2s - loss: 0.5429 - acc: 0.729 - ETA: 2s - loss: 0.5436 - acc: 0.729 - ETA: 2s - loss: 0.5436 - acc: 0.729 - ETA: 2s - loss: 0.5442 - acc: 0.728 - ETA: 2s - loss: 0.5435 - acc: 0.728 - ETA: 2s - loss: 0.5431 - acc: 0.729 - ETA: 2s - loss: 0.5426 - acc: 0.729 - ETA: 2s - loss: 0.5425 - acc: 0.729 - ETA: 2s - loss: 0.5417 - acc: 0.730 - ETA: 2s - loss: 0.5407 - acc: 0.731 - ETA: 1s - loss: 0.5413 - acc: 0.730 - ETA: 1s - loss: 0.5413 - acc: 0.730 - ETA: 1s - loss: 0.5423 - acc: 0.730 - ETA: 1s - loss: 0.5420 - acc: 0.730 - ETA: 1s - loss: 0.5425 - acc: 0.730 - ETA: 1s - loss: 0.5426 - acc: 0.730 - ETA: 1s - loss: 0.5425 - acc: 0.730 - ETA: 1s - loss: 0.5419 - acc: 0.730 - ETA: 1s - loss: 0.5426 - acc: 0.729 - ETA: 1s - loss: 0.5434 - acc: 0.728 - ETA: 1s - loss: 0.5432 - acc: 0.728 - ETA: 1s - loss: 0.5428 - acc: 0.729 - ETA: 1s - loss: 0.5428 - acc: 0.729 - ETA: 1s - loss: 0.5433 - acc: 0.728 - ETA: 1s - loss: 0.5430 - acc: 0.729 - ETA: 1s - loss: 0.5434 - acc: 0.729 - ETA: 1s - loss: 0.5428 - acc: 0.729 - ETA: 1s - loss: 0.5424 - acc: 0.730 - ETA: 0s - loss: 0.5423 - acc: 0.730 - ETA: 0s - loss: 0.5426 - acc: 0.730 - ETA: 0s - loss: 0.5420 - acc: 0.730 - ETA: 0s - loss: 0.5414 - acc: 0.731 - ETA: 0s - loss: 0.5417 - acc: 0.731 - ETA: 0s - loss: 0.5422 - acc: 0.730 - ETA: 0s - loss: 0.5418 - acc: 0.731 - ETA: 0s - loss: 0.5413 - acc: 0.731 - ETA: 0s - loss: 0.5412 - acc: 0.731 - ETA: 0s - loss: 0.5412 - acc: 0.731 - ETA: 0s - loss: 0.5406 - acc: 0.732 - ETA: 0s - loss: 0.5404 - acc: 0.732 - ETA: 0s - loss: 0.5397 - acc: 0.732 - ETA: 0s - loss: 0.5396 - acc: 0.732 - ETA: 0s - loss: 0.5388 - acc: 0.733 - ETA: 0s - loss: 0.5383 - acc: 0.733 - ETA: 0s - loss: 0.5389 - acc: 0.733 - 7s 404us/step - loss: 0.5385 - acc: 0.7339 - val_loss: 0.5626 - val_acc: 0.7171
<tensorflow.python.keras.callbacks.History at 0x195889c1898>

After just three epochs, we have 71% validation accuracy. If we keep going, we can probably do even better, but we should probably discuss how we know how we are doing. To help with this, we can use TensorBoard, which comes with TensorFlow and it helps you visualize your models as they are trained.

We'll talk about TensorBoard as well as various tweaks to our model in the next tutorial!

The next tutorial:





  • Introduction to Deep Learning - Deep Learning basics with Python, TensorFlow and Keras p.1
  • Loading in your own data - Deep Learning basics with Python, TensorFlow and Keras p.2
  • Convolutional Neural Networks - Deep Learning basics with Python, TensorFlow and Keras p.3
  • Analyzing Models with TensorBoard - Deep Learning basics with Python, TensorFlow and Keras p.4
  • Optimizing Models with TensorBoard - Deep Learning basics with Python, TensorFlow and Keras p.5
  • How to use your trained model - Deep Learning basics with Python, TensorFlow and Keras p.6
  • Recurrent Neural Networks - Deep Learning basics with Python, TensorFlow and Keras p.7
  • Creating a Cryptocurrency-predicting finance recurrent neural network - Deep Learning basics with Python, TensorFlow and Keras p.8
  • Normalizing and creating sequences for our cryptocurrency predicting RNN - Deep Learning basics with Python, TensorFlow and Keras p.9
  • Balancing Recurrent Neural Network sequence data for our crypto predicting RNN - Deep Learning basics with Python, TensorFlow and Keras p.10
  • Cryptocurrency-predicting RNN Model - Deep Learning basics with Python, TensorFlow and Keras p.11