Introduction to Artificial Neural Networks

CSI 4106 - Fall 2025

Marcel Turcotte

Version: Oct 7, 2025 17:57

Preamble

Message of the Day

Message of the Day (2024)

Learning objectives

  • Explain perceptrons and MLPs: structure, function, history, and limitations.
  • Describe activation functions: their role in enabling complex pattern learning.
  • Implement a feedforward neural network with Keras on Fashion-MNIST.
  • Interpret neural network training and results: visualization and evaluation metrics.
  • Familiarize with deep learning frameworks: PyTorch, TensorFlow, and Keras for model building and deployment.

Introduction

TensorFlow Playground

Neural Networks (NN)

We now shift our focus to a family of machine learning models that draw inspiration from the structure and function of biological neural networks found in animals.

Machine Learning Problems

  • Supervised Learning: Classification, Regression

  • Unsupervised Learning: Autoencoders, Self-Supervised

  • Reinforcement Learning: Now an Integral Component

A neuron

Interconnected neurons

Connectionist

Hierarchy of concepts

Basics

Computations with neurodes

where \(x_1, x_2 \in \{0,1\}\) and \(f(z)\) is an indicator function: \[ f(z)= \begin{cases}0, & z<\theta \\ 1, & z \geq \theta\end{cases} \]

Computations with neurodes

\[ y = f(x_1 + x_2)= \begin{cases}0, & x_1 + x_2 <\theta \\ 1, & x_1 + x_2 \geq \theta\end{cases} \]

  • With \(\theta = 2\), the neurode implements an AND logic gate.

  • With \(\theta = 1\), the neurode implements an OR logic gate.

Computations with neurodes

  • Digital computations can be broken down into a sequence of logical operations, enabling neurode networks to execute any computation.

  • McCulloch and Pitts (1943) did not focus on learning parameter \(\theta\).

  • They introduced a machine that computes any function but cannot learn.

Perceptron

Perceptron

Threshold logic unit

Simple Step Functions

\(\text{heaviside}(t)\) =

  • 1, if \(t \geq 0\)

  • 0, if \(t < 0\)

\(\text{sign}(t)\) =

  • 1, if \(t > 0\)

  • 0, if \(t = 0\)

  • -1, if \(t < 0\)

Notation

Notation

Perceptron

Perceptron

Notation

Notation

  • \(X\) is the input data matrix where each row corresponds to an example and each column represents one of the \(D\) features.

  • \(W\) is the weight matrix, structured with one row per input (feature) and one column per neuron.

  • Bias terms can be represented separately; both approaches appear in the literature. Here, \(b\) is a vector with a length equal to the number of neurons.

Discussion

  • The algorithm to train the perceptron closely resembles stochastic gradient descent.

    • In the interest of time and to avoid confusion, we will skip this algorithm and focus on multilayer perception (MLP) and its training algorithm, backpropagation.

Historical Note and Justification

Multilayer Perceptron

XOR Classification problem

\(x^{(1)}\) \(x^{(2)}\) \(y\) \(o_1\) \(o_2\) \(o_3\)
1 0 1 0 1 1
0 1 1 0 1 1
0 0 0 0 0 0
1 1 0 1 1 0

Feedforward Neural Network (FNN)

Forward Pass (Computatation)

\(o3 = \sigma(w_{13} x^{(1)}+ w_{23} x^{(2)} + b_3)\)

\(o4 = \sigma(w_{14} x^{(1)}+ w_{24} x^{(2)} + b_4)\)

\(o5 = \sigma(w_{15} x^{(1)}+ w_{25} x^{(2)} + b_5)\)

\(o6 = \sigma(w_{36} o_3 + w_{46} o_4 + w_{56} o_5 + b_6)\)

\(o7 = \sigma(w_{37} o_3 + w_{47} o_4 + w_{57} o_5 + b_7)\)

Forward Pass (Computatation)

import numpy as np

# Sigmoid function

def sigma(x):
    return 1 / (1 + np.exp(-x))

# Input (two attributes) vector, one example of our trainig set

x1, x2 = (0.5, 0.9)

# Initializing the weights of layers 2 and 3 to random values

w13, w14, w15, w23, w24, w25 = np.random.uniform(low=-1, high=1, size=6)
w36, w46, w56, w37, w47, w57 = np.random.uniform(low=-1, high=1, size=6)

# Initializing all 5 bias terms to random values

b3, b4, b5, b6, b7 = np.random.uniform(low=-1, high=1, size=5)

o3 = sigma(w13 * x1 + w23 * x2 + b3)
o4 = sigma(w14 * x1 + w24 * x2 + b4)
o5 = sigma(w15 * x1 + w25 * x2 + b5)
o6 = sigma(w36 * o3 + w46 * o4 + w56 * o5 + b6)
o7 = sigma(w37 * o3 + w47 * o4 + w57 * o5 + b7)

(o6, o7)
(np.float64(0.44554982523515224), np.float64(0.5066431441217616))

Forward Pass (Computatation)

Forward Pass (Computatation)

Activation Function

  • As will be discussed later, the training algorithm, known as backpropagation, employs gradient descent, necessitating the calculation of the partial derivatives of the loss function.

  • The step function in the multilayer perceptron had to be replaced, as it consists only of flat surfaces. Gradient descent cannot progress on flat surfaces due to their zero derivative.

Activation Function

  • Nonlinear activation functions are paramount because, without them, multiple layers in the network would only compute a linear function of the inputs.

  • According to the Universal Approximation Theorem, sufficiently large deep networks with nonlinear activation functions can approximate any continuous function. See Universal Approximation Theorem.

Sigmoid

Code
import matplotlib.pyplot as plt

# Sigmoid function
def sigmoid(x):
    return 1 / (1 + np.exp(-x))

# Generate x values
x = np.linspace(-10, 10, 400)

# Compute y values for the sigmoid function
y = sigmoid(x)

# Create a figure and remove axes and grid
fig, ax = plt.subplots()
ax.plot(x, y, color='black', linewidth=2)  # Keep the curve opaque

plt.grid(True)

# Set transparent background for the figure and axes
fig.patch.set_alpha(0)  # Transparent background for the figure

# Save or display the plot with transparent background
# plt.savefig('sigmoid_plot.png', transparent=True, bbox_inches='tight', pad_inches=0)
plt.show()

\[ \sigma(t) = \frac{1}{1 + e^{-t}} \]

Hyperbolic Tangent Function

Code
# Generate x values
x = np.linspace(-10, 10, 400)

# Compute y values for the sigmoid function
y = np.tanh(x)

# Create a figure and remove axes and grid
fig, ax = plt.subplots()
ax.plot(x, y, color='black', linewidth=2)  # Keep the curve opaque

plt.grid(True)

# Set transparent background for the figure and axes
fig.patch.set_alpha(0)  # Transparent background for the figure

# Save or display the plot with transparent background
# plt.savefig('sigmoid_plot.png', transparent=True, bbox_inches='tight', pad_inches=0)
plt.show()

\[ \tanh(t) = 2 \sigma(2t) - 1 \]

Rectified linear unit function (ReLU)

Code
# Generate x values
x = np.linspace(-10, 10, 400)

# Compute y values for the sigmoid function
y = np.maximum(0, x)

# Create a figure and remove axes and grid
fig, ax = plt.subplots()
ax.plot(x, y, color='black', linewidth=2)  # Keep the curve opaque

plt.grid(True)

# Set transparent background for the figure and axes
fig.patch.set_alpha(0)  # Transparent background for the figure

# Save or display the plot with transparent background
# plt.savefig('sigmoid_plot.png', transparent=True, bbox_inches='tight', pad_inches=0)
plt.show()

\[ \mathrm{ReLU}(t) = \max(0, t) \]

Common Activation Functions

Code
from scipy.special import expit as sigmoid

def relu(z):
    return np.maximum(0, z)

def derivative(f, z, eps=0.000001):
    return (f(z + eps) - f(z - eps))/(2 * eps)

max_z = 4.5
z = np.linspace(-max_z, max_z, 200)

plt.figure(figsize=(11, 3.1))

plt.subplot(121)
plt.plot([-max_z, 0], [0, 0], "r-", linewidth=2, label="Heaviside")
plt.plot(z, relu(z), "m-.", linewidth=2, label="ReLU")
plt.plot([0, 0], [0, 1], "r-", linewidth=0.5)
plt.plot([0, max_z], [1, 1], "r-", linewidth=2)
plt.plot(z, sigmoid(z), "g--", linewidth=2, label="Sigmoid")
plt.plot(z, np.tanh(z), "b-", linewidth=1, label="Tanh")
plt.grid(True)
plt.title("Activation functions")
plt.axis([-max_z, max_z, -1.65, 2.4])
plt.gca().set_yticks([-1, 0, 1, 2])
plt.legend(loc="lower right", fontsize=13)

plt.subplot(122)
plt.plot(z, derivative(np.sign, z), "r-", linewidth=2, label="Heaviside")
plt.plot(0, 0, "ro", markersize=5)
plt.plot(0, 0, "rx", markersize=10)
plt.plot(z, derivative(sigmoid, z), "g--", linewidth=2, label="Sigmoid")
plt.plot(z, derivative(np.tanh, z), "b-", linewidth=1, label="Tanh")
plt.plot([-max_z, 0], [0, 0], "m-.", linewidth=2)
plt.plot([0, max_z], [1, 1], "m-.", linewidth=2)
plt.plot([0, 0], [0, 1], "m-.", linewidth=1.2)
plt.plot(0, 1, "mo", markersize=5)
plt.plot(0, 1, "mx", markersize=10)
plt.grid(True)
plt.title("Derivatives")
plt.axis([-max_z, max_z, -0.2, 1.2])

plt.show()

Universal Approximation

Definition

The universal approximation theorem (UAT) states that a feedforward neural network with a single hidden layer containing a finite number of neurons can approximate any continuous function on a compact subset of \(\mathbb{R}^n\), given appropriate weights and activation functions.

Single Hidden Layer

\[ y = \sum_{i=1}^N \alpha_i \sigma(w_{1,i} x + b_i) \]

Effect of Varying w

Code
def logistic(x, w, b):
    """Compute the logistic function with parameters w and b."""
    return 1 / (1 + np.exp(-(w * x + b)))

# Define a range for x values.
x = np.linspace(-10, 10, 400)

# Plot 1: Varying w (steepness) with b fixed at 0.
plt.figure(figsize=(6,4))
w_values = [0.5, 1, 2, 5]  # different steepness values
b = 0  # fixed bias

for w in w_values:
    plt.plot(x, logistic(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effect of Varying w (with b = 0)')
plt.xlabel('x')
plt.ylabel(r'$\sigma(wx+b)$')
plt.legend()
plt.grid(True)

plt.show()

Effect of Varying b

Code
# Plot 2: Varying b (horizontal shift) with w fixed at 1.
plt.figure(figsize=(6,4))
w = 1  # fixed steepness
b_values = [-5, -2, 0, 2, 5]  # different bias values

for b in b_values:
    plt.plot(x, logistic(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effect of Varying b (with w = 1)')
plt.xlabel('x')
plt.ylabel(r'$\sigma(wx+b)$')
plt.legend()
plt.grid(True)

plt.show()

Effect of Varying w

Code
def relu(x, w, b):
    """Compute the ReLU activation with parameters w and b."""
    return np.maximum(0, w * x + b)

# Define a range for x values.
x = np.linspace(-10, 10, 400)

# Plot 1: Varying w (scaling) with b fixed at 0.
plt.figure(figsize=(6,4))
w_values = [0.5, 1, 2, 5]  # different scaling values
b = 0  # fixed bias

for w in w_values:
    plt.plot(x, relu(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effect of Varying w (with b = 0) on ReLU Activation')
plt.xlabel('x')
plt.ylabel('ReLU(wx+b)')
plt.legend()
plt.grid(True)

plt.show()

Effect of Varying b

Code
# Plot 2: Varying b (horizontal shift) with w fixed at 1.
plt.figure(figsize=(6,4))
w = 1  # fixed scaling
b_values = [-5, -2, 0, 2, 5]  # different bias values

for b in b_values:
    plt.plot(x, relu(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effect of Varying b (with w = 1) on ReLU Activation')
plt.xlabel('x')
plt.ylabel('ReLU(wx+b)')
plt.legend()
plt.grid(True)

plt.show()

Single Hidden Layer

\[ y = \sum_{i=1}^N \alpha_i \sigma(w_{1,i} x + b_i) \]

Demonstration with code

# Defining the function to be approximated

def f(x):
  return 2 * x**3 + 4 * x**2 - 5 * x + 1

# Generating a dataset, x in [-4,2), f(x) as above

X = 6 * np.random.rand(1000, 1) - 4

y = f(X.flatten())

Increasing the number of neurons

from sklearn.neural_network import MLPRegressor
from sklearn.model_selection import train_test_split

X_train, X_valid, y_train, y_valid = train_test_split(X, y, test_size=0.1, random_state=42)

models = []

sizes = [1, 2, 5, 10, 100]

for i, n in enumerate(sizes):

  models.append(MLPRegressor(hidden_layer_sizes=[n], max_iter=5000, random_state=42))

  models[i].fit(X_train, y_train) 

Increasing the number of neurons

Code
# Create a colormap
colors = plt.colormaps['cool'].resampled(len(sizes))

X_valid = np.sort(X_valid,axis=0)

for i, n in enumerate(sizes):

  y_pred = models[i].predict(X_valid)

  plt.plot(X_valid, y_pred, "-", color=colors(i), label="Number of neurons = {}".format(n))

y_true = f(X_valid)
plt.plot(X_valid, y_true, "r.", label='Actual')

plt.legend()
plt.show()

Increasing the number of neurons

Code
for i, n in enumerate(sizes):

  plt.plot(models[i].loss_curve_, "-", color=colors(i), label="Number of neurons = {}".format(n))

plt.title('MLPRegressor Loss Curves')
plt.xlabel('Iterations')
plt.ylabel('Loss')

plt.legend()
plt.show()

Universal Approximation

Let’s code

Frameworks

PyTorch and TensorFlow are the leading platforms for deep learning.

  • PyTorch has gained considerable traction in the research community. Initially developed by Meta AI, it is now part of the Linux Foundation.

  • TensorFlow, created by Google, is widely adopted in industry for deploying models in production environments.

Keras

Keras is a high-level API designed to build, train, evaluate, and execute models across various backends, including PyTorch, TensorFlow, and JAX, Google’s high-performance platform.

Fashion-MNIST dataset

Fashion-MNIST is a dataset of Zalando’s article images—consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes.”

Loading

import tensorflow as tf

fashion_mnist = tf.keras.datasets.fashion_mnist

(X_train_full, y_train_full), (X_test, y_test) = fashion_mnist.load_data()

X_train, y_train = X_train_full[:-5000], y_train_full[:-5000]
X_valid, y_valid = X_train_full[-5000:], y_train_full[-5000:]

Exploration

X_train.shape
(55000, 28, 28)
X_train.dtype
dtype('uint8')

Transforming the pixel intensities from integers in the range 0 to 255 to floats in the range 0 to 1.

X_train = X_train / 255.0
X_valid = X_valid / 255.0

What are these images anyway!

plt.figure(figsize=(2, 2))
plt.imshow(X_train[0], cmap="binary")
plt.axis('off')
plt.show()

y_train
array([9, 0, 0, ..., 9, 0, 2], shape=(55000,), dtype=uint8)

Since the labels are integers, 0 to 9. Class names will become handy.

class_names = ["T-shirt/top", "Trouser", "Pullover", "Dress", "Coat",
               "Sandal", "Shirt", "Sneaker", "Bag", "Ankle boot"]

First 40 images

n_rows = 4
n_cols = 10
plt.figure(figsize=(n_cols * 1.2, n_rows * 1.2))
for row in range(n_rows):
    for col in range(n_cols):
        index = n_cols * row + col
        plt.subplot(n_rows, n_cols, index + 1)
        plt.imshow(X_train[index], cmap="binary", interpolation="nearest")
        plt.axis('off')
        plt.title(class_names[y_train[index]])
plt.subplots_adjust(wspace=0.2, hspace=0.5)
plt.show()

First 40 images

Creating a model

tf.random.set_seed(42)

model = tf.keras.Sequential()

model.add(tf.keras.layers.InputLayer(shape=[28, 28]))
model.add(tf.keras.layers.Flatten())
model.add(tf.keras.layers.Dense(300, activation="relu"))
model.add(tf.keras.layers.Dense(100, activation="relu"))
model.add(tf.keras.layers.Dense(10, activation="softmax"))

model.summary()

Code
model.summary()
Model: "sequential_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ flatten_1 (Flatten)             │ (None, 784)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_3 (Dense)                 │ (None, 300)            │       235,500 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_4 (Dense)                 │ (None, 100)            │        30,100 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_5 (Dense)                 │ (None, 10)             │         1,010 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 266,610 (1.02 MB)
 Trainable params: 266,610 (1.02 MB)
 Non-trainable params: 0 (0.00 B)

Creating a model (alternative)

Code
# extra code – clear the session to reset the name counters
tf.keras.backend.clear_session()
tf.random.set_seed(42)
model = tf.keras.Sequential([
    tf.keras.Input(shape=(28, 28)),
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(300, activation="relu"),
    tf.keras.layers.Dense(100, activation="relu"),
    tf.keras.layers.Dense(10, activation="softmax")
])

model.summary()

Code
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ flatten (Flatten)               │ (None, 784)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 300)            │       235,500 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 100)            │        30,100 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 10)             │         1,010 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 266,610 (1.02 MB)
 Trainable params: 266,610 (1.02 MB)
 Non-trainable params: 0 (0.00 B)

Compiling the model

model.compile(loss="sparse_categorical_crossentropy",
              optimizer="sgd",
              metrics=["accuracy"])

Training the model

history = model.fit(X_train, y_train, epochs=30,
                    validation_data=(X_valid, y_valid))
Epoch 1/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 5:33 194ms/step - accuracy: 0.1250 - loss: 2.6334

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 828us/step - accuracy: 0.2778 - loss: 2.1858  

 128/1719 ━━━━━━━━━━━━━━━━━━━ 1s 794us/step - accuracy: 0.3835 - loss: 1.9913

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 812us/step - accuracy: 0.4369 - loss: 1.8611

 246/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 823us/step - accuracy: 0.4739 - loss: 1.7564

 305/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 828us/step - accuracy: 0.5010 - loss: 1.6711

 363/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 834us/step - accuracy: 0.5222 - loss: 1.6008

 426/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 830us/step - accuracy: 0.5408 - loss: 1.5364

 487/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 829us/step - accuracy: 0.5559 - loss: 1.4828

 548/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 828us/step - accuracy: 0.5689 - loss: 1.4362

 610/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 826us/step - accuracy: 0.5805 - loss: 1.3944

 673/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.5910 - loss: 1.3566

 735/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.6003 - loss: 1.3232

 795/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.6084 - loss: 1.2940

 857/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.6160 - loss: 1.2664

 920/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.6231 - loss: 1.2407

 982/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.6296 - loss: 1.2173

1043/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.6354 - loss: 1.1960

1105/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.6409 - loss: 1.1759

1167/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.6460 - loss: 1.1572

1227/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.6507 - loss: 1.1403

1288/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.6551 - loss: 1.1242

1350/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.6593 - loss: 1.1088

1413/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.6633 - loss: 1.0940

1473/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.6670 - loss: 1.0807

1535/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.6706 - loss: 1.0677

1596/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.6739 - loss: 1.0555

1658/1719 ━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.6771 - loss: 1.0437

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 934us/step - accuracy: 0.7627 - loss: 0.7300 - val_accuracy: 0.8274 - val_loss: 0.5047

Epoch 2/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.8438 - loss: 0.5507

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 832us/step - accuracy: 0.8313 - loss: 0.5091

 123/1719 ━━━━━━━━━━━━━━━━━━━ 1s 831us/step - accuracy: 0.8237 - loss: 0.5220

 185/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 824us/step - accuracy: 0.8200 - loss: 0.5283

 245/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.8195 - loss: 0.5286

 306/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.8189 - loss: 0.5290

 368/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 823us/step - accuracy: 0.8186 - loss: 0.5287

 431/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.8185 - loss: 0.5284

 493/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.8184 - loss: 0.5278

 556/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8185 - loss: 0.5271

 618/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8187 - loss: 0.5263

 680/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8189 - loss: 0.5254

 741/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8192 - loss: 0.5246

 805/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8196 - loss: 0.5237

 875/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8200 - loss: 0.5226

 943/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8204 - loss: 0.5216

1012/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8208 - loss: 0.5204

1080/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.8212 - loss: 0.5193

1149/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8215 - loss: 0.5182

1220/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8219 - loss: 0.5171

1288/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8222 - loss: 0.5162

1357/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8225 - loss: 0.5153

1425/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8228 - loss: 0.5143

1495/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.8231 - loss: 0.5134

1565/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8235 - loss: 0.5124

1634/1719 ━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.8237 - loss: 0.5115

1703/1719 ━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.8240 - loss: 0.5107

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.8302 - loss: 0.4896 - val_accuracy: 0.8390 - val_loss: 0.4520

Epoch 3/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.8438 - loss: 0.5114

  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 790us/step - accuracy: 0.8499 - loss: 0.4447

 131/1719 ━━━━━━━━━━━━━━━━━━━ 1s 774us/step - accuracy: 0.8425 - loss: 0.4579

 198/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 769us/step - accuracy: 0.8394 - loss: 0.4635

 265/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 765us/step - accuracy: 0.8386 - loss: 0.4641

 331/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 764us/step - accuracy: 0.8380 - loss: 0.4647

 401/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.8375 - loss: 0.4652

 469/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.8373 - loss: 0.4652

 536/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.8372 - loss: 0.4650

 602/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.8373 - loss: 0.4647

 670/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 753us/step - accuracy: 0.8376 - loss: 0.4642

 739/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.8378 - loss: 0.4638

 809/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.8381 - loss: 0.4633

 878/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8383 - loss: 0.4628

 946/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8386 - loss: 0.4623

1014/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 746us/step - accuracy: 0.8389 - loss: 0.4616

1082/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 746us/step - accuracy: 0.8391 - loss: 0.4609

1149/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8394 - loss: 0.4603

1216/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8396 - loss: 0.4597

1282/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 748us/step - accuracy: 0.8398 - loss: 0.4592

1349/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 748us/step - accuracy: 0.8400 - loss: 0.4587

1415/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.8402 - loss: 0.4581

1483/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 748us/step - accuracy: 0.8404 - loss: 0.4576

1552/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 748us/step - accuracy: 0.8406 - loss: 0.4570

1621/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8408 - loss: 0.4565

1689/1719 ━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8409 - loss: 0.4560

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 799us/step - accuracy: 0.8444 - loss: 0.4438 - val_accuracy: 0.8466 - val_loss: 0.4266

Epoch 4/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.8125 - loss: 0.4900

  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 756us/step - accuracy: 0.8531 - loss: 0.4115

 137/1719 ━━━━━━━━━━━━━━━━━━━ 1s 741us/step - accuracy: 0.8485 - loss: 0.4249

 207/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 734us/step - accuracy: 0.8464 - loss: 0.4301

 270/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 750us/step - accuracy: 0.8464 - loss: 0.4308

 332/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 763us/step - accuracy: 0.8462 - loss: 0.4315

 394/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 770us/step - accuracy: 0.8463 - loss: 0.4321

 455/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.8463 - loss: 0.4323

 517/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.8464 - loss: 0.4323

 580/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8466 - loss: 0.4321

 644/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8469 - loss: 0.4318

 708/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.8472 - loss: 0.4315

 771/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.8475 - loss: 0.4312

 833/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8478 - loss: 0.4309

 896/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8480 - loss: 0.4306

 959/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8482 - loss: 0.4302

1022/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8485 - loss: 0.4297

1084/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.8487 - loss: 0.4293

1146/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.8490 - loss: 0.4288

1207/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.8492 - loss: 0.4284

1268/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8493 - loss: 0.4281

1328/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8495 - loss: 0.4277

1389/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8497 - loss: 0.4273

1451/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8499 - loss: 0.4269

1512/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.8500 - loss: 0.4265

1573/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8502 - loss: 0.4262

1634/1719 ━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8503 - loss: 0.4258

1698/1719 ━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8505 - loss: 0.4255

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 857us/step - accuracy: 0.8536 - loss: 0.4164 - val_accuracy: 0.8492 - val_loss: 0.4120

Epoch 5/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.8125 - loss: 0.4724

  64/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.8633 - loss: 0.3874

 129/1719 ━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.8600 - loss: 0.3996

 191/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 797us/step - accuracy: 0.8580 - loss: 0.4059

 257/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 789us/step - accuracy: 0.8576 - loss: 0.4072

 320/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.8572 - loss: 0.4080

 382/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 795us/step - accuracy: 0.8570 - loss: 0.4085

 445/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 796us/step - accuracy: 0.8567 - loss: 0.4089

 506/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8566 - loss: 0.4090

 568/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.8566 - loss: 0.4090

 631/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8567 - loss: 0.4087

 694/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.8568 - loss: 0.4084

 755/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8569 - loss: 0.4082

 816/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8570 - loss: 0.4079

 877/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8571 - loss: 0.4078

 939/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8572 - loss: 0.4075

 999/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.8573 - loss: 0.4072

1062/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8575 - loss: 0.4068

1125/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8576 - loss: 0.4063

1188/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8577 - loss: 0.4060

1252/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8578 - loss: 0.4057

1316/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8579 - loss: 0.4054

1379/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8580 - loss: 0.4050

1441/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8582 - loss: 0.4047

1505/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8583 - loss: 0.4044

1569/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8584 - loss: 0.4040

1631/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8585 - loss: 0.4037

1693/1719 ━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8585 - loss: 0.4034

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 859us/step - accuracy: 0.8603 - loss: 0.3960 - val_accuracy: 0.8526 - val_loss: 0.4009

Epoch 6/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.8438 - loss: 0.4526

  64/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 805us/step - accuracy: 0.8668 - loss: 0.3687

 133/1719 ━━━━━━━━━━━━━━━━━━━ 1s 767us/step - accuracy: 0.8632 - loss: 0.3814

 202/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 754us/step - accuracy: 0.8614 - loss: 0.3877

 271/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 750us/step - accuracy: 0.8612 - loss: 0.3890

 340/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 746us/step - accuracy: 0.8611 - loss: 0.3898

 410/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 741us/step - accuracy: 0.8610 - loss: 0.3906

 479/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 740us/step - accuracy: 0.8611 - loss: 0.3909

 549/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.8612 - loss: 0.3909

 618/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 737us/step - accuracy: 0.8613 - loss: 0.3907

 688/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 735us/step - accuracy: 0.8616 - loss: 0.3904

 759/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8618 - loss: 0.3902

 827/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8620 - loss: 0.3900

 896/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 732us/step - accuracy: 0.8621 - loss: 0.3899

 966/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 732us/step - accuracy: 0.8623 - loss: 0.3896

1036/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.8625 - loss: 0.3892

1105/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8627 - loss: 0.3888

1177/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8629 - loss: 0.3884

1247/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8630 - loss: 0.3881

1318/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8632 - loss: 0.3878

1388/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8633 - loss: 0.3875

1457/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8635 - loss: 0.3872

1527/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8636 - loss: 0.3869

1598/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8637 - loss: 0.3866

1668/1719 ━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8638 - loss: 0.3863

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 776us/step - accuracy: 0.8657 - loss: 0.3799 - val_accuracy: 0.8544 - val_loss: 0.3923

Epoch 7/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.8438 - loss: 0.4357

  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 748us/step - accuracy: 0.8713 - loss: 0.3532

 136/1719 ━━━━━━━━━━━━━━━━━━━ 1s 744us/step - accuracy: 0.8684 - loss: 0.3658

 206/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 735us/step - accuracy: 0.8666 - loss: 0.3720

 272/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 740us/step - accuracy: 0.8664 - loss: 0.3733

 340/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 741us/step - accuracy: 0.8663 - loss: 0.3743

 409/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.8662 - loss: 0.3752

 478/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.8663 - loss: 0.3755

 549/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8664 - loss: 0.3757

 618/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8666 - loss: 0.3755

 688/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 732us/step - accuracy: 0.8668 - loss: 0.3753

 758/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.8669 - loss: 0.3751

 827/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.8670 - loss: 0.3749

 896/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8671 - loss: 0.3749

 966/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8673 - loss: 0.3746

1034/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8674 - loss: 0.3743

1104/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8676 - loss: 0.3740

1172/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8677 - loss: 0.3736

1240/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8679 - loss: 0.3734

1310/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8680 - loss: 0.3731

1378/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8681 - loss: 0.3728

1441/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8682 - loss: 0.3725

1502/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 736us/step - accuracy: 0.8683 - loss: 0.3723

1563/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 740us/step - accuracy: 0.8684 - loss: 0.3721

1623/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8685 - loss: 0.3718

1682/1719 ━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8686 - loss: 0.3716

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 805us/step - accuracy: 0.8702 - loss: 0.3662 - val_accuracy: 0.8562 - val_loss: 0.3862

Epoch 8/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.8438 - loss: 0.4189

  63/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 815us/step - accuracy: 0.8738 - loss: 0.3390

 125/1719 ━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.8718 - loss: 0.3499

 188/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 808us/step - accuracy: 0.8700 - loss: 0.3572

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.8699 - loss: 0.3592

 312/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.8700 - loss: 0.3603

 373/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 813us/step - accuracy: 0.8702 - loss: 0.3611

 434/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 815us/step - accuracy: 0.8704 - loss: 0.3619

 496/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8706 - loss: 0.3622

 557/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8708 - loss: 0.3624

 619/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8710 - loss: 0.3623

 681/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8712 - loss: 0.3621

 743/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8714 - loss: 0.3620

 805/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8715 - loss: 0.3619

 867/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8717 - loss: 0.3618

 928/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8718 - loss: 0.3617

 989/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8719 - loss: 0.3616

1052/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8721 - loss: 0.3613

1114/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8722 - loss: 0.3610

1176/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8724 - loss: 0.3607

1239/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8725 - loss: 0.3605

1302/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8726 - loss: 0.3603

1364/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8727 - loss: 0.3601

1424/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8728 - loss: 0.3599

1486/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8729 - loss: 0.3596

1549/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8730 - loss: 0.3594

1613/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8731 - loss: 0.3592

1675/1719 ━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8732 - loss: 0.3590

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 867us/step - accuracy: 0.8745 - loss: 0.3544 - val_accuracy: 0.8616 - val_loss: 0.3791

Epoch 9/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.8438 - loss: 0.3963

  63/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 812us/step - accuracy: 0.8802 - loss: 0.3262

 126/1719 ━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.8776 - loss: 0.3373

 186/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 818us/step - accuracy: 0.8755 - loss: 0.3446

 247/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.8749 - loss: 0.3468

 308/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.8747 - loss: 0.3481

 369/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.8747 - loss: 0.3490

 432/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.8747 - loss: 0.3499

 493/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.8748 - loss: 0.3504

 555/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.8749 - loss: 0.3506

 618/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8750 - loss: 0.3505

 680/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8752 - loss: 0.3504

 741/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.8753 - loss: 0.3503

 803/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8754 - loss: 0.3502

 866/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8755 - loss: 0.3502

 928/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8756 - loss: 0.3501

 991/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8757 - loss: 0.3500

1052/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8758 - loss: 0.3498

1114/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8760 - loss: 0.3495

1177/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8761 - loss: 0.3493

1238/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8762 - loss: 0.3491

1307/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 810us/step - accuracy: 0.8763 - loss: 0.3489

1377/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8764 - loss: 0.3487

1448/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8765 - loss: 0.3484

1517/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8766 - loss: 0.3482

1587/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.8767 - loss: 0.3480

1656/1719 ━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.8768 - loss: 0.3478

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 843us/step - accuracy: 0.8781 - loss: 0.3438 - val_accuracy: 0.8634 - val_loss: 0.3744

Epoch 10/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.8438 - loss: 0.3717

  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 737us/step - accuracy: 0.8881 - loss: 0.3154

 139/1719 ━━━━━━━━━━━━━━━━━━━ 1s 727us/step - accuracy: 0.8842 - loss: 0.3281

 208/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 727us/step - accuracy: 0.8817 - loss: 0.3345

 278/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.8809 - loss: 0.3366

 347/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8805 - loss: 0.3379

 419/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.8802 - loss: 0.3392

 490/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.8801 - loss: 0.3398

 560/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.8801 - loss: 0.3401

 631/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.8802 - loss: 0.3400

 702/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.8802 - loss: 0.3399

 773/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.8802 - loss: 0.3399

 844/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8803 - loss: 0.3398

 914/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8803 - loss: 0.3399

 982/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.8803 - loss: 0.3397

1052/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.8804 - loss: 0.3395

1123/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.8805 - loss: 0.3392

1193/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.8805 - loss: 0.3390

1264/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.8806 - loss: 0.3389

1336/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8806 - loss: 0.3387

1406/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8807 - loss: 0.3384

1476/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8808 - loss: 0.3382

1546/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8808 - loss: 0.3380

1616/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8809 - loss: 0.3379

1686/1719 ━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8809 - loss: 0.3377

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 768us/step - accuracy: 0.8811 - loss: 0.3345 - val_accuracy: 0.8658 - val_loss: 0.3700

Epoch 11/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9062 - loss: 0.3459

  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 735us/step - accuracy: 0.8963 - loss: 0.3052

 138/1719 ━━━━━━━━━━━━━━━━━━━ 1s 730us/step - accuracy: 0.8898 - loss: 0.3179

 208/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 728us/step - accuracy: 0.8862 - loss: 0.3246

 277/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 727us/step - accuracy: 0.8850 - loss: 0.3268

 345/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 729us/step - accuracy: 0.8844 - loss: 0.3282

 415/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8839 - loss: 0.3295

 485/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8837 - loss: 0.3302

 555/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8836 - loss: 0.3306

 624/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8836 - loss: 0.3306

 693/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8835 - loss: 0.3305

 760/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8835 - loss: 0.3305

 820/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.8835 - loss: 0.3305

 882/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8835 - loss: 0.3305

 943/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 748us/step - accuracy: 0.8835 - loss: 0.3305

1007/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.8835 - loss: 0.3304

1069/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.8836 - loss: 0.3302

1132/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.8837 - loss: 0.3300

1193/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.8837 - loss: 0.3298

1256/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.8837 - loss: 0.3297

1320/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.8838 - loss: 0.3295

1382/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 767us/step - accuracy: 0.8838 - loss: 0.3293

1443/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8839 - loss: 0.3292

1505/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 771us/step - accuracy: 0.8839 - loss: 0.3290

1566/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8840 - loss: 0.3289

1628/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8840 - loss: 0.3287

1688/1719 ━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.8840 - loss: 0.3286

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 833us/step - accuracy: 0.8840 - loss: 0.3258 - val_accuracy: 0.8676 - val_loss: 0.3670

Epoch 12/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.8750 - loss: 0.3339

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.8966 - loss: 0.2965

 125/1719 ━━━━━━━━━━━━━━━━━━━ 1s 808us/step - accuracy: 0.8912 - loss: 0.3075

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.8882 - loss: 0.3152

 249/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.8872 - loss: 0.3178

 312/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 808us/step - accuracy: 0.8867 - loss: 0.3194

 376/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.8864 - loss: 0.3206

 440/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.8862 - loss: 0.3216

 503/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8861 - loss: 0.3222

 566/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8861 - loss: 0.3224

 629/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8862 - loss: 0.3224

 692/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8863 - loss: 0.3223

 753/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8863 - loss: 0.3223

 815/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8864 - loss: 0.3223

 879/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8864 - loss: 0.3224

 942/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8864 - loss: 0.3224

1005/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8864 - loss: 0.3222

1068/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8865 - loss: 0.3221

1130/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8866 - loss: 0.3219

1193/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8866 - loss: 0.3217

1255/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8867 - loss: 0.3216

1318/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8867 - loss: 0.3214

1383/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8868 - loss: 0.3213

1445/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8868 - loss: 0.3211

1507/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8868 - loss: 0.3209

1571/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8869 - loss: 0.3208

1633/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8869 - loss: 0.3207

1696/1719 ━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8869 - loss: 0.3206

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 855us/step - accuracy: 0.8869 - loss: 0.3180 - val_accuracy: 0.8706 - val_loss: 0.3623

Epoch 13/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.9375 - loss: 0.3142

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 833us/step - accuracy: 0.9036 - loss: 0.2884

 124/1719 ━━━━━━━━━━━━━━━━━━━ 1s 822us/step - accuracy: 0.8964 - loss: 0.2992

 184/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 827us/step - accuracy: 0.8926 - loss: 0.3070

 247/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.8910 - loss: 0.3099

 308/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 823us/step - accuracy: 0.8902 - loss: 0.3114

 369/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 824us/step - accuracy: 0.8898 - loss: 0.3126

 430/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 824us/step - accuracy: 0.8893 - loss: 0.3137

 491/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 824us/step - accuracy: 0.8890 - loss: 0.3143

 553/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.8888 - loss: 0.3146

 616/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.8887 - loss: 0.3146

 686/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 811us/step - accuracy: 0.8887 - loss: 0.3146

 756/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8887 - loss: 0.3146

 824/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8887 - loss: 0.3146

 893/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.8886 - loss: 0.3147

 961/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8887 - loss: 0.3146

1032/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8887 - loss: 0.3145

1094/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8887 - loss: 0.3143

1161/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8888 - loss: 0.3141

1230/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8888 - loss: 0.3140

1298/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.8889 - loss: 0.3139

1367/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8889 - loss: 0.3137

1437/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8890 - loss: 0.3135

1506/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 771us/step - accuracy: 0.8890 - loss: 0.3133

1573/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.8891 - loss: 0.3132

1642/1719 ━━━━━━━━━━━━━━━━━━━ 0s 768us/step - accuracy: 0.8891 - loss: 0.3131

1713/1719 ━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.8891 - loss: 0.3130

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.8893 - loss: 0.3107 - val_accuracy: 0.8726 - val_loss: 0.3587

Epoch 14/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9375 - loss: 0.3011

  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 746us/step - accuracy: 0.9052 - loss: 0.2822

 138/1719 ━━━━━━━━━━━━━━━━━━━ 1s 738us/step - accuracy: 0.8974 - loss: 0.2943

 209/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 729us/step - accuracy: 0.8933 - loss: 0.3010

 278/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 730us/step - accuracy: 0.8920 - loss: 0.3033

 348/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8912 - loss: 0.3047

 417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8905 - loss: 0.3061

 487/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8902 - loss: 0.3068

 556/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8899 - loss: 0.3072

 627/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8899 - loss: 0.3072

 696/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8899 - loss: 0.3072

 765/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8899 - loss: 0.3072

 836/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8899 - loss: 0.3072

 905/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8899 - loss: 0.3073

 975/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8900 - loss: 0.3073

1044/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8901 - loss: 0.3071

1116/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.8902 - loss: 0.3069

1184/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8903 - loss: 0.3068

1254/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8903 - loss: 0.3067

1324/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8904 - loss: 0.3065

1394/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.8905 - loss: 0.3063

1462/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8906 - loss: 0.3062

1532/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8907 - loss: 0.3060

1601/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8908 - loss: 0.3059

1671/1719 ━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8908 - loss: 0.3058

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 776us/step - accuracy: 0.8917 - loss: 0.3037 - val_accuracy: 0.8734 - val_loss: 0.3561

Epoch 15/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9375 - loss: 0.2859

  67/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 762us/step - accuracy: 0.9050 - loss: 0.2754

 128/1719 ━━━━━━━━━━━━━━━━━━━ 1s 793us/step - accuracy: 0.8991 - loss: 0.2859

 189/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 802us/step - accuracy: 0.8955 - loss: 0.2930

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.8941 - loss: 0.2957

 313/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.8933 - loss: 0.2973

 376/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.8928 - loss: 0.2985

 436/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.8924 - loss: 0.2996

 495/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8922 - loss: 0.3002

 558/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8920 - loss: 0.3005

 619/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8920 - loss: 0.3005

 681/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8921 - loss: 0.3005

 743/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8921 - loss: 0.3005

 804/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8921 - loss: 0.3005

 867/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8921 - loss: 0.3006

 928/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8921 - loss: 0.3007

 989/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8922 - loss: 0.3006

1052/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8922 - loss: 0.3005

1112/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8923 - loss: 0.3003

1172/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8924 - loss: 0.3002

1232/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8924 - loss: 0.3001

1295/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8925 - loss: 0.3000

1355/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.8925 - loss: 0.2998

1417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.8926 - loss: 0.2997

1480/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8927 - loss: 0.2995

1542/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8927 - loss: 0.2994

1605/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8928 - loss: 0.2993

1669/1719 ━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8928 - loss: 0.2992

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 872us/step - accuracy: 0.8933 - loss: 0.2973 - val_accuracy: 0.8726 - val_loss: 0.3538

Epoch 16/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2780

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 823us/step - accuracy: 0.9077 - loss: 0.2697

 123/1719 ━━━━━━━━━━━━━━━━━━━ 1s 827us/step - accuracy: 0.9021 - loss: 0.2795

 188/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.8982 - loss: 0.2873

 251/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.8968 - loss: 0.2900

 315/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.8961 - loss: 0.2915

 379/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 801us/step - accuracy: 0.8956 - loss: 0.2927

 443/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 798us/step - accuracy: 0.8952 - loss: 0.2937

 507/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8951 - loss: 0.2942

 569/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8950 - loss: 0.2945

 632/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8950 - loss: 0.2945

 696/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8950 - loss: 0.2944

 760/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8950 - loss: 0.2944

 823/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8950 - loss: 0.2944

 888/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8949 - loss: 0.2945

 952/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.8950 - loss: 0.2945

1014/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8950 - loss: 0.2944

1074/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8951 - loss: 0.2942

1137/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8951 - loss: 0.2941

1201/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8952 - loss: 0.2939

1263/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8952 - loss: 0.2938

1326/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8953 - loss: 0.2937

1387/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8954 - loss: 0.2936

1450/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8954 - loss: 0.2934

1514/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8955 - loss: 0.2933

1579/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8955 - loss: 0.2932

1644/1719 ━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8956 - loss: 0.2931

1704/1719 ━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8956 - loss: 0.2930

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 854us/step - accuracy: 0.8957 - loss: 0.2911 - val_accuracy: 0.8734 - val_loss: 0.3518

Epoch 17/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9062 - loss: 0.2775

  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 743us/step - accuracy: 0.9006 - loss: 0.2654

 138/1719 ━━━━━━━━━━━━━━━━━━━ 1s 738us/step - accuracy: 0.8980 - loss: 0.2762

 206/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 739us/step - accuracy: 0.8963 - loss: 0.2823

 276/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 735us/step - accuracy: 0.8957 - loss: 0.2847

 346/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 732us/step - accuracy: 0.8956 - loss: 0.2861

 414/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8954 - loss: 0.2874

 486/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8953 - loss: 0.2881

 555/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8954 - loss: 0.2885

 626/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8955 - loss: 0.2885

 697/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8956 - loss: 0.2884

 769/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.8957 - loss: 0.2884

 838/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.8958 - loss: 0.2884

 906/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8958 - loss: 0.2886

 974/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8959 - loss: 0.2885

1044/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8960 - loss: 0.2884

1114/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8962 - loss: 0.2882

1179/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8963 - loss: 0.2880

1243/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8963 - loss: 0.2879

1313/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8964 - loss: 0.2878

1382/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8965 - loss: 0.2876

1451/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8967 - loss: 0.2875

1520/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8967 - loss: 0.2873

1591/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8968 - loss: 0.2872

1663/1719 ━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8968 - loss: 0.2871

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 777us/step - accuracy: 0.8976 - loss: 0.2853 - val_accuracy: 0.8712 - val_loss: 0.3509

Epoch 18/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9375 - loss: 0.2674

  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 792us/step - accuracy: 0.9042 - loss: 0.2592

 125/1719 ━━━━━━━━━━━━━━━━━━━ 1s 814us/step - accuracy: 0.9004 - loss: 0.2684

 186/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.8983 - loss: 0.2754

 246/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 824us/step - accuracy: 0.8976 - loss: 0.2780

 306/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.8973 - loss: 0.2795

 369/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.8971 - loss: 0.2806

 431/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.8969 - loss: 0.2817

 494/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 818us/step - accuracy: 0.8969 - loss: 0.2823

 555/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.8969 - loss: 0.2826

 617/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.8970 - loss: 0.2826

 681/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8971 - loss: 0.2826

 744/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8972 - loss: 0.2826

 805/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8973 - loss: 0.2826

 867/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8973 - loss: 0.2827

 931/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 812us/step - accuracy: 0.8974 - loss: 0.2827

 992/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8975 - loss: 0.2826

1052/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8976 - loss: 0.2825

1114/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8977 - loss: 0.2823

1179/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 812us/step - accuracy: 0.8978 - loss: 0.2822

1242/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 812us/step - accuracy: 0.8978 - loss: 0.2821

1304/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 812us/step - accuracy: 0.8979 - loss: 0.2820

1368/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 810us/step - accuracy: 0.8980 - loss: 0.2818

1434/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.8981 - loss: 0.2817

1496/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8982 - loss: 0.2816

1559/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.8983 - loss: 0.2815

1623/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8984 - loss: 0.2814

1686/1719 ━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8984 - loss: 0.2813

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 868us/step - accuracy: 0.8994 - loss: 0.2797 - val_accuracy: 0.8724 - val_loss: 0.3504

Epoch 19/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2601

  59/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 870us/step - accuracy: 0.9086 - loss: 0.2525

 115/1719 ━━━━━━━━━━━━━━━━━━━ 1s 882us/step - accuracy: 0.9047 - loss: 0.2607

 174/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 873us/step - accuracy: 0.9020 - loss: 0.2687

 233/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 870us/step - accuracy: 0.9008 - loss: 0.2718

 293/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 865us/step - accuracy: 0.9003 - loss: 0.2735

 352/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 863us/step - accuracy: 0.9001 - loss: 0.2746

 411/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 862us/step - accuracy: 0.8998 - loss: 0.2757

 471/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 859us/step - accuracy: 0.8997 - loss: 0.2764

 530/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 860us/step - accuracy: 0.8997 - loss: 0.2768

 590/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 859us/step - accuracy: 0.8997 - loss: 0.2770

 650/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 857us/step - accuracy: 0.8998 - loss: 0.2770

 710/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 856us/step - accuracy: 0.8999 - loss: 0.2769

 768/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 857us/step - accuracy: 0.8999 - loss: 0.2769

 826/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 857us/step - accuracy: 0.9000 - loss: 0.2770

 884/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 859us/step - accuracy: 0.9000 - loss: 0.2771

 944/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 857us/step - accuracy: 0.9001 - loss: 0.2771

1003/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 857us/step - accuracy: 0.9002 - loss: 0.2770

1061/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 858us/step - accuracy: 0.9002 - loss: 0.2769

1119/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 858us/step - accuracy: 0.9003 - loss: 0.2767

1179/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 857us/step - accuracy: 0.9004 - loss: 0.2766

1242/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 854us/step - accuracy: 0.9004 - loss: 0.2765

1305/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 851us/step - accuracy: 0.9005 - loss: 0.2764

1369/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 848us/step - accuracy: 0.9006 - loss: 0.2763

1431/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 847us/step - accuracy: 0.9007 - loss: 0.2761

1493/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 846us/step - accuracy: 0.9008 - loss: 0.2760

1556/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 844us/step - accuracy: 0.9008 - loss: 0.2759

1618/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 842us/step - accuracy: 0.9009 - loss: 0.2758

1682/1719 ━━━━━━━━━━━━━━━━━━━ 0s 840us/step - accuracy: 0.9009 - loss: 0.2758

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 899us/step - accuracy: 0.9017 - loss: 0.2743 - val_accuracy: 0.8738 - val_loss: 0.3489

Epoch 20/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2529

  60/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 856us/step - accuracy: 0.9113 - loss: 0.2469

 119/1719 ━━━━━━━━━━━━━━━━━━━ 1s 852us/step - accuracy: 0.9074 - loss: 0.2556

 179/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 846us/step - accuracy: 0.9045 - loss: 0.2633

 249/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.9031 - loss: 0.2666

 316/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 798us/step - accuracy: 0.9024 - loss: 0.2682

 387/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 782us/step - accuracy: 0.9020 - loss: 0.2697

 457/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.9018 - loss: 0.2707

 527/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.9017 - loss: 0.2713

 597/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.9017 - loss: 0.2715

 666/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 756us/step - accuracy: 0.9018 - loss: 0.2714

 736/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 752us/step - accuracy: 0.9019 - loss: 0.2714

 806/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.9020 - loss: 0.2714

 876/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9020 - loss: 0.2716

 944/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9020 - loss: 0.2716

1015/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.9021 - loss: 0.2715

1086/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 742us/step - accuracy: 0.9021 - loss: 0.2713

1155/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 741us/step - accuracy: 0.9022 - loss: 0.2712

1225/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 740us/step - accuracy: 0.9023 - loss: 0.2711

1298/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 737us/step - accuracy: 0.9023 - loss: 0.2710

1366/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.9024 - loss: 0.2708

1435/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 737us/step - accuracy: 0.9025 - loss: 0.2707

1506/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 736us/step - accuracy: 0.9025 - loss: 0.2706

1577/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 735us/step - accuracy: 0.9026 - loss: 0.2705

1647/1719 ━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9026 - loss: 0.2704

1718/1719 ━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.9027 - loss: 0.2704

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 787us/step - accuracy: 0.9033 - loss: 0.2691 - val_accuracy: 0.8750 - val_loss: 0.3470

Epoch 21/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9375 - loss: 0.2429

  71/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 715us/step - accuracy: 0.9116 - loss: 0.2428

 140/1719 ━━━━━━━━━━━━━━━━━━━ 1s 720us/step - accuracy: 0.9080 - loss: 0.2536

 209/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 724us/step - accuracy: 0.9056 - loss: 0.2598

 277/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 727us/step - accuracy: 0.9047 - loss: 0.2623

 345/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 729us/step - accuracy: 0.9043 - loss: 0.2636

 415/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9040 - loss: 0.2651

 486/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9039 - loss: 0.2658

 557/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9039 - loss: 0.2662

 628/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9040 - loss: 0.2663

 698/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9041 - loss: 0.2662

 769/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9042 - loss: 0.2662

 840/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9043 - loss: 0.2663

 908/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9043 - loss: 0.2664

 977/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9044 - loss: 0.2664

1048/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9044 - loss: 0.2663

1118/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9045 - loss: 0.2661

1188/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9045 - loss: 0.2660

1258/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9046 - loss: 0.2659

1330/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9046 - loss: 0.2658

1398/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9047 - loss: 0.2657

1467/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9048 - loss: 0.2656

1533/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9048 - loss: 0.2655

1594/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9049 - loss: 0.2654

1655/1719 ━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9049 - loss: 0.2653

1715/1719 ━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9049 - loss: 0.2653

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 792us/step - accuracy: 0.9053 - loss: 0.2642 - val_accuracy: 0.8756 - val_loss: 0.3472

Epoch 22/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9062 - loss: 0.2408

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 830us/step - accuracy: 0.9122 - loss: 0.2373

 124/1719 ━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.9102 - loss: 0.2463

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 812us/step - accuracy: 0.9078 - loss: 0.2535

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.9068 - loss: 0.2564

 312/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.9064 - loss: 0.2579

 375/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.9063 - loss: 0.2592

 440/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.9061 - loss: 0.2603

 502/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9061 - loss: 0.2609

 566/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9061 - loss: 0.2612

 629/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9062 - loss: 0.2612

 692/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9063 - loss: 0.2612

 755/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9064 - loss: 0.2612

 818/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9064 - loss: 0.2613

 882/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9064 - loss: 0.2614

 945/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9065 - loss: 0.2615

1007/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9065 - loss: 0.2614

1072/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9065 - loss: 0.2613

1135/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9066 - loss: 0.2612

1198/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9066 - loss: 0.2611

1262/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9066 - loss: 0.2610

1325/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9067 - loss: 0.2609

1390/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9067 - loss: 0.2608

1454/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9068 - loss: 0.2607

1517/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9068 - loss: 0.2606

1580/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9068 - loss: 0.2605

1642/1719 ━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9068 - loss: 0.2605

1705/1719 ━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9068 - loss: 0.2604

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 853us/step - accuracy: 0.9069 - loss: 0.2595 - val_accuracy: 0.8770 - val_loss: 0.3467

Epoch 23/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9375 - loss: 0.2295

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.9154 - loss: 0.2320

 124/1719 ━━━━━━━━━━━━━━━━━━━ 1s 818us/step - accuracy: 0.9135 - loss: 0.2413

 185/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.9112 - loss: 0.2484

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.9101 - loss: 0.2515

 313/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.9095 - loss: 0.2530

 378/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 801us/step - accuracy: 0.9092 - loss: 0.2543

 442/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 799us/step - accuracy: 0.9090 - loss: 0.2554

 505/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9089 - loss: 0.2560

 569/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9088 - loss: 0.2563

 633/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9089 - loss: 0.2563

 696/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9090 - loss: 0.2563

 758/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9090 - loss: 0.2563

 821/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9090 - loss: 0.2564

 884/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9090 - loss: 0.2565

 948/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9090 - loss: 0.2566

1011/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9090 - loss: 0.2565

1072/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9090 - loss: 0.2564

1136/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9090 - loss: 0.2563

1199/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9090 - loss: 0.2562

1261/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9090 - loss: 0.2561

1325/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9090 - loss: 0.2560

1390/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9090 - loss: 0.2559

1456/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9091 - loss: 0.2558

1527/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.9091 - loss: 0.2557

1597/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9091 - loss: 0.2557

1665/1719 ━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.9091 - loss: 0.2556

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 835us/step - accuracy: 0.9090 - loss: 0.2548 - val_accuracy: 0.8778 - val_loss: 0.3457

Epoch 24/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 14s 9ms/step - accuracy: 0.9375 - loss: 0.2310

  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 741us/step - accuracy: 0.9162 - loss: 0.2284

 137/1719 ━━━━━━━━━━━━━━━━━━━ 1s 740us/step - accuracy: 0.9139 - loss: 0.2385

 207/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 733us/step - accuracy: 0.9118 - loss: 0.2449

 276/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 731us/step - accuracy: 0.9109 - loss: 0.2475

 346/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 729us/step - accuracy: 0.9106 - loss: 0.2489

 417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.9104 - loss: 0.2504

 478/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.9104 - loss: 0.2511

 539/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.9104 - loss: 0.2515

 601/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.9105 - loss: 0.2516

 664/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 760us/step - accuracy: 0.9107 - loss: 0.2516

 725/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.9107 - loss: 0.2516

 786/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.9107 - loss: 0.2517

 850/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 771us/step - accuracy: 0.9107 - loss: 0.2518

 915/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.9107 - loss: 0.2519

 978/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.9108 - loss: 0.2519

1041/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.9108 - loss: 0.2518

1104/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.9108 - loss: 0.2517

1165/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.9108 - loss: 0.2516

1227/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9108 - loss: 0.2515

1291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9108 - loss: 0.2514

1351/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9108 - loss: 0.2513

1415/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9109 - loss: 0.2512

1476/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.9109 - loss: 0.2511

1537/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.9109 - loss: 0.2511

1601/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9109 - loss: 0.2510

1663/1719 ━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9110 - loss: 0.2510

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 842us/step - accuracy: 0.9110 - loss: 0.2502 - val_accuracy: 0.8784 - val_loss: 0.3456

Epoch 25/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2233

  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.9195 - loss: 0.2237

 128/1719 ━━━━━━━━━━━━━━━━━━━ 1s 793us/step - accuracy: 0.9170 - loss: 0.2325

 192/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 794us/step - accuracy: 0.9146 - loss: 0.2393

 255/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 795us/step - accuracy: 0.9134 - loss: 0.2421

 320/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 790us/step - accuracy: 0.9128 - loss: 0.2436

 384/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 790us/step - accuracy: 0.9125 - loss: 0.2449

 447/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.9123 - loss: 0.2459

 510/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.9123 - loss: 0.2465

 573/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.9124 - loss: 0.2468

 636/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9125 - loss: 0.2468

 699/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9126 - loss: 0.2468

 763/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9126 - loss: 0.2468

 827/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9126 - loss: 0.2469

 890/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9125 - loss: 0.2471

 954/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9125 - loss: 0.2471

1017/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9125 - loss: 0.2471

1080/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9125 - loss: 0.2470

1142/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9126 - loss: 0.2469

1207/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9126 - loss: 0.2468

1270/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9126 - loss: 0.2468

1334/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9126 - loss: 0.2467

1396/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9126 - loss: 0.2466

1465/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.9127 - loss: 0.2465

1535/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9127 - loss: 0.2464

1605/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.9127 - loss: 0.2464

1675/1719 ━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.9127 - loss: 0.2464

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 837us/step - accuracy: 0.9127 - loss: 0.2459 - val_accuracy: 0.8798 - val_loss: 0.3461

Epoch 26/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.9375 - loss: 0.2166

  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 751us/step - accuracy: 0.9212 - loss: 0.2198

 137/1719 ━━━━━━━━━━━━━━━━━━━ 1s 741us/step - accuracy: 0.9182 - loss: 0.2294

 208/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 731us/step - accuracy: 0.9157 - loss: 0.2357

 278/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 727us/step - accuracy: 0.9147 - loss: 0.2383

 347/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.9143 - loss: 0.2397

 417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9140 - loss: 0.2411

 479/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 740us/step - accuracy: 0.9140 - loss: 0.2418

 540/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.9140 - loss: 0.2423

 602/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.9140 - loss: 0.2424

 665/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.9141 - loss: 0.2424

 725/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.9142 - loss: 0.2424

 789/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 768us/step - accuracy: 0.9142 - loss: 0.2425

 852/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.9141 - loss: 0.2426

 915/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9141 - loss: 0.2427

 978/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.9141 - loss: 0.2427

1040/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.9141 - loss: 0.2427

1102/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.9141 - loss: 0.2426

1164/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9141 - loss: 0.2425

1226/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.9141 - loss: 0.2424

1288/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9141 - loss: 0.2424

1349/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.9141 - loss: 0.2423

1408/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9141 - loss: 0.2422

1469/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9141 - loss: 0.2421

1529/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9142 - loss: 0.2421

1589/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9142 - loss: 0.2420

1653/1719 ━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9142 - loss: 0.2420

1717/1719 ━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9142 - loss: 0.2420

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 853us/step - accuracy: 0.9140 - loss: 0.2415 - val_accuracy: 0.8802 - val_loss: 0.3485

Epoch 27/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2169

  63/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 813us/step - accuracy: 0.9233 - loss: 0.2152

 128/1719 ━━━━━━━━━━━━━━━━━━━ 1s 795us/step - accuracy: 0.9204 - loss: 0.2237

 193/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.9180 - loss: 0.2305

 256/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.9169 - loss: 0.2333

 319/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 793us/step - accuracy: 0.9164 - loss: 0.2347

 381/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 795us/step - accuracy: 0.9162 - loss: 0.2360

 443/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 798us/step - accuracy: 0.9160 - loss: 0.2370

 506/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9160 - loss: 0.2376

 570/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9160 - loss: 0.2379

 634/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9160 - loss: 0.2380

 697/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9161 - loss: 0.2379

 760/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9161 - loss: 0.2380

 824/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9160 - loss: 0.2381

 886/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9160 - loss: 0.2383

 948/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9159 - loss: 0.2383

1012/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9159 - loss: 0.2383

1075/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9160 - loss: 0.2382

1136/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9160 - loss: 0.2381

1198/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9160 - loss: 0.2380

1262/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9160 - loss: 0.2380

1323/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9160 - loss: 0.2379

1390/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9160 - loss: 0.2378

1451/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9160 - loss: 0.2377

1512/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9160 - loss: 0.2377

1575/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9161 - loss: 0.2376

1634/1719 ━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9161 - loss: 0.2376

1693/1719 ━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9161 - loss: 0.2376

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 863us/step - accuracy: 0.9157 - loss: 0.2373 - val_accuracy: 0.8798 - val_loss: 0.3483

Epoch 28/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2112

  63/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.9251 - loss: 0.2112

 127/1719 ━━━━━━━━━━━━━━━━━━━ 1s 797us/step - accuracy: 0.9223 - loss: 0.2195

 192/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.9201 - loss: 0.2264

 253/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 800us/step - accuracy: 0.9190 - loss: 0.2292

 314/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 805us/step - accuracy: 0.9183 - loss: 0.2307

 375/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 808us/step - accuracy: 0.9180 - loss: 0.2318

 437/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.9178 - loss: 0.2330

 497/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 812us/step - accuracy: 0.9177 - loss: 0.2336

 559/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 812us/step - accuracy: 0.9176 - loss: 0.2339

 623/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.9177 - loss: 0.2340

 688/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9177 - loss: 0.2340

 751/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9177 - loss: 0.2340

 814/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9176 - loss: 0.2341

 875/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9176 - loss: 0.2343

 939/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9175 - loss: 0.2343

1003/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9175 - loss: 0.2343

1067/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9175 - loss: 0.2342

1131/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9175 - loss: 0.2341

1193/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9175 - loss: 0.2340

1249/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9175 - loss: 0.2340

1309/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 810us/step - accuracy: 0.9175 - loss: 0.2339

1373/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.9175 - loss: 0.2338

1438/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9175 - loss: 0.2338

1503/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9175 - loss: 0.2337

1566/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9176 - loss: 0.2337

1632/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9176 - loss: 0.2336

1698/1719 ━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9175 - loss: 0.2336

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 858us/step - accuracy: 0.9170 - loss: 0.2333 - val_accuracy: 0.8796 - val_loss: 0.3469

Epoch 29/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9375 - loss: 0.1969

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.9287 - loss: 0.2055

 125/1719 ━━━━━━━━━━━━━━━━━━━ 1s 815us/step - accuracy: 0.9253 - loss: 0.2139

 185/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 824us/step - accuracy: 0.9228 - loss: 0.2208

 244/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 831us/step - accuracy: 0.9214 - loss: 0.2239

 306/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 827us/step - accuracy: 0.9207 - loss: 0.2255

 369/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 823us/step - accuracy: 0.9203 - loss: 0.2268

 430/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 823us/step - accuracy: 0.9200 - loss: 0.2280

 493/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.9199 - loss: 0.2287

 556/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9198 - loss: 0.2291

 617/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9198 - loss: 0.2292

 677/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9198 - loss: 0.2293

 738/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9198 - loss: 0.2293

 800/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9197 - loss: 0.2294

 864/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9196 - loss: 0.2296

 924/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9196 - loss: 0.2297

 986/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9196 - loss: 0.2297

1048/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9195 - loss: 0.2297

1108/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.9195 - loss: 0.2296

1168/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9195 - loss: 0.2295

1229/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9195 - loss: 0.2295

1291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9195 - loss: 0.2295

1352/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9195 - loss: 0.2294

1414/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9195 - loss: 0.2293

1478/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.9195 - loss: 0.2293

1542/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9195 - loss: 0.2292

1604/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9195 - loss: 0.2292

1669/1719 ━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.9195 - loss: 0.2292

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 872us/step - accuracy: 0.9188 - loss: 0.2292 - val_accuracy: 0.8786 - val_loss: 0.3494

Epoch 30/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.1999

  60/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 852us/step - accuracy: 0.9299 - loss: 0.2024

 125/1719 ━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.9270 - loss: 0.2105

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.9244 - loss: 0.2174

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.9229 - loss: 0.2205

 313/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.9221 - loss: 0.2221

 377/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.9217 - loss: 0.2234

 440/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.9214 - loss: 0.2246

 504/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9213 - loss: 0.2253

 568/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9212 - loss: 0.2256

 632/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9212 - loss: 0.2257

 696/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9212 - loss: 0.2257

 760/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9212 - loss: 0.2257

 822/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9212 - loss: 0.2258

 884/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9211 - loss: 0.2260

 947/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9211 - loss: 0.2261

1011/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9211 - loss: 0.2261

1072/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9211 - loss: 0.2260

1134/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9211 - loss: 0.2259

1196/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9210 - loss: 0.2259

1258/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9210 - loss: 0.2258

1320/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9210 - loss: 0.2258

1382/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9210 - loss: 0.2257

1443/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9210 - loss: 0.2256

1507/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9210 - loss: 0.2256

1573/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9210 - loss: 0.2255

1635/1719 ━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9211 - loss: 0.2255

1700/1719 ━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9210 - loss: 0.2255

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 859us/step - accuracy: 0.9206 - loss: 0.2254 - val_accuracy: 0.8788 - val_loss: 0.3500

Visualization

import pandas as pd 

pd.DataFrame(history.history).plot(
    figsize=(8, 5), xlim=[0, 29], ylim=[0, 1], grid=True, xlabel="Epoch",
    style=["r--", "r--.", "b-", "b-*"])
plt.legend(loc="lower left")  # extra code
plt.show()

Visualization

Evaluating the model on our test

model.evaluate(X_test, y_test)

Making predictions

X_new = X_test[:3]
y_proba = model.predict(X_new)
y_proba.round(2)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step

1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step
array([[0., 0., 0., 0., 0., 0., 0., 0., 0., 1.],
       [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],
       [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.]], dtype=float32)
y_pred = y_proba.argmax(axis=-1).astype(int)
y_pred
y_new = y_test[:3]
y_new

Predicted vs Observed

Code
plt.figure(figsize=(7.2, 2.4))
for index, image in enumerate(X_new):
    plt.subplot(1, 3, index + 1)
    plt.imshow(image, cmap="binary", interpolation="nearest")
    plt.axis('off')
    plt.title(class_names[y_test[index]])
plt.subplots_adjust(wspace=0.2, hspace=0.5)
plt.show()

Test Set Performance

from sklearn.metrics import classification_report

y_proba = model.predict(X_test)
y_pred = y_proba.argmax(axis=-1).astype(int)

Test Set Performance

print(classification_report(y_test, y_pred))
              precision    recall  f1-score   support

           0       0.82      0.86      0.84      1000
           1       0.96      0.98      0.97      1000
           2       0.80      0.72      0.76      1000
           3       0.89      0.82      0.85      1000
           4       0.61      0.93      0.73      1000
           5       0.90      0.97      0.93      1000
           6       0.83      0.42      0.56      1000
           7       0.98      0.79      0.88      1000
           8       0.92      0.98      0.95      1000
           9       0.88      0.98      0.92      1000

    accuracy                           0.84     10000
   macro avg       0.86      0.84      0.84     10000
weighted avg       0.86      0.84      0.84     10000

Prologue

Summary

  • Introduction to Neural Networks and Connectionism
    • Shift from symbolic AI to connectionist approaches in artificial intelligence.
    • Inspiration from biological neural networks and the human brain’s structure.
  • Computations with Neurodes and Threshold Logic Units
    • Early models of neurons (neurodes) capable of performing logical operations (AND, OR, NOT).
    • Limitations of simple perceptrons in solving non-linearly separable problems like XOR.
  • Multilayer Perceptrons (MLPs) and Feedforward Neural Networks (FNNs)
    • Overcoming perceptron limitations by introducing hidden layers.
    • Structure and information flow in feedforward neural networks.
    • Explanation of forward pass computations in neural networks.
  • Activation Functions in Neural Networks
    • Importance of nonlinear activation functions (sigmoid, tanh, ReLU) for enabling learning of complex patterns.
    • Role of activation functions in backpropagation and gradient descent optimization.
    • Universal Approximation Theorem and its implications for neural networks.
  • Deep Learning Frameworks
    • Overview of PyTorch and TensorFlow as leading platforms for deep learning.
    • Introduction to Keras as a high-level API for building and training neural networks.
    • Discussion on the suitability of different frameworks for research and industry applications.
  • Hands-On Implementation with Keras
    • Loading and exploring the Fashion-MNIST dataset.
    • Building a neural network model using Keras’ Sequential API.
    • Compiling the model with appropriate loss functions and optimizers for multiclass classification.
    • Training the model and visualizing training and validation metrics over epochs.
    • Evaluating model performance on test data and interpreting results.
  • Making Predictions and Interpreting Results
    • Using the trained model to make predictions on new data.
    • Visualizing predictions alongside actual images and labels.
    • Understanding the output probabilities and class assignments in the context of the dataset.

3Blue1Brown on Deep Learning

Next lecture

  • We will discuss the training algorithm for artificial neural networks.

References

Cybenko, George V. 1989. “Approximation by Superpositions of a Sigmoidal Function.” Mathematics of Control, Signals and Systems 2: 303–14. https://api.semanticscholar.org/CorpusID:3958369.
Géron, Aurélien. 2022. Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow. 3rd ed. O’Reilly Media, Inc.
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. 2016. Deep Learning. Adaptive Computation and Machine Learning. MIT Press. https://dblp.org/rec/books/daglib/0040158.
Hornik, Kurt, Maxwell Stinchcombe, and Halbert White. 1989. “Multilayer Feedforward Networks Are Universal Approximators.” Neural Networks 2 (5): 359–66. https://doi.org/https://doi.org/10.1016/0893-6080(89)90020-8.
LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. 2015. “Deep Learning.” Nature 521 (7553): 436–44. https://doi.org/10.1038/nature14539.
LeNail, Alexander. 2019. NN-SVG: Publication-Ready Neural Network Architecture Schematics.” Journal of Open Source Software 4 (33): 747. https://doi.org/10.21105/joss.00747.
McCulloch, Warren S, and Walter Pitts. 1943. A logical calculus of the ideas immanent in nervous activity.” The Bulletin of Mathematical Biophysics 5 (4): 115–33. https://doi.org/10.1007/bf02478259.
Minsky, Marvin, and Seymour Papert. 1969. Perceptrons: An Introduction to Computational Geometry. Cambridge, MA, USA: MIT Press.
Rosenblatt, F. 1958. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review 65 (6): 386–408. https://doi.org/10.1037/h0042519.
Russell, Stuart, and Peter Norvig. 2020. Artificial Intelligence: A Modern Approach. 4th ed. Pearson. http://aima.cs.berkeley.edu/.

Marcel Turcotte

Marcel.Turcotte@uOttawa.ca

School of Electrical Engineering and Computer Science (EECS)

University of Ottawa