Introduction to Artificial Neural Networks

CSI 4106 - Fall 2024

Author

Marcel Turcotte

Published

Version: Nov 14, 2024 09:02

Preamble

Quote of the Day

The Nobel Prize in Physics 2024 was awarded to John J. Hopfield and Geoffrey E. Hinton “for foundational discoveries and inventions that enable machine learning with artificial neural networks”

Learning objectives

  • Explain perceptrons and MLPs: structure, function, history, and limitations.
  • Describe activation functions: their role in enabling complex pattern learning.
  • Implement a feedforward neural network with Keras on Fashion-MNIST.
  • Interpret neural network training and results: visualization and evaluation metrics.
  • Familiarize with deep learning frameworks: PyTorch, TensorFlow, and Keras for model building and deployment.

As stated at the beginning of this course, there are two primary schools of thought in artificial intelligence: symbolic AI and connectionism. While the symbolic approach initially dominated the field, the connectionist approach is now more prevalent. We will now focus on connectionism.

Introduction

Neural Networks (NN)

We now shift our focus to a family of machine learning models that draw inspiration from the structure and function of biological neural networks found in animals.

AKA artificial neural networks or neural nets, abbreviated as ANN or NN.

Machine Learning Problems

  • Supervised Learning: Classification, Regression

  • Unsupervised Learning: Autoencoders, Self-Supervised

  • Reinforcement Learning: Now an Integral Component

We will begin our exploration within the framework of supervised learning.

A neuron

Attribution: Jennifer Walinga, CC BY-SA 4.0

In the study of artificial intelligence, it is logical to derive inspiration from the most well-understood form of intelligence: the human brain. The brain is composed of a complex network of neurons, which together form biological neural networks. Although each neuron exhibits relatively simple behavior, it is connected to thousands of other neurons, contributing to the intricate functionality of these networks.

A neuron can be conceptualized as a basic computational unit, and the complexity of brain function arises from the interconnectedness of these units.

Yann LeCun and other researchers have frequently noted that artificial neural networks used in machine learning resemble biological neural networks in much the same way that an airplane’s wings resemble those of a bird.

Interconnected neurons

Attribution: Molecular Mechanism of Synaptic Function from the Howard Hughes Medical Institute (HHMI). Published on YouTube on 2018-11-15.

From biology, we essentially adopt the concept of simple computational units that are interconnected to form a network, which collectively performs complex computations.

While research into understanding biological neural networks is undeniably important, the field of artificial neural networks has incorporated only a limited number of key concepts from this research.

Connectionist

Attribution: LeNail, (2019). NN-SVG: Publication-Ready Neural Network Architecture Schematics. Journal of Open Source Software, 4(33), 747, https://doi.org/10.21105/joss.00747 (GitHub)

Another characteristic of biological neural networks that we adopt is the organization of neurons into layers, particularly evident in the cerebral cortex.

The term “connectionists” comes from the idea that nodes in these models are interconnected. Instead of being explicitly programmed, these models learn their behavior through training. Deep learning is a connectionist approach.

Neural networks (NNs) consist of layers of interconnected nodes (neurons), each connection having an associated weight.

Neural networks process input data through these weighted connections, and learning occurs by adjusting the weights based on errors in the training data.

Hierarchy of concepts

Attribution: LeCun, Bengio, and Hinton (2015)

In the book “Deep Learning” (Goodfellow, Bengio, and Courville 2016), authors Goodfellow, Bengio, and Courville define deep learning as a subset of machine learning that enables computers to “understand the world in terms of a hierarchy of concepts.”

This hierarchical approach is one of deep learning’s most significant contributions. It reduces the need for manual feature engineering and redirects the focus toward the engineering of neural network architectures.

Basics

Computations with neurodes

where \(x_1, x_2 \in \{0,1\}\) and \(f(z)\) is an indicator function: \[ f(z)= \begin{cases}0, & z<\theta \\ 1, & z \geq \theta\end{cases} \]

McCulloch and Pitts (1943) termed artificial neurons, neurodes, for “neuron” + “node”.

In mathematics, \(f(z)\), as defined above, is known as an indicator function or a characteristic function.

These neurodes have one or more binary inputs, taking a value of 0 or 1, and one binary output.

They showed that such units could implement Boolean functions such as AND, OR, and NOT.

But also that networks of such units can compute any logical proposition.

Computations with neurodes

\[ y = f(x_1 + x_2)= \begin{cases}0, & x_1 + x_2 <\theta \\ 1, & x_1 + x_2 \geq \theta\end{cases} \]

  • With \(\theta = 2\), the neurode implements an AND logic gate.

  • With \(\theta = 1\), the neurode implements an OR logic gate.

More complex logic can be constructed by multiplying the inputs by -1, which is interpreted as inhibitory. Namely, this allows building a logical NOT.

With \(\theta = 1\), $x_1 {1} and \(x_2\) multiplied by (-1), \(y = 0\) when \(x_2 = 1\), \(y = 1\), if \(x_2 = 0\).

\[ y = f(x_1 + (-1) x_2)= \begin{cases}0, & x_1 + x_2 <\theta \\ 1, & x_1 + (-1) x_2 \geq \theta\end{cases} \]

Neurons can be broadly categorized into two primary types: excitatory and inhibitory.

Computations with neurodes

  • Digital computations can be broken down into a sequence of logical operations, enabling neurode networks to execute any computation.

  • McCulloch and Pitts (1943) did not focus on learning parameter \(\theta\).

  • They introduced a machine that computes any function but cannot learn.

From this work, we take the idea that networks of such units perform computations. Signal propagates from one end of the network to compute a result.

Threshold logic unit

Rosenblatt (1958)

In 1957, Frank Rosenblatt developed a conceptually distinct model of a neuron known as the threshold logic unit, which he published in 1958.

In this model, both the inputs and the output of the neuron are represented as real values. Notably, each input connection has an associated weight.

The left section of the neuron, denoted by the sigma symbol, represents the computation of a weighted sum of its inputs, expressed as \(\theta_1 x_1 + \theta_2 x_2 + \ldots + \theta_D x_D + b\).

This sum is then processed through a step function, right section of the neuron, to generate the output.

Here, \(x^T \theta\) represents the dot product of two vectors: \(x\) and \(\theta\). Here, \(x^T\) denotes the transpose of the vector \(x\), converting it from a row vector to a column vector, allowing the dot product operation to be performed with the vector \(\theta\).

The dot product \(x^T \theta\) is then a scalar given by:

\[ x^T \theta = x^{(1)} \theta_1 + x^{(2)} \theta_2 + \cdots + x_{(D)} \theta_D \]

where \(x^{(j)}\) and \(theta_j\) are the components of the vectors \(x\) and \(\theta\), respectively.

Simple Step Functions

\(\text{heaviside}(t)\) =

  • 1, if \(t \geq 0\)

  • 0, if \(t < 0\)

\(\text{sign}(t)\) =

  • 1, if \(t > 0\)

  • 0, if \(t = 0\)

  • -1, if \(t < 0\)

Common step functions include the heavyside function (0 if the input is negative and 1 otherwise) or the sign function (-1 if the input is negative, 0 if the input is zero, 1 otherwise).

Notation

Add an extra feature with a fixed value of 1 to the input. Associate it with weight \(b = \theta_{0}\), where \(b\) is the bias/intercept term.

Notation

\(\theta_{0} = b\) is the bias/intercept term.

The threshold logic unit is analogous to logistic regression, with the primary distinction being the substitution of the logistic (sigmoid) function with a step function. Similar to logistic regression, the perceptron is employed for classification tasks.

Perceptron

A perceptron consists of one or more threshold logic units arranged in a single layer, with each unit connected to all inputs. This configuration is referred to as fully connected or dense.

Since the threshold logic units in this single layer also generate the output, it is referred to as the output layer.

Perceptron

As this perceptron generates multiple outputs simultaneously, it performs multiple binary predictions, making it as a multilabel classifier (can also be used as multiclass classifier).

Classification tasks, can be further divided into multilabel and multiclass classification.

  1. Multiclass Classification:

    • In multiclass classification, each instance is assigned to one and only one class out of a set of three or more possible classes. The classes are mutually exclusive, meaning that an instance cannot belong to more than one class at the same time.

    • Example: Classifying an image as either a cat, dog, or bird. Each image can only belong to one of these categories.

  2. Multilabel Classification:

    • In multilabel classification, each instance can be associated with multiple classes simultaneously. The classes are not mutually exclusive, allowing for the possibility that an instance can belong to several classes at once.

    • Example: Tagging an image with multiple attributes such as “outdoor,” “sunset,” and “beach.” The image can simultaneously belong to all these labels.

The key difference lies in the relationship between classes: multiclass classification deals with a single label per instance, while multilabel classification handles multiple labels for each instance.

Notation

As before, introduce an additional feature with a value of 1 to the input. Assign a bias \(b\) to each neuron. Each incoming connection implicitly has an associated weight.

Notation

  • \(X\) is the input data matrix where each row corresponds to an example and each column represents one of the \(D\) features.

  • \(W\) is the weight matrix, structured with one row per input (feature) and one column per neuron.

  • Bias terms can be represented separately; both approaches appear in the literature. Here, \(b\) is a vector with a length equal to the number of neurons.

With neural networks, the parameters of the model are often reffered to as \(w\) (vector) or \(W\) (matrix), rather than \(\theta\).

Discussion

  • The algorithm to train the perceptron closely resembles stochastic gradient descent.

    • In the interest of time and to avoid confusion, we will skip this algorithm and focus on multilayer perception (MLP) and its training algorithm, backpropagation.

Historical Note and Justification

Minsky and Papert (1969) demonstrated the limitations of perceptrons, notably their inability to solve exclusive OR (XOR) classification problems: \({([0,1],\mathrm{true}), ([1,0],\mathrm{true}), ([0,0],\mathrm{false}), ([1,1],\mathrm{false})}\).

This limitation also applies to other linear classifiers, such as logistic regression.

Consequently, due to these limitations and a lack of practical applications, some researchers abandoned the perceptron.

Multilayer Perceptron

A multilayer perceptron (MLP) includes an input layer and one or more layers of threshold logic units. Layers that are neither input nor output are termed hidden layers.

XOR Classification problem

\(x^{(1)}\) \(x^{(2)}\) \(y\) \(o_1\) \(o_2\) \(o_3\)
1 0 1 0 1 1
0 1 1 0 1 1
0 0 0 0 0 0
1 1 0 1 1 0

\(x^{(1)}\) and \(x^{(2)}\) are two attributes, \(y\) is the target, \(o_1\), \(o_2\), and \(o_3 = h_\theta(x)\), are the output of the top left, bottom left, and right threshold units. Clearly \(h_\theta(x) = y, \forall x \in X\). The challenge during Rosenblatt’s time was the lack of algorithms to train multi-layer networks.

I developed an Excel spreadsheet to verify that the proposed multilayer perceptron effectively solves the XOR classification problem.

The step function used in the above model is the heavyside function.

Feedforward Neural Network (FNN)

Information in this architecture flows unidirectionally—from left to right, moving from input to output. Consequently, it is termed a feedforward neural network.

The network consists of three layers: input, hidden, and output. The input layer contains two nodes, the hidden layer comprises three nodes, and the output layer has two nodes. Additional hidden layers and nodes per layer can be added, which will be discussed later.

It is often useful to include explicit input nodes that do not perform calculations, known as input units or input neurons. These nodes act as placeholders to introduce input features into the network, passing data directly to the next layer without transformation. In the network diagram, these are the light blue nodes on the left, labeled 1 and 2. Typically, the number of input units corresponds to the number of features.

For clarity, nodes are labeled to facilitate discussion of the weights between them, such as \(w_{1,5}\) between nodes 1 and 5. Similarly, the output of a node is denoted by \(o_k\), where \(k\) represents the node’s label. For example, for \(k=3\), the output would be \(o_3\).

Forward Pass (Computatation)

\(o3 = \sigma(w_{13} x^{(1)}+ w_{23} x^{(2)} + b_3)\)

\(o4 = \sigma(w_{14} x^{(1)}+ w_{24} x^{(2)} + b_4)\)

\(o5 = \sigma(w_{15} x^{(1)}+ w_{25} x^{(2)} + b_5)\)

\(o6 = \sigma(w_{36} o_3 + w_{46} o_4 + w_{56} o_5 + b_6)\)

\(o7 = \sigma(w_{37} o_3 + w_{47} o_4 + w_{57} o_5 + b_7)\)

First, it’s important to understand the information flow: this network computes two outputs from its inputs.

To simplify the figure, I have opted not to display the bias terms, though they remain crucial components. Specifically, \(b_3\) represents the bias term associated with node 3.

If bias terms were not significant, the training process would naturally reduce them to zero. Bias terms are essential as they enable the adjustment of the decision boundary, allowing the model to learn more complex patterns that weights alone cannot capture. By offering additional degrees of freedom, they also contribute to faster convergence during training.

Forward Pass (Computatation)

import numpy as np

# Sigmoid function

def sigma(x):
    return 1 / (1 + np.exp(-x))

# Input (two attributes) vector, one example of our trainig set

x1, x2 = (0.5, 0.9)

# Initializing the weights of layers 2 and 3 to random values

w13, w14, w15, w23, w24, w25 = np.random.uniform(low=-1, high=1, size=6)
w36, w46, w56, w37, w47, w57 = np.random.uniform(low=-1, high=1, size=6)

# Initializing all 5 bias terms to random values

b3, b4, b5, b6, b7 = np.random.uniform(low=-1, high=1, size=5)

o3 = sigma(w13 * x1 + w23 * x2 + b3)
o4 = sigma(w14 * x1 + w24 * x2 + b4)
o5 = sigma(w15 * x1 + w25 * x2 + b5)
o6 = sigma(w36 * o3 + w46 * o4 + w56 * o5 + b6)
o7 = sigma(w37 * o3 + w47 * o4 + w57 * o5 + b7)

(o6, o7)
(0.3340444496589228, 0.7758645468280726)

The example above illustrates the computation process with specific values. Before training a neural network, it is standard practice to initialize the weights and biases with random values. Gradient descent is then employed to iteratively adjust these parameters, aiming to minimize the loss function.

Forward Pass (Computatation)

The information flow remains consistent even in more complex networks. Networks with many layers are called deep neural networks (DNN).

Produced using NN-SVG, LeNail (2019).

Forward Pass (Computatation)

Same network with bias terms shown.

Produced using NN-SVG, LeNail (2019).

Activation Function

  • As will be discussed later, the training algorithm, known as backpropagation, employs gradient descent, necessitating the calculation of the partial derivatives of the loss function.

  • The step function in the multilayer perceptron had to be replaced, as it consists only of flat surfaces. Gradient descent cannot progress on flat surfaces due to their zero derivative.

Activation Function

  • Nonlinear activation functions are paramount because, without them, multiple layers in the network would only compute a linear function of the inputs.

  • According to the Universal Approximation Theorem, sufficiently large deep networks with nonlinear activation functions can approximate any continuous function. See Universal Approximation Theorem.

Sigmoid

\[ \sigma(t) = \frac{1}{1 + e^{-t}} \]

Hyperbolic Tangent Function

\[ \tanh(t) = 2 \sigma(2t) - 1 \]

This S-shaped curve, similar to the sigmoid function, produces output values ranging from -1 to 1. According to Géron (2022), this range helps each layer’s output to be approximately centered around 0 at the start of training, thereby accelerating convergence.

Rectified linear unit function (ReLU)

\[ \mathrm{ReLU}(t) = \max(0, t) \]

Although the ReLU function is not differentiable at \(t=0\) and has a derivative of 0 for \(t<0\), it performs quite well in practice and is computationally efficient. Consequently, it has become the default activation function.

Common Activation Functions

Universal Approximation

Definition

The universal approximation theorem (UAT) states that a feedforward neural network with a single hidden layer containing a finite number of neurons can approximate any continuous function on a compact subset of \(\mathbb{R}^n\), given appropriate weights and activation functions.

Cybenko (1989); Hornik, Stinchcombe, and White (1989)

In mathematical terms, a subset of \(\mathbb{R}^n\) is considered compact if it is both closed and bounded.

  • Closed: A set is closed if it contains all its boundary points. In other words, it includes its limit points or accumulation points.

  • Bounded: A set is bounded if there exists a real number (M) such that the distance between any two points in the set is less than \(M\).

In the context of the universal approximation theorem, compactness ensures that the function being approximated is defined on a finite and well-behaved region, which is crucial for the theoretical guarantees provided by the theorem.

Demonstration with code

import numpy as np

# Defining the function to be approximated

def f(x):
  return 2 * x**3 + 4 * x**2 - 5 * x + 1

# Generating a dataset, x in [-4,2), f(x) as above

X = 6 * np.random.rand(1000, 1) - 4

y = f(X.flatten())

Increasing the number of neurons

from sklearn.neural_network import MLPRegressor
from sklearn.model_selection import train_test_split

X_train, X_valid, y_train, y_valid = train_test_split(X, y, test_size=0.1, random_state=42)

models = []

sizes = [1, 2, 5, 10, 100]

for i, n in enumerate(sizes):

  models.append(MLPRegressor(hidden_layer_sizes=[n], max_iter=5000, random_state=42))

  models[i].fit(X_train, y_train) 

MLPRegressor is a multi-layer perceptron regressor from sklearn. Its default activation function is relu.

Increasing the number of neurons

In the example above, I retained only 10% of the data as the test set because the function being approximated is straightforward and noise-free. This decision was made to ensure that the true curve does not overshadow the other results.

Increasing the number of neurons

As expected, increasing neuron count reduces loss.

Universal Approximation

This video effectively conveys the underlying intuition of the universal approximation theorem. (18m 53s)

The video effectively elucidates key concepts (terminology) in neural networks, including nodes, layers, weights, and activation functions. It demonstrates the process of summing activation outputs from a preceding layer, akin to the aggregation of curves. Additionally, the video illustrates how scaling an output by a weight not only alters the amplitude of a curve but also inverts its orientation when the weight is negative. Moreover, it clearly depicts the function of bias terms in vertically shifting the curve, contingent on the sign of the bias.

Let’s code

Frameworks

PyTorch and TensorFlow are the leading platforms for deep learning.

  • PyTorch has gained considerable traction in the research community. Initially developed by Meta AI, it is now part of the Linux Foundation.

  • TensorFlow, created by Google, is widely adopted in industry for deploying models in production environments.

Keras

Keras is a high-level API designed to build, train, evaluate, and execute models across various backends, including PyTorch, TensorFlow, and JAX, Google’s high-performance platform.

Keras is powerful enough for most projects.

As highlighted in previous Quotes of the Day, François Chollet, a Google engineer, is the originator and one of the primary developers of the Keras project.

Fashion-MNIST dataset

Fashion-MNIST is a dataset of Zalando’s article images—consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes.”

Attribution: Géron (2022)10_neural_nets_with_keras.ipynb

Loading

import tensorflow as tf

fashion_mnist = tf.keras.datasets.fashion_mnist.load_data()

(X_train_full, y_train_full), (X_test, y_test) = fashion_mnist

X_train, y_train = X_train_full[:-5000], y_train_full[:-5000]
X_valid, y_valid = X_train_full[-5000:], y_train_full[-5000:]

Setting aside 5000 examples as a validation set.

Exploration

X_train.shape
(55000, 28, 28)

. . .

X_train.dtype
dtype('uint8')

. . .

Transforming the pixel intensities from integers in the range 0 to 255 to floats in the range 0 to 1.

X_train, X_valid, X_test = X_train / 255., X_valid / 255., X_test / 255.

What are these images anyway!

plt.figure(figsize=(2, 2))
plt.imshow(X_train[0], cmap="binary")
plt.axis('off')
plt.show()

. . .

y_train
array([9, 0, 0, ..., 9, 0, 2], dtype=uint8)

. . .

Since the labels are integers, 0 to 9. Class names will become handy.

class_names = ["T-shirt/top", "Trouser", "Pullover", "Dress", "Coat",
               "Sandal", "Shirt", "Sneaker", "Bag", "Ankle boot"]

First 40 images

n_rows = 4
n_cols = 10
plt.figure(figsize=(n_cols * 1.2, n_rows * 1.2))
for row in range(n_rows):
    for col in range(n_cols):
        index = n_cols * row + col
        plt.subplot(n_rows, n_cols, index + 1)
        plt.imshow(X_train[index], cmap="binary", interpolation="nearest")
        plt.axis('off')
        plt.title(class_names[y_train[index]])
plt.subplots_adjust(wspace=0.2, hspace=0.5)
plt.show()

Creating a model

tf.random.set_seed(42)

model = tf.keras.Sequential()

model.add(tf.keras.layers.InputLayer(shape=[28, 28]))
model.add(tf.keras.layers.Flatten())
model.add(tf.keras.layers.Dense(300, activation="relu"))
model.add(tf.keras.layers.Dense(100, activation="relu"))
model.add(tf.keras.layers.Dense(10, activation="softmax"))

model.summary()

Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ flatten (Flatten)               │ (None, 784)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 300)            │       235,500 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 100)            │        30,100 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 10)             │         1,010 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 266,610 (1.02 MB)
 Trainable params: 266,610 (1.02 MB)
 Non-trainable params: 0 (0.00 B)

As observed, dense_3 has \(235,500\) parameters, while \(784 \times 300 = 235,200\).

Could you explain the origin of the additional parameters?

Similarly, dense_3 has \(30,100\) parameters, while \(300 \times 100 = 30,000\).

Can you explain why?

Creating a model (alternative)

model = tf.keras.Sequential([
    tf.keras.layers.Flatten(input_shape=[28, 28]),
    tf.keras.layers.Dense(300, activation="relu"),
    tf.keras.layers.Dense(100, activation="relu"),
    tf.keras.layers.Dense(10, activation="softmax")
])

model.summary()

Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ flatten (Flatten)               │ (None, 784)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 300)            │       235,500 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 100)            │        30,100 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 10)             │         1,010 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 266,610 (1.02 MB)
 Trainable params: 266,610 (1.02 MB)
 Non-trainable params: 0 (0.00 B)

Compiling the model

model.compile(loss="sparse_categorical_crossentropy",
              optimizer="sgd",
              metrics=["accuracy"])

sparse_categorical_crossentropy is the appropriate function for a multiclass classification problem (more later).

The method compile allows to set the loss function, as well as other parameters. Keras then prepares the model for training.

Training the model

history = model.fit(X_train, y_train, epochs=30,
                    validation_data=(X_valid, y_valid))
Epoch 1/30
   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 665us/step - accuracy: 0.0625 - loss: 2.3583   1/1719 ━━━━━━━━━━━━━━━━━━━━ 5:14 183ms/step - accuracy: 0.0625 - loss: 2.3583   3/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 1ms/step - accuracy: 0.0729 - loss: 2.3469      69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 781us/step - accuracy: 0.3054 - loss: 2.0629 140/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 746us/step - accuracy: 0.4138 - loss: 1.8711 141/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 747us/step - accuracy: 0.4148 - loss: 1.8688 215/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.4719 - loss: 1.7225 216/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 726us/step - accuracy: 0.4725 - loss: 1.7208 289/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 718us/step - accuracy: 0.5078 - loss: 1.6124 290/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 719us/step - accuracy: 0.5082 - loss: 1.6111 364/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.5336 - loss: 1.5248 365/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.5339 - loss: 1.5238 437/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.5531 - loss: 1.4562 438/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.5534 - loss: 1.4554 511/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.5693 - loss: 1.3983 512/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.5695 - loss: 1.3976 587/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.5834 - loss: 1.3480 662/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.5954 - loss: 1.3052 736/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.6059 - loss: 1.2682 737/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.6060 - loss: 1.2678 810/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.6151 - loss: 1.2355 883/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.6234 - loss: 1.2065 955/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.6308 - loss: 1.1805 956/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.6309 - loss: 1.18011027/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.6376 - loss: 1.15671028/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.6377 - loss: 1.15641096/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.6436 - loss: 1.13581097/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.6437 - loss: 1.13551166/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.6493 - loss: 1.11621237/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.6545 - loss: 1.09791238/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.6546 - loss: 1.09771310/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.6596 - loss: 1.08051311/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.6596 - loss: 1.08021383/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.6643 - loss: 1.06411384/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.6644 - loss: 1.06391455/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.6687 - loss: 1.04901456/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.6688 - loss: 1.04881524/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.6726 - loss: 1.03541525/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.6727 - loss: 1.03521526/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.6728 - loss: 1.03511527/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.6728 - loss: 1.03491595/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.6765 - loss: 1.02221666/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.6801 - loss: 1.00981719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 787us/step - accuracy: 0.6827 - loss: 1.0008 - val_accuracy: 0.8284 - val_loss: 0.5093
Epoch 2/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8438 - loss: 0.5425  72/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 713us/step - accuracy: 0.8309 - loss: 0.5101  73/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 715us/step - accuracy: 0.8308 - loss: 0.5102 146/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 700us/step - accuracy: 0.8233 - loss: 0.5226 147/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 702us/step - accuracy: 0.8232 - loss: 0.5227 148/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 702us/step - accuracy: 0.8232 - loss: 0.5228 214/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 718us/step - accuracy: 0.8212 - loss: 0.5253 215/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 720us/step - accuracy: 0.8212 - loss: 0.5253 286/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 717us/step - accuracy: 0.8203 - loss: 0.5257 287/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 718us/step - accuracy: 0.8203 - loss: 0.5257 358/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.8196 - loss: 0.5256 359/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.8196 - loss: 0.5256 432/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.8191 - loss: 0.5254 433/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8191 - loss: 0.5254 507/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.8190 - loss: 0.5248 508/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.8190 - loss: 0.5248 579/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 708us/step - accuracy: 0.8191 - loss: 0.5241 580/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 708us/step - accuracy: 0.8191 - loss: 0.5241 654/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8194 - loss: 0.5233 655/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8194 - loss: 0.5233 730/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8198 - loss: 0.5224 803/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8202 - loss: 0.5215 804/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8202 - loss: 0.5215 878/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8206 - loss: 0.5206 879/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8206 - loss: 0.5206 951/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8210 - loss: 0.5196 952/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8210 - loss: 0.51961024/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8214 - loss: 0.51851025/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8214 - loss: 0.51851097/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8218 - loss: 0.51741098/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8218 - loss: 0.51741170/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8221 - loss: 0.51641171/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8221 - loss: 0.51641244/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8225 - loss: 0.51551319/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8228 - loss: 0.51451394/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8231 - loss: 0.51361395/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8231 - loss: 0.51361469/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8235 - loss: 0.51261545/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8238 - loss: 0.51171619/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8241 - loss: 0.51081620/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8241 - loss: 0.51081694/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8244 - loss: 0.50991695/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8244 - loss: 0.50991719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 733us/step - accuracy: 0.8245 - loss: 0.5096 - val_accuracy: 0.8424 - val_loss: 0.4561
Epoch 3/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.8438 - loss: 0.4840  71/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 717us/step - accuracy: 0.8524 - loss: 0.4422  72/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 719us/step - accuracy: 0.8523 - loss: 0.4423 145/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 704us/step - accuracy: 0.8450 - loss: 0.4563 219/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 696us/step - accuracy: 0.8425 - loss: 0.4608 290/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8415 - loss: 0.4621 291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8415 - loss: 0.4621 361/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8408 - loss: 0.4628 362/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8408 - loss: 0.4628 431/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.8402 - loss: 0.4633 432/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.8402 - loss: 0.4633 501/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.8398 - loss: 0.4634 502/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.8398 - loss: 0.4634 570/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8396 - loss: 0.4634 571/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8396 - loss: 0.4634 641/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8396 - loss: 0.4631 642/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8396 - loss: 0.4631 712/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8397 - loss: 0.4628 713/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8397 - loss: 0.4628 783/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8398 - loss: 0.4625 784/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8398 - loss: 0.4624 854/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8400 - loss: 0.4621 855/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.8400 - loss: 0.4621 926/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8402 - loss: 0.4617 927/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8402 - loss: 0.4617 998/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8404 - loss: 0.4611 999/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8404 - loss: 0.46111072/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.8406 - loss: 0.46051146/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.8408 - loss: 0.45991147/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.8408 - loss: 0.45991212/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.8410 - loss: 0.45941213/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8410 - loss: 0.45941276/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.8411 - loss: 0.45901277/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.8411 - loss: 0.45901278/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.8411 - loss: 0.45901340/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8412 - loss: 0.45851384/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.8413 - loss: 0.45821385/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.8413 - loss: 0.45821450/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 741us/step - accuracy: 0.8415 - loss: 0.45771451/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 741us/step - accuracy: 0.8415 - loss: 0.45771516/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8417 - loss: 0.45721517/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8417 - loss: 0.45721583/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8418 - loss: 0.45671584/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8418 - loss: 0.45671651/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8420 - loss: 0.45621652/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8420 - loss: 0.45621719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.8421 - loss: 0.4558 - val_accuracy: 0.8474 - val_loss: 0.4326
Epoch 4/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 21s 13ms/step - accuracy: 0.8750 - loss: 0.4613   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 756us/step - accuracy: 0.8750 - loss: 0.4297  64/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.8673 - loss: 0.4081  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.8671 - loss: 0.4082 132/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 782us/step - accuracy: 0.8600 - loss: 0.4209 133/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 784us/step - accuracy: 0.8599 - loss: 0.4212 197/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 783us/step - accuracy: 0.8566 - loss: 0.4273 198/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 785us/step - accuracy: 0.8566 - loss: 0.4273 264/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 779us/step - accuracy: 0.8551 - loss: 0.4289 265/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 780us/step - accuracy: 0.8551 - loss: 0.4289 333/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 771us/step - accuracy: 0.8538 - loss: 0.4299 334/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 772us/step - accuracy: 0.8538 - loss: 0.4300 398/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 774us/step - accuracy: 0.8529 - loss: 0.4308 399/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 775us/step - accuracy: 0.8529 - loss: 0.4308 462/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8523 - loss: 0.4312 463/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8523 - loss: 0.4312 533/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.8518 - loss: 0.4315 597/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8515 - loss: 0.4315 598/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.8515 - loss: 0.4315 658/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8515 - loss: 0.4313 659/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8515 - loss: 0.4313 726/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.8515 - loss: 0.4311 727/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.8515 - loss: 0.4311 796/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.8515 - loss: 0.4309 860/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.8516 - loss: 0.4307 861/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8516 - loss: 0.4307 924/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.8516 - loss: 0.4305 991/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8517 - loss: 0.4301 992/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8517 - loss: 0.43011060/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8518 - loss: 0.42961061/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8518 - loss: 0.42961130/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.8518 - loss: 0.42921131/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.8518 - loss: 0.42921199/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8519 - loss: 0.42871200/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8519 - loss: 0.42871270/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.8519 - loss: 0.42841271/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.8519 - loss: 0.42841340/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.8520 - loss: 0.42801341/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.8520 - loss: 0.42801410/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.8521 - loss: 0.42761411/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.8521 - loss: 0.42761479/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 762us/step - accuracy: 0.8522 - loss: 0.42721480/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 762us/step - accuracy: 0.8522 - loss: 0.42721552/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.8523 - loss: 0.42681553/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.8523 - loss: 0.42671624/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.8523 - loss: 0.42631625/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.8523 - loss: 0.42631696/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.8524 - loss: 0.42601719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 795us/step - accuracy: 0.8524 - loss: 0.4259 - val_accuracy: 0.8472 - val_loss: 0.4167
Epoch 5/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.8750 - loss: 0.4396  72/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 707us/step - accuracy: 0.8683 - loss: 0.3843  73/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 710us/step - accuracy: 0.8682 - loss: 0.3844 145/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 703us/step - accuracy: 0.8626 - loss: 0.3989 220/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 693us/step - accuracy: 0.8598 - loss: 0.4045 221/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 694us/step - accuracy: 0.8598 - loss: 0.4046 295/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8584 - loss: 0.4064 296/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8584 - loss: 0.4064 366/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8577 - loss: 0.4073 367/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8577 - loss: 0.4073 440/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8571 - loss: 0.4082 441/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8571 - loss: 0.4082 510/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8568 - loss: 0.4086 511/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8568 - loss: 0.4086 581/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8566 - loss: 0.4087 654/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8567 - loss: 0.4086 655/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8567 - loss: 0.4086 729/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8568 - loss: 0.4084 730/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8568 - loss: 0.4084 803/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8570 - loss: 0.4083 876/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8571 - loss: 0.4082 877/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8571 - loss: 0.4082 948/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8572 - loss: 0.4079 949/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8572 - loss: 0.40791021/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8574 - loss: 0.40761022/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8574 - loss: 0.40761090/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8575 - loss: 0.40721091/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8575 - loss: 0.40721164/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8576 - loss: 0.40681165/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8576 - loss: 0.40681239/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8577 - loss: 0.40641240/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8577 - loss: 0.40641313/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8578 - loss: 0.40611385/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8579 - loss: 0.40571459/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8580 - loss: 0.40531531/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8581 - loss: 0.40491604/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8582 - loss: 0.40461605/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8582 - loss: 0.40461677/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8583 - loss: 0.40421719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 741us/step - accuracy: 0.8583 - loss: 0.4040 - val_accuracy: 0.8508 - val_loss: 0.4064
Epoch 6/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8438 - loss: 0.4217  71/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 717us/step - accuracy: 0.8675 - loss: 0.3656 144/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 705us/step - accuracy: 0.8644 - loss: 0.3800 217/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 698us/step - accuracy: 0.8627 - loss: 0.3859 218/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 699us/step - accuracy: 0.8627 - loss: 0.3860 291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8620 - loss: 0.3880 292/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8620 - loss: 0.3880 366/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8617 - loss: 0.3891 367/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8617 - loss: 0.3891 439/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8615 - loss: 0.3900 440/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8615 - loss: 0.3900 514/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8614 - loss: 0.3904 515/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8614 - loss: 0.3904 586/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8614 - loss: 0.3907 656/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8616 - loss: 0.3905 657/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8616 - loss: 0.3905 729/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8618 - loss: 0.3904 804/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8620 - loss: 0.3903 805/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8620 - loss: 0.3903 878/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8621 - loss: 0.3903 954/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8623 - loss: 0.3900 955/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8623 - loss: 0.39001027/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8625 - loss: 0.38971028/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8625 - loss: 0.38971104/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8627 - loss: 0.38931105/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8627 - loss: 0.38931178/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8629 - loss: 0.38901179/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8629 - loss: 0.38901254/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8630 - loss: 0.38871255/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8630 - loss: 0.38871329/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8631 - loss: 0.38841330/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8631 - loss: 0.38831403/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8633 - loss: 0.38801404/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8633 - loss: 0.38801478/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8634 - loss: 0.38761552/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8635 - loss: 0.38731553/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8635 - loss: 0.38731626/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8637 - loss: 0.38701701/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8637 - loss: 0.38671702/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8637 - loss: 0.38671719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 728us/step - accuracy: 0.8638 - loss: 0.3866 - val_accuracy: 0.8526 - val_loss: 0.3974
Epoch 7/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8438 - loss: 0.4127  70/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.8717 - loss: 0.3496 138/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 730us/step - accuracy: 0.8687 - loss: 0.3631 208/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 727us/step - accuracy: 0.8669 - loss: 0.3698 276/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 730us/step - accuracy: 0.8665 - loss: 0.3720 348/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.8662 - loss: 0.3733 349/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8662 - loss: 0.3733 420/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.8660 - loss: 0.3744 492/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.8660 - loss: 0.3749 493/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.8660 - loss: 0.3749 565/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.8660 - loss: 0.3753 566/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 717us/step - accuracy: 0.8660 - loss: 0.3753 638/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 714us/step - accuracy: 0.8661 - loss: 0.3753 639/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.8662 - loss: 0.3753 709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.8664 - loss: 0.3752 710/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.8664 - loss: 0.3752 780/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.8666 - loss: 0.3751 851/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 714us/step - accuracy: 0.8668 - loss: 0.3750 922/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 714us/step - accuracy: 0.8670 - loss: 0.3750 923/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 714us/step - accuracy: 0.8670 - loss: 0.3750 996/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8672 - loss: 0.37471069/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 710us/step - accuracy: 0.8673 - loss: 0.37441137/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8675 - loss: 0.37411138/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8675 - loss: 0.37411209/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8677 - loss: 0.37381210/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8677 - loss: 0.37381281/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8678 - loss: 0.37361353/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8680 - loss: 0.37331354/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8680 - loss: 0.37331427/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.8681 - loss: 0.37291498/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.8683 - loss: 0.37261568/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.8684 - loss: 0.37241636/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.8685 - loss: 0.37211637/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.8685 - loss: 0.37211705/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 714us/step - accuracy: 0.8685 - loss: 0.37191706/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 714us/step - accuracy: 0.8685 - loss: 0.37191719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 754us/step - accuracy: 0.8685 - loss: 0.3718 - val_accuracy: 0.8554 - val_loss: 0.3886
Epoch 8/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8438 - loss: 0.3971  73/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 705us/step - accuracy: 0.8740 - loss: 0.3356  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 707us/step - accuracy: 0.8740 - loss: 0.3357 148/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 691us/step - accuracy: 0.8713 - loss: 0.3504 149/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 692us/step - accuracy: 0.8713 - loss: 0.3506 223/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 688us/step - accuracy: 0.8702 - loss: 0.3565 224/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 689us/step - accuracy: 0.8702 - loss: 0.3566 297/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8699 - loss: 0.3587 298/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8699 - loss: 0.3588 371/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8699 - loss: 0.3599 372/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8699 - loss: 0.3599 444/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8698 - loss: 0.3610 445/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8698 - loss: 0.3610 518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8698 - loss: 0.3616 519/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8698 - loss: 0.3617 589/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8698 - loss: 0.3620 665/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8700 - loss: 0.3619 666/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8701 - loss: 0.3619 739/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8703 - loss: 0.3619 809/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8705 - loss: 0.3618 810/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8705 - loss: 0.3618 883/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8707 - loss: 0.3619 884/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8707 - loss: 0.3619 956/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8709 - loss: 0.3618 957/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8709 - loss: 0.36181031/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8711 - loss: 0.36151104/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8713 - loss: 0.36121105/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8713 - loss: 0.36121177/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8715 - loss: 0.36091178/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8715 - loss: 0.36091248/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8716 - loss: 0.36071249/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8716 - loss: 0.36071321/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8718 - loss: 0.36051322/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8718 - loss: 0.36051392/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8719 - loss: 0.36021393/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8719 - loss: 0.36021465/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8721 - loss: 0.35991466/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8721 - loss: 0.35991539/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8722 - loss: 0.35971540/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8722 - loss: 0.35971614/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8723 - loss: 0.35941615/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8723 - loss: 0.35941690/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.8724 - loss: 0.35921691/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.8724 - loss: 0.35921719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 735us/step - accuracy: 0.8724 - loss: 0.3591 - val_accuracy: 0.8590 - val_loss: 0.3806
Epoch 9/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8750 - loss: 0.3837  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 684us/step - accuracy: 0.8821 - loss: 0.3236  76/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 687us/step - accuracy: 0.8821 - loss: 0.3237 150/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 685us/step - accuracy: 0.8774 - loss: 0.3386 151/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 686us/step - accuracy: 0.8774 - loss: 0.3387 226/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 681us/step - accuracy: 0.8754 - loss: 0.3447 227/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 682us/step - accuracy: 0.8754 - loss: 0.3447 301/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 681us/step - accuracy: 0.8748 - loss: 0.3470 302/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 682us/step - accuracy: 0.8748 - loss: 0.3470 374/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8745 - loss: 0.3483 375/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8745 - loss: 0.3483 449/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8742 - loss: 0.3495 450/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8742 - loss: 0.3495 525/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 684us/step - accuracy: 0.8741 - loss: 0.3502 526/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 684us/step - accuracy: 0.8741 - loss: 0.3502 600/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 684us/step - accuracy: 0.8741 - loss: 0.3504 675/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 683us/step - accuracy: 0.8743 - loss: 0.3504 746/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8745 - loss: 0.3504 747/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8745 - loss: 0.3504 819/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8746 - loss: 0.3504 820/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8746 - loss: 0.3504 821/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8746 - loss: 0.3504 892/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8747 - loss: 0.3505 893/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8747 - loss: 0.3505 964/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8749 - loss: 0.3504 965/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8749 - loss: 0.35041039/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8750 - loss: 0.35021040/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8750 - loss: 0.35021115/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8752 - loss: 0.34991116/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8752 - loss: 0.34991190/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8754 - loss: 0.34961191/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8754 - loss: 0.34961266/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8755 - loss: 0.34951267/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8755 - loss: 0.34951342/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8757 - loss: 0.34921343/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8757 - loss: 0.34921417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8758 - loss: 0.34891418/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8758 - loss: 0.34891491/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8759 - loss: 0.34871492/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8759 - loss: 0.34871566/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8761 - loss: 0.34841567/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8761 - loss: 0.34841641/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8761 - loss: 0.34821642/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8761 - loss: 0.34821716/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8762 - loss: 0.34801717/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8762 - loss: 0.34801719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 729us/step - accuracy: 0.8762 - loss: 0.3480 - val_accuracy: 0.8604 - val_loss: 0.3748
Epoch 10/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8750 - loss: 0.3707  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 689us/step - accuracy: 0.8840 - loss: 0.3129  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 690us/step - accuracy: 0.8839 - loss: 0.3131 151/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 675us/step - accuracy: 0.8794 - loss: 0.3282 152/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 676us/step - accuracy: 0.8793 - loss: 0.3283 225/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 682us/step - accuracy: 0.8776 - loss: 0.3342 298/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 683us/step - accuracy: 0.8772 - loss: 0.3365 369/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8770 - loss: 0.3378 370/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8770 - loss: 0.3378 441/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8768 - loss: 0.3390 442/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8768 - loss: 0.3391 508/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8767 - loss: 0.3397 509/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8767 - loss: 0.3397 581/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8767 - loss: 0.3401 582/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8767 - loss: 0.3401 653/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8768 - loss: 0.3401 724/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 703us/step - accuracy: 0.8770 - loss: 0.3401 725/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 703us/step - accuracy: 0.8770 - loss: 0.3401 795/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8772 - loss: 0.3402 796/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8772 - loss: 0.3402 867/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8773 - loss: 0.3403 868/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8773 - loss: 0.3403 940/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8775 - loss: 0.3402 941/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8775 - loss: 0.34021011/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8776 - loss: 0.34011012/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8777 - loss: 0.34011083/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8778 - loss: 0.33991084/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8778 - loss: 0.33991157/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8780 - loss: 0.33961228/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8782 - loss: 0.33941229/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8782 - loss: 0.33941302/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8783 - loss: 0.33931303/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8783 - loss: 0.33931373/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8784 - loss: 0.33901374/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8784 - loss: 0.33901445/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8786 - loss: 0.33881446/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8786 - loss: 0.33881518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8787 - loss: 0.33851519/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8787 - loss: 0.33851591/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8788 - loss: 0.33831660/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8789 - loss: 0.33811661/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8789 - loss: 0.33811719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 752us/step - accuracy: 0.8790 - loss: 0.3380 - val_accuracy: 0.8616 - val_loss: 0.3692
Epoch 11/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 21s 13ms/step - accuracy: 0.8750 - loss: 0.3554   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 955us/step - accuracy: 0.8828 - loss: 0.3270  67/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 774us/step - accuracy: 0.8898 - loss: 0.3017  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 777us/step - accuracy: 0.8898 - loss: 0.3019 138/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 746us/step - accuracy: 0.8858 - loss: 0.3157 139/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 747us/step - accuracy: 0.8858 - loss: 0.3158 207/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 747us/step - accuracy: 0.8834 - loss: 0.3231 273/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 749us/step - accuracy: 0.8828 - loss: 0.3258 274/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 750us/step - accuracy: 0.8828 - loss: 0.3258 342/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 747us/step - accuracy: 0.8824 - loss: 0.3274 343/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 748us/step - accuracy: 0.8824 - loss: 0.3274 411/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 746us/step - accuracy: 0.8821 - loss: 0.3289 412/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8821 - loss: 0.3289 481/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 745us/step - accuracy: 0.8819 - loss: 0.3298 482/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 745us/step - accuracy: 0.8819 - loss: 0.3298 549/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 746us/step - accuracy: 0.8817 - loss: 0.3304 550/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8817 - loss: 0.3304 619/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 745us/step - accuracy: 0.8817 - loss: 0.3306 688/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8818 - loss: 0.3306 754/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 745us/step - accuracy: 0.8819 - loss: 0.3306 822/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 745us/step - accuracy: 0.8819 - loss: 0.3307 823/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 745us/step - accuracy: 0.8819 - loss: 0.3307 891/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 745us/step - accuracy: 0.8820 - loss: 0.3308 959/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8820 - loss: 0.3308 960/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8820 - loss: 0.33081028/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8822 - loss: 0.33061029/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8822 - loss: 0.33061097/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8823 - loss: 0.33041165/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8824 - loss: 0.33021166/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8824 - loss: 0.33021233/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8825 - loss: 0.33011234/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.8825 - loss: 0.33011304/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8826 - loss: 0.32991305/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8826 - loss: 0.32991374/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 742us/step - accuracy: 0.8827 - loss: 0.32971375/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 742us/step - accuracy: 0.8827 - loss: 0.32971446/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 741us/step - accuracy: 0.8828 - loss: 0.32951518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.8828 - loss: 0.32931519/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.8828 - loss: 0.32931589/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.8829 - loss: 0.32911590/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.8829 - loss: 0.32911660/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 737us/step - accuracy: 0.8829 - loss: 0.32891719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 774us/step - accuracy: 0.8830 - loss: 0.3288 - val_accuracy: 0.8656 - val_loss: 0.3652
Epoch 12/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9062 - loss: 0.3415  73/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 697us/step - accuracy: 0.8921 - loss: 0.2933 146/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 692us/step - accuracy: 0.8879 - loss: 0.3079 220/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 687us/step - accuracy: 0.8859 - loss: 0.3148 295/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 683us/step - accuracy: 0.8856 - loss: 0.3176 370/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 680us/step - accuracy: 0.8854 - loss: 0.3193 371/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 681us/step - accuracy: 0.8854 - loss: 0.3193 444/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 682us/step - accuracy: 0.8851 - loss: 0.3206 513/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8850 - loss: 0.3214 585/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8849 - loss: 0.3219 586/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8849 - loss: 0.3219 660/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8850 - loss: 0.3219 736/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8851 - loss: 0.3220 810/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8851 - loss: 0.3221 811/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8851 - loss: 0.3221 884/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8851 - loss: 0.3222 885/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8851 - loss: 0.3222 958/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8852 - loss: 0.3222 959/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8852 - loss: 0.32221031/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8853 - loss: 0.32211032/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8853 - loss: 0.32211104/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8854 - loss: 0.32191179/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8855 - loss: 0.32171180/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8855 - loss: 0.32171255/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8855 - loss: 0.32161256/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8855 - loss: 0.32161331/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8856 - loss: 0.32141332/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8856 - loss: 0.32141407/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8857 - loss: 0.32121408/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8857 - loss: 0.32121480/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8858 - loss: 0.32101481/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8858 - loss: 0.32101554/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8858 - loss: 0.32081555/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8858 - loss: 0.32081630/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8859 - loss: 0.32061705/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8859 - loss: 0.32051706/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8859 - loss: 0.32051719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 726us/step - accuracy: 0.8859 - loss: 0.3204 - val_accuracy: 0.8672 - val_loss: 0.3611
Epoch 13/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8438 - loss: 0.3341  72/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 710us/step - accuracy: 0.8926 - loss: 0.2849  73/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 713us/step - accuracy: 0.8926 - loss: 0.2852 146/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 700us/step - accuracy: 0.8890 - loss: 0.2994 147/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 700us/step - accuracy: 0.8890 - loss: 0.2996 220/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 697us/step - accuracy: 0.8874 - loss: 0.3064 221/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 697us/step - accuracy: 0.8874 - loss: 0.3064 294/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.8872 - loss: 0.3092 366/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.8871 - loss: 0.3109 367/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8871 - loss: 0.3109 438/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8869 - loss: 0.3123 510/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8868 - loss: 0.3132 583/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8868 - loss: 0.3138 584/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8868 - loss: 0.3138 654/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8870 - loss: 0.3139 655/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8870 - loss: 0.3139 728/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8871 - loss: 0.3139 729/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8871 - loss: 0.3139 801/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8872 - loss: 0.3141 802/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8872 - loss: 0.3141 875/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8872 - loss: 0.3142 948/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 697us/step - accuracy: 0.8873 - loss: 0.31421022/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.8874 - loss: 0.31411096/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.8875 - loss: 0.31401169/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8876 - loss: 0.31381170/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.8877 - loss: 0.31381245/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8877 - loss: 0.31371246/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8877 - loss: 0.31371319/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8878 - loss: 0.31351393/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8879 - loss: 0.31331394/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8879 - loss: 0.31331469/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8880 - loss: 0.31311470/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8880 - loss: 0.31311542/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8881 - loss: 0.31301543/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8881 - loss: 0.31301618/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8882 - loss: 0.31281619/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8882 - loss: 0.31281693/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8882 - loss: 0.31271694/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8882 - loss: 0.31271719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 731us/step - accuracy: 0.8882 - loss: 0.3126 - val_accuracy: 0.8680 - val_loss: 0.3567
Epoch 14/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.8750 - loss: 0.3144   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 733us/step - accuracy: 0.8906 - loss: 0.2941  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 705us/step - accuracy: 0.8997 - loss: 0.2767  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 708us/step - accuracy: 0.8996 - loss: 0.2769 144/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 717us/step - accuracy: 0.8941 - loss: 0.2908 145/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 719us/step - accuracy: 0.8941 - loss: 0.2910 216/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 715us/step - accuracy: 0.8916 - loss: 0.2981 286/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 715us/step - accuracy: 0.8909 - loss: 0.3010 287/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 716us/step - accuracy: 0.8909 - loss: 0.3011 359/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.8905 - loss: 0.3029 360/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.8904 - loss: 0.3029 433/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.8901 - loss: 0.3045 508/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8899 - loss: 0.3055 509/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8899 - loss: 0.3055 581/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8897 - loss: 0.3060 582/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8897 - loss: 0.3060 654/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8898 - loss: 0.3062 726/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8898 - loss: 0.3063 727/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8898 - loss: 0.3063 800/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8899 - loss: 0.3064 801/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 703us/step - accuracy: 0.8899 - loss: 0.3064 874/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8899 - loss: 0.3066 875/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8899 - loss: 0.3066 946/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8899 - loss: 0.3067 947/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8899 - loss: 0.30671018/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 703us/step - accuracy: 0.8900 - loss: 0.30661019/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 703us/step - accuracy: 0.8900 - loss: 0.30661089/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8901 - loss: 0.30651090/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8901 - loss: 0.30641161/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8902 - loss: 0.30631162/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8902 - loss: 0.30631234/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8902 - loss: 0.30621301/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.8903 - loss: 0.30611372/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.8904 - loss: 0.30591373/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.8904 - loss: 0.30591443/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 708us/step - accuracy: 0.8904 - loss: 0.30571513/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 708us/step - accuracy: 0.8905 - loss: 0.30561514/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 708us/step - accuracy: 0.8905 - loss: 0.30561583/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.8905 - loss: 0.30541584/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.8905 - loss: 0.30541654/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 710us/step - accuracy: 0.8906 - loss: 0.30531719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 752us/step - accuracy: 0.8906 - loss: 0.3052 - val_accuracy: 0.8690 - val_loss: 0.3545
Epoch 15/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.9062 - loss: 0.3058   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 817us/step - accuracy: 0.9141 - loss: 0.2872  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 697us/step - accuracy: 0.9039 - loss: 0.2703  76/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 699us/step - accuracy: 0.9038 - loss: 0.2704 150/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 688us/step - accuracy: 0.8967 - loss: 0.2849 151/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 689us/step - accuracy: 0.8966 - loss: 0.2851 225/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 686us/step - accuracy: 0.8937 - loss: 0.2917 226/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 687us/step - accuracy: 0.8937 - loss: 0.2917 301/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 683us/step - accuracy: 0.8926 - loss: 0.2946 302/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 683us/step - accuracy: 0.8926 - loss: 0.2946 374/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8921 - loss: 0.2963 375/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8921 - loss: 0.2964 448/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8916 - loss: 0.2979 449/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8916 - loss: 0.2979 522/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8913 - loss: 0.2988 523/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8913 - loss: 0.2988 596/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8912 - loss: 0.2992 597/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8912 - loss: 0.2993 670/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8912 - loss: 0.2994 741/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8912 - loss: 0.2995 742/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8912 - loss: 0.2995 814/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8913 - loss: 0.2996 889/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8913 - loss: 0.2998 890/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8913 - loss: 0.2998 963/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8913 - loss: 0.29981036/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8914 - loss: 0.29971037/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8914 - loss: 0.29971108/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8915 - loss: 0.29961109/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8915 - loss: 0.29961182/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8916 - loss: 0.29951183/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8917 - loss: 0.29951254/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8917 - loss: 0.29941255/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8917 - loss: 0.29941327/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8918 - loss: 0.29931328/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8918 - loss: 0.29931400/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8919 - loss: 0.29911401/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8919 - loss: 0.29911474/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8920 - loss: 0.29891475/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8920 - loss: 0.29891549/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8921 - loss: 0.29881550/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8921 - loss: 0.29881623/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8922 - loss: 0.29861696/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8922 - loss: 0.29861697/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8922 - loss: 0.29851719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 733us/step - accuracy: 0.8922 - loss: 0.2985 - val_accuracy: 0.8690 - val_loss: 0.3518
Epoch 16/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.9062 - loss: 0.2914  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 693us/step - accuracy: 0.9047 - loss: 0.2634  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 695us/step - accuracy: 0.9045 - loss: 0.2636 147/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 697us/step - accuracy: 0.8975 - loss: 0.2775 148/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 699us/step - accuracy: 0.8975 - loss: 0.2777 223/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 690us/step - accuracy: 0.8947 - loss: 0.2846 296/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8940 - loss: 0.2876 297/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8940 - loss: 0.2876 372/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8937 - loss: 0.2895 373/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 687us/step - accuracy: 0.8937 - loss: 0.2895 446/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8933 - loss: 0.2911 522/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 683us/step - accuracy: 0.8932 - loss: 0.2920 523/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 684us/step - accuracy: 0.8932 - loss: 0.2920 596/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8932 - loss: 0.2925 597/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8932 - loss: 0.2925 670/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.8932 - loss: 0.2927 671/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 686us/step - accuracy: 0.8932 - loss: 0.2927 742/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8933 - loss: 0.2928 743/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8933 - loss: 0.2928 816/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8933 - loss: 0.2930 817/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8933 - loss: 0.2930 885/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8933 - loss: 0.2932 886/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.8933 - loss: 0.2932 960/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8934 - loss: 0.2932 961/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8934 - loss: 0.29321037/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8935 - loss: 0.29321038/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8935 - loss: 0.29321111/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8935 - loss: 0.29301112/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.8935 - loss: 0.29301186/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8936 - loss: 0.29291187/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8936 - loss: 0.29291260/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.8937 - loss: 0.29281334/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8938 - loss: 0.29271335/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8938 - loss: 0.29271409/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8939 - loss: 0.29251483/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8940 - loss: 0.29241484/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8940 - loss: 0.29241556/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8940 - loss: 0.29221557/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.8940 - loss: 0.29221632/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8941 - loss: 0.29211633/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8941 - loss: 0.29211708/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8941 - loss: 0.29211709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.8941 - loss: 0.29211719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 728us/step - accuracy: 0.8941 - loss: 0.2920 - val_accuracy: 0.8708 - val_loss: 0.3489
Epoch 17/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 12ms/step - accuracy: 0.9062 - loss: 0.2755  71/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 719us/step - accuracy: 0.9079 - loss: 0.2560  72/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 721us/step - accuracy: 0.9078 - loss: 0.2561 143/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 714us/step - accuracy: 0.9011 - loss: 0.2701 144/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 715us/step - accuracy: 0.9010 - loss: 0.2702 217/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 707us/step - accuracy: 0.8979 - loss: 0.2776 218/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 708us/step - accuracy: 0.8979 - loss: 0.2777 290/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 706us/step - accuracy: 0.8971 - loss: 0.2809 291/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 707us/step - accuracy: 0.8971 - loss: 0.2809 363/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8967 - loss: 0.2828 436/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 703us/step - accuracy: 0.8963 - loss: 0.2845 437/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8963 - loss: 0.2845 507/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8961 - loss: 0.2855 508/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8961 - loss: 0.2855 580/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 706us/step - accuracy: 0.8960 - loss: 0.2861 653/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.8960 - loss: 0.2863 654/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.8960 - loss: 0.2863 728/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.8961 - loss: 0.2864 729/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 703us/step - accuracy: 0.8961 - loss: 0.2864 803/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8961 - loss: 0.2866 876/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8960 - loss: 0.2868 877/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8960 - loss: 0.2868 949/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8960 - loss: 0.28691019/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.8961 - loss: 0.28681093/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8961 - loss: 0.28671166/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8962 - loss: 0.28661238/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8962 - loss: 0.28651311/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8963 - loss: 0.28641312/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8963 - loss: 0.28641383/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8963 - loss: 0.28631455/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.8964 - loss: 0.28611529/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8965 - loss: 0.28601530/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.8965 - loss: 0.28601604/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8965 - loss: 0.28591605/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8965 - loss: 0.28591679/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8965 - loss: 0.28581680/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8966 - loss: 0.28581719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 739us/step - accuracy: 0.8966 - loss: 0.2858 - val_accuracy: 0.8710 - val_loss: 0.3472
Epoch 18/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 19s 11ms/step - accuracy: 0.9375 - loss: 0.2648   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 837us/step - accuracy: 0.9375 - loss: 0.2547  73/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 713us/step - accuracy: 0.9112 - loss: 0.2502  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 715us/step - accuracy: 0.9111 - loss: 0.2504 147/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 703us/step - accuracy: 0.9036 - loss: 0.2646 148/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 704us/step - accuracy: 0.9035 - loss: 0.2648 220/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 701us/step - accuracy: 0.9006 - loss: 0.2717 221/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 702us/step - accuracy: 0.9005 - loss: 0.2718 294/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8996 - loss: 0.2750 295/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.8996 - loss: 0.2750 370/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8992 - loss: 0.2769 371/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8992 - loss: 0.2770 444/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8988 - loss: 0.2786 445/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.8988 - loss: 0.2786 516/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.8987 - loss: 0.2796 557/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8986 - loss: 0.2800 558/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.8986 - loss: 0.2800 619/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 742us/step - accuracy: 0.8986 - loss: 0.2802 620/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8986 - loss: 0.2802 681/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 750us/step - accuracy: 0.8986 - loss: 0.2804 682/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 750us/step - accuracy: 0.8986 - loss: 0.2804 742/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.8986 - loss: 0.2805 743/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.8986 - loss: 0.2805 803/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.8986 - loss: 0.2806 804/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.8986 - loss: 0.2806 865/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8985 - loss: 0.2808 866/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.8985 - loss: 0.2808 926/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.8985 - loss: 0.2809 985/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8985 - loss: 0.2810 986/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8985 - loss: 0.28101045/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8985 - loss: 0.28091046/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8985 - loss: 0.28091105/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8985 - loss: 0.28081106/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8985 - loss: 0.28081168/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8986 - loss: 0.28071229/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8986 - loss: 0.28071230/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8986 - loss: 0.28071291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8986 - loss: 0.28061292/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8986 - loss: 0.28061356/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8986 - loss: 0.28051357/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8986 - loss: 0.28051429/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8987 - loss: 0.28041430/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8987 - loss: 0.28041501/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8987 - loss: 0.28021502/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8987 - loss: 0.28021575/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.8988 - loss: 0.28011645/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.8988 - loss: 0.28001646/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8988 - loss: 0.28001708/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8988 - loss: 0.28001709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8988 - loss: 0.28001719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 827us/step - accuracy: 0.8988 - loss: 0.2800 - val_accuracy: 0.8730 - val_loss: 0.3468
Epoch 19/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 21s 12ms/step - accuracy: 0.9375 - loss: 0.2563   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9375 - loss: 0.2469    66/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 793us/step - accuracy: 0.9134 - loss: 0.2435 134/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 769us/step - accuracy: 0.9063 - loss: 0.2564 135/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 771us/step - accuracy: 0.9063 - loss: 0.2566 202/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 763us/step - accuracy: 0.9029 - loss: 0.2645 203/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 764us/step - accuracy: 0.9029 - loss: 0.2645 271/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 758us/step - accuracy: 0.9018 - loss: 0.2681 272/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 759us/step - accuracy: 0.9018 - loss: 0.2682 340/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 755us/step - accuracy: 0.9011 - loss: 0.2702 341/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 755us/step - accuracy: 0.9011 - loss: 0.2702 410/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 750us/step - accuracy: 0.9007 - loss: 0.2721 411/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.9007 - loss: 0.2721 479/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 750us/step - accuracy: 0.9005 - loss: 0.2732 480/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.9005 - loss: 0.2732 542/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.9004 - loss: 0.2740 543/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.9004 - loss: 0.2740 606/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 762us/step - accuracy: 0.9003 - loss: 0.2743 607/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.9003 - loss: 0.2743 668/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 768us/step - accuracy: 0.9003 - loss: 0.2745 732/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.9003 - loss: 0.2746 733/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.9003 - loss: 0.2746 773/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9003 - loss: 0.2748 774/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9003 - loss: 0.2748 840/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9003 - loss: 0.2749 841/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9003 - loss: 0.2750 910/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9003 - loss: 0.2752 911/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9003 - loss: 0.2752 981/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9003 - loss: 0.2752 982/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.9003 - loss: 0.27521052/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.9003 - loss: 0.27521053/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.9003 - loss: 0.27521122/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.9003 - loss: 0.27511123/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.9003 - loss: 0.27511192/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.9004 - loss: 0.27501193/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.9004 - loss: 0.27501261/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9004 - loss: 0.27501333/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.9004 - loss: 0.27491404/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.9004 - loss: 0.27481405/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.9004 - loss: 0.27481478/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 762us/step - accuracy: 0.9005 - loss: 0.27471479/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.9005 - loss: 0.27471550/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 760us/step - accuracy: 0.9005 - loss: 0.27461551/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 760us/step - accuracy: 0.9005 - loss: 0.27461620/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.9005 - loss: 0.27451621/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.9005 - loss: 0.27451689/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.9006 - loss: 0.27441690/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.9006 - loss: 0.27441719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 802us/step - accuracy: 0.9006 - loss: 0.2744 - val_accuracy: 0.8722 - val_loss: 0.3454
Epoch 20/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9375 - loss: 0.2428  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 737us/step - accuracy: 0.9188 - loss: 0.2382  70/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 740us/step - accuracy: 0.9187 - loss: 0.2384 139/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 738us/step - accuracy: 0.9114 - loss: 0.2516 140/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 740us/step - accuracy: 0.9113 - loss: 0.2518 208/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 743us/step - accuracy: 0.9077 - loss: 0.2593 209/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 744us/step - accuracy: 0.9077 - loss: 0.2594 273/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 754us/step - accuracy: 0.9063 - loss: 0.2627 274/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 755us/step - accuracy: 0.9063 - loss: 0.2627 341/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 754us/step - accuracy: 0.9054 - loss: 0.2647 342/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 755us/step - accuracy: 0.9054 - loss: 0.2647 399/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 773us/step - accuracy: 0.9048 - loss: 0.2663 400/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 773us/step - accuracy: 0.9047 - loss: 0.2663 458/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.9043 - loss: 0.2674 459/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.9043 - loss: 0.2674 520/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.9041 - loss: 0.2682 521/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9041 - loss: 0.2683 581/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9039 - loss: 0.2687 582/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9039 - loss: 0.2687 641/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9038 - loss: 0.2689 642/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9038 - loss: 0.2689 709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9038 - loss: 0.2691 776/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9037 - loss: 0.2693 777/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9037 - loss: 0.2693 845/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9036 - loss: 0.2695 846/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9036 - loss: 0.2695 911/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9035 - loss: 0.2697 912/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9035 - loss: 0.2697 979/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.9035 - loss: 0.2697 980/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.9035 - loss: 0.26971049/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.9034 - loss: 0.26971115/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9034 - loss: 0.26961116/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.9034 - loss: 0.26961184/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.9034 - loss: 0.26951185/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.9034 - loss: 0.26951251/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.9034 - loss: 0.26951252/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.9034 - loss: 0.26951320/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9034 - loss: 0.26941321/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9034 - loss: 0.26941390/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.9034 - loss: 0.26931391/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.9034 - loss: 0.26931457/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.9034 - loss: 0.26921458/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.9034 - loss: 0.26921525/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9034 - loss: 0.26911526/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9034 - loss: 0.26911592/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.9035 - loss: 0.26911593/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.9035 - loss: 0.26911663/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.9035 - loss: 0.26901664/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 770us/step - accuracy: 0.9035 - loss: 0.26901719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 816us/step - accuracy: 0.9035 - loss: 0.2690 - val_accuracy: 0.8722 - val_loss: 0.3455
Epoch 21/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9375 - loss: 0.2367  61/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 845us/step - accuracy: 0.9198 - loss: 0.2323  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 848us/step - accuracy: 0.9197 - loss: 0.2326 121/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 849us/step - accuracy: 0.9133 - loss: 0.2438 122/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 850us/step - accuracy: 0.9132 - loss: 0.2441 179/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 864us/step - accuracy: 0.9099 - loss: 0.2525 180/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 865us/step - accuracy: 0.9099 - loss: 0.2526 236/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 876us/step - accuracy: 0.9085 - loss: 0.2564 237/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 877us/step - accuracy: 0.9084 - loss: 0.2564 295/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 874us/step - accuracy: 0.9076 - loss: 0.2587 355/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 868us/step - accuracy: 0.9069 - loss: 0.2603 417/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 860us/step - accuracy: 0.9064 - loss: 0.2619 418/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 860us/step - accuracy: 0.9064 - loss: 0.2619 483/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 849us/step - accuracy: 0.9060 - loss: 0.2629 484/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 850us/step - accuracy: 0.9060 - loss: 0.2629 547/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 844us/step - accuracy: 0.9058 - loss: 0.2636 548/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 844us/step - accuracy: 0.9058 - loss: 0.2636 612/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 839us/step - accuracy: 0.9058 - loss: 0.2640 613/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 839us/step - accuracy: 0.9058 - loss: 0.2640 679/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 832us/step - accuracy: 0.9058 - loss: 0.2641 680/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 832us/step - accuracy: 0.9058 - loss: 0.2641 748/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.9058 - loss: 0.2643 749/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.9058 - loss: 0.2643 817/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9057 - loss: 0.2644 881/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.9056 - loss: 0.2647 951/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9056 - loss: 0.26471022/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9056 - loss: 0.26471092/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9056 - loss: 0.26471093/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9056 - loss: 0.26471162/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.9056 - loss: 0.26461232/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9056 - loss: 0.26451303/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9056 - loss: 0.26451304/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9056 - loss: 0.26451374/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9056 - loss: 0.26441375/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9056 - loss: 0.26441444/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.9057 - loss: 0.26431445/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.9057 - loss: 0.26431514/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9057 - loss: 0.26421515/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9057 - loss: 0.26421586/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9057 - loss: 0.26411587/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9057 - loss: 0.26411656/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.9057 - loss: 0.26401657/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.9057 - loss: 0.26401719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 812us/step - accuracy: 0.9057 - loss: 0.2640 - val_accuracy: 0.8740 - val_loss: 0.3439
Epoch 22/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9375 - loss: 0.2291  70/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.9197 - loss: 0.2283  71/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 728us/step - accuracy: 0.9196 - loss: 0.2284 141/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.9133 - loss: 0.2417 142/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 726us/step - accuracy: 0.9132 - loss: 0.2419 212/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 724us/step - accuracy: 0.9105 - loss: 0.2493 213/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.9105 - loss: 0.2493 285/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 718us/step - accuracy: 0.9093 - loss: 0.2529 352/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.9086 - loss: 0.2549 353/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.9086 - loss: 0.2549 419/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 732us/step - accuracy: 0.9080 - loss: 0.2566 420/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.9080 - loss: 0.2566 485/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.9077 - loss: 0.2577 486/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 740us/step - accuracy: 0.9077 - loss: 0.2578 545/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 752us/step - accuracy: 0.9076 - loss: 0.2585 610/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.9075 - loss: 0.2588 611/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.9075 - loss: 0.2588 678/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.9075 - loss: 0.2590 679/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.9076 - loss: 0.2590 748/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.9076 - loss: 0.2592 816/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 752us/step - accuracy: 0.9075 - loss: 0.2594 817/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 752us/step - accuracy: 0.9075 - loss: 0.2594 887/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.9075 - loss: 0.2597 888/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 750us/step - accuracy: 0.9075 - loss: 0.2597 959/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9074 - loss: 0.2598 960/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9074 - loss: 0.25981033/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.9074 - loss: 0.25981107/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.9074 - loss: 0.25971181/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 735us/step - accuracy: 0.9075 - loss: 0.25961252/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9075 - loss: 0.25961326/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.9075 - loss: 0.25951396/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9075 - loss: 0.25941397/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.9075 - loss: 0.25941468/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9075 - loss: 0.25931469/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9075 - loss: 0.25931538/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9075 - loss: 0.25921539/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9075 - loss: 0.25921606/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.9075 - loss: 0.25921671/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.9075 - loss: 0.25911672/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.9075 - loss: 0.25911719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.9075 - loss: 0.2591 - val_accuracy: 0.8766 - val_loss: 0.3426
Epoch 23/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 21s 13ms/step - accuracy: 0.9688 - loss: 0.2144  60/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 878us/step - accuracy: 0.9257 - loss: 0.2221  61/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 916us/step - accuracy: 0.9256 - loss: 0.2223 119/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 895us/step - accuracy: 0.9187 - loss: 0.2331 191/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.9143 - loss: 0.2431 192/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.9142 - loss: 0.2431 264/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.9124 - loss: 0.2474 333/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 774us/step - accuracy: 0.9114 - loss: 0.2496 334/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 775us/step - accuracy: 0.9114 - loss: 0.2496 405/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 763us/step - accuracy: 0.9106 - loss: 0.2516 406/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 763us/step - accuracy: 0.9106 - loss: 0.2516 469/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 768us/step - accuracy: 0.9101 - loss: 0.2528 470/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.9101 - loss: 0.2528 532/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.9099 - loss: 0.2536 533/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.9099 - loss: 0.2537 592/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.9097 - loss: 0.2541 593/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.9097 - loss: 0.2541 652/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9097 - loss: 0.2543 653/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.9097 - loss: 0.2543 711/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9097 - loss: 0.2544 712/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9097 - loss: 0.2544 771/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9097 - loss: 0.2546 772/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9097 - loss: 0.2546 832/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9096 - loss: 0.2548 833/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9096 - loss: 0.2548 893/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9095 - loss: 0.2550 894/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9095 - loss: 0.2550 956/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9095 - loss: 0.2551 957/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9095 - loss: 0.25511018/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.9095 - loss: 0.25511087/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9094 - loss: 0.25511151/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9094 - loss: 0.25501152/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9094 - loss: 0.25501221/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9094 - loss: 0.25501290/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9094 - loss: 0.25501291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9094 - loss: 0.25501361/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9094 - loss: 0.25491362/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9094 - loss: 0.25491434/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.9094 - loss: 0.25481435/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.9094 - loss: 0.25481508/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.9094 - loss: 0.25471582/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.9094 - loss: 0.25461583/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.9094 - loss: 0.25461657/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9094 - loss: 0.25451658/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9094 - loss: 0.25451719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.9094 - loss: 0.2545 - val_accuracy: 0.8760 - val_loss: 0.3416
Epoch 24/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9375 - loss: 0.2057  73/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 695us/step - accuracy: 0.9238 - loss: 0.2187  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 697us/step - accuracy: 0.9237 - loss: 0.2189 147/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 691us/step - accuracy: 0.9173 - loss: 0.2326 148/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 693us/step - accuracy: 0.9172 - loss: 0.2328 223/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 684us/step - accuracy: 0.9144 - loss: 0.2400 224/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 685us/step - accuracy: 0.9144 - loss: 0.2401 301/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 678us/step - accuracy: 0.9130 - loss: 0.2434 302/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 680us/step - accuracy: 0.9130 - loss: 0.2435 373/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.9123 - loss: 0.2454 374/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 685us/step - accuracy: 0.9123 - loss: 0.2455 448/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 684us/step - accuracy: 0.9117 - loss: 0.2473 523/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 682us/step - accuracy: 0.9114 - loss: 0.2484 524/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 683us/step - accuracy: 0.9114 - loss: 0.2484 591/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 690us/step - accuracy: 0.9112 - loss: 0.2490 656/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.9112 - loss: 0.2492 657/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.9112 - loss: 0.2492 724/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.9112 - loss: 0.2494 725/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.9112 - loss: 0.2494 790/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9111 - loss: 0.2496 791/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9111 - loss: 0.2496 858/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9111 - loss: 0.2499 859/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9111 - loss: 0.2499 925/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.9110 - loss: 0.2501 926/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.9110 - loss: 0.2501 993/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9110 - loss: 0.2501 994/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9110 - loss: 0.25011061/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9110 - loss: 0.25011062/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9110 - loss: 0.25011127/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9110 - loss: 0.25011128/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9110 - loss: 0.25011197/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9110 - loss: 0.25001198/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.9110 - loss: 0.25001267/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9109 - loss: 0.25001268/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.9109 - loss: 0.25001338/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9109 - loss: 0.25001339/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9109 - loss: 0.25001408/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9110 - loss: 0.24991409/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.9110 - loss: 0.24991476/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.9110 - loss: 0.24981477/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.9110 - loss: 0.24981546/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.9110 - loss: 0.24981547/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.9110 - loss: 0.24981615/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.9110 - loss: 0.24971616/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.9110 - loss: 0.24971684/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.9110 - loss: 0.24971719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 771us/step - accuracy: 0.9110 - loss: 0.2497 - val_accuracy: 0.8758 - val_loss: 0.3414
Epoch 25/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 21s 12ms/step - accuracy: 0.9688 - loss: 0.1982   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.9609 - loss: 0.1979  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 763us/step - accuracy: 0.9289 - loss: 0.2134  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 765us/step - accuracy: 0.9287 - loss: 0.2135 135/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 766us/step - accuracy: 0.9216 - loss: 0.2258 136/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 768us/step - accuracy: 0.9215 - loss: 0.2259 202/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 766us/step - accuracy: 0.9181 - loss: 0.2336 203/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 768us/step - accuracy: 0.9181 - loss: 0.2336 270/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 764us/step - accuracy: 0.9162 - loss: 0.2374 271/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 765us/step - accuracy: 0.9161 - loss: 0.2375 338/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 762us/step - accuracy: 0.9149 - loss: 0.2396 339/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 762us/step - accuracy: 0.9149 - loss: 0.2396 407/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.9140 - loss: 0.2416 408/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.9140 - loss: 0.2416 473/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 760us/step - accuracy: 0.9135 - loss: 0.2429 474/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.9135 - loss: 0.2429 543/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.9132 - loss: 0.2439 586/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.9131 - loss: 0.2442 587/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9131 - loss: 0.2442 655/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.9131 - loss: 0.2445 723/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.9131 - loss: 0.2447 724/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.9131 - loss: 0.2447 793/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9130 - loss: 0.2450 794/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9130 - loss: 0.2450 864/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 771us/step - accuracy: 0.9129 - loss: 0.2453 932/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.9129 - loss: 0.2455 933/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.9129 - loss: 0.24551003/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.9128 - loss: 0.24561004/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.9128 - loss: 0.24561076/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.9128 - loss: 0.24561077/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.9128 - loss: 0.24561147/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.9128 - loss: 0.24551148/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.9128 - loss: 0.24551217/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.9128 - loss: 0.24551218/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.9128 - loss: 0.24551289/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.9128 - loss: 0.24551290/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.9128 - loss: 0.24551362/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.9128 - loss: 0.24551363/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 752us/step - accuracy: 0.9128 - loss: 0.24551431/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.9128 - loss: 0.24541432/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.9128 - loss: 0.24541503/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.9128 - loss: 0.24531504/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.9128 - loss: 0.24531575/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9128 - loss: 0.24531576/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9128 - loss: 0.24531647/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 746us/step - accuracy: 0.9128 - loss: 0.24521648/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 746us/step - accuracy: 0.9128 - loss: 0.24521719/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.9128 - loss: 0.24521719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.9128 - loss: 0.2452 - val_accuracy: 0.8748 - val_loss: 0.3426
Epoch 26/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9375 - loss: 0.1971  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 693us/step - accuracy: 0.9279 - loss: 0.2104  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 696us/step - accuracy: 0.9278 - loss: 0.2106 148/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 694us/step - accuracy: 0.9216 - loss: 0.2239 220/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 694us/step - accuracy: 0.9190 - loss: 0.2307 291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 698us/step - accuracy: 0.9176 - loss: 0.2341 355/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9168 - loss: 0.2359 356/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9167 - loss: 0.2359 416/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9161 - loss: 0.2376 476/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9157 - loss: 0.2387 537/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 756us/step - accuracy: 0.9154 - loss: 0.2396 538/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.9154 - loss: 0.2396 600/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 762us/step - accuracy: 0.9152 - loss: 0.2401 601/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.9152 - loss: 0.2401 668/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.9152 - loss: 0.2403 669/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.9152 - loss: 0.2403 738/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.9151 - loss: 0.2405 739/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.9151 - loss: 0.2405 813/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.9150 - loss: 0.2408 814/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.9150 - loss: 0.2408 886/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9149 - loss: 0.2411 887/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.9149 - loss: 0.2411 958/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.9148 - loss: 0.2413 959/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 744us/step - accuracy: 0.9148 - loss: 0.24131034/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 738us/step - accuracy: 0.9148 - loss: 0.24131035/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.9148 - loss: 0.24131108/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 735us/step - accuracy: 0.9147 - loss: 0.24131109/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 735us/step - accuracy: 0.9147 - loss: 0.24131180/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9147 - loss: 0.24131181/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9147 - loss: 0.24131249/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9146 - loss: 0.24131250/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9146 - loss: 0.24131319/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9146 - loss: 0.24131320/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.9146 - loss: 0.24131392/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 732us/step - accuracy: 0.9146 - loss: 0.24121393/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 732us/step - accuracy: 0.9146 - loss: 0.24121466/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9146 - loss: 0.24111467/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.9146 - loss: 0.24111540/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.9146 - loss: 0.24111541/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.9146 - loss: 0.24111615/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.9146 - loss: 0.24101616/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.9146 - loss: 0.24101689/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9146 - loss: 0.24101690/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.9146 - loss: 0.24101719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 767us/step - accuracy: 0.9145 - loss: 0.2410 - val_accuracy: 0.8760 - val_loss: 0.3419
Epoch 27/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9688 - loss: 0.1938   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 793us/step - accuracy: 0.9609 - loss: 0.1931  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 697us/step - accuracy: 0.9314 - loss: 0.2061  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 698us/step - accuracy: 0.9312 - loss: 0.2062 146/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 702us/step - accuracy: 0.9242 - loss: 0.2191 218/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 701us/step - accuracy: 0.9210 - loss: 0.2261 290/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 700us/step - accuracy: 0.9193 - loss: 0.2296 291/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 701us/step - accuracy: 0.9193 - loss: 0.2296 364/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.9182 - loss: 0.2317 365/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.9182 - loss: 0.2317 437/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.9175 - loss: 0.2336 438/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.9175 - loss: 0.2336 514/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 694us/step - accuracy: 0.9170 - loss: 0.2349 589/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.9168 - loss: 0.2356 590/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 691us/step - accuracy: 0.9168 - loss: 0.2356 662/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.9167 - loss: 0.2359 663/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 692us/step - accuracy: 0.9167 - loss: 0.2359 735/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.9166 - loss: 0.2362 808/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.9165 - loss: 0.2365 809/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 693us/step - accuracy: 0.9165 - loss: 0.2365 879/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.9164 - loss: 0.2368 880/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.9164 - loss: 0.2368 952/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.9163 - loss: 0.2369 953/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.9163 - loss: 0.23691025/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.9163 - loss: 0.23701026/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 696us/step - accuracy: 0.9163 - loss: 0.23701101/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.9162 - loss: 0.23701102/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.9162 - loss: 0.23701174/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.9162 - loss: 0.23701175/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 695us/step - accuracy: 0.9162 - loss: 0.23701239/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.9162 - loss: 0.23701240/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 700us/step - accuracy: 0.9162 - loss: 0.23701304/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.9161 - loss: 0.23701305/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 705us/step - accuracy: 0.9161 - loss: 0.23701369/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 708us/step - accuracy: 0.9161 - loss: 0.23701370/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 708us/step - accuracy: 0.9161 - loss: 0.23701435/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9161 - loss: 0.23691436/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.9161 - loss: 0.23691500/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9161 - loss: 0.23681501/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9161 - loss: 0.23681566/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9161 - loss: 0.23681567/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9161 - loss: 0.23681631/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9161 - loss: 0.23681632/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9161 - loss: 0.23681695/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9161 - loss: 0.23681719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 766us/step - accuracy: 0.9161 - loss: 0.2368 - val_accuracy: 0.8764 - val_loss: 0.3424
Epoch 28/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 20s 12ms/step - accuracy: 0.9688 - loss: 0.1885   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 817us/step - accuracy: 0.9609 - loss: 0.1880  70/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 739us/step - accuracy: 0.9347 - loss: 0.2009  71/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 743us/step - accuracy: 0.9346 - loss: 0.2010 144/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 717us/step - accuracy: 0.9274 - loss: 0.2143 145/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 718us/step - accuracy: 0.9273 - loss: 0.2145 214/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 723us/step - accuracy: 0.9244 - loss: 0.2214 215/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 724us/step - accuracy: 0.9244 - loss: 0.2215 285/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 724us/step - accuracy: 0.9227 - loss: 0.2250 286/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.9227 - loss: 0.2250 355/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.9216 - loss: 0.2271 356/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.9216 - loss: 0.2272 428/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9208 - loss: 0.2291 429/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9208 - loss: 0.2291 501/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.9204 - loss: 0.2304 574/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9200 - loss: 0.2312 575/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9200 - loss: 0.2312 640/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9199 - loss: 0.2315 708/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9198 - loss: 0.2318 709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9198 - loss: 0.2318 779/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9196 - loss: 0.2321 780/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9196 - loss: 0.2321 851/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9195 - loss: 0.2324 852/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9195 - loss: 0.2324 927/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9193 - loss: 0.2326 928/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9193 - loss: 0.23261001/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9192 - loss: 0.23271072/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9191 - loss: 0.23271073/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9191 - loss: 0.23271144/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9191 - loss: 0.23271145/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9191 - loss: 0.23271217/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9190 - loss: 0.23271218/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9190 - loss: 0.23271291/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.9190 - loss: 0.23281292/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 714us/step - accuracy: 0.9190 - loss: 0.23281365/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.9189 - loss: 0.23271366/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.9189 - loss: 0.23271440/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9189 - loss: 0.23261441/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9189 - loss: 0.23261515/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.9188 - loss: 0.23261581/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.9188 - loss: 0.23261582/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 712us/step - accuracy: 0.9188 - loss: 0.23261655/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9188 - loss: 0.23261719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 751us/step - accuracy: 0.9187 - loss: 0.2326 - val_accuracy: 0.8762 - val_loss: 0.3426
Epoch 29/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 21s 12ms/step - accuracy: 0.9688 - loss: 0.1815   2/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 841us/step - accuracy: 0.9609 - loss: 0.1815  74/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 698us/step - accuracy: 0.9354 - loss: 0.1978  75/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 700us/step - accuracy: 0.9352 - loss: 0.1979 151/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 683us/step - accuracy: 0.9279 - loss: 0.2114 152/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 684us/step - accuracy: 0.9278 - loss: 0.2115 229/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 675us/step - accuracy: 0.9250 - loss: 0.2184 230/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 675us/step - accuracy: 0.9249 - loss: 0.2185 304/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 676us/step - accuracy: 0.9234 - loss: 0.2217 378/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 677us/step - accuracy: 0.9224 - loss: 0.2238 379/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 677us/step - accuracy: 0.9224 - loss: 0.2239 452/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 679us/step - accuracy: 0.9217 - loss: 0.2257 527/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 678us/step - accuracy: 0.9213 - loss: 0.2269 528/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 678us/step - accuracy: 0.9213 - loss: 0.2269 604/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 676us/step - accuracy: 0.9210 - loss: 0.2275 679/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 676us/step - accuracy: 0.9208 - loss: 0.2278 680/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 676us/step - accuracy: 0.9208 - loss: 0.2278 754/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 677us/step - accuracy: 0.9207 - loss: 0.2281 755/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 677us/step - accuracy: 0.9207 - loss: 0.2281 826/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 680us/step - accuracy: 0.9206 - loss: 0.2284 827/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 680us/step - accuracy: 0.9206 - loss: 0.2284 890/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 688us/step - accuracy: 0.9205 - loss: 0.2287 891/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 689us/step - accuracy: 0.9205 - loss: 0.2287 951/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 699us/step - accuracy: 0.9204 - loss: 0.22891011/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.9203 - loss: 0.22891071/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9203 - loss: 0.22901072/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 715us/step - accuracy: 0.9203 - loss: 0.22901135/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9202 - loss: 0.22901136/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9202 - loss: 0.22891204/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9202 - loss: 0.22891205/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9202 - loss: 0.22891274/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9201 - loss: 0.22901275/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9201 - loss: 0.22901343/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9201 - loss: 0.22891344/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.9201 - loss: 0.22891411/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9201 - loss: 0.22891412/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9201 - loss: 0.22891480/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.9201 - loss: 0.22881481/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.9201 - loss: 0.22881553/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9200 - loss: 0.22881554/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9200 - loss: 0.22881625/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9200 - loss: 0.22881626/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9200 - loss: 0.22881696/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9199 - loss: 0.22881697/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 724us/step - accuracy: 0.9199 - loss: 0.22881719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 765us/step - accuracy: 0.9199 - loss: 0.2288 - val_accuracy: 0.8768 - val_loss: 0.3433
Epoch 30/30
   1/1719 ━━━━━━━━━━━━━━━━━━━━ 21s 12ms/step - accuracy: 0.9688 - loss: 0.1730  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 736us/step - accuracy: 0.9381 - loss: 0.1932  70/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 739us/step - accuracy: 0.9379 - loss: 0.1933 141/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 723us/step - accuracy: 0.9306 - loss: 0.2060 142/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.9306 - loss: 0.2061 211/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 726us/step - accuracy: 0.9275 - loss: 0.2133 281/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 723us/step - accuracy: 0.9259 - loss: 0.2170 282/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 724us/step - accuracy: 0.9259 - loss: 0.2171 353/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9248 - loss: 0.2193 354/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 722us/step - accuracy: 0.9248 - loss: 0.2193 426/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9240 - loss: 0.2213 427/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9240 - loss: 0.2213 497/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.9235 - loss: 0.2226 567/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.9232 - loss: 0.2234 568/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9232 - loss: 0.2234 638/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9230 - loss: 0.2238 639/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9230 - loss: 0.2238 708/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9228 - loss: 0.2240 709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9228 - loss: 0.2240 779/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9227 - loss: 0.2243 780/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 721us/step - accuracy: 0.9227 - loss: 0.2243 852/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.9225 - loss: 0.2246 853/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9225 - loss: 0.2247 924/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 719us/step - accuracy: 0.9223 - loss: 0.2249 925/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 720us/step - accuracy: 0.9223 - loss: 0.2249 997/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9222 - loss: 0.2250 998/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 718us/step - accuracy: 0.9222 - loss: 0.22501071/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 716us/step - accuracy: 0.9221 - loss: 0.22501146/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.9220 - loss: 0.22501147/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 713us/step - accuracy: 0.9220 - loss: 0.22501222/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9219 - loss: 0.22501223/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 711us/step - accuracy: 0.9219 - loss: 0.22501297/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.9218 - loss: 0.22511298/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 709us/step - accuracy: 0.9218 - loss: 0.22511373/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.9218 - loss: 0.22501374/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 707us/step - accuracy: 0.9218 - loss: 0.22501450/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 704us/step - accuracy: 0.9218 - loss: 0.22501526/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.9217 - loss: 0.22491599/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.9217 - loss: 0.22491600/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 702us/step - accuracy: 0.9217 - loss: 0.22491675/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 701us/step - accuracy: 0.9216 - loss: 0.22491719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 742us/step - accuracy: 0.9216 - loss: 0.2249 - val_accuracy: 0.8774 - val_loss: 0.3433

The model is provided with both a taining set and a validation set. At each step, the model will report its performance on both sets. This will also allow to visualize the accuracy and loss curves on both sets (more later).

When calling the fit method in Keras (or similar frameworks), each step corresponds to the evaluation of a mini-batch. A mini-batch is a subset of the training data, and during each step, the model updates its weights based on the error calculated from this mini-batch.

An epoch is defined as one complete pass through the entire training dataset. During an epoch, the model processes multiple mini-batches until it has seen all the training data once. This process is repeated for a specified number of epochs to optimize the model’s performance.

Visualization

import pandas as pd 

pd.DataFrame(history.history).plot(
    figsize=(8, 5), xlim=[0, 29], ylim=[0, 1], grid=True, xlabel="Epoch",
    style=["r--", "r--.", "b-", "b-*"])
plt.legend(loc="lower left")  # extra code
plt.show()

Evaluating the model on our test

model.evaluate(X_test, y_test)
  1/313 ━━━━━━━━━━━━━━━━━━━━ 3s 10ms/step - accuracy: 0.8750 - loss: 0.5910  2/313 ━━━━━━━━━━━━━━━━━━━━ 0s 843us/step - accuracy: 0.8750 - loss: 0.5080124/313 ━━━━━━━━━━━━━━━━━━━━ 0s 416us/step - accuracy: 0.8793 - loss: 0.3606125/313 ━━━━━━━━━━━━━━━━━━━━ 0s 416us/step - accuracy: 0.8792 - loss: 0.3608126/313 ━━━━━━━━━━━━━━━━━━━━ 0s 416us/step - accuracy: 0.8791 - loss: 0.3608255/313 ━━━━━━━━━━━━━━━━━━━━ 0s 402us/step - accuracy: 0.8741 - loss: 0.3692256/313 ━━━━━━━━━━━━━━━━━━━━ 0s 402us/step - accuracy: 0.8741 - loss: 0.3692257/313 ━━━━━━━━━━━━━━━━━━━━ 0s 404us/step - accuracy: 0.8741 - loss: 0.3692258/313 ━━━━━━━━━━━━━━━━━━━━ 0s 405us/step - accuracy: 0.8741 - loss: 0.3692259/313 ━━━━━━━━━━━━━━━━━━━━ 0s 404us/step - accuracy: 0.8741 - loss: 0.3692313/313 ━━━━━━━━━━━━━━━━━━━━ 0s 412us/step - accuracy: 0.8739 - loss: 0.3689
[0.36475926637649536, 0.8741999864578247]

Making predictions

X_new = X_test[:3]
y_proba = model.predict(X_new)
y_proba.round(2)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step
array([[0.  , 0.  , 0.  , 0.  , 0.  , 0.22, 0.  , 0.01, 0.  , 0.77],
       [0.  , 0.  , 1.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ],
       [0.  , 1.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ]],
      dtype=float32)

. . .

y_pred = y_proba.argmax(axis=-1)
y_pred
array([9, 2, 1])

. . .

y_new = y_test[:3]
y_new
array([9, 2, 1], dtype=uint8)

As can be seen, the predictions are unambiguous, with only one class per prediction exhibiting a high value.

Predicted vs Observed

np.array(class_names)[y_pred]
array(['Ankle boot', 'Pullover', 'Trouser'], dtype='<U11')

Test Set Performance

from sklearn.metrics import classification_report

y_proba = model.predict(X_test)
y_pred = y_proba.argmax(axis=-1)

Test Set Performance

print(classification_report(y_test, y_pred))
              precision    recall  f1-score   support

           0       0.85      0.81      0.83      1000
           1       0.99      0.97      0.98      1000
           2       0.76      0.82      0.79      1000
           3       0.82      0.92      0.87      1000
           4       0.80      0.80      0.80      1000
           5       0.89      0.98      0.93      1000
           6       0.76      0.64      0.69      1000
           7       0.95      0.89      0.92      1000
           8       0.96      0.97      0.96      1000
           9       0.97      0.94      0.96      1000

    accuracy                           0.87     10000
   macro avg       0.87      0.87      0.87     10000
weighted avg       0.87      0.87      0.87     10000

Prologue

Summary

  • Introduction to Neural Networks and Connectionism
    • Shift from symbolic AI to connectionist approaches in artificial intelligence.
    • Inspiration from biological neural networks and the human brain’s structure.
  • Computations with Neurodes and Threshold Logic Units
    • Early models of neurons (neurodes) capable of performing logical operations (AND, OR, NOT).
    • Limitations of simple perceptrons in solving non-linearly separable problems like XOR.
  • Multilayer Perceptrons (MLPs) and Feedforward Neural Networks (FNNs)
    • Overcoming perceptron limitations by introducing hidden layers.
    • Structure and information flow in feedforward neural networks.
    • Explanation of forward pass computations in neural networks.
  • Activation Functions in Neural Networks
    • Importance of nonlinear activation functions (sigmoid, tanh, ReLU) for enabling learning of complex patterns.
    • Role of activation functions in backpropagation and gradient descent optimization.
    • Universal Approximation Theorem and its implications for neural networks.
  • Deep Learning Frameworks
    • Overview of PyTorch and TensorFlow as leading platforms for deep learning.
    • Introduction to Keras as a high-level API for building and training neural networks.
    • Discussion on the suitability of different frameworks for research and industry applications.
  • Hands-On Implementation with Keras
    • Loading and exploring the Fashion-MNIST dataset.
    • Building a neural network model using Keras’ Sequential API.
    • Compiling the model with appropriate loss functions and optimizers for multiclass classification.
    • Training the model and visualizing training and validation metrics over epochs.
    • Evaluating model performance on test data and interpreting results.
  • Making Predictions and Interpreting Results
    • Using the trained model to make predictions on new data.
    • Visualizing predictions alongside actual images and labels.
    • Understanding the output probabilities and class assignments in the context of the dataset.

Next lecture

  • We will discuss the training algorithm for artificial neural networks.

References

Cybenko, George V. 1989. “Approximation by Superpositions of a Sigmoidal Function.” Mathematics of Control, Signals and Systems 2: 303–14. https://api.semanticscholar.org/CorpusID:3958369.
Géron, Aurélien. 2022. Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow. 3rd ed. O’Reilly Media, Inc.
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. 2016. Deep Learning. Adaptive Computation and Machine Learning. MIT Press. https://dblp.org/rec/books/daglib/0040158.
Hornik, Kurt, Maxwell Stinchcombe, and Halbert White. 1989. “Multilayer Feedforward Networks Are Universal Approximators.” Neural Networks 2 (5): 359–66. https://doi.org/https://doi.org/10.1016/0893-6080(89)90020-8.
LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. 2015. “Deep Learning.” Nature 521 (7553): 436–44. https://doi.org/10.1038/nature14539.
LeNail, Alexander. 2019. NN-SVG: Publication-Ready Neural Network Architecture Schematics.” Journal of Open Source Software 4 (33): 747. https://doi.org/10.21105/joss.00747.
McCulloch, Warren S, and Walter Pitts. 1943. A logical calculus of the ideas immanent in nervous activity.” The Bulletin of Mathematical Biophysics 5 (4): 115–33. https://doi.org/10.1007/bf02478259.
Minsky, Marvin, and Seymour Papert. 1969. Perceptrons: An Introduction to Computational Geometry. Cambridge, MA, USA: MIT Press.
Rosenblatt, F. 1958. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review 65 (6): 386–408. https://doi.org/10.1037/h0042519.
Russell, Stuart, and Peter Norvig. 2020. Artificial Intelligence: A Modern Approach. 4th ed. Pearson. http://aima.cs.berkeley.edu/.

Marcel Turcotte

Marcel.Turcotte@uOttawa.ca

School of Electrical Engineering and Computer Science (EECS)

University of Ottawa