Introduction aux réseaux de reurones artificiels

CSI 4106 - Automne 2025

Marcel Turcotte

Version: oct. 8, 2025 07h39

Préambule

Message du jour

Citation du Jour (2024)

Objectifs d’apprentissage

  • Expliquer les perceptrons et MLPs : structure, fonction, histoire, et limitations.
  • Décrire les fonctions d’activation : leur rôle dans l’apprentissage de modèles complexes.
  • Implémenter un réseau de neurones à propagation avant avec Keras sur Fashion-MNIST.
  • Interpréter l’entraînement et les résultats des réseaux neuronaux : visualisation et mesures d’évaluation.
  • Se familiariser avec les frameworks d’apprentissage profond : PyTorch, TensorFlow et Keras pour la création et le déploiement de modèles.

Introduction

TensorFlow Playground

Réseaux neuronaux (NN)

Nous concentrons maintenant notre attention sur une famille de modèles d’apprentissage automatique inspirés de la structure et du fonctionnement des réseaux neuronaux biologiques présents chez les animaux.

Apprentissage automatique

  • Supervisé: classification, régression

  • Non supervisé: autoencodeurs, auto-apprentissage (self-supervised)

  • Par renforcement: NN désormais un composant intégral

Un neurone

Neurones interconnectés

Connexionniste

Hiérarchie des concepts

Notions de base

Calculs avec neurodes

\(x_1, x_2 \in \{0,1\}\) et \(f(z)\) est une fonction indicatrice : \[ f(z)= \begin{cases}0, & z<\theta \\ 1, & z \geq \theta\end{cases} \]

Calculs avec neurodes

\[ y = f(x_1 + x_2)= \begin{cases}0, & x_1 + x_2 <\theta \\ 1, & x_1 + x_2 \geq \theta\end{cases} \]

  • Avec \(\theta = 2\), le neurode implémente une porte logique ET.

  • Avec \(\theta = 1\), le neurode implémente une porte logique OU.

Calculs avec neurodes

  • Les calculs numériques peuvent être décomposés en une suite d’opérations logiques, permettant aux réseaux de neurodes d’exécuter tout calcul.

  • McCulloch et Pitts (1943) ne se sont pas concentrés sur l’apprentissage du paramètre \(\theta\).

  • Ils ont introduit une machine qui calcule toute fonction, mais ne peut pas apprendre.

Perceptron

Perceptron

Unité logique à seuil

Fonctions de seuil simples

\(\text{heaviside}(t)\) =

  • 1, si \(t \geq 0\)

  • 0, si \(t < 0\)

\(\text{sign}(t)\) =

  • 1, si \(t > 0\)

  • 0, si \(t = 0\)

  • -1, si \(t < 0\)

Notation

Notation

Perceptron

Perceptron

Notation

Notation

  • \(X\) est la matrice de données d’entréechaque ligne correspond à un exemple et chaque colonne représente l’un des \(D\) attributs.

  • \(W\) est la matrice de poids, structurée avec une ligne par entrée (attribut) et une colonne par neurone.

  • Les termes de biais peuvent être représentés séparément ; les deux approches apparaissent dans la littérature. Ici, \(b\) est un vecteur de longueur égale au nombre de neurones.

Discussion

  • L’algorithme pour entraîner le perceptron ressemble étroitement à la descente de gradient stochastique.

    • Dans l’intérêt du temps et pour éviter la confusion, nous passerons cet algorithme et nous nous concentrerons sur le perceptron multicouche (MLP) et son algorithme d’entraînement, le backpropagation.

Note historique et justification

Perceptron multicouche (MLP)

Problème de classification XOR

\(x^{(1)}\) \(x^{(2)}\) \(y\) \(o_1\) \(o_2\) \(o_3\)
1 0 1 0 1 1
0 1 1 0 1 1
0 0 0 0 0 0
1 1 0 1 1 0

Propagation avant (FNN)

Propagation avant (Calcul)

\(o_3 = \sigma(w_{13} x^{(1)}+ w_{23} x^{(2)} + b_3)\)

\(o_4 = \sigma(w_{14} x^{(1)}+ w_{24} x^{(2)} + b_4)\)

\(o_5 = \sigma(w_{15} x^{(1)}+ w_{25} x^{(2)} + b_5)\)

\(o_6 = \sigma(w_{36} o_3 + w_{46} o_4 + w_{56} o_5 + b_6)\)

\(o_7 = \sigma(w_{37} o_3 + w_{47} o_4 + w_{57} o_5 + b_7)\)

Propagation avant (Calcul)

import numpy as np

# Fonction sigmoïde

def sigma(x):
    return 1 / (1 + np.exp(-x))

# Vecteur d'entrée (deux attributs), un exemple de notre ensemble d'entraînement

x1, x2 = (0.5, 0.9)

# Initialisation des poids des couches 2 et 3 à des valeurs aléatoires

w13, w14, w15, w23, w24, w25 = np.random.uniform(low=-1, high=1, size=6)
w36, w46, w56, w37, w47, w57 = np.random.uniform(low=-1, high=1, size=6)

# Initialisation des 5 termes de biais à des valeurs aléatoires

b3, b4, b5, b6, b7 = np.random.uniform(low=-1, high=1, size=5)

o3 = sigma(w13 * x1 + w23 * x2 + b3)
o4 = sigma(w14 * x1 + w24 * x2 + b4)
o5 = sigma(w15 * x1 + w25 * x2 + b5)
o6 = sigma(w36 * o3 + w46 * o4 + w56 * o5 + b6)
o7 = sigma(w37 * o3 + w47 * o4 + w57 * o5 + b7)

(o6, o7)
(np.float64(0.5583287342187054), np.float64(0.7551766490311281))

Propagation avant (Calcul)

Propagation avant (Calcul)

Fonction d’activation

  • Comme discuté plus tard, l’algorithme d’entraînement, appelé rétropropagation (backpropagation), utilise la descente de gradient, nécessitant le calcul des dérivées partielles de la fonction de perte.

  • La fonction de seuil dans le perceptron multicouche a dû être remplacée, car elle consiste uniquement en des surfaces plates. La descente de gradient ne peut pas progresser sur des surfaces planes en raison de leur dérivée nulle.

Fonction d’activation

  • Les fonctions d’activation non linéaires sont primordiales car, sans elles, plusieurs couches du réseau ne calculeraient qu’une fonction linéaire des entrées.

  • Selon le théorème d’approximation universelle, des réseaux profonds suffisamment grands avec des fonctions d’activation non linéaires peuvent approximer n’importe quelle fonction continue. Voir Théorème d’Approximation Universelle.

Sigmoïde

Code
import numpy as np
import matplotlib.pyplot as plt

# Fonction sigmoïde
def sigmoid(x):
    return 1 / (1 + np.exp(-x))

# Générer des valeurs x
x = np.linspace(-10, 10, 400)

# Calculer les valeurs y pour la fonction sigmoïde
y = sigmoid(x)

# Créer une figure et supprimer les axes et la grille
fig, ax = plt.subplots()
ax.plot(x, y, color='black', linewidth=2)  # Conserver la courbe opaque

plt.grid(True)

# Définir un fond transparent pour la figure et les axes
fig.patch.set_alpha(0)  # Fond transparent pour la figure

# Enregistrer ou afficher le graphique avec un fond transparent
# plt.savefig('sigmoid_plot.png', transparent=True, bbox_inches='tight', pad_inches=0)
plt.show()

\[ \sigma(t) = \frac{1}{1 + e^{-t}} \]

Fonction tangente hyperbolique

Code
# Générer des valeurs x
x = np.linspace(-10, 10, 400)

# Calculer les valeurs y pour la fonction tangente hyperbolique
y = np.tanh(x)

# Créer une figure et supprimer les axes et la grille
fig, ax = plt.subplots()
ax.plot(x, y, color='black', linewidth=2)  # Conserver la courbe opaque

plt.grid(True)

# Définir un fond transparent pour la figure et les axes
fig.patch.set_alpha(0)  # Fond transparent pour la figure

# Enregistrer ou afficher le graphique avec un fond transparent
# plt.savefig('tanh_plot.png', transparent=True, bbox_inches='tight', pad_inches=0)
plt.show()

\[ \tanh(t) = 2 \sigma(2t) - 1 \]

Fonction unitaire rectifiée (ReLU)

Code
# Générer des valeurs x
x = np.linspace(-10, 10, 400)

# Calculer les valeurs y pour la fonction ReLU
y = np.maximum(0, x)

# Créer une figure et supprimer les axes et la grille
fig, ax = plt.subplots()
ax.plot(x, y, color='black', linewidth=2)  # Conserver la courbe opaque

plt.grid(True)

# Définir un fond transparent pour la figure et les axes
fig.patch.set_alpha(0)  # Fond transparent pour la figure

# Enregistrer ou afficher le graphique avec un fond transparent
# plt.savefig('relu_plot.png', transparent=True, bbox_inches='tight', pad_inches=0)
plt.show()

\[ \mathrm{ReLU}(t) = \max(0, t) \]

Fonctions d’activation courantes

Code
import numpy as np
import matplotlib.pyplot as plt

from scipy.special import expit as sigmoid

def relu(z):
    return np.maximum(0, z)

def derivative(f, z, eps=0.000001):
    return (f(z + eps) - f(z - eps))/(2 * eps)

max_z = 4.5
z = np.linspace(-max_z, max_z, 200)

plt.figure(figsize=(11, 3.1))

plt.subplot(121)
plt.plot([-max_z, 0], [0, 0], "r-", linewidth=2, label="Heaviside")
plt.plot(z, relu(z), "m-.", linewidth=2, label="ReLU")
plt.plot([0, 0], [0, 1], "r-", linewidth=0.5)
plt.plot([0, max_z], [1, 1], "r-", linewidth=2)
plt.plot(z, sigmoid(z), "g--", linewidth=2, label="Sigmoïde")
plt.plot(z, np.tanh(z), "b-", linewidth=1, label="Tanh")
plt.grid(True)
plt.title("Fonctions d'activation")
plt.axis([-max_z, max_z, -1.65, 2.4])
plt.gca().set_yticks([-1, 0, 1, 2])
plt.legend(loc="lower right", fontsize=13)

plt.subplot(122)
plt.plot(z, derivative(np.sign, z), "r-", linewidth=2, label="Heaviside")
plt.plot(0, 0, "ro", markersize=5)
plt.plot(0, 0, "rx", markersize=10)
plt.plot(z, derivative(sigmoid, z), "g--", linewidth=2, label="Sigmoïde")
plt.plot(z, derivative(np.tanh, z), "b-", linewidth=1, label="Tanh")
plt.plot([-max_z, 0], [0, 0], "m-.", linewidth=2)
plt.plot([0, max_z], [1, 1], "m-.", linewidth=2)
plt.plot([0, 0], [0, 1], "m-.", linewidth=1.2)
plt.plot(0, 1, "mo", markersize=5)
plt.plot(0, 1, "mx", markersize=10)
plt.grid(True)
plt.title("Dérivées")
plt.axis([-max_z, max_z, -0.2, 1.2])

plt.show()

Approximation Universelle

Définition

Le théorème d’approximation universelle affirme qu’un réseau de neurones feed-forward avec une seule couche cachée contenant un nombre fini de neurones peut approcher n’importe quelle fonction continue sur un sous-ensemble compact de \(\mathbb{R}^n\), avec des poids et des fonctions d’activation appropriés.

Couche cachée unique

\[ y = \sum_{i=1}^N \alpha_i \sigma(w_{1,i} x + b_i) \]

Effet de la variation de w

Code
def logistic(x, w, b):
    """Calcule la fonction logistique avec les paramètres w et b."""
    return 1 / (1 + np.exp(-(w * x + b)))

# Définir une plage pour les valeurs de x.
x = np.linspace(-10, 10, 400)

# Graphique 1 : Variation de w (pente) avec b fixé à 0.
plt.figure(figsize=(6,4))
w_values = [0.5, 1, 2, 5]  # différentes valeurs de pente
b = 0  # biais fixe

for w in w_values:
    plt.plot(x, logistic(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effet de la Variation de w (avec b = 0)')
plt.xlabel('x')
plt.ylabel(r'$\sigma(wx+b)$')
plt.legend()
plt.grid(True)

plt.show()

Effet de la variation de b

Code
# Graphique 2 : Variation de b (décalage horizontal) avec w fixé à 1.
plt.figure(figsize=(6,4))
w = 1  # pente fixe
b_values = [-5, -2, 0, 2, 5]  # différentes valeurs de biais

for b in b_values:
    plt.plot(x, logistic(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effet de la Variation de b (avec w = 1)')
plt.xlabel('x')
plt.ylabel(r'$\sigma(wx+b)$')
plt.legend()
plt.grid(True)

plt.show()

Effet de la variation de w

Code
def relu(x, w, b):
    """Calcule l'activation ReLU avec les paramètres w et b."""
    return np.maximum(0, w * x + b)

# Définir une plage pour les valeurs de x.
x = np.linspace(-10, 10, 400)

# Graphique 1 : Variation de w (mise à l'échelle) avec b fixé à 0.
plt.figure(figsize=(6,4))
w_values = [0.5, 1, 2, 5]  # différentes valeurs de mise à l'échelle
b = 0  # biais fixe

for w in w_values:
    plt.plot(x, relu(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effet de la Variation de w (avec b = 0) sur l\'activation ReLU')
plt.xlabel('x')
plt.ylabel('ReLU(wx+b)')
plt.legend()
plt.grid(True)

plt.show()

Effet de la variation de b

Code
# Graphique 2 : Variation de b (décalage horizontal) avec w fixé à 1.
plt.figure(figsize=(6,4))
w = 1  # mise à l'échelle fixe
b_values = [-5, -2, 0, 2, 5]  # différentes valeurs de biais

for b in b_values:
    plt.plot(x, relu(x, w, b), label=f'w = {w}, b = {b}')
plt.title('Effet de la Variation de b (avec w = 1) sur l\'activation ReLU')
plt.xlabel('x')
plt.ylabel('ReLU(wx+b)')
plt.legend()
plt.grid(True)

plt.show()

Couche cachée unique

\[ y = \sum_{i=1}^N \alpha_i \sigma(w_{1,i} x + b_i) \]

Démonstration par le code

import numpy as np

# Définition de la fonction à approximer

def f(x):
    return 2 * x**3 + 4 * x**2 - 5 * x + 1

# Génération d'un jeu de données, x dans [-4,2), f(x) comme ci-dessus

X = 6 * np.random.rand(1000, 1) - 4

y = f(X.flatten())

Augmenter le nombre de neurones

from sklearn.neural_network import MLPRegressor
from sklearn.model_selection import train_test_split

X_train, X_valid, y_train, y_valid = train_test_split(X, y, test_size=0.1, random_state=42)

models = []

sizes = [1, 2, 5, 10, 100]

for i, n in enumerate(sizes):

    models.append(MLPRegressor(hidden_layer_sizes=[n], max_iter=5000, random_state=42))

    models[i].fit(X_train, y_train)

Augmenter le nombre de neurones

Code
import matplotlib.pyplot as plt

# Création d'une carte de couleurs
colors = plt.colormaps['cool'].resampled(len(sizes))

X_valid = np.sort(X_valid, axis=0)

for i, n in enumerate(sizes):
    y_pred = models[i].predict(X_valid)
    plt.plot(X_valid, y_pred, "-", color=colors(i), label="Nombre de neurones = {}".format(n))

y_true = f(X_valid)
plt.plot(X_valid, y_true, "r.", label='Réel')

plt.legend()
plt.show()

Augmenter le nombre de neurones

Code
for i, n in enumerate(sizes):
    plt.plot(models[i].loss_curve_, "-", color=colors(i), label="Nombre de neurones = {}".format(n))

plt.title('Courbes de Perte MLPRegressor')
plt.xlabel('Itérations')
plt.ylabel('Perte')

plt.legend()
plt.show()

Approximation Universelle

Codons

Bibliothèques

PyTorch et TensorFlow sont les plateformes dominantes pour l’apprentissage profond.

  • PyTorch a gagné beaucoup de traction dans la communauté de recherche. Initialement développé par Meta AI, il fait maintenant partie de la Linux Foundation.

  • TensorFlow, créé par Google, est largement adopté dans l’industrie pour déployer des modèles en production.

Keras

Keras est une API de haut niveau conçue pour construire, entraîner, évaluer et exécuter des modèles sur diverses plateformes, y compris PyTorch, TensorFlow et JAX, la plateforme haute performance de Google.

Dataset Fashion-MNIST

Fashion-MNIST est un ensemble de données d’images d’articles de Zalando — comprenant un ensemble d’entraînement de 60 000 exemples et un ensemble de test de 10 000 exemples. Chaque exemple est une image en niveaux de gris de 28x28, associée à une étiquette provenant de 10 classes.”

Chargement

import tensorflow as tf

fashion_mnist = tf.keras.datasets.fashion_mnist.load_data()

(X_train_full, y_train_full), (X_test, y_test) = fashion_mnist

X_train, y_train = X_train_full[:-5000], y_train_full[:-5000]
X_valid, y_valid = X_train_full[-5000:], y_train_full[-5000:]

Exploration

X_train.shape
(55000, 28, 28)

Transformer les intensités des pixels d’entiers dans la plage de 0 à 255 en flottants dans la plage de 0 à 1.

X_train, X_valid, X_test = X_train / 255., X_valid / 255., X_test / 255.

À quoi ressemblent ces images ?

plt.figure(figsize=(2, 2))
plt.imshow(X_train[0], cmap="binary")
plt.axis('off')
plt.show()

y_train
array([9, 0, 0, ..., 9, 0, 2], shape=(55000,), dtype=uint8)

Puisque les étiquettes sont des entiers de 0 à 9, les noms des classes seront utiles.

class_names = ["T-shirt/top", "Pantalon", "Pull", "Robe", "Manteau",
               "Sandale", "Chemise", "Basket", "Sac", "Botte"]

Les 40 premières images

n_rows = 4
n_cols = 10
plt.figure(figsize=(n_cols * 1.2, n_rows * 1.2))
for row in range(n_rows):
    for col in range(n_cols):
        index = n_cols * row + col
        plt.subplot(n_rows, n_cols, index + 1)
        plt.imshow(X_train[index], cmap="binary", interpolation="nearest")
        plt.axis('off')
        plt.title(class_names[y_train[index]])
plt.subplots_adjust(wspace=0.2, hspace=0.5)
plt.show()

Les 40 premières images

Création d’un modèle

tf.random.set_seed(42)

model = tf.keras.Sequential()

model.add(tf.keras.layers.InputLayer(shape=[28, 28]))
model.add(tf.keras.layers.Flatten())
model.add(tf.keras.layers.Dense(300, activation="relu"))
model.add(tf.keras.layers.Dense(100, activation="relu"))
model.add(tf.keras.layers.Dense(10, activation="softmax"))

model.summary()

model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ flatten (Flatten)               │ (None, 784)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 300)            │       235,500 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 100)            │        30,100 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 10)             │         1,010 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 266,610 (1.02 MB)
 Trainable params: 266,610 (1.02 MB)
 Non-trainable params: 0 (0.00 B)

Création d’un modèle (alternative)

Code
# extra code – clear the session to reset the name counters
tf.keras.backend.clear_session()
tf.random.set_seed(42)
model = tf.keras.Sequential([
    tf.keras.Input(shape=(28, 28)),
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(300, activation="relu"),
    tf.keras.layers.Dense(100, activation="relu"),
    tf.keras.layers.Dense(10, activation="softmax")
])

model.summary()

model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ flatten (Flatten)               │ (None, 784)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 300)            │       235,500 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 100)            │        30,100 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 10)             │         1,010 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 266,610 (1.02 MB)
 Trainable params: 266,610 (1.02 MB)
 Non-trainable params: 0 (0.00 B)

Compilation du modèle

model.compile(loss="sparse_categorical_crossentropy",
              optimizer="sgd",
              metrics=["accuracy"])

Entraînement du modèle

history = model.fit(X_train, y_train, epochs=30,
                    validation_data=(X_valid, y_valid))
Epoch 1/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 5:41 199ms/step - accuracy: 0.0625 - loss: 2.5132

  59/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 866us/step - accuracy: 0.2828 - loss: 2.0755  

 120/1719 ━━━━━━━━━━━━━━━━━━━ 1s 845us/step - accuracy: 0.3918 - loss: 1.8829

 180/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 844us/step - accuracy: 0.4485 - loss: 1.7490

 243/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 832us/step - accuracy: 0.4885 - loss: 1.6408

 303/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 834us/step - accuracy: 0.5157 - loss: 1.5593

 363/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 836us/step - accuracy: 0.5373 - loss: 1.4918

 425/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 832us/step - accuracy: 0.5555 - loss: 1.4336

 488/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 827us/step - accuracy: 0.5710 - loss: 1.3830

 552/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.5844 - loss: 1.3388

 616/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.5959 - loss: 1.3001

 668/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 834us/step - accuracy: 0.6043 - loss: 1.2720

 719/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 844us/step - accuracy: 0.6118 - loss: 1.2467

 775/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 848us/step - accuracy: 0.6192 - loss: 1.2214

 830/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 853us/step - accuracy: 0.6259 - loss: 1.1985

 888/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 854us/step - accuracy: 0.6324 - loss: 1.1764

 947/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 854us/step - accuracy: 0.6385 - loss: 1.1555

1006/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 854us/step - accuracy: 0.6442 - loss: 1.1360

1068/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 851us/step - accuracy: 0.6497 - loss: 1.1171

1130/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 849us/step - accuracy: 0.6548 - loss: 1.0996

1192/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 847us/step - accuracy: 0.6595 - loss: 1.0832

1256/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 845us/step - accuracy: 0.6641 - loss: 1.0675

1319/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 843us/step - accuracy: 0.6683 - loss: 1.0530

1385/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 839us/step - accuracy: 0.6725 - loss: 1.0387

1452/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 835us/step - accuracy: 0.6765 - loss: 1.0250

1518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 832us/step - accuracy: 0.6802 - loss: 1.0123

1582/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 830us/step - accuracy: 0.6836 - loss: 1.0007

1647/1719 ━━━━━━━━━━━━━━━━━━━ 0s 828us/step - accuracy: 0.6869 - loss: 0.9895

1713/1719 ━━━━━━━━━━━━━━━━━━━ 0s 826us/step - accuracy: 0.6900 - loss: 0.9787

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 932us/step - accuracy: 0.7688 - loss: 0.7060 - val_accuracy: 0.8286 - val_loss: 0.5039

Epoch 2/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.8438 - loss: 0.5279

  66/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 782us/step - accuracy: 0.8334 - loss: 0.4988

 133/1719 ━━━━━━━━━━━━━━━━━━━ 1s 767us/step - accuracy: 0.8246 - loss: 0.5143

 197/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 772us/step - accuracy: 0.8211 - loss: 0.5203

 265/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 766us/step - accuracy: 0.8200 - loss: 0.5214

 330/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 768us/step - accuracy: 0.8194 - loss: 0.5220

 397/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 766us/step - accuracy: 0.8192 - loss: 0.5220

 462/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 768us/step - accuracy: 0.8191 - loss: 0.5217

 527/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8192 - loss: 0.5212

 596/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.8194 - loss: 0.5205

 662/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.8198 - loss: 0.5197

 731/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.8202 - loss: 0.5189

 797/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.8206 - loss: 0.5181

 864/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 760us/step - accuracy: 0.8210 - loss: 0.5172

 927/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.8214 - loss: 0.5164

 992/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.8218 - loss: 0.5154

1055/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.8221 - loss: 0.5144

1116/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8225 - loss: 0.5135

1177/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.8229 - loss: 0.5126

1241/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8232 - loss: 0.5118

1304/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.8235 - loss: 0.5110

1365/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.8238 - loss: 0.5102

1426/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8241 - loss: 0.5094

1492/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.8244 - loss: 0.5086

1554/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8247 - loss: 0.5078

1618/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8250 - loss: 0.5070

1681/1719 ━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.8253 - loss: 0.5063

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 840us/step - accuracy: 0.8318 - loss: 0.4866 - val_accuracy: 0.8394 - val_loss: 0.4554

Epoch 3/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 18s 11ms/step - accuracy: 0.7812 - loss: 0.4815

  61/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 835us/step - accuracy: 0.8499 - loss: 0.4312

 126/1719 ━━━━━━━━━━━━━━━━━━━ 1s 805us/step - accuracy: 0.8450 - loss: 0.4471

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.8423 - loss: 0.4556

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.8416 - loss: 0.4577

 313/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.8408 - loss: 0.4592

 377/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 804us/step - accuracy: 0.8405 - loss: 0.4597

 442/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 799us/step - accuracy: 0.8402 - loss: 0.4602

 504/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.8400 - loss: 0.4604

 570/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8401 - loss: 0.4604

 634/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.8403 - loss: 0.4601

 698/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.8406 - loss: 0.4597

 761/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.8408 - loss: 0.4594

 821/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8411 - loss: 0.4591

 885/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8413 - loss: 0.4588

 947/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8415 - loss: 0.4583

1011/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8418 - loss: 0.4578

1077/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8421 - loss: 0.4572

1139/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8423 - loss: 0.4566

1203/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8425 - loss: 0.4561

1269/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8427 - loss: 0.4557

1338/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8429 - loss: 0.4552

1402/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8431 - loss: 0.4547

1471/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8433 - loss: 0.4541

1539/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.8435 - loss: 0.4536

1610/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8437 - loss: 0.4531

1679/1719 ━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.8439 - loss: 0.4527

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 828us/step - accuracy: 0.8478 - loss: 0.4412 - val_accuracy: 0.8464 - val_loss: 0.4332

Epoch 4/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.8125 - loss: 0.4550

  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 747us/step - accuracy: 0.8629 - loss: 0.3968

 138/1719 ━━━━━━━━━━━━━━━━━━━ 1s 731us/step - accuracy: 0.8567 - loss: 0.4140

 207/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 731us/step - accuracy: 0.8539 - loss: 0.4217

 278/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 727us/step - accuracy: 0.8529 - loss: 0.4241

 347/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8521 - loss: 0.4255

 417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 725us/step - accuracy: 0.8515 - loss: 0.4265

 488/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 723us/step - accuracy: 0.8512 - loss: 0.4270

 555/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 727us/step - accuracy: 0.8510 - loss: 0.4272

 625/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 726us/step - accuracy: 0.8511 - loss: 0.4271

 690/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.8513 - loss: 0.4269

 751/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 739us/step - accuracy: 0.8515 - loss: 0.4267

 814/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8517 - loss: 0.4266

 878/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 747us/step - accuracy: 0.8519 - loss: 0.4264

 940/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.8521 - loss: 0.4262

1004/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 753us/step - accuracy: 0.8523 - loss: 0.4258

1069/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 754us/step - accuracy: 0.8525 - loss: 0.4253

1134/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 756us/step - accuracy: 0.8527 - loss: 0.4249

1197/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.8528 - loss: 0.4245

1259/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 760us/step - accuracy: 0.8530 - loss: 0.4242

1323/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.8531 - loss: 0.4238

1387/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 762us/step - accuracy: 0.8533 - loss: 0.4234

1451/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.8534 - loss: 0.4231

1513/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.8536 - loss: 0.4227

1575/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 768us/step - accuracy: 0.8537 - loss: 0.4223

1638/1719 ━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8538 - loss: 0.4220

1703/1719 ━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8539 - loss: 0.4217

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.8564 - loss: 0.4133 - val_accuracy: 0.8506 - val_loss: 0.4176

Epoch 5/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.8125 - loss: 0.4357

  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 788us/step - accuracy: 0.8690 - loss: 0.3722

 131/1719 ━━━━━━━━━━━━━━━━━━━ 1s 778us/step - accuracy: 0.8631 - loss: 0.3883

 194/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 784us/step - accuracy: 0.8602 - loss: 0.3971

 257/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 787us/step - accuracy: 0.8595 - loss: 0.3998

 322/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 784us/step - accuracy: 0.8588 - loss: 0.4016

 388/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 780us/step - accuracy: 0.8585 - loss: 0.4028

 454/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.8582 - loss: 0.4036

 518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8580 - loss: 0.4040

 582/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.8580 - loss: 0.4042

 650/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.8581 - loss: 0.4041

 715/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.8582 - loss: 0.4040

 781/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8584 - loss: 0.4038

 847/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.8585 - loss: 0.4037

 911/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8586 - loss: 0.4036

 975/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.8588 - loss: 0.4034

1043/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8590 - loss: 0.4030

1109/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8591 - loss: 0.4026

1175/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.8593 - loss: 0.4022

1239/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.8595 - loss: 0.4019

1302/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8596 - loss: 0.4017

1367/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8597 - loss: 0.4014

1430/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.8599 - loss: 0.4010

1490/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8600 - loss: 0.4007

1551/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8602 - loss: 0.4005

1613/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.8603 - loss: 0.4002

1673/1719 ━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8604 - loss: 0.3999

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 840us/step - accuracy: 0.8628 - loss: 0.3930 - val_accuracy: 0.8528 - val_loss: 0.4054

Epoch 6/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9062 - loss: 0.4162

  61/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 842us/step - accuracy: 0.8811 - loss: 0.3528

 126/1719 ━━━━━━━━━━━━━━━━━━━ 1s 808us/step - accuracy: 0.8732 - loss: 0.3682

 190/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 802us/step - accuracy: 0.8692 - loss: 0.3780

 252/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.8681 - loss: 0.3810

 317/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 800us/step - accuracy: 0.8671 - loss: 0.3829

 379/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.8666 - loss: 0.3841

 439/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 808us/step - accuracy: 0.8662 - loss: 0.3850

 501/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.8659 - loss: 0.3856

 567/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8656 - loss: 0.3860

 628/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8655 - loss: 0.3859

 692/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8656 - loss: 0.3858

 754/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8656 - loss: 0.3858

 817/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8657 - loss: 0.3857

 881/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8658 - loss: 0.3857

 943/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8658 - loss: 0.3856

1007/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8659 - loss: 0.3853

1070/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8660 - loss: 0.3850

1132/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8662 - loss: 0.3847

1191/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8663 - loss: 0.3844

1256/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8663 - loss: 0.3842

1320/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8664 - loss: 0.3839

1382/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8665 - loss: 0.3837

1442/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8666 - loss: 0.3834

1501/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8667 - loss: 0.3832

1560/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8667 - loss: 0.3829

1626/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8668 - loss: 0.3827

1698/1719 ━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8669 - loss: 0.3824

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 850us/step - accuracy: 0.8681 - loss: 0.3768 - val_accuracy: 0.8572 - val_loss: 0.3949

Epoch 7/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9062 - loss: 0.3998

  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 752us/step - accuracy: 0.8860 - loss: 0.3390

 140/1719 ━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.8769 - loss: 0.3556

 210/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 722us/step - accuracy: 0.8731 - loss: 0.3638

 279/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 725us/step - accuracy: 0.8718 - loss: 0.3665

 347/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8710 - loss: 0.3682

 412/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 737us/step - accuracy: 0.8703 - loss: 0.3695

 472/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.8699 - loss: 0.3702

 534/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 758us/step - accuracy: 0.8696 - loss: 0.3707

 593/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 767us/step - accuracy: 0.8695 - loss: 0.3709

 656/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 771us/step - accuracy: 0.8695 - loss: 0.3708

 717/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.8695 - loss: 0.3708

 780/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.8695 - loss: 0.3708

 843/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8696 - loss: 0.3708

 905/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.8696 - loss: 0.3708

 968/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8697 - loss: 0.3707

1032/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8698 - loss: 0.3704

1093/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8699 - loss: 0.3702

1155/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.8701 - loss: 0.3699

1219/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.8702 - loss: 0.3697

1281/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8702 - loss: 0.3695

1346/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8703 - loss: 0.3693

1407/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8704 - loss: 0.3690

1472/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8705 - loss: 0.3688

1535/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8706 - loss: 0.3686

1599/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8706 - loss: 0.3684

1661/1719 ━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8707 - loss: 0.3682

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 845us/step - accuracy: 0.8717 - loss: 0.3634 - val_accuracy: 0.8602 - val_loss: 0.3871

Epoch 8/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.9062 - loss: 0.3826

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 827us/step - accuracy: 0.8877 - loss: 0.3253

 127/1719 ━━━━━━━━━━━━━━━━━━━ 1s 800us/step - accuracy: 0.8805 - loss: 0.3401

 192/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.8765 - loss: 0.3500

 257/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.8754 - loss: 0.3532

 320/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 788us/step - accuracy: 0.8746 - loss: 0.3551

 383/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 790us/step - accuracy: 0.8741 - loss: 0.3565

 446/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.8736 - loss: 0.3575

 511/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8733 - loss: 0.3582

 574/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8731 - loss: 0.3585

 638/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8731 - loss: 0.3585

 701/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8731 - loss: 0.3585

 762/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8732 - loss: 0.3585

 823/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8732 - loss: 0.3585

 887/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8733 - loss: 0.3586

 953/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8734 - loss: 0.3585

1019/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.8735 - loss: 0.3583

1083/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8736 - loss: 0.3580

1147/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8738 - loss: 0.3577

1212/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8739 - loss: 0.3575

1278/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8740 - loss: 0.3574

1342/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8741 - loss: 0.3571

1403/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8742 - loss: 0.3569

1466/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8743 - loss: 0.3567

1532/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8744 - loss: 0.3565

1598/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8745 - loss: 0.3563

1663/1719 ━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8745 - loss: 0.3561

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 840us/step - accuracy: 0.8756 - loss: 0.3519 - val_accuracy: 0.8634 - val_loss: 0.3798

Epoch 9/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 9ms/step - accuracy: 0.9062 - loss: 0.3687

  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.8914 - loss: 0.3145

 132/1719 ━━━━━━━━━━━━━━━━━━━ 1s 767us/step - accuracy: 0.8843 - loss: 0.3297

 194/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 782us/step - accuracy: 0.8805 - loss: 0.3386

 257/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 785us/step - accuracy: 0.8791 - loss: 0.3418

 321/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.8781 - loss: 0.3437

 386/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 785us/step - accuracy: 0.8774 - loss: 0.3452

 449/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 788us/step - accuracy: 0.8769 - loss: 0.3462

 513/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8765 - loss: 0.3469

 576/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8764 - loss: 0.3472

 640/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8764 - loss: 0.3472

 705/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8764 - loss: 0.3472

 769/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8765 - loss: 0.3472

 832/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8766 - loss: 0.3473

 895/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8766 - loss: 0.3474

 958/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8767 - loss: 0.3473

1018/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8769 - loss: 0.3471

1078/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8770 - loss: 0.3469

1140/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.8771 - loss: 0.3467

1203/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8773 - loss: 0.3465

1266/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8773 - loss: 0.3463

1329/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8774 - loss: 0.3462

1393/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8775 - loss: 0.3460

1456/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8777 - loss: 0.3457

1518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8777 - loss: 0.3456

1579/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8778 - loss: 0.3454

1644/1719 ━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8779 - loss: 0.3452

1707/1719 ━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.8779 - loss: 0.3451

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 855us/step - accuracy: 0.8790 - loss: 0.3415 - val_accuracy: 0.8638 - val_loss: 0.3742

Epoch 10/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9062 - loss: 0.3552

  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 783us/step - accuracy: 0.8923 - loss: 0.3046

 134/1719 ━━━━━━━━━━━━━━━━━━━ 1s 758us/step - accuracy: 0.8861 - loss: 0.3199

 203/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 747us/step - accuracy: 0.8828 - loss: 0.3290

 272/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 742us/step - accuracy: 0.8817 - loss: 0.3321

 340/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 741us/step - accuracy: 0.8809 - loss: 0.3339

 410/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 737us/step - accuracy: 0.8802 - loss: 0.3355

 480/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.8798 - loss: 0.3364

 549/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.8794 - loss: 0.3370

 619/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 733us/step - accuracy: 0.8793 - loss: 0.3372

 687/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 734us/step - accuracy: 0.8794 - loss: 0.3372

 750/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 740us/step - accuracy: 0.8794 - loss: 0.3372

 814/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 743us/step - accuracy: 0.8795 - loss: 0.3373

 879/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 746us/step - accuracy: 0.8795 - loss: 0.3374

 943/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 749us/step - accuracy: 0.8796 - loss: 0.3374

1007/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 751us/step - accuracy: 0.8797 - loss: 0.3372

1069/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 755us/step - accuracy: 0.8798 - loss: 0.3371

1132/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 757us/step - accuracy: 0.8800 - loss: 0.3368

1196/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 759us/step - accuracy: 0.8801 - loss: 0.3366

1261/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 760us/step - accuracy: 0.8802 - loss: 0.3365

1326/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 761us/step - accuracy: 0.8803 - loss: 0.3363

1388/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 763us/step - accuracy: 0.8803 - loss: 0.3362

1452/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.8804 - loss: 0.3360

1515/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.8805 - loss: 0.3358

1578/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 767us/step - accuracy: 0.8806 - loss: 0.3356

1638/1719 ━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.8806 - loss: 0.3355

1697/1719 ━━━━━━━━━━━━━━━━━━━ 0s 772us/step - accuracy: 0.8807 - loss: 0.3354

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 829us/step - accuracy: 0.8815 - loss: 0.3323 - val_accuracy: 0.8658 - val_loss: 0.3689

Epoch 11/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9062 - loss: 0.3413

  61/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 846us/step - accuracy: 0.8940 - loss: 0.2951

 121/1719 ━━━━━━━━━━━━━━━━━━━ 1s 841us/step - accuracy: 0.8889 - loss: 0.3081

 181/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 844us/step - accuracy: 0.8855 - loss: 0.3184

 244/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 833us/step - accuracy: 0.8843 - loss: 0.3221

 307/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 828us/step - accuracy: 0.8836 - loss: 0.3242

 367/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 830us/step - accuracy: 0.8831 - loss: 0.3255

 430/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.8826 - loss: 0.3268

 494/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.8823 - loss: 0.3276

 558/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8821 - loss: 0.3281

 621/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8820 - loss: 0.3282

 682/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8821 - loss: 0.3282

 745/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.8821 - loss: 0.3283

 808/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8822 - loss: 0.3283

 870/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8822 - loss: 0.3285

 933/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 812us/step - accuracy: 0.8823 - loss: 0.3285

 997/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 810us/step - accuracy: 0.8824 - loss: 0.3284

1060/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 810us/step - accuracy: 0.8825 - loss: 0.3282

1123/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8826 - loss: 0.3280

1188/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8827 - loss: 0.3279

1253/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8828 - loss: 0.3277

1316/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8829 - loss: 0.3276

1378/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8830 - loss: 0.3274

1444/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8831 - loss: 0.3272

1508/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8832 - loss: 0.3271

1571/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8833 - loss: 0.3269

1635/1719 ━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8833 - loss: 0.3268

1700/1719 ━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.8834 - loss: 0.3267

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 854us/step - accuracy: 0.8844 - loss: 0.3239 - val_accuracy: 0.8678 - val_loss: 0.3639

Epoch 12/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9062 - loss: 0.3319

  70/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 738us/step - accuracy: 0.8963 - loss: 0.2896

 140/1719 ━━━━━━━━━━━━━━━━━━━ 1s 729us/step - accuracy: 0.8903 - loss: 0.3044

 207/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 735us/step - accuracy: 0.8874 - loss: 0.3122

 274/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 739us/step - accuracy: 0.8864 - loss: 0.3151

 343/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 736us/step - accuracy: 0.8857 - loss: 0.3168

 412/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 736us/step - accuracy: 0.8851 - loss: 0.3184

 483/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 732us/step - accuracy: 0.8847 - loss: 0.3193

 553/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.8845 - loss: 0.3199

 621/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.8844 - loss: 0.3201

 691/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8845 - loss: 0.3201

 759/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 731us/step - accuracy: 0.8845 - loss: 0.3201

 828/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8846 - loss: 0.3202

 898/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8846 - loss: 0.3204

 967/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8847 - loss: 0.3204

1036/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8848 - loss: 0.3202

1106/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8849 - loss: 0.3200

1175/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8851 - loss: 0.3198

1245/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8852 - loss: 0.3197

1314/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8852 - loss: 0.3195

1383/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 728us/step - accuracy: 0.8853 - loss: 0.3194

1451/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8854 - loss: 0.3192

1519/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8855 - loss: 0.3190

1587/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step - accuracy: 0.8856 - loss: 0.3189

1655/1719 ━━━━━━━━━━━━━━━━━━━ 0s 730us/step - accuracy: 0.8857 - loss: 0.3188

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 780us/step - accuracy: 0.8867 - loss: 0.3162 - val_accuracy: 0.8704 - val_loss: 0.3605

Epoch 13/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 14s 9ms/step - accuracy: 0.9062 - loss: 0.3283

  66/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 779us/step - accuracy: 0.8986 - loss: 0.2816

 129/1719 ━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.8936 - loss: 0.2942

 193/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 786us/step - accuracy: 0.8905 - loss: 0.3033

 259/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 780us/step - accuracy: 0.8894 - loss: 0.3066

 324/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 779us/step - accuracy: 0.8886 - loss: 0.3084

 390/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 778us/step - accuracy: 0.8880 - loss: 0.3100

 456/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8876 - loss: 0.3111

 518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 780us/step - accuracy: 0.8873 - loss: 0.3118

 581/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.8872 - loss: 0.3122

 643/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8871 - loss: 0.3122

 704/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8872 - loss: 0.3123

 767/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.8872 - loss: 0.3124

 830/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.8873 - loss: 0.3125

 892/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.8873 - loss: 0.3127

 955/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.8873 - loss: 0.3127

1017/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.8874 - loss: 0.3126

1080/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.8875 - loss: 0.3124

1143/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.8876 - loss: 0.3123

1205/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8877 - loss: 0.3121

1269/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.8878 - loss: 0.3121

1331/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.8879 - loss: 0.3119

1393/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.8879 - loss: 0.3118

1455/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8880 - loss: 0.3116

1518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.8881 - loss: 0.3115

1578/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8882 - loss: 0.3114

1638/1719 ━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8882 - loss: 0.3113

1695/1719 ━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8882 - loss: 0.3112

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 867us/step - accuracy: 0.8889 - loss: 0.3091 - val_accuracy: 0.8706 - val_loss: 0.3587

Epoch 14/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9062 - loss: 0.3180

  58/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 886us/step - accuracy: 0.8999 - loss: 0.2728

 117/1719 ━━━━━━━━━━━━━━━━━━━ 1s 867us/step - accuracy: 0.8960 - loss: 0.2842

 177/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 856us/step - accuracy: 0.8926 - loss: 0.2947

 237/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 851us/step - accuracy: 0.8912 - loss: 0.2985

 298/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 847us/step - accuracy: 0.8906 - loss: 0.3005

 358/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 847us/step - accuracy: 0.8902 - loss: 0.3019

 418/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 846us/step - accuracy: 0.8899 - loss: 0.3032

 479/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 843us/step - accuracy: 0.8896 - loss: 0.3041

 538/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 844us/step - accuracy: 0.8895 - loss: 0.3047

 601/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 839us/step - accuracy: 0.8894 - loss: 0.3049

 664/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 835us/step - accuracy: 0.8895 - loss: 0.3050

 729/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 830us/step - accuracy: 0.8895 - loss: 0.3051

 794/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 826us/step - accuracy: 0.8896 - loss: 0.3052

 856/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 825us/step - accuracy: 0.8897 - loss: 0.3054

 919/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.8897 - loss: 0.3055

 981/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.8898 - loss: 0.3055

1045/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.8899 - loss: 0.3054

1110/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.8900 - loss: 0.3052

1173/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8901 - loss: 0.3051

1237/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.8902 - loss: 0.3050

1297/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.8903 - loss: 0.3049

1356/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.8903 - loss: 0.3048

1417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.8904 - loss: 0.3047

1475/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.8905 - loss: 0.3045

1536/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.8906 - loss: 0.3044

1596/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.8906 - loss: 0.3043

1657/1719 ━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.8906 - loss: 0.3043

1718/1719 ━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.8907 - loss: 0.3042

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 881us/step - accuracy: 0.8914 - loss: 0.3025 - val_accuracy: 0.8722 - val_loss: 0.3561

Epoch 15/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9062 - loss: 0.3135

  60/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 854us/step - accuracy: 0.9009 - loss: 0.2674

 122/1719 ━━━━━━━━━━━━━━━━━━━ 1s 833us/step - accuracy: 0.8965 - loss: 0.2791

 182/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 836us/step - accuracy: 0.8935 - loss: 0.2886

 240/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 844us/step - accuracy: 0.8923 - loss: 0.2921

 299/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 846us/step - accuracy: 0.8918 - loss: 0.2940

 363/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 835us/step - accuracy: 0.8914 - loss: 0.2955

 427/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 829us/step - accuracy: 0.8912 - loss: 0.2969

 489/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.8910 - loss: 0.2977

 550/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 827us/step - accuracy: 0.8909 - loss: 0.2983

 610/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 828us/step - accuracy: 0.8910 - loss: 0.2984

 671/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 828us/step - accuracy: 0.8910 - loss: 0.2985

 729/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 831us/step - accuracy: 0.8911 - loss: 0.2986

 790/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 830us/step - accuracy: 0.8912 - loss: 0.2987

 854/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 827us/step - accuracy: 0.8913 - loss: 0.2989

 914/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 827us/step - accuracy: 0.8914 - loss: 0.2991

 977/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 826us/step - accuracy: 0.8915 - loss: 0.2990

1039/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 825us/step - accuracy: 0.8916 - loss: 0.2990

1102/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.8918 - loss: 0.2988

1163/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.8919 - loss: 0.2987

1225/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.8920 - loss: 0.2986

1285/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.8921 - loss: 0.2985

1348/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.8922 - loss: 0.2984

1409/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.8923 - loss: 0.2983

1471/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.8924 - loss: 0.2982

1535/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.8925 - loss: 0.2981

1595/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.8925 - loss: 0.2980

1657/1719 ━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.8926 - loss: 0.2979

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.8926 - loss: 0.2979

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 877us/step - accuracy: 0.8938 - loss: 0.2963 - val_accuracy: 0.8718 - val_loss: 0.3548

Epoch 16/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9062 - loss: 0.3105

  60/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 850us/step - accuracy: 0.9022 - loss: 0.2617

 123/1719 ━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.8982 - loss: 0.2732

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.8954 - loss: 0.2828

 251/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.8944 - loss: 0.2862

 315/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 800us/step - accuracy: 0.8940 - loss: 0.2880

 378/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 800us/step - accuracy: 0.8938 - loss: 0.2895

 442/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 799us/step - accuracy: 0.8936 - loss: 0.2908

 505/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8934 - loss: 0.2916

 569/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8933 - loss: 0.2921

 637/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.8934 - loss: 0.2922

 693/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.8934 - loss: 0.2923

 753/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8935 - loss: 0.2924

 813/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8936 - loss: 0.2926

 873/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8936 - loss: 0.2928

 935/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8937 - loss: 0.2929

 997/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 810us/step - accuracy: 0.8939 - loss: 0.2929

1061/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8940 - loss: 0.2928

1126/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8941 - loss: 0.2927

1190/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8942 - loss: 0.2925

1251/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8943 - loss: 0.2925

1313/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8944 - loss: 0.2924

1376/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8945 - loss: 0.2923

1439/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8946 - loss: 0.2922

1501/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8946 - loss: 0.2921

1565/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8947 - loss: 0.2920

1628/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8948 - loss: 0.2919

1693/1719 ━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8948 - loss: 0.2918

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 857us/step - accuracy: 0.8957 - loss: 0.2904 - val_accuracy: 0.8724 - val_loss: 0.3524

Epoch 17/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9062 - loss: 0.3054

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 830us/step - accuracy: 0.9046 - loss: 0.2561

 126/1719 ━━━━━━━━━━━━━━━━━━━ 1s 810us/step - accuracy: 0.9011 - loss: 0.2677

 189/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 806us/step - accuracy: 0.8985 - loss: 0.2767

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 812us/step - accuracy: 0.8976 - loss: 0.2800

 310/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 816us/step - accuracy: 0.8972 - loss: 0.2818

 373/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 815us/step - accuracy: 0.8969 - loss: 0.2832

 435/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 815us/step - accuracy: 0.8966 - loss: 0.2845

 498/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.8964 - loss: 0.2854

 563/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8962 - loss: 0.2859

 625/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.8962 - loss: 0.2861

 689/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8962 - loss: 0.2862

 754/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8963 - loss: 0.2863

 816/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8963 - loss: 0.2865

 878/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8964 - loss: 0.2867

 939/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.8964 - loss: 0.2868

1003/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8965 - loss: 0.2868

1067/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8966 - loss: 0.2867

1130/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8967 - loss: 0.2866

1193/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8968 - loss: 0.2865

1257/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8969 - loss: 0.2864

1318/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8969 - loss: 0.2864

1380/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8970 - loss: 0.2863

1443/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8971 - loss: 0.2862

1507/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.8971 - loss: 0.2861

1567/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.8972 - loss: 0.2860

1634/1719 ━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.8972 - loss: 0.2859

1702/1719 ━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.8973 - loss: 0.2859

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 851us/step - accuracy: 0.8981 - loss: 0.2849 - val_accuracy: 0.8724 - val_loss: 0.3508

Epoch 18/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.8750 - loss: 0.2975

  68/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 747us/step - accuracy: 0.9016 - loss: 0.2511

 135/1719 ━━━━━━━━━━━━━━━━━━━ 1s 752us/step - accuracy: 0.8997 - loss: 0.2634

 199/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 764us/step - accuracy: 0.8981 - loss: 0.2713

 263/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 769us/step - accuracy: 0.8976 - loss: 0.2744

 326/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 774us/step - accuracy: 0.8974 - loss: 0.2761

 391/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 775us/step - accuracy: 0.8974 - loss: 0.2777

 456/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 775us/step - accuracy: 0.8973 - loss: 0.2790

 520/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.8973 - loss: 0.2797

 583/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.8973 - loss: 0.2801

 646/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.8975 - loss: 0.2803

 709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8976 - loss: 0.2804

 773/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8977 - loss: 0.2806

 836/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8978 - loss: 0.2808

 901/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8979 - loss: 0.2811

 965/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8980 - loss: 0.2811

1030/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8981 - loss: 0.2811

1094/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.8982 - loss: 0.2810

1157/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8983 - loss: 0.2809

1222/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8984 - loss: 0.2809

1284/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8985 - loss: 0.2808

1346/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.8986 - loss: 0.2808

1411/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.8987 - loss: 0.2807

1476/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8988 - loss: 0.2806

1538/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.8989 - loss: 0.2805

1600/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8990 - loss: 0.2804

1662/1719 ━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8990 - loss: 0.2804

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 847us/step - accuracy: 0.9003 - loss: 0.2795 - val_accuracy: 0.8730 - val_loss: 0.3506

Epoch 19/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 18s 11ms/step - accuracy: 0.8750 - loss: 0.2951

  60/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 853us/step - accuracy: 0.9055 - loss: 0.2445

 119/1719 ━━━━━━━━━━━━━━━━━━━ 1s 851us/step - accuracy: 0.9029 - loss: 0.2549

 181/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 839us/step - accuracy: 0.9005 - loss: 0.2647

 243/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 834us/step - accuracy: 0.8997 - loss: 0.2683

 305/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 830us/step - accuracy: 0.8993 - loss: 0.2703

 369/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 822us/step - accuracy: 0.8991 - loss: 0.2718

 433/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 817us/step - accuracy: 0.8989 - loss: 0.2733

 493/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.8989 - loss: 0.2741

 552/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.8989 - loss: 0.2747

 611/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 826us/step - accuracy: 0.8990 - loss: 0.2749

 677/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.8991 - loss: 0.2751

 740/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.8993 - loss: 0.2753

 803/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.8994 - loss: 0.2755

 872/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 811us/step - accuracy: 0.8995 - loss: 0.2757

 939/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.8996 - loss: 0.2759

1006/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.8997 - loss: 0.2759

1075/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.8999 - loss: 0.2758

1142/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9000 - loss: 0.2757

1212/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.9001 - loss: 0.2757

1274/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9002 - loss: 0.2756

1343/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9003 - loss: 0.2756

1411/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.9005 - loss: 0.2755

1479/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9006 - loss: 0.2754

1548/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9007 - loss: 0.2753

1617/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 779us/step - accuracy: 0.9007 - loss: 0.2752

1686/1719 ━━━━━━━━━━━━━━━━━━━ 0s 777us/step - accuracy: 0.9008 - loss: 0.2752

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.9021 - loss: 0.2744 - val_accuracy: 0.8734 - val_loss: 0.3496

Epoch 20/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.8750 - loss: 0.2874

  67/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 761us/step - accuracy: 0.9052 - loss: 0.2406

 137/1719 ━━━━━━━━━━━━━━━━━━━ 1s 742us/step - accuracy: 0.9020 - loss: 0.2533

 201/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 758us/step - accuracy: 0.9006 - loss: 0.2610

 263/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 770us/step - accuracy: 0.9000 - loss: 0.2639

 325/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 778us/step - accuracy: 0.8997 - loss: 0.2657

 389/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 779us/step - accuracy: 0.8995 - loss: 0.2673

 452/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.8994 - loss: 0.2686

 515/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.8995 - loss: 0.2694

 580/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.8996 - loss: 0.2698

 642/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.8997 - loss: 0.2700

 705/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.8999 - loss: 0.2701

 769/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9001 - loss: 0.2703

 830/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.9002 - loss: 0.2705

 892/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 792us/step - accuracy: 0.9003 - loss: 0.2708

 954/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9005 - loss: 0.2709

1015/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9006 - loss: 0.2709

1078/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9008 - loss: 0.2709

1141/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9009 - loss: 0.2708

1203/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9010 - loss: 0.2707

1267/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9011 - loss: 0.2707

1331/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9013 - loss: 0.2706

1394/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9014 - loss: 0.2705

1456/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9015 - loss: 0.2705

1518/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9016 - loss: 0.2704

1581/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9017 - loss: 0.2704

1646/1719 ━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9018 - loss: 0.2703

1710/1719 ━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9018 - loss: 0.2703

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 853us/step - accuracy: 0.9034 - loss: 0.2697 - val_accuracy: 0.8724 - val_loss: 0.3506

Epoch 21/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.8750 - loss: 0.2771

  64/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 804us/step - accuracy: 0.9087 - loss: 0.2353

 130/1719 ━━━━━━━━━━━━━━━━━━━ 1s 785us/step - accuracy: 0.9051 - loss: 0.2471

 195/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 781us/step - accuracy: 0.9033 - loss: 0.2556

 262/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 773us/step - accuracy: 0.9027 - loss: 0.2589

 329/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 770us/step - accuracy: 0.9024 - loss: 0.2608

 394/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 771us/step - accuracy: 0.9023 - loss: 0.2625

 460/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.9023 - loss: 0.2637

 528/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 767us/step - accuracy: 0.9023 - loss: 0.2646

 594/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 767us/step - accuracy: 0.9025 - loss: 0.2650

 659/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 767us/step - accuracy: 0.9027 - loss: 0.2652

 726/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 766us/step - accuracy: 0.9029 - loss: 0.2653

 794/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.9030 - loss: 0.2656

 859/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.9031 - loss: 0.2659

 925/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.9032 - loss: 0.2661

 992/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 764us/step - accuracy: 0.9034 - loss: 0.2661

1058/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 765us/step - accuracy: 0.9035 - loss: 0.2661

1118/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 769us/step - accuracy: 0.9036 - loss: 0.2660

1180/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 771us/step - accuracy: 0.9037 - loss: 0.2660

1243/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 773us/step - accuracy: 0.9038 - loss: 0.2659

1306/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 774us/step - accuracy: 0.9039 - loss: 0.2659

1368/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9040 - loss: 0.2658

1432/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9041 - loss: 0.2657

1493/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.9041 - loss: 0.2657

1552/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9042 - loss: 0.2656

1614/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.9043 - loss: 0.2656

1676/1719 ━━━━━━━━━━━━━━━━━━━ 0s 784us/step - accuracy: 0.9043 - loss: 0.2656

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 840us/step - accuracy: 0.9054 - loss: 0.2651 - val_accuracy: 0.8736 - val_loss: 0.3502

Epoch 22/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.8750 - loss: 0.2654

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 824us/step - accuracy: 0.9105 - loss: 0.2299

 124/1719 ━━━━━━━━━━━━━━━━━━━ 1s 818us/step - accuracy: 0.9080 - loss: 0.2410

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.9060 - loss: 0.2502

 248/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 816us/step - accuracy: 0.9053 - loss: 0.2536

 312/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.9048 - loss: 0.2556

 374/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 809us/step - accuracy: 0.9046 - loss: 0.2572

 439/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 803us/step - accuracy: 0.9044 - loss: 0.2587

 499/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9044 - loss: 0.2596

 560/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.9045 - loss: 0.2602

 623/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.9046 - loss: 0.2604

 686/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9048 - loss: 0.2606

 748/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9049 - loss: 0.2608

 808/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 810us/step - accuracy: 0.9050 - loss: 0.2610

 872/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.9050 - loss: 0.2613

 936/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9051 - loss: 0.2614

1000/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9052 - loss: 0.2615

1066/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9053 - loss: 0.2615

1126/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9054 - loss: 0.2614

1189/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9055 - loss: 0.2613

1252/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9055 - loss: 0.2613

1316/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9056 - loss: 0.2613

1379/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9057 - loss: 0.2612

1442/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9058 - loss: 0.2611

1506/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9059 - loss: 0.2611

1569/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9059 - loss: 0.2610

1633/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9060 - loss: 0.2610

1695/1719 ━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9060 - loss: 0.2610

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 858us/step - accuracy: 0.9068 - loss: 0.2605 - val_accuracy: 0.8742 - val_loss: 0.3502

Epoch 23/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.8750 - loss: 0.2590

  69/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 746us/step - accuracy: 0.9109 - loss: 0.2265

 140/1719 ━━━━━━━━━━━━━━━━━━━ 1s 728us/step - accuracy: 0.9086 - loss: 0.2394

 205/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 742us/step - accuracy: 0.9071 - loss: 0.2467

 265/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 763us/step - accuracy: 0.9067 - loss: 0.2496

 329/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 767us/step - accuracy: 0.9064 - loss: 0.2514

 392/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 773us/step - accuracy: 0.9061 - loss: 0.2531

 456/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 776us/step - accuracy: 0.9060 - loss: 0.2544

 519/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.9060 - loss: 0.2552

 584/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 778us/step - accuracy: 0.9061 - loss: 0.2557

 645/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 782us/step - accuracy: 0.9062 - loss: 0.2559

 709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 783us/step - accuracy: 0.9064 - loss: 0.2561

 770/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 786us/step - accuracy: 0.9065 - loss: 0.2563

 832/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9065 - loss: 0.2565

 896/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 789us/step - accuracy: 0.9066 - loss: 0.2568

 958/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 790us/step - accuracy: 0.9066 - loss: 0.2569

1017/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9067 - loss: 0.2569

1079/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9068 - loss: 0.2569

1145/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9069 - loss: 0.2568

1207/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9070 - loss: 0.2568

1271/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9071 - loss: 0.2568

1336/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 793us/step - accuracy: 0.9071 - loss: 0.2567

1399/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9072 - loss: 0.2566

1458/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9073 - loss: 0.2566

1521/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9074 - loss: 0.2565

1579/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 799us/step - accuracy: 0.9074 - loss: 0.2565

1643/1719 ━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9074 - loss: 0.2564

1706/1719 ━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9075 - loss: 0.2564

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 855us/step - accuracy: 0.9082 - loss: 0.2561 - val_accuracy: 0.8732 - val_loss: 0.3488

Epoch 24/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 16s 10ms/step - accuracy: 0.9062 - loss: 0.2529

  65/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 789us/step - accuracy: 0.9154 - loss: 0.2212

 127/1719 ━━━━━━━━━━━━━━━━━━━ 1s 801us/step - accuracy: 0.9118 - loss: 0.2324

 192/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 792us/step - accuracy: 0.9096 - loss: 0.2413

 256/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 791us/step - accuracy: 0.9088 - loss: 0.2448

 317/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 798us/step - accuracy: 0.9083 - loss: 0.2466

 380/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 798us/step - accuracy: 0.9081 - loss: 0.2483

 443/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 798us/step - accuracy: 0.9079 - loss: 0.2498

 507/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 796us/step - accuracy: 0.9078 - loss: 0.2507

 571/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9078 - loss: 0.2513

 634/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 797us/step - accuracy: 0.9079 - loss: 0.2515

 699/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 795us/step - accuracy: 0.9081 - loss: 0.2517

 763/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9082 - loss: 0.2519

 830/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9082 - loss: 0.2521

 893/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9082 - loss: 0.2525

 957/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9083 - loss: 0.2526

1021/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 791us/step - accuracy: 0.9084 - loss: 0.2526

1089/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 788us/step - accuracy: 0.9085 - loss: 0.2526

1158/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 785us/step - accuracy: 0.9085 - loss: 0.2525

1228/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 781us/step - accuracy: 0.9086 - loss: 0.2524

1282/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 787us/step - accuracy: 0.9087 - loss: 0.2524

1334/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 794us/step - accuracy: 0.9087 - loss: 0.2524

1391/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 798us/step - accuracy: 0.9088 - loss: 0.2523

1450/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 800us/step - accuracy: 0.9089 - loss: 0.2523

1511/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 801us/step - accuracy: 0.9089 - loss: 0.2522

1570/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9090 - loss: 0.2522

1632/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9090 - loss: 0.2521

1692/1719 ━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9090 - loss: 0.2521

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 862us/step - accuracy: 0.9097 - loss: 0.2518 - val_accuracy: 0.8742 - val_loss: 0.3500

Epoch 25/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9062 - loss: 0.2469

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 825us/step - accuracy: 0.9174 - loss: 0.2164

 124/1719 ━━━━━━━━━━━━━━━━━━━ 1s 817us/step - accuracy: 0.9138 - loss: 0.2273

 188/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.9115 - loss: 0.2364

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 807us/step - accuracy: 0.9106 - loss: 0.2399

 311/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.9101 - loss: 0.2419

 372/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 813us/step - accuracy: 0.9098 - loss: 0.2435

 431/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.9096 - loss: 0.2449

 491/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.9094 - loss: 0.2459

 552/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9094 - loss: 0.2466

 612/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.9095 - loss: 0.2469

 673/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.9096 - loss: 0.2471

 737/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9097 - loss: 0.2473

 802/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9098 - loss: 0.2476

 864/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.9098 - loss: 0.2478

 924/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9098 - loss: 0.2481

 985/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9099 - loss: 0.2481

1047/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9100 - loss: 0.2482

1109/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9100 - loss: 0.2481

1169/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9101 - loss: 0.2481

1232/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9102 - loss: 0.2480

1296/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.9102 - loss: 0.2480

1358/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.9103 - loss: 0.2480

1419/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 816us/step - accuracy: 0.9104 - loss: 0.2479

1482/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.9104 - loss: 0.2479

1545/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.9105 - loss: 0.2478

1606/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.9105 - loss: 0.2478

1670/1719 ━━━━━━━━━━━━━━━━━━━ 0s 814us/step - accuracy: 0.9105 - loss: 0.2478

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 868us/step - accuracy: 0.9110 - loss: 0.2477 - val_accuracy: 0.8746 - val_loss: 0.3500

Epoch 26/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9062 - loss: 0.2413

  62/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 829us/step - accuracy: 0.9205 - loss: 0.2119

 124/1719 ━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.9170 - loss: 0.2230

 189/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 804us/step - accuracy: 0.9146 - loss: 0.2322

 250/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.9135 - loss: 0.2357

 312/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.9128 - loss: 0.2377

 374/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 811us/step - accuracy: 0.9123 - loss: 0.2394

 435/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 813us/step - accuracy: 0.9120 - loss: 0.2409

 499/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 811us/step - accuracy: 0.9119 - loss: 0.2419

 563/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9118 - loss: 0.2426

 628/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9119 - loss: 0.2429

 691/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9120 - loss: 0.2431

 754/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9120 - loss: 0.2433

 814/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9120 - loss: 0.2436

 873/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 809us/step - accuracy: 0.9120 - loss: 0.2439

 936/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9120 - loss: 0.2441

1000/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9121 - loss: 0.2441

1064/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9121 - loss: 0.2441

1129/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9122 - loss: 0.2441

1192/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9123 - loss: 0.2440

1256/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9123 - loss: 0.2440

1320/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9124 - loss: 0.2440

1383/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9124 - loss: 0.2439

1446/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9125 - loss: 0.2439

1508/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 802us/step - accuracy: 0.9125 - loss: 0.2438

1570/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9125 - loss: 0.2438

1632/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 803us/step - accuracy: 0.9125 - loss: 0.2438

1694/1719 ━━━━━━━━━━━━━━━━━━━ 0s 804us/step - accuracy: 0.9125 - loss: 0.2438

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 859us/step - accuracy: 0.9126 - loss: 0.2437 - val_accuracy: 0.8732 - val_loss: 0.3503

Epoch 27/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9062 - loss: 0.2410

  59/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 874us/step - accuracy: 0.9210 - loss: 0.2081

 121/1719 ━━━━━━━━━━━━━━━━━━━ 1s 839us/step - accuracy: 0.9178 - loss: 0.2190

 181/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 838us/step - accuracy: 0.9153 - loss: 0.2280

 241/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 838us/step - accuracy: 0.9144 - loss: 0.2317

 306/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 825us/step - accuracy: 0.9139 - loss: 0.2340

 369/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.9135 - loss: 0.2356

 431/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 819us/step - accuracy: 0.9132 - loss: 0.2371

 492/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 820us/step - accuracy: 0.9131 - loss: 0.2381

 553/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9131 - loss: 0.2388

 614/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9132 - loss: 0.2391

 676/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 820us/step - accuracy: 0.9132 - loss: 0.2393

 737/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9133 - loss: 0.2395

 797/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.9133 - loss: 0.2397

 859/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9133 - loss: 0.2400

 924/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9133 - loss: 0.2402

 985/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.9133 - loss: 0.2403

1044/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9134 - loss: 0.2403

1107/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.9135 - loss: 0.2403

1169/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.9135 - loss: 0.2402

1236/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.9136 - loss: 0.2402

1301/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 813us/step - accuracy: 0.9136 - loss: 0.2402

1366/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 811us/step - accuracy: 0.9137 - loss: 0.2401

1432/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9138 - loss: 0.2400

1494/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 808us/step - accuracy: 0.9138 - loss: 0.2400

1559/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 807us/step - accuracy: 0.9139 - loss: 0.2400

1624/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step - accuracy: 0.9139 - loss: 0.2399

1689/1719 ━━━━━━━━━━━━━━━━━━━ 0s 805us/step - accuracy: 0.9139 - loss: 0.2399

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 855us/step - accuracy: 0.9142 - loss: 0.2398 - val_accuracy: 0.8732 - val_loss: 0.3511

Epoch 28/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 15s 9ms/step - accuracy: 0.9062 - loss: 0.2342

  63/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 821us/step - accuracy: 0.9249 - loss: 0.2049

 128/1719 ━━━━━━━━━━━━━━━━━━━ 1s 799us/step - accuracy: 0.9213 - loss: 0.2162

 187/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 818us/step - accuracy: 0.9188 - loss: 0.2243

 246/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 826us/step - accuracy: 0.9177 - loss: 0.2277

 303/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 837us/step - accuracy: 0.9170 - loss: 0.2296

 358/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 849us/step - accuracy: 0.9165 - loss: 0.2311

 418/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 848us/step - accuracy: 0.9160 - loss: 0.2326

 481/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 842us/step - accuracy: 0.9158 - loss: 0.2337

 542/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 840us/step - accuracy: 0.9156 - loss: 0.2345

 597/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 846us/step - accuracy: 0.9156 - loss: 0.2348

 654/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 849us/step - accuracy: 0.9156 - loss: 0.2350

 709/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 855us/step - accuracy: 0.9156 - loss: 0.2352

 760/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 863us/step - accuracy: 0.9156 - loss: 0.2354

 817/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 865us/step - accuracy: 0.9156 - loss: 0.2356

 875/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 865us/step - accuracy: 0.9155 - loss: 0.2359

 933/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 865us/step - accuracy: 0.9155 - loss: 0.2361

 993/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 864us/step - accuracy: 0.9155 - loss: 0.2362

1053/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 863us/step - accuracy: 0.9156 - loss: 0.2362

1110/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 864us/step - accuracy: 0.9156 - loss: 0.2362

1161/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 869us/step - accuracy: 0.9156 - loss: 0.2361

1223/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 867us/step - accuracy: 0.9157 - loss: 0.2361

1284/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 865us/step - accuracy: 0.9157 - loss: 0.2361

1343/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 864us/step - accuracy: 0.9157 - loss: 0.2361

1404/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 862us/step - accuracy: 0.9158 - loss: 0.2360

1465/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 860us/step - accuracy: 0.9158 - loss: 0.2360

1526/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 859us/step - accuracy: 0.9159 - loss: 0.2360

1587/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 858us/step - accuracy: 0.9159 - loss: 0.2359

1650/1719 ━━━━━━━━━━━━━━━━━━━ 0s 856us/step - accuracy: 0.9159 - loss: 0.2359

1710/1719 ━━━━━━━━━━━━━━━━━━━ 0s 856us/step - accuracy: 0.9159 - loss: 0.2359

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 915us/step - accuracy: 0.9156 - loss: 0.2359 - val_accuracy: 0.8752 - val_loss: 0.3513

Epoch 29/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2317

  58/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 887us/step - accuracy: 0.9283 - loss: 0.1999

 108/1719 ━━━━━━━━━━━━━━━━━━━ 1s 943us/step - accuracy: 0.9250 - loss: 0.2087

 167/1719 ━━━━━━━━━━━━━━━━━━━ 1s 913us/step - accuracy: 0.9218 - loss: 0.2186

 229/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 886us/step - accuracy: 0.9202 - loss: 0.2231

 294/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 861us/step - accuracy: 0.9192 - loss: 0.2255

 359/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 846us/step - accuracy: 0.9184 - loss: 0.2272

 419/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 845us/step - accuracy: 0.9179 - loss: 0.2288

 478/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 846us/step - accuracy: 0.9176 - loss: 0.2298

 537/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 847us/step - accuracy: 0.9174 - loss: 0.2305

 596/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 848us/step - accuracy: 0.9173 - loss: 0.2309

 657/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 846us/step - accuracy: 0.9173 - loss: 0.2311

 717/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 845us/step - accuracy: 0.9172 - loss: 0.2313

 776/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 846us/step - accuracy: 0.9172 - loss: 0.2316

 833/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 849us/step - accuracy: 0.9171 - loss: 0.2318

 893/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 848us/step - accuracy: 0.9170 - loss: 0.2321

 952/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 848us/step - accuracy: 0.9170 - loss: 0.2323

1012/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 847us/step - accuracy: 0.9170 - loss: 0.2323

1071/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 848us/step - accuracy: 0.9170 - loss: 0.2323

1130/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 848us/step - accuracy: 0.9170 - loss: 0.2323

1188/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 849us/step - accuracy: 0.9170 - loss: 0.2323

1242/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 853us/step - accuracy: 0.9170 - loss: 0.2323

1298/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 855us/step - accuracy: 0.9170 - loss: 0.2322

1357/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 855us/step - accuracy: 0.9171 - loss: 0.2322

1417/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 855us/step - accuracy: 0.9171 - loss: 0.2322

1477/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 854us/step - accuracy: 0.9171 - loss: 0.2321

1536/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 855us/step - accuracy: 0.9172 - loss: 0.2321

1596/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 854us/step - accuracy: 0.9172 - loss: 0.2321

1654/1719 ━━━━━━━━━━━━━━━━━━━ 0s 855us/step - accuracy: 0.9172 - loss: 0.2321

1714/1719 ━━━━━━━━━━━━━━━━━━━ 0s 854us/step - accuracy: 0.9172 - loss: 0.2321

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 913us/step - accuracy: 0.9169 - loss: 0.2320 - val_accuracy: 0.8758 - val_loss: 0.3530

Epoch 30/30


   1/1719 ━━━━━━━━━━━━━━━━━━━━ 17s 10ms/step - accuracy: 0.9375 - loss: 0.2264

  58/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 883us/step - accuracy: 0.9341 - loss: 0.1969

 119/1719 ━━━━━━━━━━━━━━━━━━━ 1s 854us/step - accuracy: 0.9289 - loss: 0.2072

 181/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 838us/step - accuracy: 0.9251 - loss: 0.2164

 243/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 834us/step - accuracy: 0.9234 - loss: 0.2201

 301/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 840us/step - accuracy: 0.9223 - loss: 0.2220

 361/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 841us/step - accuracy: 0.9213 - loss: 0.2236

 424/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 835us/step - accuracy: 0.9206 - loss: 0.2252

 485/1719 ━━━━━━━━━━━━━━━━━━━━ 1s 834us/step - accuracy: 0.9201 - loss: 0.2262

 547/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 832us/step - accuracy: 0.9198 - loss: 0.2270

 609/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 830us/step - accuracy: 0.9197 - loss: 0.2273

 666/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 835us/step - accuracy: 0.9196 - loss: 0.2275

 727/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 835us/step - accuracy: 0.9195 - loss: 0.2277

 788/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 834us/step - accuracy: 0.9194 - loss: 0.2280

 851/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 832us/step - accuracy: 0.9193 - loss: 0.2283

 914/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 830us/step - accuracy: 0.9193 - loss: 0.2286

 978/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 827us/step - accuracy: 0.9192 - loss: 0.2287

1040/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 826us/step - accuracy: 0.9192 - loss: 0.2287

1104/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 823us/step - accuracy: 0.9192 - loss: 0.2287

1165/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 824us/step - accuracy: 0.9192 - loss: 0.2287

1228/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 822us/step - accuracy: 0.9192 - loss: 0.2286

1292/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 821us/step - accuracy: 0.9192 - loss: 0.2286

1357/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 819us/step - accuracy: 0.9192 - loss: 0.2286

1420/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 818us/step - accuracy: 0.9192 - loss: 0.2285

1483/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 817us/step - accuracy: 0.9192 - loss: 0.2285

1547/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 815us/step - accuracy: 0.9192 - loss: 0.2285

1589/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 829us/step - accuracy: 0.9192 - loss: 0.2285

1628/1719 ━━━━━━━━━━━━━━━━━━━━ 0s 840us/step - accuracy: 0.9192 - loss: 0.2285

1690/1719 ━━━━━━━━━━━━━━━━━━━ 0s 839us/step - accuracy: 0.9191 - loss: 0.2285

1719/1719 ━━━━━━━━━━━━━━━━━━━━ 2s 902us/step - accuracy: 0.9183 - loss: 0.2285 - val_accuracy: 0.8742 - val_loss: 0.3546

Visualisation

import pandas as pd 

pd.DataFrame(history.history).plot(
    figsize=(8, 5), xlim=[0, 29], ylim=[0, 1], grid=True, xlabel="Époque",
    style=["r--", "r--.", "b-", "b-*"])
plt.legend(loc="lower left")  # code supplémentaire
plt.show()

Visualisation

Évaluation du modèle sur l’ensemble de test

model.evaluate(X_test, y_test)

Faire des prédictions

X_new = X_test[:3]
y_proba = model.predict(X_new)
y_proba.round(2)
y_pred = y_proba.argmax(axis=-1)
y_pred
array([9, 2, 1])
y_new = y_test[:3]
y_new
array([9, 2, 1], dtype=uint8)

Prédictions vs Observations

plt.figure(figsize=(7.2, 2.4))
for index, image in enumerate(X_new):
    plt.subplot(1, 3, index + 1)
    plt.imshow(image, cmap="binary", interpolation="nearest")
    plt.axis('off')
    plt.title(class_names[y_test[index]])
plt.subplots_adjust(wspace=0.2, hspace=0.5)
plt.show()

np.array(class_names)[y_pred]
array(['Botte', 'Pull', 'Pantalon'], dtype='<U11')

Performance sur l’ensemble de test

from sklearn.metrics import classification_report

y_proba = model.predict(X_test)
y_pred = y_proba.argmax(axis=-1)

Performance sur l’ensemble de test

print(classification_report(y_test, y_pred))
              precision    recall  f1-score   support

           0       0.85      0.80      0.82      1000
           1       0.99      0.97      0.98      1000
           2       0.77      0.82      0.79      1000
           3       0.80      0.93      0.86      1000
           4       0.79      0.81      0.80      1000
           5       0.85      0.99      0.91      1000
           6       0.76      0.62      0.68      1000
           7       0.95      0.88      0.91      1000
           8       0.96      0.97      0.96      1000
           9       0.99      0.91      0.95      1000

    accuracy                           0.87     10000
   macro avg       0.87      0.87      0.87     10000
weighted avg       0.87      0.87      0.87     10000

Prologue

Résumé

  • Introduction aux réseaux de neurones et au connexionnisme
    • Passage de l’IA symbolique aux approches connexionnistes en intelligence artificielle.
    • Inspiration des réseaux neuronaux biologiques et de la structure du cerveau humain.
  • Calculs avec neurodes et unités logiques à seuil
    • Modèles précoces de neurones (neurodes) capables de réaliser des opérations logiques (ET, OU, NON).
    • Limites des perceptrons simples dans la résolution de problèmes non linéairement séparables comme le XOR.
  • Perceptrons multicouches (MLP) et réseaux de neurones à propagation avant (FNN)
    • Dépassement des limites des perceptrons en introduisant des couches cachées.
    • Structure et flux d’information dans les réseaux de neurones à propagation avant.
    • Explication des calculs de la passe avant dans les réseaux de neurones.
  • Fonctions d’activation dans les réseaux de neurones
    • Importance des fonctions d’activation non linéaires (sigmoïde, tanh, ReLU) pour permettre l’apprentissage de motifs complexes.
    • Rôle des fonctions d’activation dans la rétropropagation et l’optimisation par descente de gradient.
    • Théorème de l’approximation universelle et ses implications pour les réseaux neuronaux.
  • Frameworks d’apprentissage profond
    • Aperçu de PyTorch et TensorFlow en tant que plateformes leaders pour l’apprentissage profond.
    • Introduction à Keras comme API de haut niveau pour la construction et l’entraînement des réseaux neuronaux.
    • Discussion sur l’adaptabilité des différents frameworks à la recherche et aux applications industrielles.
  • Implémentation pratique avec Keras
    • Chargement et exploration de l’ensemble de données Fashion-MNIST.
    • Création d’un modèle de réseau neuronal avec l’API Sequential de Keras.
    • Compilation du modèle avec des fonctions de perte et des optimiseurs appropriés pour la classification multiclasses.
    • Entraînement du modèle et visualisation des métriques d’entraînement et de validation sur les époques.
  • Évaluation des performances du modèle sur l’ensemble de test
    • Évaluation des performances du modèle sur l’ensemble de test Fashion-MNIST.
    • Interprétation des résultats obtenus après l’entraînement.
  • Faire des prédictions et interpréter les résultats
    • Utilisation du modèle entraîné pour faire des prédictions sur de nouvelles données.
    • Visualisation des prédictions en parallèle avec les images et étiquettes réelles.
    • Compréhension des probabilités de sortie et des assignations de classes dans le contexte de l’ensemble de données.

3Blue1Brown

Prochain cours

  • Nous discutons de l’algorithme utilisé pour entraîner les réseaux de neurones artificiels.

Références

Cybenko, George V. 1989. « Approximation by superpositions of a sigmoidal function ». Mathematics of Control, Signals and Systems 2: 303‑14. https://api.semanticscholar.org/CorpusID:3958369.
Géron, Aurélien. 2022. Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow. 3ᵉ éd. O’Reilly Media, Inc.
Goodfellow, Ian, Yoshua Bengio, et Aaron Courville. 2016. Deep Learning. Adaptive computation et machine learning. MIT Press. https://dblp.org/rec/books/daglib/0040158.
Hornik, Kurt, Maxwell Stinchcombe, et Halbert White. 1989. « Multilayer feedforward networks are universal approximators ». Neural Networks 2 (5): 359‑66. https://doi.org/https://doi.org/10.1016/0893-6080(89)90020-8.
LeCun, Yann, Yoshua Bengio, et Geoffrey Hinton. 2015. « Deep learning ». Nature 521 (7553): 436‑44. https://doi.org/10.1038/nature14539.
LeNail, Alexander. 2019. « NN-SVG: Publication-Ready Neural Network Architecture Schematics ». Journal of Open Source Software 4 (33): 747. https://doi.org/10.21105/joss.00747.
McCulloch, Warren S, et Walter Pitts. 1943. « A logical calculus of the ideas immanent in nervous activity ». The Bulletin of Mathematical Biophysics 5 (4): 115‑33. https://doi.org/10.1007/bf02478259.
Minsky, Marvin, et Seymour Papert. 1969. Perceptrons: An Introduction to Computational Geometry. Cambridge, MA, USA: MIT Press.
Rosenblatt, F. 1958. « The perceptron: A probabilistic model for information storage and organization in the brain. » Psychological Review 65 (6): 386‑408. https://doi.org/10.1037/h0042519.
Russell, Stuart, et Peter Norvig. 2020. Artificial Intelligence: A Modern Approach. 4ᵉ éd. Pearson. http://aima.cs.berkeley.edu/.

Marcel Turcotte

Marcel.Turcotte@uOttawa.ca

École de science informatique et de génie électrique (SIGE)

Université d’Ottawa