{ "cells": [ { "cell_type": "markdown", "id": "bc21e62b", "metadata": {}, "source": [ "## Глубокое обучение" ] }, { "cell_type": "markdown", "id": "07e007fa", "metadata": {}, "source": [ "#### Инициализация Keras\n", "\n", "В качестве бэкенда используется jax\n", "\n", "Бэкенд должен указываться до импорта keras" ] }, { "cell_type": "code", "execution_count": 1, "id": "d88eddc1", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "3.9.2\n" ] } ], "source": [ "import os\n", "\n", "os.environ[\"KERAS_BACKEND\"] = \"jax\"\n", "import keras\n", "\n", "print(keras.__version__)" ] }, { "cell_type": "markdown", "id": "e725c3e2", "metadata": {}, "source": [ "### Классификация" ] }, { "cell_type": "markdown", "id": "60b2018b", "metadata": {}, "source": [ "#### Загрузка набора данных для задачи классификации\n", "\n", "База данных MNIST (сокращение от \"Modified National Institute of Standards and Technology\") — объёмная база данных образцов рукописного написания цифр. База данных является стандартом, предложенным Национальным институтом стандартов и технологий США с целью обучения и сопоставления методов распознавания изображений с помощью машинного обучения в первую очередь на основе нейронных сетей. Данные состоят из заранее подготовленных примеров изображений, на основе которых проводится обучение и тестирование систем.\n", "\n", "База данных MNIST содержит 60000 изображений для обучения и 10000 изображений для тестирования." ] }, { "cell_type": "code", "execution_count": 2, "id": "e7a770c7", "metadata": {}, "outputs": [], "source": [ "from keras.api.datasets import mnist\n", "\n", "(X_train, y_train), (X_valid, y_valid) = mnist.load_data()" ] }, { "cell_type": "markdown", "id": "a2231ce2", "metadata": {}, "source": [ "#### Отображение данных\n", "\n", "Образцы из набора прошли сглаживание и приведены к серому полутоновому изображению размером 28x28 пикселей.\n", "\n", "Под каждым изображением представлено соответствующее ему значение целевого признака (класс)." ] }, { "cell_type": "code", "execution_count": 3, "id": "3cfd2770", "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeoAAAHLCAYAAAAHndupAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAALblJREFUeJzt3QucTfX+//HvjEtyG40UwlC5NiGXRJIySIjqSHNc5ygdRdThSCn6ISr1OxhKN35F5FHhVEdxcsspDsl5NKEfupgxNE5uQxqXmf9j7fP/7bM/32Ov2XvP3rO/a63X8/HwaL2t2Xt/8501n1nru77flVBUVFSkAACAkRLj3QAAABAchRoAAINRqAEAMBiFGgAAg1GoAQAwGIUaAACDUagBADAYhRoAAIOVDeWLCgsLVW5urqpSpYpKSEiIfasQEWvtmvz8fFW7dm2VmBje72D0sfv72EI/OwPHsvsVhdHHIRVqq9Pr1q0brfYhxrKzs1WdOnXCeg197P4+ttDPzsKx7H6h9HFIv6pZv5nBOSLpL/rYWSLtL/rZWTiW3S+U/gqpUHP5xFki6S/62Fki7S/62Vk4lt0vlP7iZjIAAAxGoQYAwGAUagAADEahBgDAYBRqAAAMRqEGAMBgFGoAAAxGoQYAwGAUagAADEahBgDAYBRqAAAMFtLTswCE79NPP7Vd0/fWW28t5RY526xZs0R++OGH/dtZWVliX69evUT+8ccfY9w6IHY4owYAwGAUagAADEahBgDAYK4foy5TpozISUlJIb925MiRIlesWFHkxo0bi/zQQw/5t2fOnCn2paeni/zrr7+KPGPGDJGffvrpkNsJM/z3f/+3yB06dBD5zTffLOUWOVv9+vVFHjhwoMiFhYX+7aZNm4p9TZo0EZkxakycODHoz9jERHnO2rlzZ5E3bNig4okzagAADEahBgDAYBRqAAAM5ogx6nr16vm3y5cvbzsO2LFjR5GrVasm8t133x21duXk5Ig8e/Zs//add94p9uXn54v8j3/8w6gxEIRPv6/g97//vchnz561nVcNe4cPHxZ548aNIt9xxx2l3CI4ydChQ0UeP3580HscdEVFRcoknFEDAGAwCjUAAAajUAMAYDAjx6hbtmwp8tq1ayOaBx1t+piGPi/v5MmT/u3FixeLfQcPHhT56NGjIn/77bdRbClKww033CByuXLlRN60aZPIy5YtK5V2ucWpU6dEZi40wpGSkiJyhQoVlFNxRg0AgMEo1AAAGIxCDQCAwYwco96/f7/IP//8c0zGqLds2SLysWPHRL7llltEPnPmjMhvvfVW1NqCyHXq1EnkJ554Iuga60eOHCnRZwW+X2pqqti3b98+kceOHVuiz/I6fQ2EFi1axK0tMF9aWprIo0aNsv363bt3B31++U8//aRMwhk1AAAGo1ADAGAwIy9965cnx40bF/QSxVdffRV0Gc8L2bFjh3+7a9euttNBrrnmGpFHjx5dbNtR+l555RWRGzZs6N9u1qyZ7ZSpcD3++OP+7erVq4t9999/v+0ysQiP/ljZwKWEi9O2bduglzktTPVyvo7actELFiwQubhh0ueff94x3w+cUQMAYDAKNQAABqNQAwBgMCPHqHUrVqy44HKiF3p8pD6FY9iwYSLPnDkz6Ji07ptvvhF5+PDhYbQapeWXX34J+oi6ki4bqC9nG7gsob6krJOXKDRRbm6uyAsXLhR58uTJQV+r79OnXmZmZkaljYifIUOGiFy7dm3br1+/fr3Ib775pnIKzqgBADAYhRoAAINRqAEAMJgjxqgDnThxwnb/8ePHbfcHznV95513xD59zBFmmjJlisjXXnutyLt27Yp4LnOlSpVEHj9+fNC5vZs3bxb73n333bA+CyXrd7sxarjPpZdeKvLvfvc725/f+n0JU6dOVU7FGTUAAAajUAMAYDAKNQAABnPcGHVx9HGr1q1bi3zzzTcHfSza6tWrY9w6RKJu3bq2a2qfO3dO5JEjR/q3Dx8+HNZnvfjiiyL369cv6NzeG2+8Maz3RnQlJv77PIP7S9ypfv36/u333nsvrNfOmTNH5HXr1imn4owaAACDUagBADAYhRoAAIO5boxaX79bH8/cvn27f/vVV1+1HcPYtm2byHPnzg26pjSiJzU1VeTly5fbzqfUx6I2bNgQ8meNHTtW5KFDh9p+/bRp00J+b8RW4Lg0x6I73Xbbbf7t5s2b237tp59+KvKsWbOUW3BGDQCAwSjUAAAYzHWXvnX79u0LemlzwYIFYt+gQYNss768pP6YtIMHD5a4vV5Rtqz81hs4cKB/+/XXXw86DedCU3Hat28v8oQJE4JOt0pOTradfpWQkGDbx/Pnz7/A/w2AaOjbt6/IM2bMCPq1mzZtsn3sZXHLSTsJZ9QAABiMQg0AgMEo1AAAGMz1Y9S6wKk+e/bsEfv08cwuXbqI/Mwzz4ickpISdOrOgQMHotJet7r33ntFfu2114JOtdHHpPfu3StymzZtguY+ffqIfVdccYXItWrVEllfclR/lB6A2CwRGu4yod99953IP/30k3IrzqgBADAYhRoAAINRqAEAMJjnxqgDZWVliXzPPfeI3Lt3b5H1edcPPPCAyA0bNvRvd+3aNYotdb7+/fvb/luePXvWv33s2DGx77e//a3IR48eFfmFF14I+ihTffxanyetj4fry5NmZ2eL3Llz56Bz9GHuYy47deokcmZmZszahdCNHz9e5HAeVzrDZo6123BGDQCAwSjUAAAYjEINAIDBEopCeD7ciRMnVFJSkvK6goIC2/Wqz50759/u3r272Ld+/XpVWqw1bqtWrRrWa2Ldx2vXrrWdgz516tSg49fFadasWdD1uPV1wIsbo9a9/fbbIg8ePFiZIJI+dtuxfP78+Ygfc6k/MnHnzp3KRCYeyyXRsmVL23nT9erVC/ralStXivyb3/xGuUEofcwZNQAABqNQAwBgMAo1AAAG8/Q8an2cSh/zaNu2re2YtC5wnGvjxo1RaaNb6ONL77//vu185XDoc59TU1ODfm16errtXHpdTk5OxO1CbL388stB1zQozvDhw0UeM2ZM1NqF4FavXi3yJZdcYvv1mzdv9m8PHTpUeRVn1AAAGIxCDQCAwSjUAAAYzPVj1I0bNxZ55MiR/u277rpL7KtZs2bE8zgtBw8ejGjNWi+YNWtW1N5LnyPar18/kQPnJOrrcS9btixq7UB87d69O95NQJiqV68ucnE/J+fNm+ffPnnypPIqzqgBADAYhRoAAIM5/tK3frlan34TeKnbUr9+/Yg/a9u2bSJPmzZN5D//+c8RvzdC9+CDD4o8YsQIkfPy8vzbt956a6m1C6Vrzpw5/u1Ro0aJfVdddZXta0ePHh30vSw8wjQ69OWAAx9NGorPP/88yi1yJs6oAQAwGIUaAACDUagBADCYI8aoL7/88qCPNMzMzBS5SZMmEX/Oli1bRH7++edtl8FkClbp0B+Jed9994msP+LwlVde8W+zBKg3fPPNNyJfeeWVtl/PsVs6j7JMS0uz/Xc/c+aMyHPnzhX5p59+ikkbnYYzagAADEahBgDAYBRqAAAMZsQYdXJyssjz588POuZR3NhTOPPyXnjhBbHvk08+Efn06dMl+ixEx5o1a2zHrBctWiTypEmTSqVdMEfgfQmW3r17x60tXletWrWQl2U+cOCAyGPHjo1Zu5yMM2oAAAxGoQYAwGAUagAAvD5G3a5dO5HHjRsn8vXXXy/yFVdcEfFn/fLLLyLPnj1b5Geeeca/ferUqYg/B/FbL3jKlCm289vhPTt37hR5165dIjdt2rSUWwRED2fUAAAYjEINAIDBKNQAAHh9jPrOO++0zeGMP3344Ydi37lz50TW50YfO3YsrM+CeaZPn26bgR9//FHka6+9Nm5t8brdu3cHfZ50x44d49Ai5+OMGgAAg1GoAQAwGIUaAACDJRTpD/O9gBMnTqikpKTSaRFK7Pjx46pq1aphvYY+dn8fW+hnZ+FYdr9Q+pgzagAADEahBgDAYBRqAAAMRqEGAMBgFGoAAAxGoQYAwGAUagAADEahBgDAYBRqAACcXqhDWLwMBomkv+hjZ4m0v+hnZ+FYdr9Q+iukQp2fnx+N9qCURNJf9LGzRNpf9LOzcCy7Xyj9FdJa34WFhSo3N1dVqVJFJSQkRKt9iDKrK61Or127tkpMDG9Ugz52fx9b6Gdn4Fh2v6Iw+jikQg0AAOKDm8kAADAYhRoAAINRqAEAMBiFGgAAg3muUE+ePNl3J2TgnyZNmsS7WYihGTNm+Pp5zJgx8W4Komjjxo2qd+/evrtmrf5dsWJFvJuEKKOPPVqoLddcc406ePCg/8+mTZvi3STEyNatW9X8+fNV8+bN490URNmpU6dUixYt1Ny5c+PdFMQIffwvZZUHlS1bVtWsWTPezUCMnTx5Ug0YMEC9+uqraurUqfFuDqKsR48evj9wL/rYw2fUe/bs8V1KufLKK30/yPfv3x/vJiEGHnroIdWzZ0+VlpYW76YAQMQ8d0bdrl07tXDhQtW4cWPfZe+nn35a3XTTTSorK8u3kg/cYenSpWr79u2+S98A4GSeK9SBl1GscUurcKekpKhly5apYcOGxbVtiI7s7Gw1evRotWbNGlWhQoV4NwcASsRzhVpXrVo11ahRI7V37954NwVR8uWXX6q8vDzVqlUr/9+dP3/edwdpZmamKigoUGXKlIlrGwEgVJ4v1NYNR/v27VODBg2Kd1MQJV26dFFff/21+LuMjAzfNLzx48dTpAE4iucK9dixY33z8qzL3dYTZiZNmuT7wZ2enh7vpiFKrHsNUlNTxd9VqlRJVa9e/T/+Hs7+JTvwStj333+vduzYoZKTk1W9evXi2jZEB33s0UKdk5PjK8o///yzqlGjhurYsaPavHmzbxuAc2zbtk3dcsst/vzoo4/6/jtkyBDfDaNwPvr4X3jMJQAABvPkPGoAAJyCQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABisbChfVFhYqHJzc1WVKlVUQkJC7FuFiBQVFan8/HxVu3ZtlZgY3u9g9LH7+9hCPzsDx7L7FYXRxyEVaqvT69atG632Icays7NVnTp1wnoNfez+PrbQz87Csex+ofRxSL+qWb+ZwTki6S/62Fki7S/62Vk4lt0vlP4KqVBz+cRZIukv+thZIu0v+tlZOJbdL5T+4mYyAAAMRqEGAMBgFGoAAAxGoQYAwGAUagAADEahBgDAYBRqAAAMRqEGAMBgFGoAAAxGoQYAwGAUagAADEahBgDAYCE95hJwi9atW4s8cuRIkQcPHizym2++KfKcOXP829u3b49JG/EvzZo1E7lXr14iDx8+3L+9detWse+rr76yfe8//elPIp85c6YELQViizNqAAAMRqEGAMBgFGoAAAyWUFRUVFTcF504cUIlJSWVTotQYsePH1dVq1YN6zVu7eOWLVuKvHbtWpHD/Xey/m3/T/Xq1ZWT+tj0fn7ggQdEnjlzpsiVK1eO2mfdeuutIq9bt06ZyO3HcqNGjUQuV66cyJ06dfJvz5s3T+wrLCyMWjtWrlwp8r333ltq9zCE0secUQMAYDAKNQAABqNQAwBgMMaoXcjt41rFuf766/3b7733nthXu3ZtkfVv//z8fNuxqcBx6Y4dO4p9+rzqeI9rOa2fk5OTRd61a5fIl112WdQ+69ixYyL3799f5NWrVysTOP1Yvuaaa0QeOnSoyP369RM5MTEx6PGakJAg9oVQuiKmr58wZsyY//g3jhbGqAEAcDgKNQAABqNQAwBgMNev9d2uXTuRBw4c6N+++eabbcdTdGPHjhU5NzdX5MAxy0WLFol9W7ZsCaPVsFOxYkWRW7VqJXLgv32tWrXCeu89e/aI/Nxzz4m8dOlS//bf/vY3sW/ixIkiT58+PazP9rojR46IPGnSJJFfeOGFoN8H+/fvF/vq1atn+1nVqlUT+bbbbjNyjNrp9GPg9ttvV04wWFvz//XXXxdZP/ZjjTNqAAAMRqEGAMBgFGoAAAzmujFqfT7krFmzRL700kuDzstbv369yDVq1BD5+eeft/3swPfTX6uvHYvIzZ8/X+T09PSovbc+3q2vL71hwwb/dufOncW+5s2bR60dUOrll18W+fe//73ILVq0iNq81szMzBK9Hhe2Zs2asMao8/Lygo4NJ2pzrItb67tDhw4i6/ckOQln1AAAGIxCDQCAwRx36btsWdnkNm3aiPzqq6/aTuXZuHGjf3vKlCli36ZNm0S+6KKLRF62bJnI3bp1C9rObdu2Bd2H8LRu3Vrknj17iqwPYQS7VG354IMPbB+lqE+5++qrr0Q+evRo0Ecl2rUDJTd16lSRn3jiiaCPMw1X+fLlS/R6XNhLL70k8ooVK2y//uzZsyIfOnQo4s/Wl+XMysqyXU7Yrp3x/nnOGTUAAAajUAMAYDAKNQAABnPcGHXgEqCW1157LazpAYHTt4qb0qFP9bIbk7bk5OT4t//nf/7H9msRnD7eqPehPvakP+5u1apVQadu6VM09GU/9e+nw4cPi/yPf/wj6PQQfexcn+qlPwYT4Xn33XeD3lOiL/l57bXXlmj8+ze/+U1EbYR07tw5kbOzs0vts7t37y7yJZdcEvJrA3+WWwoKClQ8cUYNAIDBKNQAABiMQg0AgMEcMUYdON/58ccftx2fnDdvnu0YZDhLDQbO0wzFww8/HHRsE/YaNWrk3x43bpzYl5SUJPI///lPkQ8ePChy4P0BJ0+eFPs++ugj21wSF198sch/+MMfRB4wYEDUPsuL9H+/wCVEU1NTS/Te+hoKcJ57tWWa77//ftvj085TTz2lTMIZNQAABqNQAwBgMAo1AAAGM3KMWh8fCByXPnPmjNj3ySefiDx+/HiRT58+HfRzKlSoYDtPul69erZrOetzL1euXBn0s6Bs11EPXHNbfxRefn6+yIMHD7ZdhzecsahY0r9/YK9JkyYiL1++XOSrr77adt3/kvjzn/8ctfdC6dyj8Nhjj9l+f5QrVy7k996xY4ftmuPxxhk1AAAGo1ADAGAwCjUAAAYzYoy6WrVqIj/44INB50rrY9J9+/YN67MCxzEWL15s+9zj4tYafu6558L6bPzbddddJ7I+Lh2oT58+ts+Yhjs0bdpU5AYNGsRsTFr3yCOPiDxq1KiYfZaX1K9fX+RBgwaJnJaWFvJ7dezY0XYNjeLoa2gEjnH/5S9/CfnepnjgjBoAAINRqAEAMJgRl77Lly8v8qWXXhrSMp2Wyy67TOSMjAyR77jjDpEDlxqsXLmy7aUUPS9atEjkU6dOBW0n7L344otBp77pl7ZNvdSdmCh/z9Ufe4nw6NOx/vjHP4r87LPP2k6vLIlatWpF7b28LvBnrD7tLZ5TFj/77DORX3nlFeUUnFEDAGAwCjUAAAajUAMAYDAjxqj1ZUH1R0TWqFHDv/3999+X6Bb93NzcoLfr6+NU+uMUP/jgg7A+C//Wq1cvkVu2bBm0H52ynKM+Jq1/L+rLEiI8s2fPFnnPnj220zrtpnJlZmaKXLVq1ai0Efb0ZZf1XJr3hPTSfgb16NHDv71q1SplMs6oAQAwGIUaAACDUagBADCYEWPUx44ds10W9MMPP/RvJycni3379u2zfdTkwoULRT5y5Ih/e+nSpbZj1Pp+RE5/9KQ+dz4vL8+//c477yhTH8c5efLkoF+7du1akSdMmBCzdnlROOOI+lio/ghE/VG6+j0TKSkp/u0ff/wxzJZ6W1ZWln+7c+fOYt/AgQNF1peE/vXXXyP+3GHDhrl2GVjOqAEAMBiFGgAAg1GoAQAwmBFj1LotW7YEnUddUp06dfJv33zzzbbz8r777ruofS7sFRQU+LcPHjxozJj0xIkTRR43bpx/OycnR+x74YUXRD558mRM2oji6fdA6GPSurNnz4p8/vz5mLTLa/Tx/WnTpsXssyZr948wRg0AAEoFhRoAAINRqAEAMJiRY9SlNZ+3uLWamUddeuK1vrc+fzZwDNrSv3//oPP077777hi3DpGaOnVqWF//+uuvi6zffwDzde/eXbkVZ9QAABiMQg0AgMEo1AAAGMxzY9T62rIw47m0geu7jx49OmbteOSRR0R+8sknRU5KShJ58eLFIg8ePDhmbXOD6tWri7xgwQKRlyxZcsHtktLX6R8+fHhYr3///fej1ha3K1eunMjdunULuub96dOnY9aOjIwMkWfNmqXcijNqAAAMRqEGAMBgnrv07eZb+E2mT33Tc82aNf3bs2fPFvveeOMNkX/++WeRb7jhBpEHDRrk327RooXYV6dOHZH3799vOzQyb968C/zfIBi973r37i1yo0aN/Nu5ubli34EDB0Teu3evyK1btw76Xn/84x/FvqpVq9q2U1/uVW8L/q1jx44iP/HEEyJ37dpV5AYNGvi3s7OzS/TZgY81vv3228W+F198UeSKFSvavpd+Gb4kj9QsbZxRAwBgMAo1AAAGo1ADAGAwz41RX3nllfFuAi6gTJky/u0HH3xQ7NOX6jxx4oTIDRs2DPlzPv/8c5HXrVsX1uMQYW/OnDlBxyst7du392+vX79e7Pvhhx9E3rlzp8g33XSTyFWqVAnaDv0eiN27d4s8adIkx45XlrbMzEyRU1NTbb8+8H6B/Pz8En124Ph3q1atbPtYp39/vfTSS7bHvsk4owYAwGAUagAADEahBgDAYJ4bo/7ss8/824mJ8vcU/bGXiJ4vvvhC5K1bt4rctm3boK8NnGNtufzyy20/K3Cetf6o0lguTwqlNm/ebNvvb731VtA56vXr17fN4Th69KjIzZo1i/i9EJ4RI0aUyufk5eWJ/MEHH9ge606+D4EzagAADEahBgDAYBRqAAAM5rkx6qysLP/2nj17bOdYX3XVVSIfPnw4xq1zr5ycHJHvuusukR944AH/9sSJE8N6b/3xdoHzJfX1olG6/vCHP4h80UUX+bcrV65s+9rrrrtO5PT09KBfe/z4cdv1pxG5oUOHijxq1CiRhwwZErXP2rdvn8i//PLLBe8vsrzyyitBf7a7DWfUAAAYjEINAIDBKNQAABgsoai4BVP//9rKSUlJyu1jL6+99prIGzZssB2b0dciNoU1Xlfc83i90sduFUkfW+hnZzHxWA68z+BCP0enTp3q377kkkvEvhUrVoi8Zs0akVeuXCnyoUOHlNuF0secUQMAYDAKNQAABqNQAwBgME+PUevjAsuWLRM5LS1N5Pfff1/kjIwMkU+dOqVMYOK4FqKLMWpv4Fh2P8aoAQBwOAo1AAAG89wSovolokD33HOPyNOmTbN9fNvkyZMdMV0LAOBcnFEDAGAwCjUAAAajUAMAYDBPj1EXN2atLxmqZwAAYo0zagAADEahBgDA6YU6hMXLYJBI+os+dpZI+4t+dhaOZfcLpb9CKtT5+fnRaA9KSST9RR87S6T9RT87C8ey+4XSXyGt9V1YWKhyc3NVlSpVVEJCQrTahyizutLq9Nq1a6vExPBGNehj9/exhX52Bo5l9ysKo49DKtQAACA+uJkMAACDUagBADAYhRoAAINRqAEAMJgnC7V1p92YMWNUSkqKuvjii1WHDh3U1q1b490sRNncuXNV/fr1VYUKFVS7du3U3//+93g3CVHy0ksvqebNm6uqVav6/rRv316tWrUq3s1ClNHPHi7U9913n1qzZo1666231Ndff626deum0tLS1IEDB+LdNETJO++8ox599FE1adIktX37dtWiRQvVvXt3lZeXF++mIQrq1KmjZsyYob788ku1bds2deutt6o+ffqob775Jt5NQxTRzx6dnnX69Gnf/MKVK1eqnj17+v++devWqkePHmrq1KlxbR+iwzqDbtu2rcrMzPTPLa1bt67vwSqPPfZYvJuHGEhOTlbPP/+8GjZsWLybghhK9mA/e+6M+ty5c+r8+fO+y6GBrEvgmzZtilu7ED1nzpzx/QZuXSX5P9aCAlb+4osv4to2RJ91PC9dulSdOnXKd2kU7nTew/3sucdcWmfTVidPmTJFNW3aVF1++eVqyZIlvh/gV199dbybhyj45z//6Tuorb4NZOXdu3fHrV2ILmvYyjqWf/31V1W5cmW1fPly1axZs3g3C1H2Nf3svTNqizU2bV3xv+KKK9RFF12kZs+erdLT0yNakhFAfDRu3Fjt2LFDbdmyRY0YMUINGTJE7dy5M97NQpQ1pp+9N0YdyLqEcuLECVWrVi3Vv39/dfLkSfXRRx/Fu1mIwqXvihUrqnfffVf17dvX//fWAX7s2DHf/QlwH2to46qrrlLz58+Pd1MQQ2ke7GdPn0JWqlTJV6SPHj2qPvnkE9/dhHC+8uXL+24O/PTTT/1/Z91MZmWvjW15idXHBQUF8W4GYqzQg/3suTFqi1WUrQsJ1iWVvXv3qnHjxqkmTZqojIyMeDcNUWJNzbLOoNu0aaOuv/569ac//cl3BYU+docJEyb4ZmnUq1fPty7C22+/rdavX+87tuEe9LOHC/Xx48d93wA5OTm+W/3vvvtuNW3aNFWuXLl4Nw1RYg1lHD58WD311FPq0KFDqmXLlurjjz/+jxvM4EzWfPjBgwergwcPqqSkJN+iGNYP765du8a7aYgi+vlfPD1GDQCA6Tw9Rg0AgOko1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMEo1AAAGIxCDQCAwSjUAAAYjEINAIDBKNQAABiMQg0AgMHKhvJFhYWFKjc3V1WpUkUlJCTEvlWISFFRkcrPz1e1a9dWiYnh/Q5GH7u/jy30szNwLLtfURh9HFKhtjq9bt260WofYiw7O1vVqVMnrNfQx+7vYwv97Cwcy+4XSh+H9Kua9ZsZnCOS/qKPnSXS/qKfnYVj2f1C6a+QCjWXT5wlkv6ij50l0v6in52FY9n9QukvbiYDAMBgFGoAAAxGoQYAwGAUagAADEahBgDAYBRqAAAMRqEGAMBgFGoAAAxGoQYAwGAUagAADEahBgDAYBRqAAAMRqEGAMBgFGoAAAxGoQYAwGBllcuVK1dO5A4dOvi3n3nmGbHvxhtvLLV2wYyHtFeuXFnknj17ilyjRg2RX3zxRf92QUFBTNoIs3Tp0kXkxYsXi3zzzTeL/O2335ZKu+AdnFEDAGAwCjUAAAajUAMAYDDXj1EnJSWJvG7dOv/2oUOHxL6aNWuKrO+HM9SvX9+/PX78eLGvffv2Iqempob13rVq1fJvP/zww8prOnXqJHL16tVFXr58uXKbtm3birx169a4tQXexBk1AAAGo1ADAGAwCjUAAAZz/Ri1HX1MmjFqZ2jSpInIY8aMEXnAgAH+7YsvvljsS0hIEDk7O1vk/Px8kZs2bSryPffc49+eN2+e2Ld7927ldp07dxa5YcOGrhyjTkz89zlMgwYNxL6UlBTb7yk4T4rWp/rPjfT0dJFHjBhh+34fffSRfzsjI6PE7eOMGgAAg1GoAQAwGIUaAACDeXqMmrElZ8x9f/bZZ0Xu37+/7frddvbs2SNy9+7dbdeG18edL7300gtue8XgwYNF/uKLL5QbBc6Xv//++8W+RYsWee7eBDdIS0sT+a677go6Bq3/DCoqKgrrs2644QYVTZxRAwBgMAo1AAAG8/Slb/1yRoUKFeLWFvzbnXfeKfJ9990X8Xvt27dP5K5du9pOz7r66qsj/iwvCJy25GavvfZayMMnMLPPrr32WtulYO3o0zT1R5vqy8guWbJE5F9//VVFkzeOOgAAHIpCDQCAwSjUAAAYzNNj1Lo2bdqIvHnz5ri1xcv69esX1tf/8MMPQceP9Mdc6mPSOn3JUK9r3ry5yJdffrnyAn16TqA1a9aUaltw4UeqTp8+XeTf/e53Ih85ckTkL7/8UuQZM2b4t7OyssS+06dPi7x//34VT5xRAwBgMAo1AAAGo1ADAGAw149Rnzt3TuTjx48HHYe66qqrSq1dCE5fsnH48OEir169WuS9e/eKnJeXF/Fne2UMNlS333677eP/3ELvd/3RloEOHDhQCi2C7sknnxR52LBhIs+ZM0fkJ554QuSTJ08qp+KMGgAAg1GoAQAwGIUaAACDuX6M+tixYyJ/9tln/u1evXrFoUUoTm5ursiTJ08utc9u3759qX2WEzRu3Nh2/zfffKPcYObMmUHHrP/3f//Xdh1oRK5ixYoi6+seDBo0yL89ZswYsW/dunUif/LJJzFdbzueOKMGAMBgFGoAAAxGoQYAwGCuH6OG9zz88MP+7UqVKoX1Wv0ZtrrPP//cv/3FF18or9Ofy2uKqlWrinzbbbeJPHDgQJG7desW9L2mTJlie98LIjdx4kTbMeply5YFXT/BTWPQxeGMGgAAg1GoAQAwGJe+bR6jBjOncDRr1kzkSZMm2S57GSgxUf5uWlhYGNZUsYyMDP/2+fPnldclJydH/NoWLVqInJCQIHJaWprIderUEbl8+fL+7QEDBtj2s/7Ywi1btohcUFAgctmyZYM+HhHRM2HCBJGLiopEXrJkiScvdes4owYAwGAUagAADEahBgDAYIxRB7jjjjvi3QTPKleunH/7uuuuE/vee+89kWvVqmU7/hg4rqxPodKn6ejj37rAsUrLXXfd5d+eNWuW2HfmzBnlNvq/rT6G+PLLL4v8+OOPh/zezZs3tx2j1h9R+8svv4i8c+dO//Ybb7wh9m3btk3kDRs2iPzTTz+JnJOTE/Rxnrt377b9/0Dk/v73v4vcpk0bkTMzM4N+L65Zs0Z5BWfUAAAYjEINAIDBKNQAABjMc2PUgY9G4zGX8RM4B1YfO37//fdtX/v000+LvHbtWpH/9re/BZ3nq39tamqq7WfVqFFD5OnTp/u39+/fL/atWLHCdm6uEz344IMi//jjjyJ36NAh4vcu7t9v165dIm/evFlFy/Dhw237+bvvvovaZ3lNu3bt/NtfffWV7X0cPXr0CLr8r+XJJ5/0b7/77rtBP8ft9xJwRg0AgMEo1AAAGIxCDQCAwTw3Rq2PiwWby2tJSUmxHZ9D6PR/W32cedy4cUFfu2rVKpHnzJlj+9jBwPHGv/zlL7aPsdTHzJ577jnbMew+ffr4txcvXiz2/fWvfxX52WefFfno0aPKzo4dO5Tp9P8np+rSpYvtfn3uPoKvY/Dhhx+KXK9ePf/2I488IvYtWrRI5CNHjgSdN62PUVeuXDlq68w7DWfUAAAYjEINAIDBKNQAABjMc2PU+vrBdmsNX3TRRaXQIncqU6aMyFOmTBF57NixIp86dcq//dhjj4l9S5cutR2TtlsfWF83fM+ePSKPGDEi6Dx7S9WqVYPOG9afgayvFV/cWsTZ2dkiN2jQwPbrUXqWL18e7yYYa/v27bbHyPjx44OOSRdn9OjRQff9VbsHJCsrS3kFZ9QAABiMQg0AgMEo1AAAGMxzY9QrV64MujZskyZNRB4zZoztuscIfS1lfUxaf7bwAw884N9evXq12HfDDTeInJGRYbtecOCzhP/rv/5L7FuwYIHtOLHuxIkTIn/88ccX3Lakp6eL/Nvf/tb2vfU5poATzJ49W+SJEycG3a9/rU6/Z6Rhw4ZB166YMGGC7bHpZpxRAwBgMAo1AAAG89yl70D6JdYrrrhC5EcffbSUW+QeTz31VFjTtwKXEJ08ebLYd/XVV4f12YGvD3wspeX8+fMqVpYsWWKbYS59amajRo1i8nhNN9CPqbNnz4ocOCUyLS3N9r0uueQSkT/66KOgQ2Z79+5VXsUZNQAABqNQAwBgMAo1AAAG8/QYta6oqMj2EYgI3aFDh4I+evJCy7O2aNEi6Hvpj6rcuHGjyCtWrBD5hx9+KJUxabj32E9M5BwmVDNnzox3E1yP70YAAAxGoQYAwGAUagAADMYYtc3j2vr06SMyj74LXadOnUTu27evyK1atRI5Ly/Pv/3GG2+IfUePHhWZewcQa+3bt/dvL1y4MK5tATijBgDAYBRqAAAMRqEGAMBgnh6jvueee0QuKCgQedeuXaXcIvfIz88X+a233rLNgElrfQMm4YwaAACDUagBADAYhRoAAIN5eoxaXzO6adOmIp8+fbqUWwSgNKxatUrkfv36xa0tQHE4owYAwGAUagAADEahBgDAYAlF+oNYL+DEiRMqKSmpdFqEEjt+/Ph/rFteHPrY/X1soZ+dhWPZ/ULpY86oAQAwGIUaAACDUagBADAYhRoAAINRqAEAMBiFGgAAg1GoAQAwGIUaAACDUagBAHB6oQ5h8TIYJJL+oo+dJdL+op+dhWPZ/ULpr5AKdX5+fjTag1ISSX/Rx84SaX/Rz87Csex+ofRXSGt9FxYWqtzcXFWlShWVkJAQrfYhyqyutDq9du3aKjExvFEN+tj9fWyhn52BY9n9isLo45AKNQAAiA9uJgMAwGAUagAADEahBgDAYBRqAAAM5rlCvXHjRtW7d2/fnXbWHZErVqyId5MQZdOnT1dt27b13fV62WWXqb59+6pvv/023s1CDM2YMcN3PI8ZMybeTUEUTZ482devgX+aNGmivMZzhfrUqVOqRYsWau7cufFuCmJkw4YN6qGHHlKbN29Wa9asUWfPnlXdunXz9T3cZ+vWrWr+/PmqefPm8W4KYuCaa65RBw8e9P/ZtGmT8pqyymN69Ojh+wP3+vjjj0VeuHCh78z6yy+/VJ06dYpbuxB9J0+eVAMGDFCvvvqqmjp1arybgxgoW7asqlmzpvIyz51Rw3uOHz/u+29ycnK8m4Ios66c9OzZU6WlpcW7KYiRPXv2+IYqr7zySt8vZfv371de47kzaniLtUqTNW554403qtTU1Hg3B1G0dOlStX37dt+lb7hTu3btfFfEGjdu7Lvs/fTTT6ubbrpJZWVl+e5B8QoKNVx/xmUd1F4c13Kz7OxsNXr0aN89CBUqVIh3cxAjgcOUzZs39xXulJQUtWzZMjVs2DDlFRRquNbIkSPVhx9+6LvTv06dOvFuDqLIut8gLy9PtWrVyv9358+f9/V1ZmamKigoUGXKlIlrGxF91apVU40aNVJ79+5VXkKhhutYy9ePGjVKLV++XK1fv141aNAg3k1ClHXp0kV9/fXX4u8yMjJ8U3fGjx9PkXbxzYP79u1TgwYNUl5S1osdHfjb2Pfff6927Njhu9GoXr16cW0bone5++2331YrV670jWMdOnTI9/dJSUnq4osvjnfzEAVWv+r3HFSqVElVr16dexFcZOzYsb51L6zL3dYTwSZNmuT7JSw9PV15iecK9bZt29Qtt9ziz48++qjvv0OGDPHdtADne+mll3z/7dy5s/j7BQsWqKFDh8apVQDClZOT4yvKP//8s6pRo4bq2LGjb30Ea9tLeMwlAAAGYx41AAAGo1ADAGAwCjUAAAajUAMAYDAKNQAABqNQAwBgMAo1AAAGo1ADAGAwCjUAAAajUAMAYDAKNQAABqNQAwCgzPX/AOdmcBaIiWqoAAAAAElFTkSuQmCC", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from turtle import width\n", "from matplotlib import pyplot as plt\n", "\n", "fig, axes = plt.subplots(3, 4, figsize=(5, 5))\n", "for k in range(12):\n", " current_axes = axes[k % 3][k % 4]\n", " current_axes.imshow(X_train[k], cmap='grey')\n", " current_axes.get_xaxis().set_ticks([])\n", " current_axes.get_yaxis().set_ticks([])\n", " current_axes.set_xlabel(y_train[k])\n", "plt.tight_layout()\n", "plt.show()" ] }, { "cell_type": "markdown", "id": "d3639caf", "metadata": {}, "source": [ "#### Предобработка данных\n", "\n", "Количество классов - 10 (от 0 до 9).\n", "\n", "Все изображения из X трансформируются в векторы длиной 784 (28*28) признака и нормализуются.\n", "\n", "Для целевых признаков применяется унитарное кодирование в бинарные векторы длиной 10 (нормализация)." ] }, { "cell_type": "code", "execution_count": 4, "id": "4a696e24", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0.01176471, 0.07058824, 0.07058824,\n", " 0.07058824, 0.49411765, 0.53333336, 0.6862745 , 0.10196079,\n", " 0.6509804 , 1. , 0.96862745, 0.49803922, 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0.11764706, 0.14117648, 0.36862746, 0.6039216 ,\n", " 0.6666667 , 0.99215686, 0.99215686, 0.99215686, 0.99215686,\n", " 0.99215686, 0.88235295, 0.6745098 , 0.99215686, 0.9490196 ,\n", " 0.7647059 , 0.2509804 , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0.19215687, 0.93333334,\n", " 0.99215686, 0.99215686, 0.99215686, 0.99215686, 0.99215686,\n", " 0.99215686, 0.99215686, 0.99215686, 0.9843137 , 0.3647059 ,\n", " 0.32156864, 0.32156864, 0.21960784, 0.15294118, 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0.07058824, 0.85882354, 0.99215686, 0.99215686,\n", " 0.99215686, 0.99215686, 0.99215686, 0.7764706 , 0.7137255 ,\n", " 0.96862745, 0.94509804, 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0.3137255 , 0.6117647 , 0.41960785, 0.99215686, 0.99215686,\n", " 0.8039216 , 0.04313726, 0. , 0.16862746, 0.6039216 ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0.05490196,\n", " 0.00392157, 0.6039216 , 0.99215686, 0.3529412 , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0.54509807,\n", " 0.99215686, 0.74509805, 0.00784314, 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0.04313726, 0.74509805, 0.99215686,\n", " 0.27450982, 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0.13725491, 0.94509804, 0.88235295, 0.627451 ,\n", " 0.42352942, 0.00392157, 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0.31764707, 0.9411765 , 0.99215686, 0.99215686, 0.46666667,\n", " 0.09803922, 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0.1764706 ,\n", " 0.7294118 , 0.99215686, 0.99215686, 0.5882353 , 0.10588235,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0.0627451 , 0.3647059 ,\n", " 0.9882353 , 0.99215686, 0.73333335, 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0.9764706 , 0.99215686,\n", " 0.9764706 , 0.2509804 , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0.18039216, 0.50980395,\n", " 0.7176471 , 0.99215686, 0.99215686, 0.8117647 , 0.00784314,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0.15294118,\n", " 0.5803922 , 0.8980392 , 0.99215686, 0.99215686, 0.99215686,\n", " 0.98039216, 0.7137255 , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0.09411765, 0.44705883, 0.8666667 , 0.99215686, 0.99215686,\n", " 0.99215686, 0.99215686, 0.7882353 , 0.30588236, 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0.09019608, 0.25882354, 0.8352941 , 0.99215686,\n", " 0.99215686, 0.99215686, 0.99215686, 0.7764706 , 0.31764707,\n", " 0.00784314, 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0.07058824, 0.67058825, 0.85882354,\n", " 0.99215686, 0.99215686, 0.99215686, 0.99215686, 0.7647059 ,\n", " 0.3137255 , 0.03529412, 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0.21568628, 0.6745098 ,\n", " 0.8862745 , 0.99215686, 0.99215686, 0.99215686, 0.99215686,\n", " 0.95686275, 0.52156866, 0.04313726, 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0.53333336, 0.99215686, 0.99215686, 0.99215686,\n", " 0.83137256, 0.5294118 , 0.5176471 , 0.0627451 , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. , 0. ,\n", " 0. , 0. , 0. , 0. ], dtype=float32)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/plain": [ "array([0., 0., 0., 0., 0., 1., 0., 0., 0., 0.])" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "n_classes = 10\n", "\n", "X_train = X_train.reshape(60000, 784).astype(\"float32\") / 255\n", "X_valid = X_valid.reshape(10000, 784).astype(\"float32\") / 255\n", "y_train = keras.utils.to_categorical(y_train, n_classes)\n", "y_valid = keras.utils.to_categorical(y_valid, n_classes)\n", "\n", "display(X_train[0])\n", "display(y_train[0])" ] }, { "cell_type": "markdown", "id": "40173015", "metadata": {}, "source": [ "#### Проектирование архитектуры простой ИНС\n", "\n", "Сеть состоит из:\n", "- входного слоя с 784 входами (InputLayer);\n", "- скрытого полносвязного слоя с 64 sigmoid-нейронами (dense_2);\n", "- выходного слоя с 10 softmax-нейронами (многоклассовая классификация) (dense_3).\n", "\n", "Количество параметров в слоях:\n", "- dense_2: 784 * 64 + 64 = 50 176 + 64 = 50 240. У каждого из 64 нейронов 784 входа с 784 параметрами (w * x) + 64 смещения (b).\n", "- dense_3: 64 * 10 + 10 = 640 + 10 = 650.\n", "\n", "Всего параметров: 50 240 + 650 = 50 890.\n", "\n", "Все параметры настраиваются в процессе обучения." ] }, { "cell_type": "code", "execution_count": 5, "id": "91b073d9", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
Model: \"sequential\"\n",
       "
\n" ], "text/plain": [ "\u001b[1mModel: \"sequential\"\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
       "│ dense (Dense)                   │ (None, 64)             │        50,240 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_1 (Dense)                 │ (None, 10)             │           650 │\n",
       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
       "
\n" ], "text/plain": [ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", "│ dense (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m50,240\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_1 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m10\u001b[0m) │ \u001b[38;5;34m650\u001b[0m │\n", "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Total params: 50,890 (198.79 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m50,890\u001b[0m (198.79 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Trainable params: 50,890 (198.79 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m50,890\u001b[0m (198.79 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Non-trainable params: 0 (0.00 B)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m0\u001b[0m (0.00 B)\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from keras.api.models import Sequential\n", "from keras.api.layers import Dense, InputLayer\n", "\n", "simple_model = Sequential()\n", "simple_model.add(InputLayer(shape=(28*28,)))\n", "simple_model.add(Dense(64, activation=\"sigmoid\"))\n", "simple_model.add(Dense(n_classes, activation=\"softmax\"))\n", "simple_model.summary()" ] }, { "cell_type": "markdown", "id": "6a21eb9e", "metadata": {}, "source": [ "#### Обучение простой модели\n", "\n", "Функция стоимости: MSE (квадратичная функция)\n", "\n", "Оптимизатор: стохастический градиентный спуск (SGD)\n", "\n", "Скорость обучения: 0.01\n", "\n", "Количество эпох: 200\n", "\n", "Размер пакета: 128\n", "\n", "Всего пакетов: 60 000 / 128 = 468.75 (468 пакетов по 128 изображений и 1 пакет с 96 изображениями)\n", "\n", "Метрика оценки качества: accuracy\n", "\n", "Оценка качества и стоимость на обучающей выборке:\\\n", "accuracy: 0.4650 - loss: 0.0849\n", "\n", "Оценка качества и стоимость на тестовой выборке:\\\n", "val_accuracy: 0.4703 - val_loss: 0.0845" ] }, { "cell_type": "code", "execution_count": 6, "id": "b65eca38", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 2ms/step - accuracy: 0.1217 - loss: 0.0930 - val_accuracy: 0.1302 - val_loss: 0.0919\n", "Epoch 2/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 797us/step - accuracy: 0.1303 - loss: 0.0916 - val_accuracy: 0.1377 - val_loss: 0.0909\n", "Epoch 3/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 749us/step - accuracy: 0.1397 - loss: 0.0906 - val_accuracy: 0.1498 - val_loss: 0.0902\n", "Epoch 4/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 832us/step - accuracy: 0.1451 - loss: 0.0901 - val_accuracy: 0.1616 - val_loss: 0.0897\n", "Epoch 5/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 763us/step - accuracy: 0.1590 - loss: 0.0896 - val_accuracy: 0.1792 - val_loss: 0.0892\n", "Epoch 6/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 695us/step - accuracy: 0.1774 - loss: 0.0891 - val_accuracy: 0.2015 - val_loss: 0.0888\n", "Epoch 7/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 771us/step - accuracy: 0.2010 - loss: 0.0888 - val_accuracy: 0.2305 - val_loss: 0.0885\n", "Epoch 8/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 862us/step - accuracy: 0.2317 - loss: 0.0885 - val_accuracy: 0.2665 - val_loss: 0.0882\n", "Epoch 9/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 931us/step - accuracy: 0.2713 - loss: 0.0881 - val_accuracy: 0.3034 - val_loss: 0.0878\n", "Epoch 10/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 735us/step - accuracy: 0.3060 - loss: 0.0878 - val_accuracy: 0.3347 - val_loss: 0.0875\n", "Epoch 11/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 711us/step - accuracy: 0.3334 - loss: 0.0875 - val_accuracy: 0.3617 - val_loss: 0.0872\n", "Epoch 12/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 841us/step - accuracy: 0.3607 - loss: 0.0872 - val_accuracy: 0.3854 - val_loss: 0.0869\n", "Epoch 13/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 782us/step - accuracy: 0.3807 - loss: 0.0869 - val_accuracy: 0.4040 - val_loss: 0.0866\n", "Epoch 14/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 800us/step - accuracy: 0.3964 - loss: 0.0867 - val_accuracy: 0.4214 - val_loss: 0.0863\n", "Epoch 15/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 747us/step - accuracy: 0.4183 - loss: 0.0863 - val_accuracy: 0.4375 - val_loss: 0.0860\n", "Epoch 16/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 828us/step - accuracy: 0.4257 - loss: 0.0860 - val_accuracy: 0.4483 - val_loss: 0.0857\n", "Epoch 17/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 831us/step - accuracy: 0.4421 - loss: 0.0857 - val_accuracy: 0.4594 - val_loss: 0.0853\n", "Epoch 18/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 809us/step - accuracy: 0.4493 - loss: 0.0854 - val_accuracy: 0.4693 - val_loss: 0.0850\n", "Epoch 19/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 903us/step - accuracy: 0.4597 - loss: 0.0850 - val_accuracy: 0.4793 - val_loss: 0.0846\n", "Epoch 20/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 996us/step - accuracy: 0.4715 - loss: 0.0846 - val_accuracy: 0.4873 - val_loss: 0.0843\n", "Epoch 21/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 915us/step - accuracy: 0.4783 - loss: 0.0843 - val_accuracy: 0.4939 - val_loss: 0.0839\n", "Epoch 22/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 1ms/step - accuracy: 0.4842 - loss: 0.0839 - val_accuracy: 0.5003 - val_loss: 0.0835\n", "Epoch 23/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 938us/step - accuracy: 0.4904 - loss: 0.0836 - val_accuracy: 0.5059 - val_loss: 0.0832\n", "Epoch 24/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 867us/step - accuracy: 0.4936 - loss: 0.0832 - val_accuracy: 0.5099 - val_loss: 0.0828\n", "Epoch 25/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 941us/step - accuracy: 0.4990 - loss: 0.0828 - val_accuracy: 0.5142 - val_loss: 0.0823\n", "Epoch 26/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.5010 - loss: 0.0824 - val_accuracy: 0.5187 - val_loss: 0.0819\n", "Epoch 27/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.5052 - loss: 0.0820 - val_accuracy: 0.5223 - val_loss: 0.0815\n", "Epoch 28/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 1ms/step - accuracy: 0.5082 - loss: 0.0816 - val_accuracy: 0.5268 - val_loss: 0.0811\n", "Epoch 29/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.5127 - loss: 0.0811 - val_accuracy: 0.5304 - val_loss: 0.0806\n", "Epoch 30/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 923us/step - accuracy: 0.5189 - loss: 0.0807 - val_accuracy: 0.5343 - val_loss: 0.0802\n", "Epoch 31/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 844us/step - accuracy: 0.5180 - loss: 0.0803 - val_accuracy: 0.5379 - val_loss: 0.0797\n", "Epoch 32/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 732us/step - accuracy: 0.5233 - loss: 0.0798 - val_accuracy: 0.5414 - val_loss: 0.0792\n", "Epoch 33/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 806us/step - accuracy: 0.5250 - loss: 0.0793 - val_accuracy: 0.5446 - val_loss: 0.0787\n", "Epoch 34/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 861us/step - accuracy: 0.5325 - loss: 0.0788 - val_accuracy: 0.5480 - val_loss: 0.0782\n", "Epoch 35/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 873us/step - accuracy: 0.5349 - loss: 0.0783 - val_accuracy: 0.5511 - val_loss: 0.0777\n", "Epoch 36/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 927us/step - accuracy: 0.5351 - loss: 0.0778 - val_accuracy: 0.5545 - val_loss: 0.0772\n", "Epoch 37/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 928us/step - accuracy: 0.5426 - loss: 0.0773 - val_accuracy: 0.5591 - val_loss: 0.0767\n", "Epoch 38/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 899us/step - accuracy: 0.5458 - loss: 0.0768 - val_accuracy: 0.5618 - val_loss: 0.0762\n", "Epoch 39/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 877us/step - accuracy: 0.5514 - loss: 0.0764 - val_accuracy: 0.5650 - val_loss: 0.0757\n", "Epoch 40/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 885us/step - accuracy: 0.5581 - loss: 0.0757 - val_accuracy: 0.5666 - val_loss: 0.0752\n", "Epoch 41/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 845us/step - accuracy: 0.5623 - loss: 0.0751 - val_accuracy: 0.5709 - val_loss: 0.0746\n", "Epoch 42/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 826us/step - accuracy: 0.5606 - loss: 0.0748 - val_accuracy: 0.5743 - val_loss: 0.0741\n", "Epoch 43/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 793us/step - accuracy: 0.5669 - loss: 0.0742 - val_accuracy: 0.5767 - val_loss: 0.0736\n", "Epoch 44/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 804us/step - accuracy: 0.5684 - loss: 0.0736 - val_accuracy: 0.5807 - val_loss: 0.0730\n", "Epoch 45/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 865us/step - accuracy: 0.5733 - loss: 0.0731 - val_accuracy: 0.5843 - val_loss: 0.0725\n", "Epoch 46/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 868us/step - accuracy: 0.5780 - loss: 0.0725 - val_accuracy: 0.5900 - val_loss: 0.0720\n", "Epoch 47/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 834us/step - accuracy: 0.5784 - loss: 0.0722 - val_accuracy: 0.5937 - val_loss: 0.0714\n", "Epoch 48/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 897us/step - accuracy: 0.5810 - loss: 0.0716 - val_accuracy: 0.5957 - val_loss: 0.0709\n", "Epoch 49/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 908us/step - accuracy: 0.5852 - loss: 0.0710 - val_accuracy: 0.5989 - val_loss: 0.0704\n", "Epoch 50/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 867us/step - accuracy: 0.5891 - loss: 0.0705 - val_accuracy: 0.6031 - val_loss: 0.0698\n", "Epoch 51/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 877us/step - accuracy: 0.5936 - loss: 0.0699 - val_accuracy: 0.6062 - val_loss: 0.0693\n", "Epoch 52/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 925us/step - accuracy: 0.5938 - loss: 0.0696 - val_accuracy: 0.6092 - val_loss: 0.0688\n", "Epoch 53/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 973us/step - accuracy: 0.5974 - loss: 0.0690 - val_accuracy: 0.6109 - val_loss: 0.0682\n", "Epoch 54/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 873us/step - accuracy: 0.6022 - loss: 0.0683 - val_accuracy: 0.6139 - val_loss: 0.0677\n", "Epoch 55/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 953us/step - accuracy: 0.6052 - loss: 0.0679 - val_accuracy: 0.6179 - val_loss: 0.0672\n", "Epoch 56/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 911us/step - accuracy: 0.6093 - loss: 0.0674 - val_accuracy: 0.6207 - val_loss: 0.0667\n", "Epoch 57/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 874us/step - accuracy: 0.6134 - loss: 0.0669 - val_accuracy: 0.6244 - val_loss: 0.0662\n", "Epoch 58/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 878us/step - accuracy: 0.6126 - loss: 0.0665 - val_accuracy: 0.6261 - val_loss: 0.0656\n", "Epoch 59/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 837us/step - accuracy: 0.6177 - loss: 0.0659 - val_accuracy: 0.6289 - val_loss: 0.0651\n", "Epoch 60/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 910us/step - accuracy: 0.6224 - loss: 0.0654 - val_accuracy: 0.6308 - val_loss: 0.0646\n", "Epoch 61/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 842us/step - accuracy: 0.6295 - loss: 0.0647 - val_accuracy: 0.6336 - val_loss: 0.0641\n", "Epoch 62/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 1ms/step - accuracy: 0.6281 - loss: 0.0644 - val_accuracy: 0.6363 - val_loss: 0.0636\n", "Epoch 63/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.6307 - loss: 0.0638 - val_accuracy: 0.6413 - val_loss: 0.0631\n", "Epoch 64/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 908us/step - accuracy: 0.6346 - loss: 0.0634 - val_accuracy: 0.6452 - val_loss: 0.0627\n", "Epoch 65/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 796us/step - accuracy: 0.6368 - loss: 0.0630 - val_accuracy: 0.6493 - val_loss: 0.0622\n", "Epoch 66/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 817us/step - accuracy: 0.6375 - loss: 0.0626 - val_accuracy: 0.6525 - val_loss: 0.0617\n", "Epoch 67/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 895us/step - accuracy: 0.6433 - loss: 0.0622 - val_accuracy: 0.6561 - val_loss: 0.0612\n", "Epoch 68/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 791us/step - accuracy: 0.6477 - loss: 0.0614 - val_accuracy: 0.6598 - val_loss: 0.0607\n", "Epoch 69/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 910us/step - accuracy: 0.6497 - loss: 0.0612 - val_accuracy: 0.6647 - val_loss: 0.0603\n", "Epoch 70/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 904us/step - accuracy: 0.6534 - loss: 0.0606 - val_accuracy: 0.6674 - val_loss: 0.0598\n", "Epoch 71/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 727us/step - accuracy: 0.6596 - loss: 0.0601 - val_accuracy: 0.6695 - val_loss: 0.0594\n", "Epoch 72/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 659us/step - accuracy: 0.6605 - loss: 0.0598 - val_accuracy: 0.6731 - val_loss: 0.0589\n", "Epoch 73/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 732us/step - accuracy: 0.6656 - loss: 0.0591 - val_accuracy: 0.6770 - val_loss: 0.0584\n", "Epoch 74/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 712us/step - accuracy: 0.6699 - loss: 0.0588 - val_accuracy: 0.6797 - val_loss: 0.0580\n", "Epoch 75/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 769us/step - accuracy: 0.6747 - loss: 0.0583 - val_accuracy: 0.6829 - val_loss: 0.0576\n", "Epoch 76/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 742us/step - accuracy: 0.6751 - loss: 0.0579 - val_accuracy: 0.6855 - val_loss: 0.0571\n", "Epoch 77/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 740us/step - accuracy: 0.6758 - loss: 0.0576 - val_accuracy: 0.6883 - val_loss: 0.0567\n", "Epoch 78/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 738us/step - accuracy: 0.6848 - loss: 0.0570 - val_accuracy: 0.6914 - val_loss: 0.0563\n", "Epoch 79/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 804us/step - accuracy: 0.6872 - loss: 0.0564 - val_accuracy: 0.6949 - val_loss: 0.0558\n", "Epoch 80/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 768us/step - accuracy: 0.6886 - loss: 0.0563 - val_accuracy: 0.6977 - val_loss: 0.0554\n", "Epoch 81/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 765us/step - accuracy: 0.6933 - loss: 0.0557 - val_accuracy: 0.7008 - val_loss: 0.0550\n", "Epoch 82/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 795us/step - accuracy: 0.6968 - loss: 0.0551 - val_accuracy: 0.7040 - val_loss: 0.0546\n", "Epoch 83/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 798us/step - accuracy: 0.6965 - loss: 0.0551 - val_accuracy: 0.7077 - val_loss: 0.0542\n", "Epoch 84/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 755us/step - accuracy: 0.7000 - loss: 0.0545 - val_accuracy: 0.7113 - val_loss: 0.0538\n", "Epoch 85/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 782us/step - accuracy: 0.7028 - loss: 0.0542 - val_accuracy: 0.7132 - val_loss: 0.0534\n", "Epoch 86/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 723us/step - accuracy: 0.7048 - loss: 0.0539 - val_accuracy: 0.7149 - val_loss: 0.0530\n", "Epoch 87/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 788us/step - accuracy: 0.7078 - loss: 0.0534 - val_accuracy: 0.7172 - val_loss: 0.0526\n", "Epoch 88/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 738us/step - accuracy: 0.7100 - loss: 0.0529 - val_accuracy: 0.7191 - val_loss: 0.0522\n", "Epoch 89/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 780us/step - accuracy: 0.7132 - loss: 0.0526 - val_accuracy: 0.7200 - val_loss: 0.0518\n", "Epoch 90/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 839us/step - accuracy: 0.7164 - loss: 0.0521 - val_accuracy: 0.7233 - val_loss: 0.0514\n", "Epoch 91/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 709us/step - accuracy: 0.7173 - loss: 0.0517 - val_accuracy: 0.7254 - val_loss: 0.0510\n", "Epoch 92/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 697us/step - accuracy: 0.7213 - loss: 0.0515 - val_accuracy: 0.7273 - val_loss: 0.0507\n", "Epoch 93/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 675us/step - accuracy: 0.7231 - loss: 0.0510 - val_accuracy: 0.7297 - val_loss: 0.0503\n", "Epoch 94/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 682us/step - accuracy: 0.7227 - loss: 0.0508 - val_accuracy: 0.7321 - val_loss: 0.0499\n", "Epoch 95/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 753us/step - accuracy: 0.7235 - loss: 0.0506 - val_accuracy: 0.7350 - val_loss: 0.0496\n", "Epoch 96/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 675us/step - accuracy: 0.7283 - loss: 0.0500 - val_accuracy: 0.7371 - val_loss: 0.0492\n", "Epoch 97/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 670us/step - accuracy: 0.7300 - loss: 0.0496 - val_accuracy: 0.7382 - val_loss: 0.0488\n", "Epoch 98/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 680us/step - accuracy: 0.7328 - loss: 0.0493 - val_accuracy: 0.7402 - val_loss: 0.0485\n", "Epoch 99/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 702us/step - accuracy: 0.7338 - loss: 0.0491 - val_accuracy: 0.7416 - val_loss: 0.0481\n", "Epoch 100/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 696us/step - accuracy: 0.7358 - loss: 0.0487 - val_accuracy: 0.7435 - val_loss: 0.0478\n", "Epoch 101/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 703us/step - accuracy: 0.7375 - loss: 0.0483 - val_accuracy: 0.7453 - val_loss: 0.0475\n", "Epoch 102/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 700us/step - accuracy: 0.7374 - loss: 0.0481 - val_accuracy: 0.7468 - val_loss: 0.0471\n", "Epoch 103/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 759us/step - accuracy: 0.7400 - loss: 0.0477 - val_accuracy: 0.7483 - val_loss: 0.0468\n", "Epoch 104/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 707us/step - accuracy: 0.7412 - loss: 0.0475 - val_accuracy: 0.7502 - val_loss: 0.0465\n", "Epoch 105/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 711us/step - accuracy: 0.7424 - loss: 0.0472 - val_accuracy: 0.7516 - val_loss: 0.0461\n", "Epoch 106/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 761us/step - accuracy: 0.7455 - loss: 0.0467 - val_accuracy: 0.7539 - val_loss: 0.0458\n", "Epoch 107/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 718us/step - accuracy: 0.7485 - loss: 0.0463 - val_accuracy: 0.7560 - val_loss: 0.0455\n", "Epoch 108/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 745us/step - accuracy: 0.7490 - loss: 0.0460 - val_accuracy: 0.7588 - val_loss: 0.0452\n", "Epoch 109/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 782us/step - accuracy: 0.7521 - loss: 0.0456 - val_accuracy: 0.7607 - val_loss: 0.0449\n", "Epoch 110/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 796us/step - accuracy: 0.7542 - loss: 0.0451 - val_accuracy: 0.7621 - val_loss: 0.0446\n", "Epoch 111/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 751us/step - accuracy: 0.7558 - loss: 0.0449 - val_accuracy: 0.7635 - val_loss: 0.0443\n", "Epoch 112/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 768us/step - accuracy: 0.7568 - loss: 0.0446 - val_accuracy: 0.7658 - val_loss: 0.0439\n", "Epoch 113/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 848us/step - accuracy: 0.7598 - loss: 0.0443 - val_accuracy: 0.7676 - val_loss: 0.0436\n", "Epoch 114/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 789us/step - accuracy: 0.7588 - loss: 0.0443 - val_accuracy: 0.7697 - val_loss: 0.0433\n", "Epoch 115/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 811us/step - accuracy: 0.7617 - loss: 0.0440 - val_accuracy: 0.7727 - val_loss: 0.0431\n", "Epoch 116/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 867us/step - accuracy: 0.7634 - loss: 0.0437 - val_accuracy: 0.7746 - val_loss: 0.0428\n", "Epoch 117/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 834us/step - accuracy: 0.7650 - loss: 0.0434 - val_accuracy: 0.7770 - val_loss: 0.0425\n", "Epoch 118/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 729us/step - accuracy: 0.7706 - loss: 0.0429 - val_accuracy: 0.7792 - val_loss: 0.0422\n", "Epoch 119/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 676us/step - accuracy: 0.7693 - loss: 0.0428 - val_accuracy: 0.7817 - val_loss: 0.0419\n", "Epoch 120/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 742us/step - accuracy: 0.7732 - loss: 0.0424 - val_accuracy: 0.7841 - val_loss: 0.0416\n", "Epoch 121/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 700us/step - accuracy: 0.7724 - loss: 0.0422 - val_accuracy: 0.7864 - val_loss: 0.0414\n", "Epoch 122/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 705us/step - accuracy: 0.7760 - loss: 0.0420 - val_accuracy: 0.7890 - val_loss: 0.0411\n", "Epoch 123/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 685us/step - accuracy: 0.7797 - loss: 0.0416 - val_accuracy: 0.7911 - val_loss: 0.0408\n", "Epoch 124/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 791us/step - accuracy: 0.7798 - loss: 0.0414 - val_accuracy: 0.7927 - val_loss: 0.0405\n", "Epoch 125/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 707us/step - accuracy: 0.7831 - loss: 0.0411 - val_accuracy: 0.7951 - val_loss: 0.0403\n", "Epoch 126/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 659us/step - accuracy: 0.7845 - loss: 0.0410 - val_accuracy: 0.7968 - val_loss: 0.0400\n", "Epoch 127/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 768us/step - accuracy: 0.7861 - loss: 0.0407 - val_accuracy: 0.7983 - val_loss: 0.0398\n", "Epoch 128/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 754us/step - accuracy: 0.7894 - loss: 0.0404 - val_accuracy: 0.8005 - val_loss: 0.0395\n", "Epoch 129/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 742us/step - accuracy: 0.7918 - loss: 0.0402 - val_accuracy: 0.8016 - val_loss: 0.0392\n", "Epoch 130/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 775us/step - accuracy: 0.7946 - loss: 0.0399 - val_accuracy: 0.8046 - val_loss: 0.0390\n", "Epoch 131/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 832us/step - accuracy: 0.7928 - loss: 0.0398 - val_accuracy: 0.8067 - val_loss: 0.0387\n", "Epoch 132/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 775us/step - accuracy: 0.7974 - loss: 0.0394 - val_accuracy: 0.8080 - val_loss: 0.0385\n", "Epoch 133/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 669us/step - accuracy: 0.7985 - loss: 0.0392 - val_accuracy: 0.8103 - val_loss: 0.0383\n", "Epoch 134/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 829us/step - accuracy: 0.8031 - loss: 0.0387 - val_accuracy: 0.8119 - val_loss: 0.0380\n", "Epoch 135/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 696us/step - accuracy: 0.8024 - loss: 0.0386 - val_accuracy: 0.8138 - val_loss: 0.0378\n", "Epoch 136/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 796us/step - accuracy: 0.8035 - loss: 0.0385 - val_accuracy: 0.8158 - val_loss: 0.0375\n", "Epoch 137/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 857us/step - accuracy: 0.8061 - loss: 0.0382 - val_accuracy: 0.8171 - val_loss: 0.0373\n", "Epoch 138/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 811us/step - accuracy: 0.8057 - loss: 0.0381 - val_accuracy: 0.8193 - val_loss: 0.0371\n", "Epoch 139/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 755us/step - accuracy: 0.8090 - loss: 0.0377 - val_accuracy: 0.8201 - val_loss: 0.0369\n", "Epoch 140/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 863us/step - accuracy: 0.8108 - loss: 0.0377 - val_accuracy: 0.8212 - val_loss: 0.0366\n", "Epoch 141/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 735us/step - accuracy: 0.8104 - loss: 0.0373 - val_accuracy: 0.8225 - val_loss: 0.0364\n", "Epoch 142/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 723us/step - accuracy: 0.8155 - loss: 0.0369 - val_accuracy: 0.8236 - val_loss: 0.0362\n", "Epoch 143/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 711us/step - accuracy: 0.8163 - loss: 0.0367 - val_accuracy: 0.8250 - val_loss: 0.0360\n", "Epoch 144/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 872us/step - accuracy: 0.8165 - loss: 0.0368 - val_accuracy: 0.8258 - val_loss: 0.0358\n", "Epoch 145/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 831us/step - accuracy: 0.8207 - loss: 0.0362 - val_accuracy: 0.8275 - val_loss: 0.0355\n", "Epoch 146/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 717us/step - accuracy: 0.8186 - loss: 0.0363 - val_accuracy: 0.8287 - val_loss: 0.0353\n", "Epoch 147/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 787us/step - accuracy: 0.8209 - loss: 0.0361 - val_accuracy: 0.8297 - val_loss: 0.0351\n", "Epoch 148/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 728us/step - accuracy: 0.8207 - loss: 0.0360 - val_accuracy: 0.8306 - val_loss: 0.0349\n", "Epoch 149/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 749us/step - accuracy: 0.8244 - loss: 0.0356 - val_accuracy: 0.8316 - val_loss: 0.0347\n", "Epoch 150/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 805us/step - accuracy: 0.8228 - loss: 0.0356 - val_accuracy: 0.8320 - val_loss: 0.0345\n", "Epoch 151/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 741us/step - accuracy: 0.8252 - loss: 0.0350 - val_accuracy: 0.8331 - val_loss: 0.0343\n", "Epoch 152/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 767us/step - accuracy: 0.8275 - loss: 0.0349 - val_accuracy: 0.8344 - val_loss: 0.0341\n", "Epoch 153/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 874us/step - accuracy: 0.8277 - loss: 0.0348 - val_accuracy: 0.8353 - val_loss: 0.0339\n", "Epoch 154/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 815us/step - accuracy: 0.8269 - loss: 0.0349 - val_accuracy: 0.8365 - val_loss: 0.0337\n", "Epoch 155/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 795us/step - accuracy: 0.8280 - loss: 0.0346 - val_accuracy: 0.8376 - val_loss: 0.0335\n", "Epoch 156/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 859us/step - accuracy: 0.8330 - loss: 0.0344 - val_accuracy: 0.8392 - val_loss: 0.0334\n", "Epoch 157/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 818us/step - accuracy: 0.8307 - loss: 0.0341 - val_accuracy: 0.8405 - val_loss: 0.0332\n", "Epoch 158/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 859us/step - accuracy: 0.8341 - loss: 0.0337 - val_accuracy: 0.8414 - val_loss: 0.0330\n", "Epoch 159/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 817us/step - accuracy: 0.8358 - loss: 0.0336 - val_accuracy: 0.8420 - val_loss: 0.0328\n", "Epoch 160/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 882us/step - accuracy: 0.8338 - loss: 0.0337 - val_accuracy: 0.8421 - val_loss: 0.0326\n", "Epoch 161/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 827us/step - accuracy: 0.8344 - loss: 0.0334 - val_accuracy: 0.8429 - val_loss: 0.0325\n", "Epoch 162/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 817us/step - accuracy: 0.8372 - loss: 0.0330 - val_accuracy: 0.8439 - val_loss: 0.0323\n", "Epoch 163/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 703us/step - accuracy: 0.8363 - loss: 0.0329 - val_accuracy: 0.8444 - val_loss: 0.0321\n", "Epoch 164/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 762us/step - accuracy: 0.8368 - loss: 0.0329 - val_accuracy: 0.8450 - val_loss: 0.0319\n", "Epoch 165/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 714us/step - accuracy: 0.8373 - loss: 0.0327 - val_accuracy: 0.8461 - val_loss: 0.0318\n", "Epoch 166/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 710us/step - accuracy: 0.8380 - loss: 0.0326 - val_accuracy: 0.8468 - val_loss: 0.0316\n", "Epoch 167/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 782us/step - accuracy: 0.8398 - loss: 0.0325 - val_accuracy: 0.8476 - val_loss: 0.0315\n", "Epoch 168/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 762us/step - accuracy: 0.8409 - loss: 0.0322 - val_accuracy: 0.8486 - val_loss: 0.0313\n", "Epoch 169/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 696us/step - accuracy: 0.8406 - loss: 0.0321 - val_accuracy: 0.8493 - val_loss: 0.0311\n", "Epoch 170/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 750us/step - accuracy: 0.8407 - loss: 0.0320 - val_accuracy: 0.8499 - val_loss: 0.0310\n", "Epoch 171/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 799us/step - accuracy: 0.8438 - loss: 0.0317 - val_accuracy: 0.8507 - val_loss: 0.0308\n", "Epoch 172/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 756us/step - accuracy: 0.8417 - loss: 0.0317 - val_accuracy: 0.8514 - val_loss: 0.0307\n", "Epoch 173/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 740us/step - accuracy: 0.8441 - loss: 0.0314 - val_accuracy: 0.8520 - val_loss: 0.0305\n", "Epoch 174/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 760us/step - accuracy: 0.8424 - loss: 0.0315 - val_accuracy: 0.8529 - val_loss: 0.0304\n", "Epoch 175/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 750us/step - accuracy: 0.8462 - loss: 0.0312 - val_accuracy: 0.8542 - val_loss: 0.0302\n", "Epoch 176/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 732us/step - accuracy: 0.8447 - loss: 0.0312 - val_accuracy: 0.8546 - val_loss: 0.0301\n", "Epoch 177/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 759us/step - accuracy: 0.8468 - loss: 0.0308 - val_accuracy: 0.8550 - val_loss: 0.0299\n", "Epoch 178/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 705us/step - accuracy: 0.8481 - loss: 0.0306 - val_accuracy: 0.8553 - val_loss: 0.0298\n", "Epoch 179/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 678us/step - accuracy: 0.8475 - loss: 0.0306 - val_accuracy: 0.8557 - val_loss: 0.0296\n", "Epoch 180/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 724us/step - accuracy: 0.8465 - loss: 0.0307 - val_accuracy: 0.8565 - val_loss: 0.0295\n", "Epoch 181/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 722us/step - accuracy: 0.8505 - loss: 0.0302 - val_accuracy: 0.8571 - val_loss: 0.0294\n", "Epoch 182/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 729us/step - accuracy: 0.8495 - loss: 0.0302 - val_accuracy: 0.8575 - val_loss: 0.0292\n", "Epoch 183/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 750us/step - accuracy: 0.8505 - loss: 0.0300 - val_accuracy: 0.8579 - val_loss: 0.0291\n", "Epoch 184/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 766us/step - accuracy: 0.8497 - loss: 0.0300 - val_accuracy: 0.8583 - val_loss: 0.0290\n", "Epoch 185/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 753us/step - accuracy: 0.8524 - loss: 0.0297 - val_accuracy: 0.8589 - val_loss: 0.0288\n", "Epoch 186/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 772us/step - accuracy: 0.8520 - loss: 0.0297 - val_accuracy: 0.8594 - val_loss: 0.0287\n", "Epoch 187/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 740us/step - accuracy: 0.8503 - loss: 0.0297 - val_accuracy: 0.8596 - val_loss: 0.0286\n", "Epoch 188/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 697us/step - accuracy: 0.8539 - loss: 0.0293 - val_accuracy: 0.8606 - val_loss: 0.0284\n", "Epoch 189/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 818us/step - accuracy: 0.8537 - loss: 0.0291 - val_accuracy: 0.8613 - val_loss: 0.0283\n", "Epoch 190/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 661us/step - accuracy: 0.8506 - loss: 0.0295 - val_accuracy: 0.8617 - val_loss: 0.0282\n", "Epoch 191/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 709us/step - accuracy: 0.8535 - loss: 0.0292 - val_accuracy: 0.8625 - val_loss: 0.0281\n", "Epoch 192/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 779us/step - accuracy: 0.8552 - loss: 0.0289 - val_accuracy: 0.8627 - val_loss: 0.0279\n", "Epoch 193/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 784us/step - accuracy: 0.8553 - loss: 0.0287 - val_accuracy: 0.8634 - val_loss: 0.0278\n", "Epoch 194/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 761us/step - accuracy: 0.8543 - loss: 0.0287 - val_accuracy: 0.8642 - val_loss: 0.0277\n", "Epoch 195/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 896us/step - accuracy: 0.8547 - loss: 0.0287 - val_accuracy: 0.8647 - val_loss: 0.0276\n", "Epoch 196/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 810us/step - accuracy: 0.8526 - loss: 0.0287 - val_accuracy: 0.8647 - val_loss: 0.0275\n", "Epoch 197/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 882us/step - accuracy: 0.8550 - loss: 0.0283 - val_accuracy: 0.8646 - val_loss: 0.0274\n", "Epoch 198/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 844us/step - accuracy: 0.8567 - loss: 0.0284 - val_accuracy: 0.8653 - val_loss: 0.0273\n", "Epoch 199/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 726us/step - accuracy: 0.8578 - loss: 0.0280 - val_accuracy: 0.8657 - val_loss: 0.0271\n", "Epoch 200/200\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 904us/step - accuracy: 0.8595 - loss: 0.0278 - val_accuracy: 0.8662 - val_loss: 0.0270\n" ] }, { "data": { "text/plain": [ "" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "from keras.api.optimizers import SGD\n", "\n", "simple_model.compile(\n", " loss=\"mean_squared_error\",\n", " optimizer=SGD(learning_rate=0.01),\n", " metrics=[\"accuracy\"],\n", ")\n", "\n", "simple_model.fit(\n", " X_train,\n", " y_train,\n", " batch_size=128,\n", " epochs=200,\n", " validation_data=(X_valid, y_valid),\n", ")" ] }, { "cell_type": "markdown", "id": "20501a84", "metadata": {}, "source": [ "#### Оценка качества простой модели\n", "\n", "Лучшее качество модели: 86.6 %" ] }, { "cell_type": "code", "execution_count": 7, "id": "30db9b2c", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[1m313/313\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 850us/step - accuracy: 0.8494 - loss: 0.0299\n" ] }, { "data": { "text/plain": [ "[0.027035443112254143, 0.8661999702453613]" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "simple_model.evaluate(X_valid, y_valid)" ] }, { "cell_type": "markdown", "id": "5db127c8", "metadata": {}, "source": [ "#### Проектирование архитектуры более сложной ИНС\n", "\n", "Добавлен дополнительный скрытый полносвязный слой\n", "\n", "Все скрытые слои используют ReLU-нейроны" ] }, { "cell_type": "code", "execution_count": 8, "id": "d2f2c585", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
Model: \"sequential_1\"\n",
       "
\n" ], "text/plain": [ "\u001b[1mModel: \"sequential_1\"\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
       "│ dense_2 (Dense)                 │ (None, 64)             │        50,240 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_3 (Dense)                 │ (None, 64)             │         4,160 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_4 (Dense)                 │ (None, 10)             │           650 │\n",
       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
       "
\n" ], "text/plain": [ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", "│ dense_2 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m50,240\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_3 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m4,160\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_4 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m10\u001b[0m) │ \u001b[38;5;34m650\u001b[0m │\n", "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Total params: 55,050 (215.04 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m55,050\u001b[0m (215.04 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Trainable params: 55,050 (215.04 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m55,050\u001b[0m (215.04 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Non-trainable params: 0 (0.00 B)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m0\u001b[0m (0.00 B)\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "difficult_model = Sequential()\n", "difficult_model.add(InputLayer(shape=(28 * 28,)))\n", "difficult_model.add(Dense(64, activation=\"relu\"))\n", "difficult_model.add(Dense(64, activation=\"relu\"))\n", "difficult_model.add(Dense(10, activation=\"softmax\"))\n", "difficult_model.summary()" ] }, { "cell_type": "markdown", "id": "5650eddc", "metadata": {}, "source": [ "#### Обучение более сложной модели\n", "\n", "Функция стоимости изменена на перекрестную энтропию (лучше подходит для классификации)\n", "\n", "Количество эпох уменьшено с 200 до 20" ] }, { "cell_type": "code", "execution_count": 9, "id": "10fc0413", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 2ms/step - accuracy: 0.7680 - loss: 0.8001 - val_accuracy: 0.9225 - val_loss: 0.2617\n", "Epoch 2/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 922us/step - accuracy: 0.9279 - loss: 0.2443 - val_accuracy: 0.9413 - val_loss: 0.2026\n", "Epoch 3/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9458 - loss: 0.1841 - val_accuracy: 0.9488 - val_loss: 0.1680\n", "Epoch 4/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 897us/step - accuracy: 0.9555 - loss: 0.1511 - val_accuracy: 0.9585 - val_loss: 0.1303\n", "Epoch 5/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 853us/step - accuracy: 0.9614 - loss: 0.1277 - val_accuracy: 0.9583 - val_loss: 0.1377\n", "Epoch 6/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 765us/step - accuracy: 0.9669 - loss: 0.1110 - val_accuracy: 0.9658 - val_loss: 0.1079\n", "Epoch 7/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 759us/step - accuracy: 0.9722 - loss: 0.0937 - val_accuracy: 0.9691 - val_loss: 0.0995\n", "Epoch 8/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 879us/step - accuracy: 0.9745 - loss: 0.0850 - val_accuracy: 0.9654 - val_loss: 0.1110\n", "Epoch 9/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 840us/step - accuracy: 0.9774 - loss: 0.0783 - val_accuracy: 0.9661 - val_loss: 0.1048\n", "Epoch 10/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 668us/step - accuracy: 0.9790 - loss: 0.0701 - val_accuracy: 0.9700 - val_loss: 0.0961\n", "Epoch 11/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 832us/step - accuracy: 0.9817 - loss: 0.0636 - val_accuracy: 0.9716 - val_loss: 0.0902\n", "Epoch 12/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 774us/step - accuracy: 0.9839 - loss: 0.0562 - val_accuracy: 0.9738 - val_loss: 0.0833\n", "Epoch 13/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 759us/step - accuracy: 0.9846 - loss: 0.0544 - val_accuracy: 0.9748 - val_loss: 0.0822\n", "Epoch 14/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 662us/step - accuracy: 0.9856 - loss: 0.0487 - val_accuracy: 0.9733 - val_loss: 0.0842\n", "Epoch 15/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 701us/step - accuracy: 0.9879 - loss: 0.0454 - val_accuracy: 0.9741 - val_loss: 0.0840\n", "Epoch 16/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 678us/step - accuracy: 0.9873 - loss: 0.0414 - val_accuracy: 0.9738 - val_loss: 0.0861\n", "Epoch 17/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 700us/step - accuracy: 0.9886 - loss: 0.0384 - val_accuracy: 0.9742 - val_loss: 0.0827\n", "Epoch 18/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 668us/step - accuracy: 0.9895 - loss: 0.0363 - val_accuracy: 0.9745 - val_loss: 0.0813\n", "Epoch 19/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 651us/step - accuracy: 0.9901 - loss: 0.0335 - val_accuracy: 0.9761 - val_loss: 0.0777\n", "Epoch 20/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 649us/step - accuracy: 0.9922 - loss: 0.0304 - val_accuracy: 0.9737 - val_loss: 0.0894\n" ] }, { "data": { "text/plain": [ "" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "difficult_model.compile(\n", " loss=\"categorical_crossentropy\",\n", " optimizer=SGD(learning_rate=0.1),\n", " metrics=[\"accuracy\"],\n", ")\n", "\n", "difficult_model.fit(\n", " X_train,\n", " y_train,\n", " batch_size=128,\n", " epochs=20,\n", " validation_data=(X_valid, y_valid),\n", ")" ] }, { "cell_type": "markdown", "id": "cef2db19", "metadata": {}, "source": [ "#### Оценка качества более сложной модели\n", "\n", "Лучшее качество модели: 97.4 %\n", "\n", "При этом количество эпох обучения значительно сократилось (с 200 до 20)." ] }, { "cell_type": "code", "execution_count": 10, "id": "df92dc50", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[1m313/313\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9689 - loss: 0.1073\n" ] }, { "data": { "text/plain": [ "[0.08943505585193634, 0.9736999869346619]" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "difficult_model.evaluate(X_valid, y_valid)" ] }, { "cell_type": "markdown", "id": "768404ed", "metadata": {}, "source": [ "#### Проектирование архитектуры глубокой ИНС\n", "\n", "В ИНС теперь три скрытых полносвязных слоя с ReLU-нейронами\n", "\n", "Для выходов каждого скрытого слоя используется пакетная нормализация\n", "\n", "Для последнего скрытого слоя применяется прореживание, при котором отключается 20 % случайных нейронов\n", "\n", "Keras автоматически корректирует значения (умножает входы на 0.8)" ] }, { "cell_type": "code", "execution_count": 11, "id": "ddbbb225", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
Model: \"sequential_2\"\n",
       "
\n" ], "text/plain": [ "\u001b[1mModel: \"sequential_2\"\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
       "│ dense_5 (Dense)                 │ (None, 64)             │        50,240 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ batch_normalization             │ (None, 64)             │           256 │\n",
       "│ (BatchNormalization)            │                        │               │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_6 (Dense)                 │ (None, 64)             │         4,160 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ batch_normalization_1           │ (None, 64)             │           256 │\n",
       "│ (BatchNormalization)            │                        │               │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_7 (Dense)                 │ (None, 64)             │         4,160 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ batch_normalization_2           │ (None, 64)             │           256 │\n",
       "│ (BatchNormalization)            │                        │               │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dropout (Dropout)               │ (None, 64)             │             0 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_8 (Dense)                 │ (None, 10)             │           650 │\n",
       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
       "
\n" ], "text/plain": [ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", "│ dense_5 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m50,240\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ batch_normalization │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m256\u001b[0m │\n", "│ (\u001b[38;5;33mBatchNormalization\u001b[0m) │ │ │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_6 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m4,160\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ batch_normalization_1 │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m256\u001b[0m │\n", "│ (\u001b[38;5;33mBatchNormalization\u001b[0m) │ │ │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_7 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m4,160\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ batch_normalization_2 │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m256\u001b[0m │\n", "│ (\u001b[38;5;33mBatchNormalization\u001b[0m) │ │ │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dropout (\u001b[38;5;33mDropout\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m64\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_8 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m10\u001b[0m) │ \u001b[38;5;34m650\u001b[0m │\n", "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Total params: 59,978 (234.29 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m59,978\u001b[0m (234.29 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Trainable params: 59,594 (232.79 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m59,594\u001b[0m (232.79 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Non-trainable params: 384 (1.50 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m384\u001b[0m (1.50 KB)\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from keras.api.layers import Dropout\n", "from keras.api.layers import BatchNormalization\n", "\n", "deep_model = Sequential()\n", "deep_model.add(InputLayer(shape=(28 * 28,)))\n", "deep_model.add(Dense(64, activation=\"relu\"))\n", "deep_model.add(BatchNormalization())\n", "deep_model.add(Dense(64, activation=\"relu\"))\n", "deep_model.add(BatchNormalization())\n", "deep_model.add(Dense(64, activation=\"relu\"))\n", "deep_model.add(BatchNormalization())\n", "deep_model.add(Dropout(0.2))\n", "deep_model.add(Dense(10, activation=\"softmax\"))\n", "deep_model.summary()" ] }, { "cell_type": "markdown", "id": "985823c3", "metadata": {}, "source": [ "#### Обучение глубокой модели\n", "\n", "Вместо SGD используется оптимизатор Adam" ] }, { "cell_type": "code", "execution_count": 12, "id": "02f0f967", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2s\u001b[0m 4ms/step - accuracy: 0.7737 - loss: 0.7278 - val_accuracy: 0.9490 - val_loss: 0.1610\n", "Epoch 2/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9508 - loss: 0.1621 - val_accuracy: 0.9622 - val_loss: 0.1260\n", "Epoch 3/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9643 - loss: 0.1164 - val_accuracy: 0.9659 - val_loss: 0.1077\n", "Epoch 4/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9716 - loss: 0.0913 - val_accuracy: 0.9672 - val_loss: 0.1038\n", "Epoch 5/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9770 - loss: 0.0773 - val_accuracy: 0.9717 - val_loss: 0.0893\n", "Epoch 6/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9801 - loss: 0.0647 - val_accuracy: 0.9714 - val_loss: 0.0937\n", "Epoch 7/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 2ms/step - accuracy: 0.9806 - loss: 0.0597 - val_accuracy: 0.9731 - val_loss: 0.0951\n", "Epoch 8/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9836 - loss: 0.0527 - val_accuracy: 0.9709 - val_loss: 0.0981\n", "Epoch 9/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9841 - loss: 0.0472 - val_accuracy: 0.9706 - val_loss: 0.0941\n", "Epoch 10/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9869 - loss: 0.0405 - val_accuracy: 0.9733 - val_loss: 0.0916\n", "Epoch 11/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9874 - loss: 0.0385 - val_accuracy: 0.9733 - val_loss: 0.0943\n", "Epoch 12/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9879 - loss: 0.0363 - val_accuracy: 0.9738 - val_loss: 0.0953\n", "Epoch 13/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9885 - loss: 0.0328 - val_accuracy: 0.9749 - val_loss: 0.0926\n", "Epoch 14/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9898 - loss: 0.0313 - val_accuracy: 0.9741 - val_loss: 0.0953\n", "Epoch 15/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9903 - loss: 0.0296 - val_accuracy: 0.9729 - val_loss: 0.0987\n", "Epoch 16/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9914 - loss: 0.0251 - val_accuracy: 0.9744 - val_loss: 0.0936\n", "Epoch 17/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9911 - loss: 0.0256 - val_accuracy: 0.9737 - val_loss: 0.1036\n", "Epoch 18/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9916 - loss: 0.0254 - val_accuracy: 0.9742 - val_loss: 0.0942\n", "Epoch 19/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9910 - loss: 0.0247 - val_accuracy: 0.9713 - val_loss: 0.1125\n", "Epoch 20/20\n", "\u001b[1m469/469\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9926 - loss: 0.0240 - val_accuracy: 0.9731 - val_loss: 0.1057\n" ] }, { "data": { "text/plain": [ "" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "deep_model.compile(\n", " loss=\"categorical_crossentropy\",\n", " optimizer=\"adam\",\n", " metrics=[\"accuracy\"],\n", ")\n", "\n", "deep_model.fit(\n", " X_train,\n", " y_train,\n", " batch_size=128,\n", " epochs=20,\n", " validation_data=(X_valid, y_valid),\n", ")" ] }, { "cell_type": "markdown", "id": "fdc87962", "metadata": {}, "source": [ "#### Оценка качества глубокой модели\n", "\n", "Лучшее качество модели: 97.3 %\n", "\n", "Качество модели незначительно улучшилось за счет улучшения архитектуры сети и смены оптимизатора" ] }, { "cell_type": "code", "execution_count": 13, "id": "70d626e2", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[1m313/313\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 1ms/step - accuracy: 0.9674 - loss: 0.1266\n" ] }, { "data": { "text/plain": [ "[0.10574997216463089, 0.9731000065803528]" ] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "deep_model.evaluate(X_valid, y_valid)" ] }, { "cell_type": "markdown", "id": "fc09e059", "metadata": {}, "source": [ "### Регрессия" ] }, { "cell_type": "markdown", "id": "36cfb72f", "metadata": {}, "source": [ "#### Загрузка данных для задачи регрессии\n", "\n", "Набор данных о жилье в Бостоне собран Службой переписи населения США.\n", "\n", "Входные признаки:\n", "- CRIM — уровень преступности на душу населения по районам;\n", "- ZN — доля жилых земель, отведенных под участки площадью более 25 000 кв. футов;\n", "- INDUS — доля неторговых акров в городе;\n", "- CHAS — 1, если участок граничит с рекой; 0 в противном случае;\n", "- NOX — концентрация оксидов азота;\n", "- RM — среднее количество комнат в помещении;\n", "- AGE — доля домов, построенных до 1940 года;\n", "- DIS — взвешенные расстояния до пяти центров занятости Бостона;\n", "- RAD — индекс доступности радиальных автомагистралей;\n", "- TAX — ставка налога на имущество на полную стоимость;\n", "- PTRATIO — соотношение учеников и учителей по районам;\n", "- B — доля чернокожих по районам;\n", "- LSTAT — % населения с более низким статусом.\n", "\n", "Целевой признак:\n", "- MEDV — медианная стоимость домов в тысячах долларов США.\n", "\n", "Данные уже предобработаны" ] }, { "cell_type": "code", "execution_count": 14, "id": "23146b3b", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[1.23247e+00, 0.00000e+00, 8.14000e+00, ..., 2.10000e+01,\n", " 3.96900e+02, 1.87200e+01],\n", " [2.17700e-02, 8.25000e+01, 2.03000e+00, ..., 1.47000e+01,\n", " 3.95380e+02, 3.11000e+00],\n", " [4.89822e+00, 0.00000e+00, 1.81000e+01, ..., 2.02000e+01,\n", " 3.75520e+02, 3.26000e+00],\n", " ...,\n", " [3.46600e-02, 3.50000e+01, 6.06000e+00, ..., 1.69000e+01,\n", " 3.62250e+02, 7.83000e+00],\n", " [2.14918e+00, 0.00000e+00, 1.95800e+01, ..., 1.47000e+01,\n", " 2.61950e+02, 1.57900e+01],\n", " [1.43900e-02, 6.00000e+01, 2.93000e+00, ..., 1.56000e+01,\n", " 3.76700e+02, 4.38000e+00]])" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/plain": [ "array([15.2, 42.3, 50. , 21.1, 17.7, 18.5, 11.3, 15.6, 15.6, 14.4, 12.1,\n", " 17.9, 23.1, 19.9, 15.7, 8.8, 50. , 22.5, 24.1, 27.5, 10.9, 30.8,\n", " 32.9, 24. , 18.5, 13.3, 22.9, 34.7, 16.6, 17.5, 22.3, 16.1, 14.9,\n", " 23.1, 34.9, 25. , 13.9, 13.1, 20.4, 20. , 15.2, 24.7, 22.2, 16.7,\n", " 12.7, 15.6, 18.4, 21. , 30.1, 15.1, 18.7, 9.6, 31.5, 24.8, 19.1,\n", " 22. , 14.5, 11. , 32. , 29.4, 20.3, 24.4, 14.6, 19.5, 14.1, 14.3,\n", " 15.6, 10.5, 6.3, 19.3, 19.3, 13.4, 36.4, 17.8, 13.5, 16.5, 8.3,\n", " 14.3, 16. , 13.4, 28.6, 43.5, 20.2, 22. , 23. , 20.7, 12.5, 48.5,\n", " 14.6, 13.4, 23.7, 50. , 21.7, 39.8, 38.7, 22.2, 34.9, 22.5, 31.1,\n", " 28.7, 46. , 41.7, 21. , 26.6, 15. , 24.4, 13.3, 21.2, 11.7, 21.7,\n", " 19.4, 50. , 22.8, 19.7, 24.7, 36.2, 14.2, 18.9, 18.3, 20.6, 24.6,\n", " 18.2, 8.7, 44. , 10.4, 13.2, 21.2, 37. , 30.7, 22.9, 20. , 19.3,\n", " 31.7, 32. , 23.1, 18.8, 10.9, 50. , 19.6, 5. , 14.4, 19.8, 13.8,\n", " 19.6, 23.9, 24.5, 25. , 19.9, 17.2, 24.6, 13.5, 26.6, 21.4, 11.9,\n", " 22.6, 19.6, 8.5, 23.7, 23.1, 22.4, 20.5, 23.6, 18.4, 35.2, 23.1,\n", " 27.9, 20.6, 23.7, 28. , 13.6, 27.1, 23.6, 20.6, 18.2, 21.7, 17.1,\n", " 8.4, 25.3, 13.8, 22.2, 18.4, 20.7, 31.6, 30.5, 20.3, 8.8, 19.2,\n", " 19.4, 23.1, 23. , 14.8, 48.8, 22.6, 33.4, 21.1, 13.6, 32.2, 13.1,\n", " 23.4, 18.9, 23.9, 11.8, 23.3, 22.8, 19.6, 16.7, 13.4, 22.2, 20.4,\n", " 21.8, 26.4, 14.9, 24.1, 23.8, 12.3, 29.1, 21. , 19.5, 23.3, 23.8,\n", " 17.8, 11.5, 21.7, 19.9, 25. , 33.4, 28.5, 21.4, 24.3, 27.5, 33.1,\n", " 16.2, 23.3, 48.3, 22.9, 22.8, 13.1, 12.7, 22.6, 15. , 15.3, 10.5,\n", " 24. , 18.5, 21.7, 19.5, 33.2, 23.2, 5. , 19.1, 12.7, 22.3, 10.2,\n", " 13.9, 16.3, 17. , 20.1, 29.9, 17.2, 37.3, 45.4, 17.8, 23.2, 29. ,\n", " 22. , 18. , 17.4, 34.6, 20.1, 25. , 15.6, 24.8, 28.2, 21.2, 21.4,\n", " 23.8, 31. , 26.2, 17.4, 37.9, 17.5, 20. , 8.3, 23.9, 8.4, 13.8,\n", " 7.2, 11.7, 17.1, 21.6, 50. , 16.1, 20.4, 20.6, 21.4, 20.6, 36.5,\n", " 8.5, 24.8, 10.8, 21.9, 17.3, 18.9, 36.2, 14.9, 18.2, 33.3, 21.8,\n", " 19.7, 31.6, 24.8, 19.4, 22.8, 7.5, 44.8, 16.8, 18.7, 50. , 50. ,\n", " 19.5, 20.1, 50. , 17.2, 20.8, 19.3, 41.3, 20.4, 20.5, 13.8, 16.5,\n", " 23.9, 20.6, 31.5, 23.3, 16.8, 14. , 33.8, 36.1, 12.8, 18.3, 18.7,\n", " 19.1, 29. , 30.1, 50. , 50. , 22. , 11.9, 37.6, 50. , 22.7, 20.8,\n", " 23.5, 27.9, 50. , 19.3, 23.9, 22.6, 15.2, 21.7, 19.2, 43.8, 20.3,\n", " 33.2, 19.9, 22.5, 32.7, 22. , 17.1, 19. , 15. , 16.1, 25.1, 23.7,\n", " 28.7, 37.2, 22.6, 16.4, 25. , 29.8, 22.1, 17.4, 18.1, 30.3, 17.5,\n", " 24.7, 12.6, 26.5, 28.7, 13.3, 10.4, 24.4, 23. , 20. , 17.8, 7. ,\n", " 11.8, 24.4, 13.8, 19.4, 25.2, 19.4, 19.4, 29.1])" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from keras.api.datasets import boston_housing\n", "\n", "(X_train, y_train), (X_valid, y_valid) = boston_housing.load_data()\n", "\n", "display(X_train)\n", "display(y_train)" ] }, { "cell_type": "markdown", "id": "cf59b0ae", "metadata": {}, "source": [ "#### Проектирование ИНС для задачи регрессии\n", "\n", "Для решения задачи регрессии в выходном слое используются нейроны с линейной функцией активации\n", "\n", "Создавать более сложную архитектуру не имеет смысла, так как в наборе данных мало признаков" ] }, { "cell_type": "code", "execution_count": 15, "id": "1ebaf129", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
Model: \"sequential_3\"\n",
       "
\n" ], "text/plain": [ "\u001b[1mModel: \"sequential_3\"\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
       "│ dense_9 (Dense)                 │ (None, 32)             │           448 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ batch_normalization_3           │ (None, 32)             │           128 │\n",
       "│ (BatchNormalization)            │                        │               │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_10 (Dense)                │ (None, 16)             │           528 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ batch_normalization_4           │ (None, 16)             │            64 │\n",
       "│ (BatchNormalization)            │                        │               │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dropout_1 (Dropout)             │ (None, 16)             │             0 │\n",
       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
       "│ dense_11 (Dense)                │ (None, 1)              │            17 │\n",
       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
       "
\n" ], "text/plain": [ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", "│ dense_9 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m32\u001b[0m) │ \u001b[38;5;34m448\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ batch_normalization_3 │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m32\u001b[0m) │ \u001b[38;5;34m128\u001b[0m │\n", "│ (\u001b[38;5;33mBatchNormalization\u001b[0m) │ │ │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_10 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m16\u001b[0m) │ \u001b[38;5;34m528\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ batch_normalization_4 │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m16\u001b[0m) │ \u001b[38;5;34m64\u001b[0m │\n", "│ (\u001b[38;5;33mBatchNormalization\u001b[0m) │ │ │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dropout_1 (\u001b[38;5;33mDropout\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m16\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", "│ dense_11 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m1\u001b[0m) │ \u001b[38;5;34m17\u001b[0m │\n", "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Total params: 1,185 (4.63 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m1,185\u001b[0m (4.63 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Trainable params: 1,089 (4.25 KB)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m1,089\u001b[0m (4.25 KB)\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
 Non-trainable params: 96 (384.00 B)\n",
       "
\n" ], "text/plain": [ "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m96\u001b[0m (384.00 B)\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "reg_model = Sequential()\n", "reg_model.add(InputLayer(shape=(13,)))\n", "reg_model.add(Dense(32, activation=\"relu\"))\n", "reg_model.add(BatchNormalization())\n", "reg_model.add(Dense(16, activation=\"relu\"))\n", "reg_model.add(BatchNormalization())\n", "reg_model.add(Dropout(0.2))\n", "reg_model.add(Dense(1, activation=\"linear\"))\n", "reg_model.summary()" ] }, { "cell_type": "markdown", "id": "474627ee", "metadata": {}, "source": [ "#### Обучение модели для регрессии\n", "\n", "Функция стоимости: MSE (лучше подходит для задачи регрессии)" ] }, { "cell_type": "code", "execution_count": 16, "id": "3b1a0038", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m1s\u001b[0m 16ms/step - loss: 601.7983 - val_loss: 533.4313\n", "Epoch 2/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 406us/step - loss: 565.4746 - val_loss: 511.8965\n", "Epoch 3/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 387us/step - loss: 546.8860 - val_loss: 498.2231\n", "Epoch 4/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 589us/step - loss: 508.6799 - val_loss: 477.7539\n", "Epoch 5/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 499us/step - loss: 459.7863 - val_loss: 449.6629\n", "Epoch 6/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 620us/step - loss: 443.5108 - val_loss: 440.0035\n", "Epoch 7/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 533us/step - loss: 411.8610 - val_loss: 433.9403\n", "Epoch 8/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 483us/step - loss: 401.8837 - val_loss: 403.7916\n", "Epoch 9/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 442us/step - loss: 327.4671 - val_loss: 336.9683\n", "Epoch 10/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 456us/step - loss: 294.0855 - val_loss: 307.6475\n", "Epoch 11/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 366us/step - loss: 263.6659 - val_loss: 284.4071\n", "Epoch 12/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 347us/step - loss: 271.5737 - val_loss: 212.2185\n", "Epoch 13/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 819us/step - loss: 191.6893 - val_loss: 262.1397\n", "Epoch 14/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 351us/step - loss: 146.1058 - val_loss: 190.2726\n", "Epoch 15/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 351us/step - loss: 140.3470 - val_loss: 107.0253\n", "Epoch 16/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 344us/step - loss: 112.8817 - val_loss: 44.3060\n", "Epoch 17/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 318us/step - loss: 97.8928 - val_loss: 94.4347\n", "Epoch 18/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 325us/step - loss: 69.6717 - val_loss: 47.5913\n", "Epoch 19/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 336us/step - loss: 77.6726 - val_loss: 31.8441\n", "Epoch 20/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 342us/step - loss: 59.1488 - val_loss: 63.5918\n", "Epoch 21/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 321us/step - loss: 58.7572 - val_loss: 48.5345\n", "Epoch 22/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 342us/step - loss: 52.5399 - val_loss: 38.4719\n", "Epoch 23/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 328us/step - loss: 50.7571 - val_loss: 34.1380\n", "Epoch 24/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 366us/step - loss: 48.0917 - val_loss: 48.2067\n", "Epoch 25/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 372us/step - loss: 40.9145 - val_loss: 26.2381\n", "Epoch 26/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 352us/step - loss: 57.9949 - val_loss: 23.5171\n", "Epoch 27/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 376us/step - loss: 45.4554 - val_loss: 39.4339\n", "Epoch 28/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 361us/step - loss: 43.5602 - val_loss: 61.3742\n", "Epoch 29/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 372us/step - loss: 46.8815 - val_loss: 42.7337\n", "Epoch 30/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 373us/step - loss: 44.2500 - val_loss: 33.6492\n", "Epoch 31/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 381us/step - loss: 36.4661 - val_loss: 29.0794\n", "Epoch 32/32\n", "\u001b[1m51/51\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 386us/step - loss: 44.0898 - val_loss: 19.0642\n" ] }, { "data": { "text/plain": [ "" ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "reg_model.compile(\n", " loss=\"mean_squared_error\",\n", " optimizer=\"adam\",\n", ")\n", "\n", "reg_model.fit(\n", " X_train,\n", " y_train,\n", " batch_size=8,\n", " epochs=32,\n", " validation_data=(X_valid, y_valid),\n", ")" ] }, { "cell_type": "markdown", "id": "bb962af9", "metadata": {}, "source": [ "#### Оценка качества модели для регрессии\n", "\n", "Средняя ошибка на тестовой выборке: 19.1 тысячи долларов" ] }, { "cell_type": "code", "execution_count": 17, "id": "727e9df3", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[1m1/1\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 84ms/step\n" ] }, { "data": { "text/plain": [ "17.616888" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/plain": [ "14.1" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import numpy as np\n", "\n", "y_hat = reg_model.predict(np.reshape(X_valid[42], [1, 13]))\n", "display(y_hat[0][0])\n", "display(y_valid[42])" ] } ], "metadata": { "kernelspec": { "display_name": ".venv (3.13.3)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.12.10" } }, "nbformat": 4, "nbformat_minor": 5 }