히든 레이어
기존의 입력 부분과 출력 부분사이에 퍼셉트론을 이어주면 딥러닝이 깊어진다고 하는데, 그 추가된 퍼셉트론을 숨겨진 층 , 즉 히든 레이어 라고 한다 .
처음 인풋 데이터에서 총 5개의 퍼셉트론을 통해 히든레이어를 구성한다.
히든레이어 에서는 하나의 퍼셉트론으로 output Layer를 도출 할 수 있다.
히든레이어 구조를 생성하는 코드는 다음과 같다.
3개의 히든레이어를 만들고 싶으면 다음과 같이 코딩할 수 있다.
이렇게 하면 그전의 모델보다 더 똑똑한 모델을 학습할 수 있다.
보스턴 집값 예측
##########################
# 라이브러리 사용
import tensorflow as tf
import pandas as pd
# 1.과거의 데이터를 준비합니다.
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/boston.csv'
보스턴 = pd.read_csv(파일경로)
# 종속변수, 독립변수
독립 = 보스턴[['crim', 'zn', 'indus', 'chas', 'nox',
'rm', 'age', 'dis', 'rad', 'tax',
'ptratio', 'b', 'lstat']]
종속 = 보스턴[['medv']]
print(독립.shape, 종속.shape)
(506, 13) (506, 1)
# 2. 모델의 구조를 만듭니다
X = tf.keras.layers.Input(shape=[13])
H = tf.keras.layers.Dense(10, activation='swish')(X)
Y = tf.keras.layers.Dense(1)(H)
model = tf.keras.models.Model(X,Y)
model.compile(loss='mse')
모델을 만든뒤 summary()를 이용하여 모델이 잘 구축되었는지 확인 해 주자.
#멀티 레이어에 대한 모델 확인
model.summary()
Model: "functional_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_2 (InputLayer) [(None, 13)] 0
_________________________________________________________________
dense_2 (Dense) (None, 10) 140
_________________________________________________________________
dense_3 (Dense) (None, 1) 11
=================================================================
Total params: 151
Trainable params: 151
Non-trainable params: 0
_________________________________________________________________
model.fit(독립, 종속, epochs = 10)
Epoch 1/10
16/16 [==============================] - 0s 875us/step - loss: 29.3493
Epoch 2/10
16/16 [==============================] - 0s 937us/step - loss: 28.2938
Epoch 3/10
16/16 [==============================] - 0s 1ms/step - loss: 26.5373
Epoch 4/10
16/16 [==============================] - 0s 1ms/step - loss: 27.3969
Epoch 5/10
16/16 [==============================] - 0s 937us/step - loss: 30.8988
Epoch 6/10
16/16 [==============================] - 0s 932us/step - loss: 26.2695
Epoch 7/10
16/16 [==============================] - 0s 812us/step - loss: 28.3679
Epoch 8/10
16/16 [==============================] - 0s 999us/step - loss: 27.8773
Epoch 9/10
16/16 [==============================] - 0s 937us/step - loss: 28.5545
Epoch 10/10
16/16 [==============================] - 0s 874us/step - loss: 26.7624
<tensorflow.python.keras.callbacks.History at 0x19a38b0aec8>
# 4. 모델을 이용한다.
print(model.predict(독립[:5]))
print(종속[:5])
[[28.39877 ]
[23.232418]
[29.714046]
[28.539837]
[28.08813 ]]
medv
0 24.0
1 21.6
2 34.7
3 33.4
4 36.2
아이리스 품종 분류
###########################
# 라이브러리 사용
import tensorflow as tf
import pandas as pd
# 1.과거의 데이터를 준비합니다.
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/iris.csv'
아이리스 = pd.read_csv(파일경로)
아이리스 = pd.get_dummies(아이리스)
독립 = 아이리스[['꽃잎길이', '꽃잎폭', '꽃받침길이', '꽃받침폭']]
종속 = 아이리스[['품종_setosa', '품종_versicolor', '품종_virginica']]
print(독립.shape, 종속.shape)
(150, 4) (150, 3)
# 2. 모델의 구조를 만듭니다
X = tf.keras.layers.Input(shape=[4])
H = tf.keras.layers.Dense(8, activation="swish")(X)
H = tf.keras.layers.Dense(8, activation="swish")(H)
H = tf.keras.layers.Dense(8, activation="swish")(H)
Y = tf.keras.layers.Dense(3, activation='softmax')(H)
model = tf.keras.models.Model(X, Y)
model.compile(loss='categorical_crossentropy',
metrics='accuracy')
model.summary()
# 3. 모델을 학습합니다.
model.fit(독립, 종속, epochs=100)
Epoch 1/100
5/5 [==============================] - 0s 799us/step - loss: 1.3173 - accuracy: 0.3133
Epoch 2/100
5/5 [==============================] - 0s 1000us/step - loss: 1.2031 - accuracy: 0.2333
Epoch 3/100
5/5 [==============================] - 0s 999us/step - loss: 1.1438 - accuracy: 0.0800
Epoch 4/100
5/5 [==============================] - 0s 999us/step - loss: 1.1016 - accuracy: 0.1867
Epoch 5/100
5/5 [==============================] - 0s 1000us/step - loss: 1.0655 - accuracy: 0.4533
Epoch 6/100
5/5 [==============================] - 0s 799us/step - loss: 1.0348 - accuracy: 0.7267
Epoch 7/100
5/5 [==============================] - 0s 1ms/step - loss: 1.0062 - accuracy: 0.6800
Epoch 8/100
5/5 [==============================] - 0s 799us/step - loss: 0.9767 - accuracy: 0.6667
Epoch 9/100
5/5 [==============================] - 0s 1ms/step - loss: 0.9467 - accuracy: 0.7600
Epoch 10/100
5/5 [==============================] - 0s 999us/step - loss: 0.9182 - accuracy: 0.7800
Epoch 11/100
5/5 [==============================] - 0s 1ms/step - loss: 0.8883 - accuracy: 0.7000
Epoch 12/100
5/5 [==============================] - 0s 1ms/step - loss: 0.8582 - accuracy: 0.7400
Epoch 13/100
5/5 [==============================] - 0s 1ms/step - loss: 0.8307 - accuracy: 0.7400
Epoch 14/100
5/5 [==============================] - 0s 1ms/step - loss: 0.7992 - accuracy: 0.6933
Epoch 15/100
5/5 [==============================] - 0s 1ms/step - loss: 0.7730 - accuracy: 0.6867
Epoch 16/100
5/5 [==============================] - 0s 1000us/step - loss: 0.7437 - accuracy: 0.6867
Epoch 17/100
5/5 [==============================] - 0s 1ms/step - loss: 0.7176 - accuracy: 0.7067
Epoch 18/100
5/5 [==============================] - 0s 1ms/step - loss: 0.6933 - accuracy: 0.8067
Epoch 19/100
5/5 [==============================] - 0s 999us/step - loss: 0.6702 - accuracy: 0.7267
Epoch 20/100
5/5 [==============================] - 0s 999us/step - loss: 0.6458 - accuracy: 0.8600
Epoch 21/100
5/5 [==============================] - 0s 1ms/step - loss: 0.6210 - accuracy: 0.7933
Epoch 22/100
5/5 [==============================] - 0s 1ms/step - loss: 0.6000 - accuracy: 0.8333
Epoch 23/100
5/5 [==============================] - 0s 1ms/step - loss: 0.5814 - accuracy: 0.8267
Epoch 24/100
5/5 [==============================] - 0s 999us/step - loss: 0.5584 - accuracy: 0.8400
Epoch 25/100
5/5 [==============================] - 0s 1ms/step - loss: 0.5377 - accuracy: 0.8133
Epoch 26/100
5/5 [==============================] - 0s 800us/step - loss: 0.5204 - accuracy: 0.8933
Epoch 27/100
5/5 [==============================] - 0s 600us/step - loss: 0.5011 - accuracy: 0.9467
Epoch 28/100
5/5 [==============================] - 0s 999us/step - loss: 0.4835 - accuracy: 0.8733
Epoch 29/100
5/5 [==============================] - 0s 800us/step - loss: 0.4673 - accuracy: 0.9600
Epoch 30/100
5/5 [==============================] - 0s 999us/step - loss: 0.4529 - accuracy: 0.8800
Epoch 31/100
5/5 [==============================] - 0s 800us/step - loss: 0.4393 - accuracy: 0.9267
Epoch 32/100
5/5 [==============================] - 0s 1ms/step - loss: 0.4222 - accuracy: 0.9467
Epoch 33/100
5/5 [==============================] - 0s 800us/step - loss: 0.4104 - accuracy: 0.9400
Epoch 34/100
5/5 [==============================] - 0s 999us/step - loss: 0.3949 - accuracy: 0.9600
Epoch 35/100
5/5 [==============================] - 0s 999us/step - loss: 0.3833 - accuracy: 0.9733
Epoch 36/100
5/5 [==============================] - 0s 1ms/step - loss: 0.3690 - accuracy: 0.9600
Epoch 37/100
5/5 [==============================] - 0s 799us/step - loss: 0.3599 - accuracy: 0.9800
Epoch 38/100
5/5 [==============================] - 0s 1ms/step - loss: 0.3459 - accuracy: 0.9533
Epoch 39/100
5/5 [==============================] - 0s 1000us/step - loss: 0.3362 - accuracy: 0.9667
Epoch 40/100
5/5 [==============================] - 0s 999us/step - loss: 0.3257 - accuracy: 0.9667
Epoch 41/100
5/5 [==============================] - 0s 1ms/step - loss: 0.3130 - accuracy: 0.9600
Epoch 42/100
5/5 [==============================] - 0s 799us/step - loss: 0.3088 - accuracy: 0.9733
Epoch 43/100
5/5 [==============================] - 0s 799us/step - loss: 0.2948 - accuracy: 0.9733
Epoch 44/100
5/5 [==============================] - 0s 799us/step - loss: 0.2883 - accuracy: 0.9733
Epoch 45/100
5/5 [==============================] - 0s 999us/step - loss: 0.2757 - accuracy: 0.9733
Epoch 46/100
5/5 [==============================] - 0s 1ms/step - loss: 0.2752 - accuracy: 0.9533
Epoch 47/100
5/5 [==============================] - 0s 999us/step - loss: 0.2630 - accuracy: 0.9733
Epoch 48/100
5/5 [==============================] - 0s 800us/step - loss: 0.2641 - accuracy: 0.9800
Epoch 49/100
5/5 [==============================] - 0s 800us/step - loss: 0.2477 - accuracy: 0.9733
Epoch 50/100
5/5 [==============================] - 0s 999us/step - loss: 0.2473 - accuracy: 0.9733
Epoch 51/100
5/5 [==============================] - 0s 999us/step - loss: 0.2382 - accuracy: 0.9733
Epoch 52/100
5/5 [==============================] - 0s 999us/step - loss: 0.2306 - accuracy: 0.9733
Epoch 53/100
5/5 [==============================] - 0s 799us/step - loss: 0.2273 - accuracy: 0.9800
Epoch 54/100
5/5 [==============================] - 0s 999us/step - loss: 0.2232 - accuracy: 0.9733
Epoch 55/100
5/5 [==============================] - 0s 1ms/step - loss: 0.2141 - accuracy: 0.9667
Epoch 56/100
5/5 [==============================] - 0s 2ms/step - loss: 0.2080 - accuracy: 0.9733
Epoch 57/100
5/5 [==============================] - 0s 1ms/step - loss: 0.2076 - accuracy: 0.9733
Epoch 58/100
5/5 [==============================] - 0s 1ms/step - loss: 0.2010 - accuracy: 0.9733
Epoch 59/100
5/5 [==============================] - 0s 999us/step - loss: 0.2031 - accuracy: 0.9800
Epoch 60/100
5/5 [==============================] - 0s 2ms/step - loss: 0.1899 - accuracy: 0.9733
Epoch 61/100
5/5 [==============================] - 0s 2ms/step - loss: 0.1880 - accuracy: 0.9733
Epoch 62/100
5/5 [==============================] - 0s 2ms/step - loss: 0.1908 - accuracy: 0.9667
Epoch 63/100
5/5 [==============================] - 0s 2ms/step - loss: 0.1791 - accuracy: 0.9733
Epoch 64/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1765 - accuracy: 0.9667
Epoch 65/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1723 - accuracy: 0.9733
Epoch 66/100
5/5 [==============================] - 0s 2ms/step - loss: 0.1720 - accuracy: 0.9533
Epoch 67/100
5/5 [==============================] - 0s 2ms/step - loss: 0.1674 - accuracy: 0.9733
Epoch 68/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1639 - accuracy: 0.9733
Epoch 69/100
5/5 [==============================] - 0s 999us/step - loss: 0.1606 - accuracy: 0.9733
Epoch 70/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1613 - accuracy: 0.9667
Epoch 71/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1528 - accuracy: 0.9733
Epoch 72/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1507 - accuracy: 0.9733
Epoch 73/100
5/5 [==============================] - 0s 800us/step - loss: 0.1480 - accuracy: 0.9733
Epoch 74/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1458 - accuracy: 0.9667
Epoch 75/100
5/5 [==============================] - 0s 1000us/step - loss: 0.1471 - accuracy: 0.9600
Epoch 76/100
5/5 [==============================] - 0s 999us/step - loss: 0.1437 - accuracy: 0.9667
Epoch 77/100
5/5 [==============================] - 0s 799us/step - loss: 0.1387 - accuracy: 0.9800
Epoch 78/100
5/5 [==============================] - 0s 999us/step - loss: 0.1384 - accuracy: 0.9800
Epoch 79/100
5/5 [==============================] - 0s 800us/step - loss: 0.1345 - accuracy: 0.9800
Epoch 80/100
5/5 [==============================] - 0s 999us/step - loss: 0.1358 - accuracy: 0.9667
Epoch 81/100
5/5 [==============================] - 0s 999us/step - loss: 0.1347 - accuracy: 0.9667
Epoch 82/100
5/5 [==============================] - 0s 800us/step - loss: 0.1312 - accuracy: 0.9667
Epoch 83/100
5/5 [==============================] - 0s 999us/step - loss: 0.1273 - accuracy: 0.9800
Epoch 84/100
5/5 [==============================] - 0s 799us/step - loss: 0.1277 - accuracy: 0.9733
Epoch 85/100
5/5 [==============================] - 0s 999us/step - loss: 0.1258 - accuracy: 0.9800
Epoch 86/100
5/5 [==============================] - 0s 999us/step - loss: 0.1227 - accuracy: 0.9733
Epoch 87/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1214 - accuracy: 0.9800
Epoch 88/100
5/5 [==============================] - 0s 800us/step - loss: 0.1209 - accuracy: 0.9733
Epoch 89/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1267 - accuracy: 0.9600
Epoch 90/100
5/5 [==============================] - 0s 800us/step - loss: 0.1156 - accuracy: 0.9800
Epoch 91/100
5/5 [==============================] - 0s 999us/step - loss: 0.1170 - accuracy: 0.9800
Epoch 92/100
5/5 [==============================] - 0s 1000us/step - loss: 0.1144 - accuracy: 0.9800
Epoch 93/100
5/5 [==============================] - 0s 1000us/step - loss: 0.1210 - accuracy: 0.9600
Epoch 94/100
5/5 [==============================] - 0s 999us/step - loss: 0.1118 - accuracy: 0.9733
Epoch 95/100
5/5 [==============================] - 0s 800us/step - loss: 0.1124 - accuracy: 0.9733
Epoch 96/100
5/5 [==============================] - 0s 999us/step - loss: 0.1098 - accuracy: 0.9800
Epoch 97/100
5/5 [==============================] - 0s 800us/step - loss: 0.1093 - accuracy: 0.9733
Epoch 98/100
5/5 [==============================] - 0s 999us/step - loss: 0.1104 - accuracy: 0.9733
Epoch 99/100
5/5 [==============================] - 0s 800us/step - loss: 0.1060 - accuracy: 0.9733
Epoch 100/100
5/5 [==============================] - 0s 999us/step - loss: 0.1110 - accuracy: 0.9733
<tensorflow.python.keras.callbacks.History at 0x19a38c97b48>
# 4. 모델을 이용합니다
print(model.predict(독립[:5]))
print(종속[:5])
[[9.9975282e-01 2.4686794e-04 3.8594033e-07]
[9.9908948e-01 9.0842636e-04 2.1982848e-06]
[9.9953449e-01 4.6451803e-04 1.1026074e-06]
[9.9888235e-01 1.1146414e-03 3.0053830e-06]
[9.9979466e-01 2.0501974e-04 3.1939598e-07]]
품종_setosa 품종_versicolor 품종_virginica
0 1 0 0
1 1 0 0
2 1 0 0
3 1 0 0
4 1 0 0
'데이터 분석 > 코딩야학' 카테고리의 다른 글
코딩야학 - 데이터 전처리, 효율높은 모델링 (0) | 2020.08.24 |
---|---|
코딩야학 - 아이리스 품종 분류 (0) | 2020.08.22 |
코딩야학 - 보스턴 집값 예측 (2) | 2020.08.20 |
코딩야학-레모네이드 판매 예측 실습 (0) | 2020.08.20 |
코딩야학-지도 학습 (0) | 2020.08.19 |