판다스를 이용한 간단한 데이터 전처리
- 변수(칼럼) 데이터 확인 : 데이터.dtypes
- 변수를 범주형으로 변경
- 데이터['칼럼명'].astype('category')
- 변수를 수치형으로 변경
- 데이터['칼럼명'].astype('int')
- 데이터['칼럼명'].asfloat('float')
- NA 값의 처리
- NA 갯수 체크 : 데이터.isna().sum()
- na 값 채우기 : 데이터['칼럼명'].fillna(특정숫자)
# 라이브러리 사용
import pandas as pd
# 파일 읽어오기
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/iris2.csv'
아이리스 = pd.read_csv(파일경로)
아이리스.head()
.dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; }
꽃잎길이 | 꽃잎폭 | 꽃받침길이 | 꽃받침폭 | 품종 | |
---|---|---|---|---|---|
0 | 5.1 | 3.5 | 1.4 | 0.2 | 0 |
1 | 4.9 | 3.0 | 1.4 | 0.2 | 0 |
2 | 4.7 | 3.2 | 1.3 | 0.2 | 0 |
3 | 4.6 | 3.1 | 1.5 | 0.2 | 0 |
4 | 5.0 | 3.6 | 1.4 | 0.2 | 0 |
변수 타입 변경
# 원핫인코딩
인코딩 = pd.get_dummies(아이리스)
인코딩.head()
.dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; }
꽃잎길이 | 꽃잎폭 | 꽃받침길이 | 꽃받침폭 | 품종 | |
---|---|---|---|---|---|
0 | 5.1 | 3.5 | 1.4 | 0.2 | 0 |
1 | 4.9 | 3.0 | 1.4 | 0.2 | 0 |
2 | 4.7 | 3.2 | 1.3 | 0.2 | 0 |
3 | 4.6 | 3.1 | 1.5 | 0.2 | 0 |
4 | 5.0 | 3.6 | 1.4 | 0.2 | 0 |
품종 칼럼을 범주형으로 인지못해서 원핫 인코딩이 이루어지지 않는다.
#먼저 각 칼럼의 타입 확인
print(아이리스.dtypes)
꽃잎길이 float64
꽃잎폭 float64
꽃받침길이 float64
꽃받침폭 float64
품종 int64
dtype: object
# 품종 타입을 범주형으로 바꾸어준다.
아이리스['품종']=아이리스['품종'].astype('category')
print(아이리스.dtypes)
꽃잎길이 float64
꽃잎폭 float64
꽃받침길이 float64
꽃받침폭 float64
품종 category
dtype: object
품종의 데이터 타입이 카테고리로 변경된 것을 알 수 있다.
# 원핫인코딩
인코딩 = pd.get_dummies(아이리스)
인코딩.head()
.dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; }
꽃잎길이 | 꽃잎폭 | 꽃받침길이 | 꽃받침폭 | 품종_0 | 품종_1 | 품종_2 | |
---|---|---|---|---|---|---|---|
0 | 5.1 | 3.5 | 1.4 | 0.2 | 1 | 0 | 0 |
1 | 4.9 | 3.0 | 1.4 | 0.2 | 1 | 0 | 0 |
2 | 4.7 | 3.2 | 1.3 | 0.2 | 1 | 0 | 0 |
3 | 4.6 | 3.1 | 1.5 | 0.2 | 1 | 0 | 0 |
4 | 5.0 | 3.6 | 1.4 | 0.2 | 1 | 0 | 0 |
결측치 처리
#NA 값을 체크 해보자
아이리스.isnull().sum()
꽃잎길이 0
꽃잎폭 1
꽃받침길이 0
꽃받침폭 0
품종 0
dtype: int64
아이리스.tail()
.dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; }
꽃잎길이 | 꽃잎폭 | 꽃받침길이 | 꽃받침폭 | 품종 | |
---|---|---|---|---|---|
145 | 6.7 | 3.0 | 5.2 | 2.3 | 2 |
146 | 6.3 | 2.5 | 5.0 | 1.9 | 2 |
147 | 6.5 | 3.0 | 5.2 | 2.0 | 2 |
148 | 6.2 | 3.4 | 5.4 | 2.3 | 2 |
149 | 5.9 | NaN | 5.1 | 1.8 | 2 |
# NA 값에 꽃잎폭 평균값을 넣어주는 방법
mean = 아이리스 ['꽃잎폭'].mean()
아이리스['꽃잎폭'] = 아이리스['꽃잎폭'] .fillna(mean)
아이리스 .tail()
.dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; }
꽃잎길이 | 꽃잎폭 | 꽃받침길이 | 꽃받침폭 | 품종 | |
---|---|---|---|---|---|
145 | 6.7 | 3.000000 | 5.2 | 2.3 | 2 |
146 | 6.3 | 2.500000 | 5.0 | 1.9 | 2 |
147 | 6.5 | 3.000000 | 5.2 | 2.0 | 2 |
148 | 6.2 | 3.400000 | 5.4 | 2.3 | 2 |
149 | 5.9 | 3.054362 | 5.1 | 1.8 | 2 |
학습이 잘 되는 모델 설계 방법.
- 사용할 레이어
- tf.keras.layers.BatchNormalization()
- tf.keran.layers.Activation('swish')
- 데이터
- 보스턴 집 값 예측
- 아이리스 품종 분류
# 라이브러리 사용
import tensorflow as tf
import pandas as pd
###########################
# 1.과거의 데이터를 준비합니다.
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/boston.csv'
보스턴 = pd.read_csv(파일경로)
# 종속변수, 독립변수
독립 = 보스턴[['crim', 'zn', 'indus', 'chas', 'nox',
'rm', 'age', 'dis', 'rad', 'tax',
'ptratio', 'b', 'lstat']]
종속 = 보스턴[['medv']]
print(독립.shape, 종속.shape)
(506, 13) (506, 1)
# 2. 모델의 구조를 만든다.
X = tf.keras.layers.Input(shape=[13])
H = tf.keras.layers.Dense(8, activation='swish')(X)
H = tf.keras.layers.Dense(8, activation='swish')(H)
H = tf.keras.layers.Dense(8, activation='swish')(H)
Y = tf.keras.layers.Dense(1)(H)
model = tf.keras.models.Model(X, Y)
model.compile(loss='mse')
# 3.데이터로 모델을 학습(FIT)합니다.
model.fit(독립, 종속, epochs=1000,verbose=0)
model.fit(독립, 종속, epochs=10)
Epoch 1/10
16/16 [==============================] - 0s 749us/step - loss: 16.4464
Epoch 2/10
16/16 [==============================] - 0s 750us/step - loss: 15.2519
Epoch 3/10
16/16 [==============================] - 0s 812us/step - loss: 15.1065
Epoch 4/10
16/16 [==============================] - 0s 750us/step - loss: 15.0645
Epoch 5/10
16/16 [==============================] - 0s 688us/step - loss: 15.2179
Epoch 6/10
16/16 [==============================] - 0s 812us/step - loss: 15.0632
Epoch 7/10
16/16 [==============================] - 0s 750us/step - loss: 15.4234
Epoch 8/10
16/16 [==============================] - 0s 812us/step - loss: 15.3210
Epoch 9/10
16/16 [==============================] - 0s 875us/step - loss: 16.2321
Epoch 10/10
16/16 [==============================] - 0s 875us/step - loss: 15.8727
<tensorflow.python.keras.callbacks.History at 0x19a3c360248>
loss가 일정수준 이하로 줄어들지 않는다.
BatchNormalization를 이용해보자.
# 2. 모델의 구조를 만든다.
X = tf.keras.layers.Input(shape=[13])
H = tf.keras.layers.Dense(8)(X)
H = tf.keras.layers.BatchNormalization()(H)
H = tf.keras.layers.Activation('swish')(H)
H = tf.keras.layers.Dense(8)(X)
H = tf.keras.layers.BatchNormalization()(H)
H = tf.keras.layers.Activation('swish')(H)
H = tf.keras.layers.Dense(8)(X)
H = tf.keras.layers.BatchNormalization()(H)
H = tf.keras.layers.Activation('swish')(H)
Y = tf.keras.layers.Dense(1)(H)
model = tf.keras.models.Model(X, Y)
model.compile(loss='mse')
# 3.데이터로 모델을 학습(FIT)합니다.
model.fit(독립, 종속, epochs=1000,verbose=0)
model.fit(독립, 종속, epochs=10)
Epoch 1/10
16/16 [==============================] - 0s 1ms/step - loss: 11.3594
Epoch 2/10
16/16 [==============================] - 0s 1ms/step - loss: 11.6020
Epoch 3/10
16/16 [==============================] - 0s 1ms/step - loss: 12.6939
Epoch 4/10
16/16 [==============================] - 0s 1ms/step - loss: 12.4105
Epoch 5/10
16/16 [==============================] - 0s 1ms/step - loss: 11.5803
Epoch 6/10
16/16 [==============================] - 0s 1ms/step - loss: 15.0238
Epoch 7/10
16/16 [==============================] - 0s 1ms/step - loss: 13.9119
Epoch 8/10
16/16 [==============================] - 0s 1ms/step - loss: 10.9356
Epoch 9/10
16/16 [==============================] - 0s 1ms/step - loss: 12.5090
Epoch 10/10
16/16 [==============================] - 0s 1ms/step - loss: 12.6031
<tensorflow.python.keras.callbacks.History at 0x19a3c65f908>
loss 가 12 까지 떨어진 것을 확인 할 수 있다.
아이리스 데이터에 적용
###########################
# 라이브러리 사용
import tensorflow as tf
import pandas as pd
# 1.과거의 데이터를 준비합니다.
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/iris.csv'
아이리스 = pd.read_csv(파일경로)
# 원핫인코딩
아이리스 = pd.get_dummies(아이리스)
# 종속변수, 독립변수
독립 = 아이리스[['꽃잎길이', '꽃잎폭', '꽃받침길이', '꽃받침폭']]
종속 = 아이리스[['품종_setosa', '품종_versicolor', '품종_virginica']]
print(독립.shape, 종속.shape)
(150, 4) (150, 3)
# 2. 모델의 구조를 BatchNormalization layer를 사용하여 만든다.
X = tf.keras.layers.Input(shape=[4])
H = tf.keras.layers.Dense(8)(X)
H = tf.keras.layers.BatchNormalization()(H)
H = tf.keras.layers.Activation('swish')(H)
H = tf.keras.layers.Dense(8)(H)
H = tf.keras.layers.BatchNormalization()(H)
H = tf.keras.layers.Activation('swish')(H)
H = tf.keras.layers.Dense(8)(H)
H = tf.keras.layers.BatchNormalization()(H)
H = tf.keras.layers.Activation('swish')(H)
Y = tf.keras.layers.Dense(3, activation='softmax')(H)
model = tf.keras.models.Model(X, Y)
model.compile(loss='categorical_crossentropy',
metrics='accuracy')
# 3.데이터로 모델을 학습(FIT)합니다.
model.fit(독립, 종속, epochs=1000,batch_size=150)
Epoch 1/1000
1/1 [==============================] - 0s 999us/step - loss: 0.0215 - accuracy: 0.9933
Epoch 2/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0214 - accuracy: 0.9933
Epoch 3/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0213 - accuracy: 0.9933
Epoch 4/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0212 - accuracy: 0.9933
Epoch 5/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0211 - accuracy: 0.9933
Epoch 6/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0211 - accuracy: 0.9933
Epoch 7/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0210 - accuracy: 0.9933
Epoch 8/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0209 - accuracy: 0.9933
Epoch 9/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0208 - accuracy: 0.9933
Epoch 10/1000
1/1 [==============================] - 0s 998us/step - loss: 0.0207 - accuracy: 0.9933
Epoch 11/1000
1/1 [==============================] - 0s 1000us/step - loss: 0.0206 - accuracy: 0.9933
Epoch 12/1000
1/1 [==============================] - 0s 999us/step - loss: 0.0205 - accuracy: 0.9933
Epoch 13/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0204 - accuracy: 0.9933
Epoch 14/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0203 - accuracy: 0.9933
Epoch 15/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0203 - accuracy: 0.9933
Epoch 16/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0202 - accuracy: 0.9933
Epoch 17/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0201 - accuracy: 0.9933
Epoch 18/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0200 - accuracy: 0.9933
Epoch 19/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0199 - accuracy: 0.9933
Epoch 20/1000
1/1 [==============================] - 0s 1000us/step - loss: 0.0198 - accuracy: 0.9933
Epoch 21/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0197 - accuracy: 0.9933
Epoch 22/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0196 - accuracy: 0.9933
Epoch 23/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0195 - accuracy: 0.9933
Epoch 24/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0194 - accuracy: 0.9933
Epoch 25/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0193 - accuracy: 0.9933
Epoch 26/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0192 - accuracy: 0.9933
Epoch 27/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0190 - accuracy: 0.9933
Epoch 28/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0189 - accuracy: 0.9933
Epoch 29/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0188 - accuracy: 0.9933
Epoch 30/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0186 - accuracy: 0.9933
Epoch 31/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0185 - accuracy: 0.9933
Epoch 32/1000
1/1 [==============================] - 0s 998us/step - loss: 0.0184 - accuracy: 0.9933
Epoch 33/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0182 - accuracy: 0.9933
Epoch 34/1000
1/1 [==============================] - 0s 1000us/step - loss: 0.0180 - accuracy: 0.9933
Epoch 35/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0178 - accuracy: 0.9933
Epoch 36/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0177 - accuracy: 0.9933
Epoch 37/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0175 - accuracy: 0.9933
Epoch 38/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0172 - accuracy: 0.9933
Epoch 39/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0170 - accuracy: 0.9933
Epoch 40/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0168 - accuracy: 0.9933
Epoch 41/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0165 - accuracy: 0.9933
Epoch 42/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0163 - accuracy: 0.9933
Epoch 43/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0160 - accuracy: 0.9933
Epoch 44/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0158 - accuracy: 0.9933
Epoch 45/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0155 - accuracy: 0.9933
Epoch 46/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0152 - accuracy: 0.9933
Epoch 47/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0150 - accuracy: 0.9933
Epoch 48/1000
1/1 [==============================] - 0s 999us/step - loss: 0.0147 - accuracy: 0.9933
Epoch 49/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0145 - accuracy: 0.9933
Epoch 50/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0142 - accuracy: 0.9933
Epoch 51/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0140 - accuracy: 0.9933
Epoch 52/1000
1/1 [==============================] - 0s 1000us/step - loss: 0.0137 - accuracy: 0.9933
Epoch 53/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0135 - accuracy: 0.9933
Epoch 54/1000
1/1 [==============================] - 0s 998us/step - loss: 0.0133 - accuracy: 0.9933
Epoch 55/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0130 - accuracy: 0.9933
Epoch 56/1000
1/1 [==============================] - 0s 1000us/step - loss: 0.0128 - accuracy: 0.9933
Epoch 57/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0126 - accuracy: 0.9933
Epoch 58/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0123 - accuracy: 1.0000
Epoch 59/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0121 - accuracy: 1.0000
Epoch 60/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0118 - accuracy: 1.0000
Epoch 61/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0116 - accuracy: 1.0000
Epoch 62/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0114 - accuracy: 1.0000
Epoch 63/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0111 - accuracy: 1.0000
Epoch 64/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0109 - accuracy: 1.0000
Epoch 65/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0107 - accuracy: 1.0000
Epoch 66/1000
1/1 [==============================] - 0s 5ms/step - loss: 0.0104 - accuracy: 1.0000
Epoch 67/1000
1/1 [==============================] - 0s 5ms/step - loss: 0.0102 - accuracy: 1.0000
Epoch 68/1000
1/1 [==============================] - 0s 4ms/step - loss: 0.0100 - accuracy: 1.0000
Epoch 69/1000
1/1 [==============================] - 0s 4ms/step - loss: 0.0098 - accuracy: 1.0000
Epoch 70/1000
1/1 [==============================] - 0s 4ms/step - loss: 0.0096 - accuracy: 1.0000
Epoch 71/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0094 - accuracy: 1.0000
Epoch 72/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0092 - accuracy: 1.0000
Epoch 73/1000
1/1 [==============================] - 0s 4ms/step - loss: 0.0090 - accuracy: 1.0000
Epoch 74/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0088 - accuracy: 1.0000
Epoch 75/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0086 - accuracy: 1.0000
Epoch 76/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0085 - accuracy: 1.0000
Epoch 77/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0089 - accuracy: 1.0000
Epoch 78/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0087 - accuracy: 1.0000
Epoch 79/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0083 - accuracy: 1.0000
Epoch 80/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0080 - accuracy: 1.0000
Epoch 81/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0079 - accuracy: 1.0000
Epoch 82/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0077 - accuracy: 1.0000
Epoch 83/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0076 - accuracy: 1.0000
Epoch 84/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0074 - accuracy: 1.0000
Epoch 85/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0073 - accuracy: 1.0000
Epoch 86/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0071 - accuracy: 1.0000
Epoch 87/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0070 - accuracy: 1.0000
Epoch 88/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0069 - accuracy: 1.0000
Epoch 89/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0068 - accuracy: 1.0000
Epoch 90/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0067 - accuracy: 1.0000
Epoch 91/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0066 - accuracy: 1.0000
Epoch 92/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0065 - accuracy: 1.0000
Epoch 93/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0065 - accuracy: 1.0000
Epoch 94/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0064 - accuracy: 1.0000
Epoch 95/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0063 - accuracy: 1.0000
Epoch 96/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0061 - accuracy: 1.0000
Epoch 97/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0060 - accuracy: 1.0000
Epoch 98/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0058 - accuracy: 1.0000
Epoch 99/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0057 - accuracy: 1.0000
Epoch 100/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0056 - accuracy: 1.0000
Epoch 101/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0055 - accuracy: 1.0000
Epoch 102/1000
1/1 [==============================] - 0s 1ms/step - loss: 0.0054 - accuracy: 1.0000
Epoch 103/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0053 - accuracy: 1.0000
Epoch 104/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0053 - accuracy: 1.0000
Epoch 105/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0052 - accuracy: 1.0000
Epoch 106/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0051 - accuracy: 1.0000
Epoch 107/1000
1/1 [==============================] - 0s 3ms/step - loss: 0.0051 - accuracy: 1.0000
Epoch 108/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0050 - accuracy: 1.0000
Epoch 109/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0050 - accuracy: 1.0000
Epoch 110/1000
1/1 [==============================] - 0s 1000us/step - loss: 0.0049 - accuracy: 1.0000
Epoch 111/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0048 - accuracy: 1.0000
Epoch 112/1000
1/1 [==============================] - 0s 2ms/step - loss: 0.0047 - accuracy: 1.0000
...
Epoch 990/1000
1/1 [==============================] - 0s 2ms/step - loss: 8.3684e-07 - accuracy: 1.0000
Epoch 991/1000
1/1 [==============================] - 0s 2ms/step - loss: 1.2803e-06 - accuracy: 1.0000
Epoch 992/1000
1/1 [==============================] - 0s 2ms/step - loss: 4.8015e-06 - accuracy: 1.0000
Epoch 993/1000
1/1 [==============================] - 0s 3ms/step - loss: 6.7999e-06 - accuracy: 1.0000
Epoch 994/1000
1/1 [==============================] - 0s 2ms/step - loss: 1.4106e-06 - accuracy: 1.0000
Epoch 995/1000
1/1 [==============================] - 0s 2ms/step - loss: 8.1062e-07 - accuracy: 1.0000
Epoch 996/1000
1/1 [==============================] - 0s 2ms/step - loss: 7.5340e-07 - accuracy: 1.0000
Epoch 997/1000
1/1 [==============================] - 0s 2ms/step - loss: 7.3433e-07 - accuracy: 1.0000
Epoch 998/1000
1/1 [==============================] - 0s 2ms/step - loss: 7.2399e-07 - accuracy: 1.0000
Epoch 999/1000
1/1 [==============================] - 0s 2ms/step - loss: 7.1684e-07 - accuracy: 1.0000
Epoch 1000/1000
1/1 [==============================] - 0s 2ms/step - loss: 7.1366e-07 - accuracy: 1.0000
<tensorflow.python.keras.callbacks.History at 0x19a3e9ec588>
원래는 1의 정확도는 거의 나오지 않는데 결괏값을 보면 정확독 거의 1로 나오는 것을 확인 할 수 있다.
'데이터 분석 > 코딩야학' 카테고리의 다른 글
코딩야학 - 히든레이어 (0) | 2020.08.24 |
---|---|
코딩야학 - 아이리스 품종 분류 (0) | 2020.08.22 |
코딩야학 - 보스턴 집값 예측 (2) | 2020.08.20 |
코딩야학-레모네이드 판매 예측 실습 (0) | 2020.08.20 |
코딩야학-지도 학습 (0) | 2020.08.19 |