keras basics

Äɶó½ºÀÇ ±âÃÊ¿¡ ´ëÇؼ­ ¾Ë¾Æº»´Ù. ÄÉ¶ó½º ¸À¸¸ º¼°ÍÀÌ´Ù.
ÀÌ·ÐÀûÀÎ ¼³¸íº¸´Ù ÇÔ¼ö¿¡ ´ëÇØ °£´ÜÇÏ°Ô ¼³¸í ÇÒ °ÍÀÌ´Ù.

ÃÊ, ÁßÇб³¶§ Á÷¼±ÀÇ °ø½ÄÀ» ¹è¿ü´Ù.

y = ax + b

¼±Çüȸ±Í ¼ö½ÄÀº ´ÙÀ½°ú °°ÀÌ Ç¥ÇöÇÑ´Ù.

H(x) = Wx + b

H :  Hypothesies °¡¼³, ÃßÃø


´ÙÀ½ÀÇ ¼ø¼­·Î ÁøÇàÇÑ´Ù.

1. µ¥ÀÌÅÍ Àüó¸®
2. ¸ðµ¨ »ý¼º
3. ¸ðµ¨ ÄÄÆÄÀÏ
4. ¸ðµ¨ ÈÆ·Ã
5. ¸ðµ¨ Æò°¡
6. ¿¹ÃøÇϱâ

Àüü Äڵ带 º¸ÀÚ.

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
import numpy as np

# µ¥ÀÌÅÍ ¼ÂÆÃ
x = np.array([1, 2, 3, 4, 5])
y = np.array([1, 2, 3, 4, 5])

model = Sequential()
model.add(Dense(1, input_dim=1, activation='relu'))

model.compile('SGD', 'mse', metrics=['accuracy'])

model.fit(x, y, epochs=100, verbose=0)

loss_and_metrics = model.evaluate(x, y)
print(loss_and_metrics)

predict = model.predict(x).flatten()
print('y', y, ' predict: ', predict)

1. µ¥ÀÌÅÍ Àüó¸®

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
import numpy as np

# µ¥ÀÌÅÍ ¼ÂÆÃ
x = np.array([1, 2, 3, 4, 5])
y = np.array([1, 2, 3, 4, 5])

models, layers ¶óÀ̺귯¸®¿¡¼­ Sequential, Dense ¿ÀºêÁ§Æ®¸¦ ÀÓÆ÷Æ® ÇÑ´Ù.
numpyµµ ÀÓÆ÷Æ® ÇÑ´Ù.

numpy·Î x, y¿¡ µ¥ÀÌÅ͸¦ ÀÔ·ÂÇÑ´Ù.

2. ¸ðµ¨ »ý¼º

model = Sequential()
model.add(Dense(1, input_dim=1, activation='relu'))

model = Sequential()

Sequential ¿ÀºêÁ§Æ®¸¦ ¸¸µé¾î model º¯¼ö¿¡ ³Ö´Â´Ù.
Äɶ󽺿¡¼­ ÇÙ½É µ¥ÀÌÅÍ ±¸Á¶´Â ¸ðµ¨ÀÌ°í ÀÌ ¸ðµ¨À» ±¸¼ºÇÏ´Â °ÍÀÌ LayerÀÌ´Ù.
Layers´Â ´º·±µéÀÇ ÁýÇÕÀÌ´Ù.

model.add(Dense(1, input_dim=1, activation='relu'))

´º·±ÀÇ µ¿ÀÛ °úÁ¤Àº input ´º·±(³ëµå) ---> Àº´Ð ´º·±(³ëµå) ---> output ´º·±(³ëµå)ÀÌ´Ù.
Dense ·¹À̾î´Â ÀÔ·Â, Ãâ·ÂÀ» ¸ðµÎ ¿¬°áÇØÁÖ¸ç, ÀԷ°ú Ãâ·ÂÀ» °¢°¢ ¿¬°áÇØÁÖ´Â °¡ÁßÄ¡¸¦ Æ÷ÇÔÇÑ´Ù.

Dense ·¹À̾î´Â ÀÔ·Â, Ãâ·ÂÀ» ¸ðµÎ ¿¬°áÇØÁÖ¸ç ÀԷ°ú Ãâ·ÂÀ» °¢°¢ ¿¬°á ÇØÁÖ´Â °¡ÁßÄ¡¸¦ Æ÷ÇÔÇÑ´Ù.

Dense ù¹ø° ÀÎÀÚ : Ãâ·Â ³ëµå·Î ´ÙÀ½ ³ëµå°¡ Àº´ÐÃþÀÎ °æ¿ì Àº´ÐÃþÀÇ ³ëµåÀÌ´Ù.
input_dim=1 : ÀԷ°ªÀÌ 1°³ÀÌ´Ù.
activation='relu' : ´Ù¸¥ ·¹À̾îÃþÀ¸·Î Àü´ÞÇÒ¶§ ¾î¶² È°¼ºÈ­ ¿ªÇÒ ÇÔ¼ö¸¦ »ç¿ëÇÏ¿© Àü´ÞÈúÁö °áÁ¤ÇÑ´Ù.

3. ¸ðµ¨ ÄÄÆÄÀÏ

¸ðµ¨À» ÈÆ·Ã ½ÃÅ°±â Àü¿¡ ¿©·¯°¡Áö ȯ°æ ¼³Á¤À» ÇÏ¿© ÇнÀÇÒ Áغñ¸¦ ÇÑ´Ù.

model.compile('SGD', 'mse', metrics=['accuracy'])

ù¹ø° ÀÎÀÚ : optimizer='SGD'
Stochastic Gradient Descent (È®·üÀû °æ»ç ÇÏ°­¹ý)
¿ÉƼ¸¶ÀÌÀú´Â ¼Õ½Ç ÇÔ¼ö¸¦ ÅëÇØ ¾òÀº ¼Õ½Ç°ªÀ¸·Î ºÎÅÍ ¸ðµ¨À» ¾÷µ¥ÀÌÆ® ÇÏ´Â ¹æ½ÄÀÌ´Ù.
¿ÉƼ¸¶ÀÌÀú °ª¿¡ µû¶ó ¸ðµ¨ÀÇ ¼Õ½Ç °ªÀÌ °¨¼Ò ÇÒ¼ö ÀÖ´Ù.

µÎ¹ø° ÀÎÀÚ: loss='mse'
mean squared error (Æò±Õ Á¦°ö ¿ÀÂ÷)
·Î½º´Â µ¥ÀÌÅÍ ¿¹Ãø°ª°ú ½ÇÁ¦ °ªÀ» ºñ±³ÇÏ´Â ÇÔ¼öÀÌ´Ù.
¸ðµ¨À» ÈÆ·Ã ½Ãų¶§ ¿À·ù¸¦ ÃÖ¼ÒÈ­ ÇÑ´Ù.

metrics¸¦ »ý·«ÇÏ¸é ¸ðµ¨ Æò°¡(evaluate)½Ã Á¤È®µµ(accuracy)´Â Ãâ·ÂÇÏÁö ¾Ê´Â´Ù.

4. ¸ðµ¨ ÈÆ·Ã

model.fit(x, y, epochs=100, verbose=0)

epochs(¿¡Æ÷Å©) = 100 ÈÆ·ÃÀÇ ¹Ýº¹ Ƚ¼ö
verbose=0 ÇнÀÀÇ ÁøÇà »óȲÀ» º¸¿©ÁÙÁö¸¦ °áÁ¤ÇÑ´Ù.

5. ¸ðµ¨ Æò°¡

loss_and_metrics = model.evaluate(x, y)
print(loss_and_metrics)

°á°ú)
1/1 [==============================] - 0s 29ms/step - loss: 0.0020 - accuracy: 0.2000
[0.0019654773641377687, 0.20000000298023224]

ÈÆ·Ã µ¥ÀÌÅÍ¿Í Æò°¡ µ¥ÀÌÅ͸¦ ºÐ¸®ÇØ¾ß ÇÏÁö¸¸ ¿©±â¼­´Â °£´ÜÇÏ°Ô Çϱâ À§ÇØ ÈÆ·Ã µ¥ÀÌÅ͸¦ »ç¿ëÇÑ´Ù.
µ¥ÀÌÅÍ·Î ÇнÀÈÄ ÇнÀÀÌ Àß µÇ¾ú´ÂÁö °ËÁõÇÏ´Â ¸í·ÉÀÌ´Ù. °ËÁõ °ªÀº loss¿Í acc·Î ³ªÅ¸³½´Ù.

acc (accuracy) : Á¤È®µµ·Î 1¿¡ °¡±õ°í ³ôÀ»¼ö·Ï ÁÁ´Ù.
loss : ½ÇÁ¦°ª°ú ¿¹Ãø°ªÀÇ Â÷À̸¦ ÀǹÌÇÑ´Ù.  0¿¡ °¡±î¿ï¼ö·Ï ÁÁ´Ù.

6. ¸ðµ¨ ¿¹Ãø

predict = model.predict(x).flatten()
print('y', y, ' predict: ', predict)

1/1 [==============================] - 0s 48ms/step
y [1 2 3 4 5] predict: [0.924868 1.9536507 2.9824333 4.011216 5.039999 ]


predict´Â ÇнÀµÈ ¸ðµ¨À» »ç¿ëÇÏ¿© µ¥ÀÌÅ͸¦ ¿¹ÃøÇÑ´Ù.
flatten() : ´ÙÂ÷¿øÀ» 1Â÷¿ø ¹è¿­·Î  º¯È¯Çؼ­ Àб⠽±°Ô ÇØÁØ´Ù.

Á¤È®µµ°¡ 0.2 ¹Û¿¡ ¾ÈµÇ°í,  ¿¹Ãø °ªÀÌ ÀÌ»óÇÏ´Ù.

Âü°í)
ÇнÀÀÇ ÀÌÇØ ¹× °£´ÜÇÑ ¹æÁ¤½Ä Ç®ÀÌ
https://needjarvis.tistory.com/426

ÄÉ¶ó½º model.add °Ë»ö

Æ©Å丮¾ó1 - Sequential Model ±¸Çö 
https://ebbnflow.tistory.com/120

Dense ·¹À̾î ÀÚ¼¼ÇÏ°Ô ¼³¸í
https://ebbnflow.tistory.com/124

https://www.kaggle.com/code/prashant111/keras-basics-for-beginners