목차
모델 배포
[목표]
이미 트레이닝된 모델을 디스크에서 로드
다른 형식의 이미지에 대해 트레이닝된 모델의 이미지 형식 변경
트레이닝된 모델이 처음 접하는 새로운 이미지로 추론을 수행하고 성능을 평가
구현하기 전 읽어 볼 지식
- 📌필수📌데이터 증강 구현: 이 모델을 베포할 것이므로 꼭꼭 읽어보자!
모델 로드
저장된 모델을 로드
- 어디서 무슨 모델을 저장 했나요?
keras.models.load_model
: 케라스에서 모델 로드
from tensorflow import keras
model = keras.models.load_model('asl_model')
기억이 안나면 모델 요약을 다시한번 확인해보자
model.summary()
👀 모델 요약 보기
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 28, 28, 75) 750
_________________________________________________________________
batch_normalization (BatchNo (None, 28, 28, 75) 300
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 14, 14, 75) 0
_________________________________________________________________
conv2d_1 (Conv2D) (None, 14, 14, 50) 33800
_________________________________________________________________
dropout (Dropout) (None, 14, 14, 50) 0
_________________________________________________________________
batch_normalization_1 (Batch (None, 14, 14, 50) 200
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 7, 7, 50) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 7, 7, 25) 11275
_________________________________________________________________
batch_normalization_2 (Batch (None, 7, 7, 25) 100
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 4, 4, 25) 0
_________________________________________________________________
flatten (Flatten) (None, 400) 0
_________________________________________________________________
dense (Dense) (None, 512) 205312
_________________________________________________________________
dropout_1 (Dropout) (None, 512) 0
_________________________________________________________________
dense_1 (Dense) (None, 24) 12312
=================================================================
Total params: 264,049
Trainable params: 263,749
Non-trainable params: 300
_________________________________________________________________
모델을 위한 이미지 준비
데이터세트의 이미지: 28x28픽셀의 회색조
예측을 당하는 이미지: 컬러
➡ 채널이 다름
➡ 예측을 위해서는 입력(예측을 당하는 이미지) 과 트레이닝의 대상이 된 이미지(데이터세트의 이미지) 의 모양과 일치 해야함
이미지 표시
먼저 입력이미지의 모양을 파악하기 위해 시각화
- matplotlib 라이브러리를 사용
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
def show_image(image_path):
image = mpimg.imread(image_path)
plt.imshow(image)
show_image('data/asl_images/b.png')
이미지 확장
[문제 1]
위의 이미지는 컬러 이미지 이므로 흑백이미지인 학습 이미지와 채널의 수가 다르다.
그러므로 입력이미지의 색으로 흑백으로 바꿔줘야 한다.
[문제 2]
위의 이미지의 크기는 180x180 픽셀 이지만 데이터 세트의 크기는 28x28픽셀이다
그러므로 입력이미지의 크기(픽셀 수)를 조절해줘야 한다.
from tensorflow.keras.preprocessing import image as image_utils
def load_and_scale_image(image_path):
image = image_utils.load_img(image_path, color_mode="grayscale", target_size=(28,28))
return image
image = load_and_scale_image('data/asl_images/b.png')
plt.imshow(image, cmap='gray')
예측을 위한 이미지 준비
위의 문제들을 모두 해결했으니 이제 입력이미지를 모델에 넣에 예측을 진행할 수 있다.
그 전, 모델이 트레이닝된 데이터세트의 모양과 일치하도록 이미지를 재구성한다
그러니까 원래 이렇게 시각화 된 이미지를
image
이 과정을 통해서
image = image_utils.img_to_array(image)
이렇게 데이터 세트의 모양과 같게 변화시켜주는 것이다
👀 변황 결과 보기
array([[[158.],
[161.],
[162.],
[167.],
[169.],
[169.],
[171.],
[170.],
[170.],
[170.],
[171.],
[171.],
[171.],
[170.],
[187.],
[165.],
[168.],
[166.],
[166.],
[163.],
[162.],
[159.],
[157.],
[155.],
[154.],
[151.],
[149.],
[179.]],
[[161.],
[164.],
[166.],
[169.],
[170.],
[172.],
[172.],
[173.],
[173.],
[173.],
[174.],
[173.],
[173.],
[175.],
[220.],
[175.],
[148.],
[165.],
[167.],
[166.],
[164.],
[161.],
[160.],
[159.],
[156.],
[154.],
[151.],
[169.]],
[[162.],
[166.],
[169.],
[171.],
[173.],
[175.],
[175.],
[175.],
[176.],
[177.],
[177.],
[176.],
[218.],
[180.],
[180.],
[184.],
[ 97.],
[166.],
[123.],
[169.],
[169.],
[165.],
[164.],
[160.],
[159.],
[157.],
[156.],
[170.]],
[[167.],
[170.],
[172.],
[174.],
[175.],
[177.],
[178.],
[180.],
[180.],
[180.],
[179.],
[180.],
[239.],
[177.],
[150.],
[202.],
[109.],
[198.],
[122.],
[171.],
[169.],
[170.],
[168.],
[164.],
[160.],
[161.],
[157.],
[171.]],
[[169.],
[171.],
[174.],
[175.],
[177.],
[178.],
[179.],
[181.],
[181.],
[182.],
[182.],
[183.],
[244.],
[183.],
[143.],
[211.],
[127.],
[203.],
[133.],
[179.],
[173.],
[171.],
[170.],
[168.],
[164.],
[162.],
[159.],
[171.]],
[[172.],
[174.],
[177.],
[179.],
[180.],
[182.],
[183.],
[184.],
[184.],
[186.],
[200.],
[168.],
[244.],
[199.],
[117.],
[199.],
[129.],
[215.],
[162.],
[165.],
[175.],
[174.],
[172.],
[171.],
[168.],
[167.],
[163.],
[172.]],
[[173.],
[177.],
[180.],
[181.],
[183.],
[184.],
[185.],
[186.],
[186.],
[200.],
[202.],
[152.],
[236.],
[184.],
[109.],
[179.],
[126.],
[218.],
[160.],
[147.],
[180.],
[176.],
[175.],
[173.],
[171.],
[169.],
[166.],
[173.]],
[[176.],
[181.],
[183.],
[185.],
[186.],
[187.],
[188.],
[190.],
[191.],
[216.],
[206.],
[156.],
[239.],
[194.],
[111.],
[197.],
[127.],
[217.],
[159.],
[142.],
[184.],
[179.],
[179.],
[176.],
[173.],
[172.],
[170.],
[174.]],
[[177.],
[180.],
[184.],
[186.],
[188.],
[191.],
[192.],
[191.],
[193.],
[223.],
[217.],
[164.],
[240.],
[206.],
[108.],
[193.],
[124.],
[219.],
[168.],
[147.],
[185.],
[183.],
[182.],
[179.],
[176.],
[175.],
[173.],
[175.]],
[[181.],
[184.],
[187.],
[189.],
[191.],
[193.],
[194.],
[194.],
[193.],
[235.],
[206.],
[158.],
[238.],
[199.],
[ 84.],
[171.],
[103.],
[213.],
[166.],
[153.],
[187.],
[184.],
[183.],
[180.],
[179.],
[177.],
[174.],
[175.]],
[[183.],
[187.],
[189.],
[192.],
[192.],
[194.],
[196.],
[195.],
[196.],
[233.],
[214.],
[151.],
[230.],
[157.],
[119.],
[137.],
[ 81.],
[202.],
[153.],
[160.],
[190.],
[185.],
[186.],
[185.],
[181.],
[178.],
[175.],
[176.]],
[[184.],
[187.],
[192.],
[194.],
[195.],
[196.],
[198.],
[199.],
[197.],
[234.],
[219.],
[127.],
[214.],
[218.],
[168.],
[160.],
[202.],
[161.],
[124.],
[162.],
[192.],
[189.],
[189.],
[186.],
[184.],
[181.],
[179.],
[177.]],
[[186.],
[190.],
[193.],
[195.],
[196.],
[197.],
[201.],
[201.],
[202.],
[251.],
[212.],
[196.],
[183.],
[165.],
[188.],
[181.],
[193.],
[166.],
[114.],
[147.],
[194.],
[194.],
[191.],
[188.],
[185.],
[183.],
[182.],
[177.]],
[[187.],
[191.],
[195.],
[197.],
[199.],
[201.],
[203.],
[202.],
[204.],
[252.],
[225.],
[185.],
[154.],
[141.],
[165.],
[167.],
[172.],
[188.],
[144.],
[133.],
[194.],
[193.],
[191.],
[189.],
[187.],
[186.],
[185.],
[178.]],
[[193.],
[194.],
[196.],
[199.],
[201.],
[203.],
[204.],
[204.],
[194.],
[253.],
[225.],
[167.],
[138.],
[137.],
[146.],
[175.],
[227.],
[220.],
[181.],
[130.],
[197.],
[195.],
[193.],
[193.],
[192.],
[187.],
[184.],
[178.]],
[[192.],
[195.],
[198.],
[202.],
[202.],
[205.],
[206.],
[206.],
[198.],
[253.],
[218.],
[162.],
[131.],
[137.],
[153.],
[184.],
[224.],
[231.],
[199.],
[130.],
[198.],
[197.],
[195.],
[194.],
[192.],
[188.],
[155.],
[162.]],
[[192.],
[196.],
[199.],
[202.],
[203.],
[204.],
[206.],
[207.],
[201.],
[253.],
[226.],
[163.],
[138.],
[138.],
[172.],
[207.],
[236.],
[210.],
[193.],
[142.],
[192.],
[203.],
[137.],
[110.],
[103.],
[ 93.],
[ 88.],
[154.]],
[[192.],
[194.],
[198.],
[201.],
[203.],
[205.],
[205.],
[207.],
[197.],
[253.],
[231.],
[168.],
[140.],
[154.],
[185.],
[227.],
[235.],
[207.],
[184.],
[120.],
[194.],
[186.],
[106.],
[103.],
[ 97.],
[ 86.],
[ 87.],
[154.]],
[[193.],
[198.],
[200.],
[201.],
[204.],
[205.],
[208.],
[207.],
[199.],
[254.],
[234.],
[183.],
[145.],
[170.],
[190.],
[240.],
[235.],
[212.],
[171.],
[117.],
[197.],
[122.],
[ 91.],
[ 84.],
[ 82.],
[ 83.],
[ 84.],
[153.]],
[[194.],
[200.],
[203.],
[204.],
[206.],
[208.],
[209.],
[209.],
[207.],
[253.],
[235.],
[192.],
[153.],
[158.],
[180.],
[245.],
[227.],
[190.],
[161.],
[112.],
[111.],
[ 90.],
[ 76.],
[ 63.],
[ 54.],
[ 42.],
[ 56.],
[151.]],
[[195.],
[200.],
[204.],
[205.],
[207.],
[209.],
[210.],
[210.],
[212.],
[252.],
[235.],
[202.],
[169.],
[151.],
[173.],
[243.],
[218.],
[179.],
[153.],
[105.],
[ 90.],
[ 79.],
[ 66.],
[ 55.],
[ 54.],
[ 50.],
[ 62.],
[152.]],
[[196.],
[200.],
[204.],
[206.],
[207.],
[211.],
[212.],
[213.],
[213.],
[246.],
[241.],
[210.],
[185.],
[169.],
[177.],
[233.],
[201.],
[155.],
[137.],
[ 94.],
[ 88.],
[ 89.],
[ 73.],
[ 48.],
[ 48.],
[ 52.],
[ 54.],
[147.]],
[[195.],
[199.],
[204.],
[203.],
[204.],
[205.],
[209.],
[212.],
[210.],
[224.],
[243.],
[215.],
[190.],
[176.],
[164.],
[215.],
[182.],
[134.],
[119.],
[ 83.],
[ 77.],
[ 85.],
[ 89.],
[ 55.],
[ 43.],
[ 50.],
[ 52.],
[145.]],
[[105.],
[109.],
[110.],
[106.],
[105.],
[108.],
[111.],
[113.],
[130.],
[112.],
[247.],
[220.],
[190.],
[165.],
[143.],
[161.],
[145.],
[107.],
[ 98.],
[ 60.],
[ 68.],
[ 71.],
[ 95.],
[ 74.],
[ 48.],
[ 42.],
[ 51.],
[144.]],
[[107.],
[110.],
[108.],
[108.],
[108.],
[109.],
[106.],
[108.],
[126.],
[ 83.],
[236.],
[205.],
[167.],
[140.],
[123.],
[127.],
[119.],
[ 94.],
[ 66.],
[ 46.],
[ 56.],
[ 60.],
[ 84.],
[ 86.],
[ 60.],
[ 40.],
[ 46.],
[144.]],
[[112.],
[107.],
[112.],
[108.],
[110.],
[111.],
[111.],
[111.],
[120.],
[ 40.],
[166.],
[158.],
[133.],
[118.],
[118.],
[110.],
[107.],
[ 72.],
[ 21.],
[ 22.],
[ 49.],
[ 49.],
[ 71.],
[ 90.],
[ 74.],
[ 47.],
[ 37.],
[143.]],
[[113.],
[112.],
[113.],
[107.],
[113.],
[115.],
[115.],
[114.],
[127.],
[ 80.],
[ 82.],
[144.],
[126.],
[104.],
[107.],
[ 91.],
[ 76.],
[ 30.],
[ 16.],
[ 31.],
[ 67.],
[ 55.],
[ 60.],
[ 82.],
[ 84.],
[ 58.],
[ 39.],
[141.]],
[[223.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[185.],
[204.]]], dtype=float32)
이제 예측을 위한 준비가 되도록 이미지를 재구성이 가능하다
# This reshape corresponds to 1 image of 28x28 pixels with one color channel
image = image.reshape(1,28,28,1)
정규화
마지막으로, 트레이닝 데이터세트로 했던 것처럼 데이터를 정규화(모든 값을 0~1로 설정)
image = image / 255
👀 정규화 결과 보기
array([[[[0.61960787],
[0.6313726 ],
[0.63529414],
[0.654902 ],
[0.6627451 ],
[0.6627451 ],
[0.67058825],
[0.6666667 ],
[0.6666667 ],
[0.6666667 ],
[0.67058825],
[0.67058825],
[0.67058825],
[0.6666667 ],
[0.73333335],
[0.64705884],
[0.65882355],
[0.6509804 ],
[0.6509804 ],
[0.6392157 ],
[0.63529414],
[0.62352943],
[0.6156863 ],
[0.60784316],
[0.6039216 ],
[0.5921569 ],
[0.58431375],
[0.7019608 ]],
[[0.6313726 ],
[0.6431373 ],
[0.6509804 ],
[0.6627451 ],
[0.6666667 ],
[0.6745098 ],
[0.6745098 ],
[0.6784314 ],
[0.6784314 ],
[0.6784314 ],
[0.68235296],
[0.6784314 ],
[0.6784314 ],
[0.6862745 ],
[0.8627451 ],
[0.6862745 ],
[0.5803922 ],
[0.64705884],
[0.654902 ],
[0.6509804 ],
[0.6431373 ],
[0.6313726 ],
[0.627451 ],
[0.62352943],
[0.6117647 ],
[0.6039216 ],
[0.5921569 ],
[0.6627451 ]],
[[0.63529414],
[0.6509804 ],
[0.6627451 ],
[0.67058825],
[0.6784314 ],
[0.6862745 ],
[0.6862745 ],
[0.6862745 ],
[0.6901961 ],
[0.69411767],
[0.69411767],
[0.6901961 ],
[0.85490197],
[0.7058824 ],
[0.7058824 ],
[0.72156864],
[0.38039216],
[0.6509804 ],
[0.48235294],
[0.6627451 ],
[0.6627451 ],
[0.64705884],
[0.6431373 ],
[0.627451 ],
[0.62352943],
[0.6156863 ],
[0.6117647 ],
[0.6666667 ]],
[[0.654902 ],
[0.6666667 ],
[0.6745098 ],
[0.68235296],
[0.6862745 ],
[0.69411767],
[0.69803923],
[0.7058824 ],
[0.7058824 ],
[0.7058824 ],
[0.7019608 ],
[0.7058824 ],
[0.9372549 ],
[0.69411767],
[0.5882353 ],
[0.7921569 ],
[0.42745098],
[0.7764706 ],
[0.47843137],
[0.67058825],
[0.6627451 ],
[0.6666667 ],
[0.65882355],
[0.6431373 ],
[0.627451 ],
[0.6313726 ],
[0.6156863 ],
[0.67058825]],
[[0.6627451 ],
[0.67058825],
[0.68235296],
[0.6862745 ],
[0.69411767],
[0.69803923],
[0.7019608 ],
[0.70980394],
[0.70980394],
[0.7137255 ],
[0.7137255 ],
[0.7176471 ],
[0.95686275],
[0.7176471 ],
[0.56078434],
[0.827451 ],
[0.49803922],
[0.79607844],
[0.52156866],
[0.7019608 ],
[0.6784314 ],
[0.67058825],
[0.6666667 ],
[0.65882355],
[0.6431373 ],
[0.63529414],
[0.62352943],
[0.67058825]],
[[0.6745098 ],
[0.68235296],
[0.69411767],
[0.7019608 ],
[0.7058824 ],
[0.7137255 ],
[0.7176471 ],
[0.72156864],
[0.72156864],
[0.7294118 ],
[0.78431374],
[0.65882355],
[0.95686275],
[0.78039217],
[0.45882353],
[0.78039217],
[0.5058824 ],
[0.84313726],
[0.63529414],
[0.64705884],
[0.6862745 ],
[0.68235296],
[0.6745098 ],
[0.67058825],
[0.65882355],
[0.654902 ],
[0.6392157 ],
[0.6745098 ]],
[[0.6784314 ],
[0.69411767],
[0.7058824 ],
[0.70980394],
[0.7176471 ],
[0.72156864],
[0.7254902 ],
[0.7294118 ],
[0.7294118 ],
[0.78431374],
[0.7921569 ],
[0.59607846],
[0.9254902 ],
[0.72156864],
[0.42745098],
[0.7019608 ],
[0.49411765],
[0.85490197],
[0.627451 ],
[0.5764706 ],
[0.7058824 ],
[0.6901961 ],
[0.6862745 ],
[0.6784314 ],
[0.67058825],
[0.6627451 ],
[0.6509804 ],
[0.6784314 ]],
[[0.6901961 ],
[0.70980394],
[0.7176471 ],
[0.7254902 ],
[0.7294118 ],
[0.73333335],
[0.7372549 ],
[0.74509805],
[0.7490196 ],
[0.84705883],
[0.80784315],
[0.6117647 ],
[0.9372549 ],
[0.7607843 ],
[0.43529412],
[0.77254903],
[0.49803922],
[0.8509804 ],
[0.62352943],
[0.5568628 ],
[0.72156864],
[0.7019608 ],
[0.7019608 ],
[0.6901961 ],
[0.6784314 ],
[0.6745098 ],
[0.6666667 ],
[0.68235296]],
[[0.69411767],
[0.7058824 ],
[0.72156864],
[0.7294118 ],
[0.7372549 ],
[0.7490196 ],
[0.7529412 ],
[0.7490196 ],
[0.75686276],
[0.8745098 ],
[0.8509804 ],
[0.6431373 ],
[0.9411765 ],
[0.80784315],
[0.42352942],
[0.75686276],
[0.4862745 ],
[0.85882354],
[0.65882355],
[0.5764706 ],
[0.7254902 ],
[0.7176471 ],
[0.7137255 ],
[0.7019608 ],
[0.6901961 ],
[0.6862745 ],
[0.6784314 ],
[0.6862745 ]],
[[0.70980394],
[0.72156864],
[0.73333335],
[0.7411765 ],
[0.7490196 ],
[0.75686276],
[0.7607843 ],
[0.7607843 ],
[0.75686276],
[0.92156863],
[0.80784315],
[0.61960787],
[0.93333334],
[0.78039217],
[0.32941177],
[0.67058825],
[0.40392157],
[0.8352941 ],
[0.6509804 ],
[0.6 ],
[0.73333335],
[0.72156864],
[0.7176471 ],
[0.7058824 ],
[0.7019608 ],
[0.69411767],
[0.68235296],
[0.6862745 ]],
[[0.7176471 ],
[0.73333335],
[0.7411765 ],
[0.7529412 ],
[0.7529412 ],
[0.7607843 ],
[0.76862746],
[0.7647059 ],
[0.76862746],
[0.9137255 ],
[0.8392157 ],
[0.5921569 ],
[0.9019608 ],
[0.6156863 ],
[0.46666667],
[0.5372549 ],
[0.31764707],
[0.7921569 ],
[0.6 ],
[0.627451 ],
[0.74509805],
[0.7254902 ],
[0.7294118 ],
[0.7254902 ],
[0.70980394],
[0.69803923],
[0.6862745 ],
[0.6901961 ]],
[[0.72156864],
[0.73333335],
[0.7529412 ],
[0.7607843 ],
[0.7647059 ],
[0.76862746],
[0.7764706 ],
[0.78039217],
[0.77254903],
[0.91764706],
[0.85882354],
[0.49803922],
[0.8392157 ],
[0.85490197],
[0.65882355],
[0.627451 ],
[0.7921569 ],
[0.6313726 ],
[0.4862745 ],
[0.63529414],
[0.7529412 ],
[0.7411765 ],
[0.7411765 ],
[0.7294118 ],
[0.72156864],
[0.70980394],
[0.7019608 ],
[0.69411767]],
[[0.7294118 ],
[0.74509805],
[0.75686276],
[0.7647059 ],
[0.76862746],
[0.77254903],
[0.7882353 ],
[0.7882353 ],
[0.7921569 ],
[0.9843137 ],
[0.83137256],
[0.76862746],
[0.7176471 ],
[0.64705884],
[0.7372549 ],
[0.70980394],
[0.75686276],
[0.6509804 ],
[0.44705883],
[0.5764706 ],
[0.7607843 ],
[0.7607843 ],
[0.7490196 ],
[0.7372549 ],
[0.7254902 ],
[0.7176471 ],
[0.7137255 ],
[0.69411767]],
[[0.73333335],
[0.7490196 ],
[0.7647059 ],
[0.77254903],
[0.78039217],
[0.7882353 ],
[0.79607844],
[0.7921569 ],
[0.8 ],
[0.9882353 ],
[0.88235295],
[0.7254902 ],
[0.6039216 ],
[0.5529412 ],
[0.64705884],
[0.654902 ],
[0.6745098 ],
[0.7372549 ],
[0.5647059 ],
[0.52156866],
[0.7607843 ],
[0.75686276],
[0.7490196 ],
[0.7411765 ],
[0.73333335],
[0.7294118 ],
[0.7254902 ],
[0.69803923]],
[[0.75686276],
[0.7607843 ],
[0.76862746],
[0.78039217],
[0.7882353 ],
[0.79607844],
[0.8 ],
[0.8 ],
[0.7607843 ],
[0.99215686],
[0.88235295],
[0.654902 ],
[0.5411765 ],
[0.5372549 ],
[0.57254905],
[0.6862745 ],
[0.8901961 ],
[0.8627451 ],
[0.70980394],
[0.50980395],
[0.77254903],
[0.7647059 ],
[0.75686276],
[0.75686276],
[0.7529412 ],
[0.73333335],
[0.72156864],
[0.69803923]],
[[0.7529412 ],
[0.7647059 ],
[0.7764706 ],
[0.7921569 ],
[0.7921569 ],
[0.8039216 ],
[0.80784315],
[0.80784315],
[0.7764706 ],
[0.99215686],
[0.85490197],
[0.63529414],
[0.5137255 ],
[0.5372549 ],
[0.6 ],
[0.72156864],
[0.8784314 ],
[0.90588236],
[0.78039217],
[0.50980395],
[0.7764706 ],
[0.77254903],
[0.7647059 ],
[0.7607843 ],
[0.7529412 ],
[0.7372549 ],
[0.60784316],
[0.63529414]],
[[0.7529412 ],
[0.76862746],
[0.78039217],
[0.7921569 ],
[0.79607844],
[0.8 ],
[0.80784315],
[0.8117647 ],
[0.7882353 ],
[0.99215686],
[0.8862745 ],
[0.6392157 ],
[0.5411765 ],
[0.5411765 ],
[0.6745098 ],
[0.8117647 ],
[0.9254902 ],
[0.8235294 ],
[0.75686276],
[0.5568628 ],
[0.7529412 ],
[0.79607844],
[0.5372549 ],
[0.43137255],
[0.40392157],
[0.3647059 ],
[0.34509805],
[0.6039216 ]],
[[0.7529412 ],
[0.7607843 ],
[0.7764706 ],
[0.7882353 ],
[0.79607844],
[0.8039216 ],
[0.8039216 ],
[0.8117647 ],
[0.77254903],
[0.99215686],
[0.90588236],
[0.65882355],
[0.54901963],
[0.6039216 ],
[0.7254902 ],
[0.8901961 ],
[0.92156863],
[0.8117647 ],
[0.72156864],
[0.47058824],
[0.7607843 ],
[0.7294118 ],
[0.41568628],
[0.40392157],
[0.38039216],
[0.3372549 ],
[0.34117648],
[0.6039216 ]],
[[0.75686276],
[0.7764706 ],
[0.78431374],
[0.7882353 ],
[0.8 ],
[0.8039216 ],
[0.8156863 ],
[0.8117647 ],
[0.78039217],
[0.99607843],
[0.91764706],
[0.7176471 ],
[0.5686275 ],
[0.6666667 ],
[0.74509805],
[0.9411765 ],
[0.92156863],
[0.83137256],
[0.67058825],
[0.45882353],
[0.77254903],
[0.47843137],
[0.35686275],
[0.32941177],
[0.32156864],
[0.3254902 ],
[0.32941177],
[0.6 ]],
[[0.7607843 ],
[0.78431374],
[0.79607844],
[0.8 ],
[0.80784315],
[0.8156863 ],
[0.81960785],
[0.81960785],
[0.8117647 ],
[0.99215686],
[0.92156863],
[0.7529412 ],
[0.6 ],
[0.61960787],
[0.7058824 ],
[0.9607843 ],
[0.8901961 ],
[0.74509805],
[0.6313726 ],
[0.4392157 ],
[0.43529412],
[0.3529412 ],
[0.29803923],
[0.24705882],
[0.21176471],
[0.16470589],
[0.21960784],
[0.5921569 ]],
[[0.7647059 ],
[0.78431374],
[0.8 ],
[0.8039216 ],
[0.8117647 ],
[0.81960785],
[0.8235294 ],
[0.8235294 ],
[0.83137256],
[0.9882353 ],
[0.92156863],
[0.7921569 ],
[0.6627451 ],
[0.5921569 ],
[0.6784314 ],
[0.9529412 ],
[0.85490197],
[0.7019608 ],
[0.6 ],
[0.4117647 ],
[0.3529412 ],
[0.30980393],
[0.25882354],
[0.21568628],
[0.21176471],
[0.19607843],
[0.24313726],
[0.59607846]],
[[0.76862746],
[0.78431374],
[0.8 ],
[0.80784315],
[0.8117647 ],
[0.827451 ],
[0.83137256],
[0.8352941 ],
[0.8352941 ],
[0.9647059 ],
[0.94509804],
[0.8235294 ],
[0.7254902 ],
[0.6627451 ],
[0.69411767],
[0.9137255 ],
[0.7882353 ],
[0.60784316],
[0.5372549 ],
[0.36862746],
[0.34509805],
[0.34901962],
[0.28627452],
[0.1882353 ],
[0.1882353 ],
[0.20392157],
[0.21176471],
[0.5764706 ]],
[[0.7647059 ],
[0.78039217],
[0.8 ],
[0.79607844],
[0.8 ],
[0.8039216 ],
[0.81960785],
[0.83137256],
[0.8235294 ],
[0.8784314 ],
[0.9529412 ],
[0.84313726],
[0.74509805],
[0.6901961 ],
[0.6431373 ],
[0.84313726],
[0.7137255 ],
[0.5254902 ],
[0.46666667],
[0.3254902 ],
[0.3019608 ],
[0.33333334],
[0.34901962],
[0.21568628],
[0.16862746],
[0.19607843],
[0.20392157],
[0.5686275 ]],
[[0.4117647 ],
[0.42745098],
[0.43137255],
[0.41568628],
[0.4117647 ],
[0.42352942],
[0.43529412],
[0.44313726],
[0.50980395],
[0.4392157 ],
[0.96862745],
[0.8627451 ],
[0.74509805],
[0.64705884],
[0.56078434],
[0.6313726 ],
[0.5686275 ],
[0.41960785],
[0.38431373],
[0.23529412],
[0.26666668],
[0.2784314 ],
[0.37254903],
[0.2901961 ],
[0.1882353 ],
[0.16470589],
[0.2 ],
[0.5647059 ]],
[[0.41960785],
[0.43137255],
[0.42352942],
[0.42352942],
[0.42352942],
[0.42745098],
[0.41568628],
[0.42352942],
[0.49411765],
[0.3254902 ],
[0.9254902 ],
[0.8039216 ],
[0.654902 ],
[0.54901963],
[0.48235294],
[0.49803922],
[0.46666667],
[0.36862746],
[0.25882354],
[0.18039216],
[0.21960784],
[0.23529412],
[0.32941177],
[0.3372549 ],
[0.23529412],
[0.15686275],
[0.18039216],
[0.5647059 ]],
[[0.4392157 ],
[0.41960785],
[0.4392157 ],
[0.42352942],
[0.43137255],
[0.43529412],
[0.43529412],
[0.43529412],
[0.47058824],
[0.15686275],
[0.6509804 ],
[0.61960787],
[0.52156866],
[0.4627451 ],
[0.4627451 ],
[0.43137255],
[0.41960785],
[0.28235295],
[0.08235294],
[0.08627451],
[0.19215687],
[0.19215687],
[0.2784314 ],
[0.3529412 ],
[0.2901961 ],
[0.18431373],
[0.14509805],
[0.56078434]],
[[0.44313726],
[0.4392157 ],
[0.44313726],
[0.41960785],
[0.44313726],
[0.4509804 ],
[0.4509804 ],
[0.44705883],
[0.49803922],
[0.3137255 ],
[0.32156864],
[0.5647059 ],
[0.49411765],
[0.40784314],
[0.41960785],
[0.35686275],
[0.29803923],
[0.11764706],
[0.0627451 ],
[0.12156863],
[0.2627451 ],
[0.21568628],
[0.23529412],
[0.32156864],
[0.32941177],
[0.22745098],
[0.15294118],
[0.5529412 ]],
[[0.8745098 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.7254902 ],
[0.8 ]]]], dtype=float32)
예측 수행
이제 모든 준비가 다 끝났으니 예측 메서드에 전달해보자!
from tensorflow import keras
model = keras.models.load_model('asl_model')
## 예측 결과
[[2.1393621e-26 1.0000000e+00 0.0000000e+00 0.0000000e+00 2.7215095e-22
1.1024388e-11 4.1655554e-37 0.0000000e+00 3.7572172e-12 0.0000000e+00
0.0000000e+00 2.3718250e-36 2.7730918e-29 0.0000000e+00 1.8111450e-31
0.0000000e+00 9.8561096e-34 2.2478763e-38 0.0000000e+00 1.1418552e-26
0.0000000e+00 8.0510884e-16 3.5616203e-27 3.2698013e-25]]
예측 이해
예측은 24 길이 어레이의 형식임
- 이는 y_train 및 y_test의 “바이너리화된” 범주 어레이와 동일한 형식
- 어레이의 각 엘리먼트: 각 범주의 컨피던스를 표현하는 0~1 사이의 확률
더 쉽게 이해하기 위해,
우선 어레이의 어떤 엘리먼트가 가장 높은 확률인지 보자
NumPy
라이브러리와argmax
함수를 사용
import numpy as np
np.argmax(prediction)
# 1
예측 어레이의 각 엘리먼트: 수화 알파벳의 가능한 문자
그럼, 예측 어레이의 인덱스와 해당하는 문자 간의 매핑을 생성해보자
- 동작인 j와 z 제외
# Alphabet does not contain j or z because they require movement
alphabet = "abcdefghiklmnopqrstuvwxy"
dictionary = {}
for i in range(24):
dictionary[i] = alphabet[i]
위의 수행 결과 보기
{0: 'a',
1: 'b',
2: 'c',
3: 'd',
4: 'e',
5: 'f',
6: 'g',
7: 'h',
8: 'i',
9: 'k',
10: 'l',
11: 'm',
12: 'n',
13: 'o',
14: 'p',
15: 'q',
16: 'r',
17: 's',
18: 't',
19: 'u',
20: 'v',
21: 'w',
22: 'x',
23: 'y'}
이제 위의 2에 해당하는 알파벳을 바로 찾을 수 있음
dictionary[np.argmax(prediction)]
# 'b'
TEST
위의 과정을 테스트 해 보자!!
답을 적어보고 정답토드를 보길 바란다
def predict_letter(file_path):
# Show image
FIXME
# Load and scale image
image = FIXME
# Convert to array
image = FIXME
# Reshape image
image = FIXME
# Normalize image
image = FIXME
# Make prediction
prediction = FIXME
# Convert prediction to letter
predicted_letter = FIXME
# Return prediction
return predicted_letter
👀 정답 코드 보기
def predict_letter(file_path):
show_image(file_path)
image = load_and_scale_image(file_path)
image = image_utils.img_to_array(image)
image = image.reshape(1,28,28,1)
image = image/255
prediction = model.predict(image)
# convert prediction to letter
predicted_letter = dictionary[np.argmax(prediction)]
return predicted_letter
메모리
넘어가기 전에 다음 셀을 실행하여 GPU 메모리를 지우기
이는 다음 노트북으로 넘어가기 위한 필수 작업
import IPython
app = IPython.Application.instance()
app.kernel.do_shutdown(True)