코랩에서 train.py 실행 시 오류
안녕하세요, 구글 코랩에서 추가적으로 test.py 를 실행했을 경우!python train.py아래와 같은 에러가 발생합니다. 코랩에서는, 해당 스크립트를 실행하지 못하는데 해결방법이 있을까요? 2023-03-11 14:02:16.774499: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/lib/python3.9/dist-packages/cv2/../../lib64:/usr/lib64-nvidia 2023-03-11 14:02:16.774586: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/lib/python3.9/dist-packages/cv2/../../lib64:/usr/lib64-nvidia 2023-03-11 14:02:16.774603: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:You use TensorFlow DType in tfds.features This will soon be deprecated in favor of NumPy DTypes. In the meantime it was converted to float32. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:You use TensorFlow DType in tfds.features This will soon be deprecated in favor of NumPy DTypes. In the meantime it was converted to bool. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:You use TensorFlow DType in tfds.features This will soon be deprecated in favor of NumPy DTypes. In the meantime it was converted to int64. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. 2023-03-11 14:02:19.478223: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:42] Overriding orig_value setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. WARNING:absl:`TensorInfo.dtype` is deprecated. Please change your code to use NumPy with the field `TensorInfo.np_dtype` or use TensorFlow with the field `TensorInfo.tf_dtype`. Model: "model" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3 0 [] )] conv2d (Conv2D) (None, 111, 111, 32 864 ['input_1[0][0]'] ) batch_normalization (BatchNorm (None, 111, 111, 32 96 ['conv2d[0][0]'] alization) ) activation (Activation) (None, 111, 111, 32 0 ['batch_normalization[0][0]'] ) conv2d_1 (Conv2D) (None, 109, 109, 32 9216 ['activation[0][0]'] ) batch_normalization_1 (BatchNo (None, 109, 109, 32 96 ['conv2d_1[0][0]'] rmalization) ) activation_1 (Activation) (None, 109, 109, 32 0 ['batch_normalization_1[0][0]'] ) conv2d_2 (Conv2D) (None, 109, 109, 64 18432 ['activation_1[0][0]'] ) batch_normalization_2 (BatchNo (None, 109, 109, 64 192 ['conv2d_2[0][0]'] rmalization) ) activation_2 (Activation) (None, 109, 109, 64 0 ['batch_normalization_2[0][0]'] ) max_pooling2d (MaxPooling2D) (None, 54, 54, 64) 0 ['activation_2[0][0]'] conv2d_3 (Conv2D) (None, 54, 54, 80) 5120 ['max_pooling2d[0][0]'] batch_normalization_3 (BatchNo (None, 54, 54, 80) 240 ['conv2d_3[0][0]'] rmalization) activation_3 (Activation) (None, 54, 54, 80) 0 ['batch_normalization_3[0][0]'] conv2d_4 (Conv2D) (None, 52, 52, 192) 138240 ['activation_3[0][0]'] batch_normalization_4 (BatchNo (None, 52, 52, 192) 576 ['conv2d_4[0][0]'] rmalization) activation_4 (Activation) (None, 52, 52, 192) 0 ['batch_normalization_4[0][0]'] max_pooling2d_1 (MaxPooling2D) (None, 25, 25, 192) 0 ['activation_4[0][0]'] conv2d_8 (Conv2D) (None, 25, 25, 64) 12288 ['max_pooling2d_1[0][0]'] batch_normalization_8 (BatchNo (None, 25, 25, 64) 192 ['conv2d_8[0][0]'] rmalization) activation_8 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_8[0][0]'] conv2d_6 (Conv2D) (None, 25, 25, 48) 9216 ['max_pooling2d_1[0][0]'] conv2d_9 (Conv2D) (None, 25, 25, 96) 55296 ['activation_8[0][0]'] batch_normalization_6 (BatchNo (None, 25, 25, 48) 144 ['conv2d_6[0][0]'] rmalization) batch_normalization_9 (BatchNo (None, 25, 25, 96) 288 ['conv2d_9[0][0]'] rmalization) activation_6 (Activation) (None, 25, 25, 48) 0 ['batch_normalization_6[0][0]'] activation_9 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_9[0][0]'] average_pooling2d (AveragePool (None, 25, 25, 192) 0 ['max_pooling2d_1[0][0]'] ing2D) conv2d_5 (Conv2D) (None, 25, 25, 64) 12288 ['max_pooling2d_1[0][0]'] conv2d_7 (Conv2D) (None, 25, 25, 64) 76800 ['activation_6[0][0]'] conv2d_10 (Conv2D) (None, 25, 25, 96) 82944 ['activation_9[0][0]'] conv2d_11 (Conv2D) (None, 25, 25, 32) 6144 ['average_pooling2d[0][0]'] batch_normalization_5 (BatchNo (None, 25, 25, 64) 192 ['conv2d_5[0][0]'] rmalization) batch_normalization_7 (BatchNo (None, 25, 25, 64) 192 ['conv2d_7[0][0]'] rmalization) batch_normalization_10 (BatchN (None, 25, 25, 96) 288 ['conv2d_10[0][0]'] ormalization) batch_normalization_11 (BatchN (None, 25, 25, 32) 96 ['conv2d_11[0][0]'] ormalization) activation_5 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_5[0][0]'] activation_7 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_7[0][0]'] activation_10 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_10[0][0]'] activation_11 (Activation) (None, 25, 25, 32) 0 ['batch_normalization_11[0][0]'] mixed0 (Concatenate) (None, 25, 25, 256) 0 ['activation_5[0][0]', 'activation_7[0][0]', 'activation_10[0][0]', 'activation_11[0][0]'] conv2d_15 (Conv2D) (None, 25, 25, 64) 16384 ['mixed0[0][0]'] batch_normalization_15 (BatchN (None, 25, 25, 64) 192 ['conv2d_15[0][0]'] ormalization) activation_15 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_15[0][0]'] conv2d_13 (Conv2D) (None, 25, 25, 48) 12288 ['mixed0[0][0]'] conv2d_16 (Conv2D) (None, 25, 25, 96) 55296 ['activation_15[0][0]'] batch_normalization_13 (BatchN (None, 25, 25, 48) 144 ['conv2d_13[0][0]'] ormalization) batch_normalization_16 (BatchN (None, 25, 25, 96) 288 ['conv2d_16[0][0]'] ormalization) activation_13 (Activation) (None, 25, 25, 48) 0 ['batch_normalization_13[0][0]'] activation_16 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_16[0][0]'] average_pooling2d_1 (AveragePo (None, 25, 25, 256) 0 ['mixed0[0][0]'] oling2D) conv2d_12 (Conv2D) (None, 25, 25, 64) 16384 ['mixed0[0][0]'] conv2d_14 (Conv2D) (None, 25, 25, 64) 76800 ['activation_13[0][0]'] conv2d_17 (Conv2D) (None, 25, 25, 96) 82944 ['activation_16[0][0]'] conv2d_18 (Conv2D) (None, 25, 25, 64) 16384 ['average_pooling2d_1[0][0]'] batch_normalization_12 (BatchN (None, 25, 25, 64) 192 ['conv2d_12[0][0]'] ormalization) batch_normalization_14 (BatchN (None, 25, 25, 64) 192 ['conv2d_14[0][0]'] ormalization) batch_normalization_17 (BatchN (None, 25, 25, 96) 288 ['conv2d_17[0][0]'] ormalization) batch_normalization_18 (BatchN (None, 25, 25, 64) 192 ['conv2d_18[0][0]'] ormalization) activation_12 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_12[0][0]'] activation_14 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_14[0][0]'] activation_17 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_17[0][0]'] activation_18 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_18[0][0]'] mixed1 (Concatenate) (None, 25, 25, 288) 0 ['activation_12[0][0]', 'activation_14[0][0]', 'activation_17[0][0]', 'activation_18[0][0]'] conv2d_22 (Conv2D) (None, 25, 25, 64) 18432 ['mixed1[0][0]'] batch_normalization_22 (BatchN (None, 25, 25, 64) 192 ['conv2d_22[0][0]'] ormalization) activation_22 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_22[0][0]'] conv2d_20 (Conv2D) (None, 25, 25, 48) 13824 ['mixed1[0][0]'] conv2d_23 (Conv2D) (None, 25, 25, 96) 55296 ['activation_22[0][0]'] batch_normalization_20 (BatchN (None, 25, 25, 48) 144 ['conv2d_20[0][0]'] ormalization) batch_normalization_23 (BatchN (None, 25, 25, 96) 288 ['conv2d_23[0][0]'] ormalization) activation_20 (Activation) (None, 25, 25, 48) 0 ['batch_normalization_20[0][0]'] activation_23 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_23[0][0]'] average_pooling2d_2 (AveragePo (None, 25, 25, 288) 0 ['mixed1[0][0]'] oling2D) conv2d_19 (Conv2D) (None, 25, 25, 64) 18432 ['mixed1[0][0]'] conv2d_21 (Conv2D) (None, 25, 25, 64) 76800 ['activation_20[0][0]'] conv2d_24 (Conv2D) (None, 25, 25, 96) 82944 ['activation_23[0][0]'] conv2d_25 (Conv2D) (None, 25, 25, 64) 18432 ['average_pooling2d_2[0][0]'] batch_normalization_19 (BatchN (None, 25, 25, 64) 192 ['conv2d_19[0][0]'] ormalization) batch_normalization_21 (BatchN (None, 25, 25, 64) 192 ['conv2d_21[0][0]'] ormalization) batch_normalization_24 (BatchN (None, 25, 25, 96) 288 ['conv2d_24[0][0]'] ormalization) batch_normalization_25 (BatchN (None, 25, 25, 64) 192 ['conv2d_25[0][0]'] ormalization) activation_19 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_19[0][0]'] activation_21 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_21[0][0]'] activation_24 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_24[0][0]'] activation_25 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_25[0][0]'] mixed2 (Concatenate) (None, 25, 25, 288) 0 ['activation_19[0][0]', 'activation_21[0][0]', 'activation_24[0][0]', 'activation_25[0][0]'] conv2d_27 (Conv2D) (None, 25, 25, 64) 18432 ['mixed2[0][0]'] batch_normalization_27 (BatchN (None, 25, 25, 64) 192 ['conv2d_27[0][0]'] ormalization) activation_27 (Activation) (None, 25, 25, 64) 0 ['batch_normalization_27[0][0]'] conv2d_28 (Conv2D) (None, 25, 25, 96) 55296 ['activation_27[0][0]'] batch_normalization_28 (BatchN (None, 25, 25, 96) 288 ['conv2d_28[0][0]'] ormalization) activation_28 (Activation) (None, 25, 25, 96) 0 ['batch_normalization_28[0][0]'] conv2d_26 (Conv2D) (None, 12, 12, 384) 995328 ['mixed2[0][0]'] conv2d_29 (Conv2D) (None, 12, 12, 96) 82944 ['activation_28[0][0]'] batch_normalization_26 (BatchN (None, 12, 12, 384) 1152 ['conv2d_26[0][0]'] ormalization) batch_normalization_29 (BatchN (None, 12, 12, 96) 288 ['conv2d_29[0][0]'] ormalization) activation_26 (Activation) (None, 12, 12, 384) 0 ['batch_normalization_26[0][0]'] activation_29 (Activation) (None, 12, 12, 96) 0 ['batch_normalization_29[0][0]'] max_pooling2d_2 (MaxPooling2D) (None, 12, 12, 288) 0 ['mixed2[0][0]'] mixed3 (Concatenate) (None, 12, 12, 768) 0 ['activation_26[0][0]', 'activation_29[0][0]', 'max_pooling2d_2[0][0]'] conv2d_34 (Conv2D) (None, 12, 12, 128) 98304 ['mixed3[0][0]'] batch_normalization_34 (BatchN (None, 12, 12, 128) 384 ['conv2d_34[0][0]'] ormalization) activation_34 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_34[0][0]'] conv2d_35 (Conv2D) (None, 12, 12, 128) 114688 ['activation_34[0][0]'] batch_normalization_35 (BatchN (None, 12, 12, 128) 384 ['conv2d_35[0][0]'] ormalization) activation_35 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_35[0][0]'] conv2d_31 (Conv2D) (None, 12, 12, 128) 98304 ['mixed3[0][0]'] conv2d_36 (Conv2D) (None, 12, 12, 128) 114688 ['activation_35[0][0]'] batch_normalization_31 (BatchN (None, 12, 12, 128) 384 ['conv2d_31[0][0]'] ormalization) batch_normalization_36 (BatchN (None, 12, 12, 128) 384 ['conv2d_36[0][0]'] ormalization) activation_31 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_31[0][0]'] activation_36 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_36[0][0]'] conv2d_32 (Conv2D) (None, 12, 12, 128) 114688 ['activation_31[0][0]'] conv2d_37 (Conv2D) (None, 12, 12, 128) 114688 ['activation_36[0][0]'] batch_normalization_32 (BatchN (None, 12, 12, 128) 384 ['conv2d_32[0][0]'] ormalization) batch_normalization_37 (BatchN (None, 12, 12, 128) 384 ['conv2d_37[0][0]'] ormalization) activation_32 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_32[0][0]'] activation_37 (Activation) (None, 12, 12, 128) 0 ['batch_normalization_37[0][0]'] average_pooling2d_3 (AveragePo (None, 12, 12, 768) 0 ['mixed3[0][0]'] oling2D) conv2d_30 (Conv2D) (None, 12, 12, 192) 147456 ['mixed3[0][0]'] conv2d_33 (Conv2D) (None, 12, 12, 192) 172032 ['activation_32[0][0]'] conv2d_38 (Conv2D) (None, 12, 12, 192) 172032 ['activation_37[0][0]'] conv2d_39 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_3[0][0]'] batch_normalization_30 (BatchN (None, 12, 12, 192) 576 ['conv2d_30[0][0]'] ormalization) batch_normalization_33 (BatchN (None, 12, 12, 192) 576 ['conv2d_33[0][0]'] ormalization) batch_normalization_38 (BatchN (None, 12, 12, 192) 576 ['conv2d_38[0][0]'] ormalization) batch_normalization_39 (BatchN (None, 12, 12, 192) 576 ['conv2d_39[0][0]'] ormalization) activation_30 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_30[0][0]'] activation_33 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_33[0][0]'] activation_38 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_38[0][0]'] activation_39 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_39[0][0]'] mixed4 (Concatenate) (None, 12, 12, 768) 0 ['activation_30[0][0]', 'activation_33[0][0]', 'activation_38[0][0]', 'activation_39[0][0]'] conv2d_44 (Conv2D) (None, 12, 12, 160) 122880 ['mixed4[0][0]'] batch_normalization_44 (BatchN (None, 12, 12, 160) 480 ['conv2d_44[0][0]'] ormalization) activation_44 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_44[0][0]'] conv2d_45 (Conv2D) (None, 12, 12, 160) 179200 ['activation_44[0][0]'] batch_normalization_45 (BatchN (None, 12, 12, 160) 480 ['conv2d_45[0][0]'] ormalization) activation_45 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_45[0][0]'] conv2d_41 (Conv2D) (None, 12, 12, 160) 122880 ['mixed4[0][0]'] conv2d_46 (Conv2D) (None, 12, 12, 160) 179200 ['activation_45[0][0]'] batch_normalization_41 (BatchN (None, 12, 12, 160) 480 ['conv2d_41[0][0]'] ormalization) batch_normalization_46 (BatchN (None, 12, 12, 160) 480 ['conv2d_46[0][0]'] ormalization) activation_41 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_41[0][0]'] activation_46 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_46[0][0]'] conv2d_42 (Conv2D) (None, 12, 12, 160) 179200 ['activation_41[0][0]'] conv2d_47 (Conv2D) (None, 12, 12, 160) 179200 ['activation_46[0][0]'] batch_normalization_42 (BatchN (None, 12, 12, 160) 480 ['conv2d_42[0][0]'] ormalization) batch_normalization_47 (BatchN (None, 12, 12, 160) 480 ['conv2d_47[0][0]'] ormalization) activation_42 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_42[0][0]'] activation_47 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_47[0][0]'] average_pooling2d_4 (AveragePo (None, 12, 12, 768) 0 ['mixed4[0][0]'] oling2D) conv2d_40 (Conv2D) (None, 12, 12, 192) 147456 ['mixed4[0][0]'] conv2d_43 (Conv2D) (None, 12, 12, 192) 215040 ['activation_42[0][0]'] conv2d_48 (Conv2D) (None, 12, 12, 192) 215040 ['activation_47[0][0]'] conv2d_49 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_4[0][0]'] batch_normalization_40 (BatchN (None, 12, 12, 192) 576 ['conv2d_40[0][0]'] ormalization) batch_normalization_43 (BatchN (None, 12, 12, 192) 576 ['conv2d_43[0][0]'] ormalization) batch_normalization_48 (BatchN (None, 12, 12, 192) 576 ['conv2d_48[0][0]'] ormalization) batch_normalization_49 (BatchN (None, 12, 12, 192) 576 ['conv2d_49[0][0]'] ormalization) activation_40 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_40[0][0]'] activation_43 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_43[0][0]'] activation_48 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_48[0][0]'] activation_49 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_49[0][0]'] mixed5 (Concatenate) (None, 12, 12, 768) 0 ['activation_40[0][0]', 'activation_43[0][0]', 'activation_48[0][0]', 'activation_49[0][0]'] conv2d_54 (Conv2D) (None, 12, 12, 160) 122880 ['mixed5[0][0]'] batch_normalization_54 (BatchN (None, 12, 12, 160) 480 ['conv2d_54[0][0]'] ormalization) activation_54 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_54[0][0]'] conv2d_55 (Conv2D) (None, 12, 12, 160) 179200 ['activation_54[0][0]'] batch_normalization_55 (BatchN (None, 12, 12, 160) 480 ['conv2d_55[0][0]'] ormalization) activation_55 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_55[0][0]'] conv2d_51 (Conv2D) (None, 12, 12, 160) 122880 ['mixed5[0][0]'] conv2d_56 (Conv2D) (None, 12, 12, 160) 179200 ['activation_55[0][0]'] batch_normalization_51 (BatchN (None, 12, 12, 160) 480 ['conv2d_51[0][0]'] ormalization) batch_normalization_56 (BatchN (None, 12, 12, 160) 480 ['conv2d_56[0][0]'] ormalization) activation_51 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_51[0][0]'] activation_56 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_56[0][0]'] conv2d_52 (Conv2D) (None, 12, 12, 160) 179200 ['activation_51[0][0]'] conv2d_57 (Conv2D) (None, 12, 12, 160) 179200 ['activation_56[0][0]'] batch_normalization_52 (BatchN (None, 12, 12, 160) 480 ['conv2d_52[0][0]'] ormalization) batch_normalization_57 (BatchN (None, 12, 12, 160) 480 ['conv2d_57[0][0]'] ormalization) activation_52 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_52[0][0]'] activation_57 (Activation) (None, 12, 12, 160) 0 ['batch_normalization_57[0][0]'] average_pooling2d_5 (AveragePo (None, 12, 12, 768) 0 ['mixed5[0][0]'] oling2D) conv2d_50 (Conv2D) (None, 12, 12, 192) 147456 ['mixed5[0][0]'] conv2d_53 (Conv2D) (None, 12, 12, 192) 215040 ['activation_52[0][0]'] conv2d_58 (Conv2D) (None, 12, 12, 192) 215040 ['activation_57[0][0]'] conv2d_59 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_5[0][0]'] batch_normalization_50 (BatchN (None, 12, 12, 192) 576 ['conv2d_50[0][0]'] ormalization) batch_normalization_53 (BatchN (None, 12, 12, 192) 576 ['conv2d_53[0][0]'] ormalization) batch_normalization_58 (BatchN (None, 12, 12, 192) 576 ['conv2d_58[0][0]'] ormalization) batch_normalization_59 (BatchN (None, 12, 12, 192) 576 ['conv2d_59[0][0]'] ormalization) activation_50 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_50[0][0]'] activation_53 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_53[0][0]'] activation_58 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_58[0][0]'] activation_59 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_59[0][0]'] mixed6 (Concatenate) (None, 12, 12, 768) 0 ['activation_50[0][0]', 'activation_53[0][0]', 'activation_58[0][0]', 'activation_59[0][0]'] conv2d_64 (Conv2D) (None, 12, 12, 192) 147456 ['mixed6[0][0]'] batch_normalization_64 (BatchN (None, 12, 12, 192) 576 ['conv2d_64[0][0]'] ormalization) activation_64 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_64[0][0]'] conv2d_65 (Conv2D) (None, 12, 12, 192) 258048 ['activation_64[0][0]'] batch_normalization_65 (BatchN (None, 12, 12, 192) 576 ['conv2d_65[0][0]'] ormalization) activation_65 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_65[0][0]'] conv2d_61 (Conv2D) (None, 12, 12, 192) 147456 ['mixed6[0][0]'] conv2d_66 (Conv2D) (None, 12, 12, 192) 258048 ['activation_65[0][0]'] batch_normalization_61 (BatchN (None, 12, 12, 192) 576 ['conv2d_61[0][0]'] ormalization) batch_normalization_66 (BatchN (None, 12, 12, 192) 576 ['conv2d_66[0][0]'] ormalization) activation_61 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_61[0][0]'] activation_66 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_66[0][0]'] conv2d_62 (Conv2D) (None, 12, 12, 192) 258048 ['activation_61[0][0]'] conv2d_67 (Conv2D) (None, 12, 12, 192) 258048 ['activation_66[0][0]'] batch_normalization_62 (BatchN (None, 12, 12, 192) 576 ['conv2d_62[0][0]'] ormalization) batch_normalization_67 (BatchN (None, 12, 12, 192) 576 ['conv2d_67[0][0]'] ormalization) activation_62 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_62[0][0]'] activation_67 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_67[0][0]'] average_pooling2d_6 (AveragePo (None, 12, 12, 768) 0 ['mixed6[0][0]'] oling2D) conv2d_60 (Conv2D) (None, 12, 12, 192) 147456 ['mixed6[0][0]'] conv2d_63 (Conv2D) (None, 12, 12, 192) 258048 ['activation_62[0][0]'] conv2d_68 (Conv2D) (None, 12, 12, 192) 258048 ['activation_67[0][0]'] conv2d_69 (Conv2D) (None, 12, 12, 192) 147456 ['average_pooling2d_6[0][0]'] batch_normalization_60 (BatchN (None, 12, 12, 192) 576 ['conv2d_60[0][0]'] ormalization) batch_normalization_63 (BatchN (None, 12, 12, 192) 576 ['conv2d_63[0][0]'] ormalization) batch_normalization_68 (BatchN (None, 12, 12, 192) 576 ['conv2d_68[0][0]'] ormalization) batch_normalization_69 (BatchN (None, 12, 12, 192) 576 ['conv2d_69[0][0]'] ormalization) activation_60 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_60[0][0]'] activation_63 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_63[0][0]'] activation_68 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_68[0][0]'] activation_69 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_69[0][0]'] mixed7 (Concatenate) (None, 12, 12, 768) 0 ['activation_60[0][0]', 'activation_63[0][0]', 'activation_68[0][0]', 'activation_69[0][0]'] conv2d_72 (Conv2D) (None, 12, 12, 192) 147456 ['mixed7[0][0]'] batch_normalization_72 (BatchN (None, 12, 12, 192) 576 ['conv2d_72[0][0]'] ormalization) activation_72 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_72[0][0]'] conv2d_73 (Conv2D) (None, 12, 12, 192) 258048 ['activation_72[0][0]'] batch_normalization_73 (BatchN (None, 12, 12, 192) 576 ['conv2d_73[0][0]'] ormalization) activation_73 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_73[0][0]'] conv2d_70 (Conv2D) (None, 12, 12, 192) 147456 ['mixed7[0][0]'] conv2d_74 (Conv2D) (None, 12, 12, 192) 258048 ['activation_73[0][0]'] batch_normalization_70 (BatchN (None, 12, 12, 192) 576 ['conv2d_70[0][0]'] ormalization) batch_normalization_74 (BatchN (None, 12, 12, 192) 576 ['conv2d_74[0][0]'] ormalization) activation_70 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_70[0][0]'] activation_74 (Activation) (None, 12, 12, 192) 0 ['batch_normalization_74[0][0]'] conv2d_71 (Conv2D) (None, 5, 5, 320) 552960 ['activation_70[0][0]'] conv2d_75 (Conv2D) (None, 5, 5, 192) 331776 ['activation_74[0][0]'] batch_normalization_71 (BatchN (None, 5, 5, 320) 960 ['conv2d_71[0][0]'] ormalization) batch_normalization_75 (BatchN (None, 5, 5, 192) 576 ['conv2d_75[0][0]'] ormalization) activation_71 (Activation) (None, 5, 5, 320) 0 ['batch_normalization_71[0][0]'] activation_75 (Activation) (None, 5, 5, 192) 0 ['batch_normalization_75[0][0]'] max_pooling2d_3 (MaxPooling2D) (None, 5, 5, 768) 0 ['mixed7[0][0]'] mixed8 (Concatenate) (None, 5, 5, 1280) 0 ['activation_71[0][0]', 'activation_75[0][0]', 'max_pooling2d_3[0][0]'] conv2d_80 (Conv2D) (None, 5, 5, 448) 573440 ['mixed8[0][0]'] batch_normalization_80 (BatchN (None, 5, 5, 448) 1344 ['conv2d_80[0][0]'] ormalization) activation_80 (Activation) (None, 5, 5, 448) 0 ['batch_normalization_80[0][0]'] conv2d_77 (Conv2D) (None, 5, 5, 384) 491520 ['mixed8[0][0]'] conv2d_81 (Conv2D) (None, 5, 5, 384) 1548288 ['activation_80[0][0]'] batch_normalization_77 (BatchN (None, 5, 5, 384) 1152 ['conv2d_77[0][0]'] ormalization) batch_normalization_81 (BatchN (None, 5, 5, 384) 1152 ['conv2d_81[0][0]'] ormalization) activation_77 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_77[0][0]'] activation_81 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_81[0][0]'] conv2d_78 (Conv2D) (None, 5, 5, 384) 442368 ['activation_77[0][0]'] conv2d_79 (Conv2D) (None, 5, 5, 384) 442368 ['activation_77[0][0]'] conv2d_82 (Conv2D) (None, 5, 5, 384) 442368 ['activation_81[0][0]'] conv2d_83 (Conv2D) (None, 5, 5, 384) 442368 ['activation_81[0][0]'] average_pooling2d_7 (AveragePo (None, 5, 5, 1280) 0 ['mixed8[0][0]'] oling2D) conv2d_76 (Conv2D) (None, 5, 5, 320) 409600 ['mixed8[0][0]'] batch_normalization_78 (BatchN (None, 5, 5, 384) 1152 ['conv2d_78[0][0]'] ormalization) batch_normalization_79 (BatchN (None, 5, 5, 384) 1152 ['conv2d_79[0][0]'] ormalization) batch_normalization_82 (BatchN (None, 5, 5, 384) 1152 ['conv2d_82[0][0]'] ormalization) batch_normalization_83 (BatchN (None, 5, 5, 384) 1152 ['conv2d_83[0][0]'] ormalization) conv2d_84 (Conv2D) (None, 5, 5, 192) 245760 ['average_pooling2d_7[0][0]'] batch_normalization_76 (BatchN (None, 5, 5, 320) 960 ['conv2d_76[0][0]'] ormalization) activation_78 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_78[0][0]'] activation_79 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_79[0][0]'] activation_82 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_82[0][0]'] activation_83 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_83[0][0]'] batch_normalization_84 (BatchN (None, 5, 5, 192) 576 ['conv2d_84[0][0]'] ormalization) activation_76 (Activation) (None, 5, 5, 320) 0 ['batch_normalization_76[0][0]'] mixed9_0 (Concatenate) (None, 5, 5, 768) 0 ['activation_78[0][0]', 'activation_79[0][0]'] concatenate (Concatenate) (None, 5, 5, 768) 0 ['activation_82[0][0]', 'activation_83[0][0]'] activation_84 (Activation) (None, 5, 5, 192) 0 ['batch_normalization_84[0][0]'] mixed9 (Concatenate) (None, 5, 5, 2048) 0 ['activation_76[0][0]', 'mixed9_0[0][0]', 'concatenate[0][0]', 'activation_84[0][0]'] conv2d_89 (Conv2D) (None, 5, 5, 448) 917504 ['mixed9[0][0]'] batch_normalization_89 (BatchN (None, 5, 5, 448) 1344 ['conv2d_89[0][0]'] ormalization) activation_89 (Activation) (None, 5, 5, 448) 0 ['batch_normalization_89[0][0]'] conv2d_86 (Conv2D) (None, 5, 5, 384) 786432 ['mixed9[0][0]'] conv2d_90 (Conv2D) (None, 5, 5, 384) 1548288 ['activation_89[0][0]'] batch_normalization_86 (BatchN (None, 5, 5, 384) 1152 ['conv2d_86[0][0]'] ormalization) batch_normalization_90 (BatchN (None, 5, 5, 384) 1152 ['conv2d_90[0][0]'] ormalization) activation_86 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_86[0][0]'] activation_90 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_90[0][0]'] conv2d_87 (Conv2D) (None, 5, 5, 384) 442368 ['activation_86[0][0]'] conv2d_88 (Conv2D) (None, 5, 5, 384) 442368 ['activation_86[0][0]'] conv2d_91 (Conv2D) (None, 5, 5, 384) 442368 ['activation_90[0][0]'] conv2d_92 (Conv2D) (None, 5, 5, 384) 442368 ['activation_90[0][0]'] average_pooling2d_8 (AveragePo (None, 5, 5, 2048) 0 ['mixed9[0][0]'] oling2D) conv2d_85 (Conv2D) (None, 5, 5, 320) 655360 ['mixed9[0][0]'] batch_normalization_87 (BatchN (None, 5, 5, 384) 1152 ['conv2d_87[0][0]'] ormalization) batch_normalization_88 (BatchN (None, 5, 5, 384) 1152 ['conv2d_88[0][0]'] ormalization) batch_normalization_91 (BatchN (None, 5, 5, 384) 1152 ['conv2d_91[0][0]'] ormalization) batch_normalization_92 (BatchN (None, 5, 5, 384) 1152 ['conv2d_92[0][0]'] ormalization) conv2d_93 (Conv2D) (None, 5, 5, 192) 393216 ['average_pooling2d_8[0][0]'] batch_normalization_85 (BatchN (None, 5, 5, 320) 960 ['conv2d_85[0][0]'] ormalization) activation_87 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_87[0][0]'] activation_88 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_88[0][0]'] activation_91 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_91[0][0]'] activation_92 (Activation) (None, 5, 5, 384) 0 ['batch_normalization_92[0][0]'] batch_normalization_93 (BatchN (None, 5, 5, 192) 576 ['conv2d_93[0][0]'] ormalization) activation_85 (Activation) (None, 5, 5, 320) 0 ['batch_normalization_85[0][0]'] mixed9_1 (Concatenate) (None, 5, 5, 768) 0 ['activation_87[0][0]', 'activation_88[0][0]'] concatenate_1 (Concatenate) (None, 5, 5, 768) 0 ['activation_91[0][0]', 'activation_92[0][0]'] activation_93 (Activation) (None, 5, 5, 192) 0 ['batch_normalization_93[0][0]'] mixed10 (Concatenate) (None, 5, 5, 2048) 0 ['activation_85[0][0]', 'mixed9_1[0][0]', 'concatenate_1[0][0]', 'activation_93[0][0]'] global_average_pooling2d (Glob (None, 2048) 0 ['mixed10[0][0]'] alAveragePooling2D) dense (Dense) (None, 539) 1104411 ['global_average_pooling2d[0][0]' ] ================================================================================================== Total params: 22,907,195 Trainable params: 22,872,763 Non-trainable params: 34,432 __________________________________________________________________________________________________ WARNING:tensorflow:5 out of the last 5 calls to triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. W0311 14:02:53.016839 139906334160704 polymorphic_function.py:154] 5 out of the last 5 calls to triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. WARNING:tensorflow:6 out of the last 6 calls to triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. W0311 14:02:53.172408 139906334160704 polymorphic_function.py:154] 6 out of the last 6 calls to triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. Epoch: 1, Iter: 1/59, Loss: 461035.750000 Traceback (most recent call last): File "/content/drive/MyDrive/Colab Notebooks/YOLO 구현으로 배우는 딥러닝 논문 구현 with TensorFlow 2.0/YOLOv1/train.py", line 326, in app.run(main) File "/usr/local/lib/python3.9/dist-packages/absl/app.py", line 308, in run _run_main(main, args) File "/usr/local/lib/python3.9/dist-packages/absl/app.py", line 254, in _run_main sys.exit(main(argv)) File "/content/drive/MyDrive/Colab Notebooks/YOLO 구현으로 배우는 딥러닝 논문 구현 with TensorFlow 2.0/YOLOv1/train.py", line 306, in main tf.summary.scalar('learning_rate ', optimizer.lr(ckpt.step).numpy(), step=int(ckpt.step)) TypeError: 'ResourceVariable' object is not callable