属性错误:顺序对象没有属性“\u ckpt\u saved\u epoch”Keras CNN

2024-06-01 21:42:19 发布

您现在位置:Python中文网/ 问答频道 /正文

我试图用keras(tensorflow后端,版本1.14.0)运行一个相对简单的CNN。下面是上下文代码。你知道吗

from keras.layers import Dense, Conv2D, MaxPooling2D, BatchNormalization, GlobalAveragePooling2D
from tensorflow.python.keras.callbacks import EarlyStopping, ModelCheckpoint
from keras import models
from keras.applications.vgg16 import preprocess_input
from keras.preprocessing.image import ImageDataGenerator
import matplotlib.pyplot as plt
from keras.models import load_model
import numpy as np


# starting point
my_model = models.Sequential()

# Add first convolutional block
my_model.add(Conv2D(16, (3, 3), activation='relu', padding='same',
                    input_shape=(224, 224, 3)))
my_model.add(MaxPooling2D((2, 2), padding='same'))

# second block
my_model.add(Conv2D(32, (3, 3), activation='relu', padding='same'))
my_model.add(MaxPooling2D((2, 2), padding='same'))
# third block
my_model.add(Conv2D(64, (3, 3), activation='relu', padding='same'))
my_model.add(MaxPooling2D((2, 2), padding='same'))
# fourth block
my_model.add(Conv2D(128, (3, 3), activation='relu', padding='same'))
my_model.add(MaxPooling2D((2, 2), padding='same'))

# global average pooling
my_model.add(GlobalAveragePooling2D())
# fully connected layer
my_model.add(Dense(64, activation='relu'))
my_model.add(BatchNormalization())
# make predictions
my_model.add(Dense(2, activation='sigmoid'))

# Show a summary of the model. Check the number of trainable parameters
my_model.summary()

# use early stopping to optimally terminate training through callbacks
es = EarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=5)

# save best model automatically
mc = ModelCheckpoint("C:/Users/ab123/Desktop/vgg16_1.h5", monitor='val_acc', verbose=1, save_best_only=True,
                             save_weights_only=False, mode='auto', period=1)

cb_list = [mc, es]


# compile model
my_model.compile(optimizer='adam', loss='binary_crossentropy',
                 metrics=['accuracy'])

# set up data generator
data_generator = ImageDataGenerator(preprocessing_function=preprocess_input)

# get batches of training images from the directory
train_generator = data_generator.flow_from_directory(
        'D:/Project2020/Step2a/train',
        target_size=(224, 224),
        batch_size=10,
        class_mode='categorical')

# get batches of validation images from the directory
validation_generator = data_generator.flow_from_directory(
        'D:/Project2020/Step2a/val',
        target_size=(224, 224),
        batch_size=10,
        class_mode='categorical')


history = my_model.fit_generator(
        train_generator,
        epochs=1,
        steps_per_epoch=2000,
        validation_data=validation_generator,
        validation_steps=1000, callbacks=cb_list)



plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.ylim([.5,1.1])
plt.ylabel('Accuracy')
plt.xlabel('Epoch')
plt.legend(['Train', 'Validation'], loc='upper left')
plt.savefig("C:/Users/ab123/Desktop/11-16-19model.png", dpi=300)

# load a saved model
import os

saved_model = load_model('C:/Users/ab123/Desktop/11-16-19model.h5')

# generate data for test set of images
test_generator = data_generator.flow_from_directory(
        'C:/Users/aeshon/Downloads/birds',
        target_size=(224, 224),
        batch_size=1,
        class_mode='categorical',
        shuffle=False)

# obtain predicted activation values for the last dense layer
test_generator.reset()
pred = saved_model.predict_generator(test_generator, verbose=1, steps=100)
# determine the maximum activation value for each sample
predicted_class_indices=np.argmax(pred,axis=1)

# label each predicted value to correct gender
labels = (test_generator.class_indices)
labels = dict((v,k) for k,v in labels.items())
predictions = [labels[k] for k in predicted_class_indices]

# format file names to simply male or female
filenames=test_generator.filenames
filenz=[0]
for i in range(0,len(filenames)):
    filenz.append(filenames[i].split('\\')[0])
filenz=filenz[1:]

# determine the test set accuracy
match=[]
for i in range(0,len(filenames)):
    match.append(filenz[i]==predictions[i])
match.count(True)/100

在触发提前停止功能或训练结束之前,模型就像一个魔咒。之后,它抛出这个错误(带有回溯)。你知道吗

Traceback (most recent call last):
  File "D:/Invasive Species Detector/11-16-19 model.py", line 77, in <module>
    validation_steps=1000, callbacks=cb_list)
  File "C:\ProgramData\Anaconda3\envs\tensorflow_test\lib\site-packages\keras\legacy\interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)
  File "C:\ProgramData\Anaconda3\envs\tensorflow_test\lib\site-packages\keras\engine\training.py", line 1418, in fit_generator
    initial_epoch=initial_epoch)
  File "C:\ProgramData\Anaconda3\envs\tensorflow_test\lib\site-packages\keras\engine\training_generator.py", line 264, in fit_generator
    callbacks.on_train_end()
  File "C:\ProgramData\Anaconda3\envs\tensorflow_test\lib\site-packages\keras\callbacks.py", line 142, in on_train_end
    callback.on_train_end(logs)
  File "C:\ProgramData\Anaconda3\envs\tensorflow_test\lib\site-packages\tensorflow\python\keras\callbacks.py", line 940, in on_train_end
    if self.model._ckpt_saved_epoch is not None:
AttributeError: 'Sequential' object has no attribute '_ckpt_saved_epoch'

当然,我首先环顾四周,看看是否有人有同样的错误之前。除了一个有KerasTPU物体的人,没有人回答他的问题,所以我不知道如何解决这个问题。我希望有人能解释一下这个问题,请在评论中询问我任何其他信息,如版本等


Tags: infromtestimportaddmodelmytensorflow
1条回答
网友
1楼 · 发布于 2024-06-01 21:42:19

在您的导入中,您在kerastf.keras包之间混合导入,这是不受支持的,并且会产生与您得到的错误类似的奇怪错误。解决方案很简单,选择一个包并从中进行所有相关的导入。你知道吗

相关问题 更多 >