如何解决您好,请在运行代码时出现错误模块“ tensorflow”没有属性“ get_default_graph”
这是我的节点:
# Model Definition
input_shape = X_train[0].shape
num_genres = 10
def cnn_vgg16(input_shape,num_genres,freezed_layers):
input_tensor = Input(shape=input_shape)
vgg16 = VGG16(include_top=False,weights='imagenet',input_tensor=input_tensor)
top = Sequential()
top.add(Flatten(input_shape=vgg16.output_shape[1:]))
top.add(Dense(256,activation='relu'))
top.add(Dropout(0.5))
top.add(Dense(num_genres,activation='softmax'))
model = Model(inputs=vgg16.input,outputs=top(vgg16.output))
for layer in model.layers[:freezed_layers]:
layer.trainable = False
return model
model = cnn_vgg16(input_shape,5)
print("Creating EarlyStopping Callback ...")
early_stopping_callback = EarlyStopping(monitor='val_acc',patience=5)
model.summary()
这是错误:
AttributeError:模块'tensorflow'没有属性'get_default_graph'
解决方法
我能够使用 Tensorflow 2.5 执行如下所示的代码
import tensorflow as tf
print(tf.__version__)
from tensorflow.keras.applications.vgg16 import VGG16
from tensorflow.keras.layers import Flatten,Dense,Dropout
from tensorflow.keras import Sequential,Model,Input
input_shape = (224,224,3)
num_genres = 10
def cnn_vgg16(input_shape,num_genres,freezed_layers):
input_tensor = Input(shape=input_shape)
vgg16 = VGG16(include_top=False,weights='imagenet',input_tensor=input_tensor)
top = Sequential()
top.add(Flatten(input_shape=vgg16.output_shape[1:]))
top.add(Dense(256,activation='relu'))
top.add(Dropout(0.5))
top.add(Dense(num_genres,activation='softmax'))
model = Model(inputs=vgg16.input,outputs=top(vgg16.output))
for layer in model.layers[:freezed_layers]:
layer.trainable = False
return model
model = cnn_vgg16(input_shape,5)
model.summary()
输出:
2.5.0
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None,3)] 0
_________________________________________________________________
block1_conv1 (Conv2D) (None,64) 1792
_________________________________________________________________
block1_conv2 (Conv2D) (None,64) 36928
_________________________________________________________________
block1_pool (MaxPooling2D) (None,112,64) 0
_________________________________________________________________
block2_conv1 (Conv2D) (None,128) 73856
_________________________________________________________________
block2_conv2 (Conv2D) (None,128) 147584
_________________________________________________________________
block2_pool (MaxPooling2D) (None,56,128) 0
_________________________________________________________________
block3_conv1 (Conv2D) (None,256) 295168
_________________________________________________________________
block3_conv2 (Conv2D) (None,256) 590080
_________________________________________________________________
block3_conv3 (Conv2D) (None,256) 590080
_________________________________________________________________
block3_pool (MaxPooling2D) (None,28,256) 0
_________________________________________________________________
block4_conv1 (Conv2D) (None,512) 1180160
_________________________________________________________________
block4_conv2 (Conv2D) (None,512) 2359808
_________________________________________________________________
block4_conv3 (Conv2D) (None,512) 2359808
_________________________________________________________________
block4_pool (MaxPooling2D) (None,14,512) 0
_________________________________________________________________
block5_conv1 (Conv2D) (None,512) 2359808
_________________________________________________________________
block5_conv2 (Conv2D) (None,512) 2359808
_________________________________________________________________
block5_conv3 (Conv2D) (None,512) 2359808
_________________________________________________________________
block5_pool (MaxPooling2D) (None,7,512) 0
_________________________________________________________________
sequential (Sequential) (None,10) 6425354
=================================================================
Total params: 21,140,042
Trainable params: 21,027,466
Non-trainable params: 112,576
_________________________________________________________________
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。