iis服务器助手广告
返回顶部
首页 > 资讯 > 后端开发 > Python >解决tensorflow 与keras 混用之坑
  • 639
分享到

解决tensorflow 与keras 混用之坑

2024-04-02 19:04:59 639人浏览 安东尼

Python 官方文档:入门教程 => 点击学习

摘要

在使用Tensorflow与keras混用是model.save 是正常的但是在load_model的时候报错了在这里mark 一下 其中错误为:TypeError: tuple

在使用Tensorflow与keras混用是model.save 是正常的但是在load_model的时候报错了在这里mark 一下

其中错误为:TypeError: tuple indices must be integers, not list

再一一番百度后无结果,上谷歌后找到了类似的问题。但是是一对鸟文不知道什么东西(翻译后发现是俄文)。后来谷歌翻译了一下找到了解决方法。故将原始问题文章贴上来警示一下

原训练代码


from tensorflow.python.keras.preprocessing.image import ImageDataGenerator
from tensorflow.Python.keras.models import Sequential
from tensorflow.python.keras.layers import Conv2D, MaxPooling2D, BatchNORMalization
from tensorflow.python.keras.layers import Activation, Dropout, Flatten, Dense
 
#Каталог с данными для обучения
train_dir = 'train'
# Каталог с данными для проверки
val_dir = 'val'
# Каталог с данными для тестирования
test_dir = 'val'
 
# Размеры изображения
img_width, img_height = 800, 800
# Размерность тензора на основе изображения для входных данных в нейронную сеть
# backend Tensorflow, channels_last
input_shape = (img_width, img_height, 3)
# Количество эпох
epochs = 1
# Размер мини-выборки
batch_size = 4
# Количество изображений для обучения
nb_train_samples = 300
# Количество изображений для проверки
nb_validation_samples = 25
# Количество изображений для тестирования
nb_test_samples = 25
 
model = Sequential()
 
model.add(Conv2D(32, (7, 7), padding="same", input_shape=input_shape))
model.add(BatchNormalization())
model.add(Activation('tanh'))
model.add(MaxPooling2D(pool_size=(10, 10)))
 
model.add(Conv2D(64, (5, 5), padding="same"))
model.add(BatchNormalization())
model.add(Activation('tanh'))
model.add(MaxPooling2D(pool_size=(10, 10)))
 
model.add(Flatten())
model.add(Dense(512))
model.add(Activation('relu'))
model.add(Dropout(0.5))
model.add(Dense(10, activation='softmax'))
 
model.compile(loss='cateGorical_crossentropy',
              optimizer="Nadam",
              metrics=['accuracy'])
print(model.summary())
datagen = ImageDataGenerator(rescale=1. / 255)
 
train_generator = datagen.flow_from_directory(
    train_dir,
    target_size=(img_width, img_height),
    batch_size=batch_size,
    class_mode='categorical')
 
val_generator = datagen.flow_from_directory(
    val_dir,
    target_size=(img_width, img_height),
    batch_size=batch_size,
    class_mode='categorical')
 
test_generator = datagen.flow_from_directory(
    test_dir,
    target_size=(img_width, img_height),
    batch_size=batch_size,
    class_mode='categorical')
 
model.fit_generator(
    train_generator,
    steps_per_epoch=nb_train_samples // batch_size,
    epochs=epochs,
    validation_data=val_generator,
    validation_steps=nb_validation_samples // batch_size)
 
print('Сохраняем сеть')
 
model.save("grib.h5")
print("Сохранение завершено!")

模型载入


from tensorflow.python.keras.preprocessing.image import ImageDataGenerator
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Conv2D, MaxPooling2D, BatchNormalization
from tensorflow.python.keras.layers import Activation, Dropout, Flatten, Dense
from keras.models import load_model
 
print("Загрузка сети")
model = load_model("grib.h5")
print("Загрузка завершена!")

报错

/usr/bin/python3.5 /home/disk2/py/neroset/do.py
/home/mama/.local/lib/python3.5/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import reGISter_converters as _register_converters
Using TensorFlow backend.
Загрузка сети
Traceback (most recent call last):
File "/home/disk2/py/neroset/do.py", line 13, in <module>
model = load_model("grib.h5")
File "/usr/local/lib/python3.5/dist-packages/keras/models.py", line 243, in load_model
model = model_from_config(model_config, custom_objects=custom_objects)
File "/usr/local/lib/python3.5/dist-packages/keras/models.py", line 317, in model_from_config
return layer_module.deserialize(config, custom_objects=custom_objects)
File "/usr/local/lib/python3.5/dist-packages/keras/layers/__init__.py", line 55, in deserialize
printable_module_name='layer')
File "/usr/local/lib/python3.5/dist-packages/keras/utils/generic_utils.py", line 144, in deserialize_keras_object
list(custom_objects.items())))
File "/usr/local/lib/python3.5/dist-packages/keras/models.py", line 1350, in from_config
model.add(layer)
File "/usr/local/lib/python3.5/dist-packages/keras/models.py", line 492, in add
output_tensor = layer(self.outputs[0])
File "/usr/local/lib/python3.5/dist-packages/keras/engine/topology.py", line 590, in __call__
self.build(input_shapes[0])
File "/usr/local/lib/python3.5/dist-packages/keras/layers/normalization.py", line 92, in build
dim = input_shape[self.axis]
TypeError: tuple indices must be integers or slices, not list

Process finished with exit code 1

战斗种族解释

убераю BatchNormalization всё работает хорошо. Не подскажите в чём ошибка?Выяснил что сохранение keras и нормализация tensorflow не работают вместе нужно просто изменить строку импорта.(译文:整理BatchNormalization一切正常。 不要告诉我错误是什么?我发现保存keras和规范化tensorflow不能一起工作;只需更改导入字符串即可。)

强调文本 强调文本


keras.preprocessing.image import ImageDataGenerator
keras.models import Sequential
keras.layers import Conv2D, MaxPooling2D, BatchNormalization
keras.layers import Activation, Dropout, Flatten, Dense

##完美解决

##附上原文链接

https://qa-help.ru/questions/keras-batchnormalization

补充:keras和tensorflow模型同时读取要慎重

项目中,先读取了一个keras模型获取模型输入size,再加载keras转tensorflow后的pb模型进行预测。

报错:

Attempting to use uninitialized value batch_normalization_14/moving_mean

逛论坛,有建议加上初始化:


sess.run(tf.global_variables_initializer())

但是这样的话,会导致模型参数全部变成初始化数据。无法使用预测模型参数。

最后发现,将keras模型的加载去掉即可。

猜测原因:keras模型和tensorflow模型同时读取有坑


import cv2
import numpy as np
from keras.models import load_model
from utils.datasets import get_labels
from utils.preprocessor import preprocess_input
import time
import os
import tensorflow as tf
from tensorflow.python.platform import gfile
 
os.environ["CUDA_VISIBLE_DEVICES"] = "-1"
 
emotion_labels = get_labels('fer2013')
emotion_target_size = (64,64)
#emotion_model_path = './models/emotion_model.hdf5'
#emotion_classifier = load_model(emotion_model_path)
#emotion_target_size = emotion_classifier.input_shape[1:3]
 
path = '/mnt/nas/cv_data/emotion/test'
filelist = os.listdir(path)
total_num = len(filelist)
timeall = 0
n = 0
 
sess = tf.Session()
#sess.run(tf.global_variables_initializer())
with gfile.FastGFile("./trans_model/emotion_mode.pb", 'rb') as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())
    sess.graph.as_default()
    tf.import_graph_def(graph_def, name='')
 
    pred = sess.graph.get_tensor_by_name("predictions/Softmax:0")
 
    ######################img##########################
    for item in filelist:
        if (item == '.DS_Store') | (item == 'Thumbs.db'):
            continue
        src = os.path.join(os.path.abspath(path), item)
        bgr_image = cv2.imread(src)
        gray_image = cv2.cvtColor(bgr_image, cv2.COLOR_BGR2GRAY)
        gray_face = gray_image
        try:
            gray_face = cv2.resize(gray_face, (emotion_target_size))
        except:
            continue
 
        gray_face = preprocess_input(gray_face, True)
        gray_face = np.expand_dims(gray_face, 0)
        gray_face = np.expand_dims(gray_face, -1)
 
        input = sess.graph.get_tensor_by_name('input_1:0')
        res = sess.run(pred, {input: gray_face})
        print("src:", src)
 
        emotion_probability = np.max(res[0])
        emotion_label_arg = np.argmax(res[0])
        emotion_text = emotion_labels[emotion_label_arg]
        print("predict:", res[0], ",prob:", emotion_probability, ",label:", emotion_label_arg, ",text:",emotion_text)

以上为个人经验,希望能给大家一个参考,也希望大家多多支持编程网。

--结束END--

本文标题: 解决tensorflow 与keras 混用之坑

本文链接: https://www.lsjlt.com/news/126208.html(转载时请注明来源链接)

有问题或投稿请发送至: 邮箱/279061341@qq.com    QQ/279061341

本篇文章演示代码以及资料文档资料下载

下载Word文档到电脑,方便收藏和打印~

下载Word文档
猜你喜欢
软考高级职称资格查询
编程网,编程工程师的家园,是目前国内优秀的开源技术社区之一,形成了由开源软件库、代码分享、资讯、协作翻译、讨论区和博客等几大频道内容,为IT开发者提供了一个发现、使用、并交流开源技术的平台。
  • 官方手机版

  • 微信公众号

  • 商务合作