通过tensorflow serving在生产系统部署模型

通过tensorflow编写、训练和测试模型,咱们都已经轻车熟路,但是在生产环境部署模型挺不容易。在tensorflow中默认有两种方法TensorFlow 模型保存/载入保存参数,使用Saver方法仅保存变量在推断的时候需要重新定义Graph,使用tf.train.import_meta_graph导入graph信息并创建Saver,再使用Saver restore变量,但是为了找到输入输出的tensor,还得用graph.get_tensor_by_name()来获取,也就是还需要知道在定义模型阶段所赋予这些tensor的名字。

谷歌提供的模型部署组件tensorflow serving,解决了上面的问题。不过目前官网提供的二进制安装包不支持GPU,需要支持GPU的需要重源码编译,请参考下面的链接。

本文通过tensorflow serving部署一组预测模型的完成过程,记录各个接口的编码过程

saved_model模块

Introductory Tutorial to TensorFlow Serving

Tensorflow Serving | model server GPU版本编译

Improve TensorFlow Serving Performance With GPU Support

为什么tensorflow serving没有占用GPU

How to Deploy a Tensorflow Model in Production

1)安装tensorflow serving

安装tensorflow serving的方法有两种,一种是通过二进制包安装,简单快捷;一种是通过源码安装,编译过程经常出错,不过需要GPU支持的话,需要从源码编译。具体请参考tensorflow serving install

2)编写支持tensorflow serving的模型

也就是接口变量需要通过tf.saved_model模块来保存

In [1]:
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
import os
tf.reset_default_graph()
In [2]:
dataset = np.loadtxt('/home/zzx/2.txt')
dataset[:10]
Out[2]:
array([  618.,   454.,   391.,   390.,   378.,   395.,   441.,   624.,
        1131.,  1565.])
In [3]:
dataset = (dataset - dataset.mean())/dataset.std()
dataset[:10]
Out[3]:
array([-0.23577144, -0.41557363, -0.48464398, -0.48574034, -0.49889659,
       -0.48025856, -0.42982624, -0.22919331,  0.32665857,  0.80247656])
In [4]:
#设置模型输出的路径
MODELS_OUPUT_DIR='/home/zzx/scenic_prediction/models2'
VERSION='01'

#模型参数
window_size=20
rnn_size=20
batch_size=100
LEARNING_RATE=0.001
NUM_EPOCHS=15

数据预处理

In [5]:
#序列数据的截取函数
def window_transform_series(series,window_size):
    X = [series[i:(i+window_size)] for i in range(len(series)-window_size)]
    y = [series[i+window_size] for i in range(len(series)-window_size)]
    # reshape each
    X = np.asarray(X)
    X.shape = (np.shape(X)[0:2])
    y = np.asarray(y)
    y.shape = (len(y),1)
   
    return X,y

#获取输入输出数据
X,y = window_transform_series(series = dataset,window_size = window_size)

#分析训练集和测试集
train_test_split = int(np.ceil(9*len(y)/float(10)))
X_train = X[:train_test_split,:]
y_train = y[:train_test_split]
X_test = X[train_test_split:,:]
y_test = y[train_test_split:]

X_train = np.asarray(np.reshape(X_train, (X_train.shape[0], window_size, 1)))
X_test = np.asarray(np.reshape(X_test, (X_test.shape[0], window_size, 1)))

print(X_train.shape)
print(y_train.shape)



def get_batches(batch_size,x_inputs,y_outputs):
    batches_len=int(np.floor(len(x_inputs)/batch_size))
    print(batches_len)
    x_batches=np.asarray([x_inputs[i*batch_size:(i+1)*batch_size,:] for i in range(0,batches_len)]).astype(np.float64)
    y_batches=np.asarray([y_outputs[i*batch_size:(i+1)*batch_size,:] for i in range(0,batches_len)]).astype(np.float64)
    return x_batches,y_batches
    
(6637, 20, 1)
(6637, 1)

模型定义

In [6]:
inputs = tf.placeholder(tf.float32, [None, None,1],name='inputs')
targets = tf.placeholder(tf.float32, [None, None],name='targets')
lr = tf.placeholder(tf.float32, name='lr')

def model(inputs,rnn_size,batch_size):
    num_layers=1
    lstm = tf.contrib.rnn.BasicLSTMCell(rnn_size,state_is_tuple=False,activation=tf.nn.relu)
    cell = tf.contrib.rnn.MultiRNNCell([lstm] * num_layers)
    inital_state=tf.identity(cell.zero_state(batch_size,tf.float32),name='initial_state')
    outputs,final_state=tf.nn.dynamic_rnn(cell=cell,inputs=inputs,dtype=tf.float32)
    print(final_state)
    #最后的链接层不再需要激活函数
    logits=tf.contrib.layers.fully_connected(inputs=outputs[:, -1],num_outputs=1,activation_fn=None)
    return logits,inital_state


logits,inital_state = model(inputs,rnn_size,batch_size)
cost = tf.losses.mean_squared_error(labels=targets,predictions=logits)
optimizer = tf.train.AdamOptimizer(lr)
gradients = optimizer.compute_gradients(cost)
capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients if grad is not None]
train_op = optimizer.apply_gradients(capped_gradients)
init=tf.global_variables_initializer()
   

x_batches,y_batches=get_batches(batch_size,X_train,y_train)
print(x_batches.shape)
print(y_batches.shape)
losses=[]
with tf.Session() as sess:
    sess.run(init)
    state=sess.run(inital_state)
    for epoch_i in range(NUM_EPOCHS):
        for i in range(len(x_batches)):
            feed={inputs:x_batches[i],targets:y_batches[i],lr:LEARNING_RATE}
            train_loss, _ = sess.run([cost, train_op],feed_dict=feed)
            print('Epoch {:>3} Batch {:>4}/{}  train_loss={:.3f}'.format(epoch_i,i,len(x_batches),train_loss))
        losses.append(train_loss)
           
    
    #test data
    targets_predict=sess.run(logits,feed_dict={inputs:X_test})
    print(targets_predict.shape)
    
    
    OUTPUT_PATH=os.path.join(tf.compat.as_bytes(MODELS_OUPUT_DIR),tf.compat.as_bytes(VERSION))
    print('Exporting trained model to' +  str(OUTPUT_PATH))
    
    #创建符合tensorflow serving服务器的模型参数保存builder
    builder=tf.saved_model.builder.SavedModelBuilder(OUTPUT_PATH)
    #获取输入输出的张量信息
    inputs_tensor_info=tf.saved_model.utils.build_tensor_info(inputs)
    outputs_tensor_info=tf.saved_model.utils.build_tensor_info(logits)
    #对输入输出张量签名
    predict_signature=tf.saved_model.signature_def_utils.build_signature_def(
            inputs={"inputs":inputs_tensor_info},
            outputs={"outputs":outputs_tensor_info},
            method_name=tf.saved_model.signature_constants.REGRESS_METHOD_NAME
            )
    #定义预测函数的名字
    builder.add_meta_graph_and_variables(sess,[tf.saved_model.tag_constants.SERVING],{"scenic_prediction_2hour":predict_signature})
    builder.save()
    
    
WARNING:tensorflow:<tensorflow.contrib.rnn.python.ops.core_rnn_cell_impl.BasicLSTMCell object at 0x7fb8d837a470>: Using a concatenated state is slower and will soon be deprecated.  Use state_is_tuple=True.
(<tf.Tensor 'rnn/while/Exit_2:0' shape=(?, 40) dtype=float32>,)
66
(66, 100, 20, 1)
(66, 100, 1)
Epoch   0 Batch    0/66  train_loss=0.845
Epoch   0 Batch    1/66  train_loss=0.806
Epoch   0 Batch    2/66  train_loss=1.056
Epoch   0 Batch    3/66  train_loss=1.165
Epoch   0 Batch    4/66  train_loss=1.139
Epoch   0 Batch    5/66  train_loss=2.513
Epoch   0 Batch    6/66  train_loss=2.722
Epoch   0 Batch    7/66  train_loss=2.334
Epoch   0 Batch    8/66  train_loss=2.505
Epoch   0 Batch    9/66  train_loss=1.853
Epoch   0 Batch   10/66  train_loss=1.876
Epoch   0 Batch   11/66  train_loss=0.934
Epoch   0 Batch   12/66  train_loss=1.306
Epoch   0 Batch   13/66  train_loss=1.968
Epoch   0 Batch   14/66  train_loss=1.833
Epoch   0 Batch   15/66  train_loss=1.415
Epoch   0 Batch   16/66  train_loss=1.249
Epoch   0 Batch   17/66  train_loss=0.925
Epoch   0 Batch   18/66  train_loss=1.044
Epoch   0 Batch   19/66  train_loss=0.777
Epoch   0 Batch   20/66  train_loss=0.845
Epoch   0 Batch   21/66  train_loss=0.786
Epoch   0 Batch   22/66  train_loss=0.492
Epoch   0 Batch   23/66  train_loss=0.690
Epoch   0 Batch   24/66  train_loss=0.848
Epoch   0 Batch   25/66  train_loss=0.539
Epoch   0 Batch   26/66  train_loss=0.493
Epoch   0 Batch   27/66  train_loss=0.443
Epoch   0 Batch   28/66  train_loss=0.499
Epoch   0 Batch   29/66  train_loss=0.384
Epoch   0 Batch   30/66  train_loss=0.504
Epoch   0 Batch   31/66  train_loss=0.406
Epoch   0 Batch   32/66  train_loss=0.346
Epoch   0 Batch   33/66  train_loss=0.370
Epoch   0 Batch   34/66  train_loss=0.364
Epoch   0 Batch   35/66  train_loss=0.430
Epoch   0 Batch   36/66  train_loss=0.408
Epoch   0 Batch   37/66  train_loss=0.346
Epoch   0 Batch   38/66  train_loss=0.446
Epoch   0 Batch   39/66  train_loss=0.397
Epoch   0 Batch   40/66  train_loss=0.579
Epoch   0 Batch   41/66  train_loss=0.505
Epoch   0 Batch   42/66  train_loss=0.580
Epoch   0 Batch   43/66  train_loss=0.662
Epoch   0 Batch   44/66  train_loss=0.636
Epoch   0 Batch   45/66  train_loss=0.610
Epoch   0 Batch   46/66  train_loss=0.573
Epoch   0 Batch   47/66  train_loss=0.515
Epoch   0 Batch   48/66  train_loss=0.492
Epoch   0 Batch   49/66  train_loss=0.464
Epoch   0 Batch   50/66  train_loss=0.463
Epoch   0 Batch   51/66  train_loss=0.431
Epoch   0 Batch   52/66  train_loss=0.263
Epoch   0 Batch   53/66  train_loss=0.304
Epoch   0 Batch   54/66  train_loss=0.453
Epoch   0 Batch   55/66  train_loss=0.387
Epoch   0 Batch   56/66  train_loss=0.438
Epoch   0 Batch   57/66  train_loss=0.666
Epoch   0 Batch   58/66  train_loss=0.464
Epoch   0 Batch   59/66  train_loss=0.411
Epoch   0 Batch   60/66  train_loss=2.113
Epoch   0 Batch   61/66  train_loss=0.836
Epoch   0 Batch   62/66  train_loss=0.426
Epoch   0 Batch   63/66  train_loss=0.446
Epoch   0 Batch   64/66  train_loss=0.725
Epoch   0 Batch   65/66  train_loss=0.536
Epoch   1 Batch    0/66  train_loss=0.529
Epoch   1 Batch    1/66  train_loss=0.493
Epoch   1 Batch    2/66  train_loss=0.467
Epoch   1 Batch    3/66  train_loss=0.107
Epoch   1 Batch    4/66  train_loss=0.081
Epoch   1 Batch    5/66  train_loss=1.222
Epoch   1 Batch    6/66  train_loss=1.659
Epoch   1 Batch    7/66  train_loss=1.345
Epoch   1 Batch    8/66  train_loss=1.430
Epoch   1 Batch    9/66  train_loss=1.072
Epoch   1 Batch   10/66  train_loss=0.928
Epoch   1 Batch   11/66  train_loss=0.078
Epoch   1 Batch   12/66  train_loss=0.337
Epoch   1 Batch   13/66  train_loss=0.829
Epoch   1 Batch   14/66  train_loss=0.762
Epoch   1 Batch   15/66  train_loss=0.468
Epoch   1 Batch   16/66  train_loss=0.457
Epoch   1 Batch   17/66  train_loss=0.389
Epoch   1 Batch   18/66  train_loss=0.367
Epoch   1 Batch   19/66  train_loss=0.386
Epoch   1 Batch   20/66  train_loss=0.288
Epoch   1 Batch   21/66  train_loss=0.273
Epoch   1 Batch   22/66  train_loss=0.206
Epoch   1 Batch   23/66  train_loss=0.171
Epoch   1 Batch   24/66  train_loss=0.215
Epoch   1 Batch   25/66  train_loss=0.144
Epoch   1 Batch   26/66  train_loss=0.148
Epoch   1 Batch   27/66  train_loss=0.141
Epoch   1 Batch   28/66  train_loss=0.168
Epoch   1 Batch   29/66  train_loss=0.077
Epoch   1 Batch   30/66  train_loss=0.104
Epoch   1 Batch   31/66  train_loss=0.103
Epoch   1 Batch   32/66  train_loss=0.071
Epoch   1 Batch   33/66  train_loss=0.076
Epoch   1 Batch   34/66  train_loss=0.094
Epoch   1 Batch   35/66  train_loss=0.102
Epoch   1 Batch   36/66  train_loss=0.136
Epoch   1 Batch   37/66  train_loss=0.050
Epoch   1 Batch   38/66  train_loss=0.066
Epoch   1 Batch   39/66  train_loss=0.071
Epoch   1 Batch   40/66  train_loss=0.264
Epoch   1 Batch   41/66  train_loss=0.071
Epoch   1 Batch   42/66  train_loss=0.240
Epoch   1 Batch   43/66  train_loss=0.814
Epoch   1 Batch   44/66  train_loss=0.115
Epoch   1 Batch   45/66  train_loss=0.279
Epoch   1 Batch   46/66  train_loss=0.196
Epoch   1 Batch   47/66  train_loss=0.043
Epoch   1 Batch   48/66  train_loss=0.097
Epoch   1 Batch   49/66  train_loss=0.151
Epoch   1 Batch   50/66  train_loss=0.136
Epoch   1 Batch   51/66  train_loss=0.127
Epoch   1 Batch   52/66  train_loss=0.069
Epoch   1 Batch   53/66  train_loss=0.082
Epoch   1 Batch   54/66  train_loss=0.132
Epoch   1 Batch   55/66  train_loss=0.107
Epoch   1 Batch   56/66  train_loss=0.140
Epoch   1 Batch   57/66  train_loss=0.230
Epoch   1 Batch   58/66  train_loss=0.139
Epoch   1 Batch   59/66  train_loss=0.106
Epoch   1 Batch   60/66  train_loss=0.975
Epoch   1 Batch   61/66  train_loss=0.218
Epoch   1 Batch   62/66  train_loss=0.103
Epoch   1 Batch   63/66  train_loss=0.122
Epoch   1 Batch   64/66  train_loss=0.130
Epoch   1 Batch   65/66  train_loss=0.093
Epoch   2 Batch    0/66  train_loss=0.143
Epoch   2 Batch    1/66  train_loss=0.160
Epoch   2 Batch    2/66  train_loss=1.403
Epoch   2 Batch    3/66  train_loss=4.428
Epoch   2 Batch    4/66  train_loss=2.419
Epoch   2 Batch    5/66  train_loss=1.129
Epoch   2 Batch    6/66  train_loss=0.674
Epoch   2 Batch    7/66  train_loss=0.591
Epoch   2 Batch    8/66  train_loss=0.771
Epoch   2 Batch    9/66  train_loss=0.558
Epoch   2 Batch   10/66  train_loss=0.507
Epoch   2 Batch   11/66  train_loss=0.110
Epoch   2 Batch   12/66  train_loss=0.295
Epoch   2 Batch   13/66  train_loss=0.602
Epoch   2 Batch   14/66  train_loss=0.670
Epoch   2 Batch   15/66  train_loss=0.395
Epoch   2 Batch   16/66  train_loss=0.335
Epoch   2 Batch   17/66  train_loss=0.231
Epoch   2 Batch   18/66  train_loss=0.265
Epoch   2 Batch   19/66  train_loss=0.244
Epoch   2 Batch   20/66  train_loss=0.191
Epoch   2 Batch   21/66  train_loss=0.195
Epoch   2 Batch   22/66  train_loss=0.151
Epoch   2 Batch   23/66  train_loss=0.149
Epoch   2 Batch   24/66  train_loss=0.180
Epoch   2 Batch   25/66  train_loss=0.121
Epoch   2 Batch   26/66  train_loss=0.120
Epoch   2 Batch   27/66  train_loss=0.136
Epoch   2 Batch   28/66  train_loss=0.140
Epoch   2 Batch   29/66  train_loss=0.100
Epoch   2 Batch   30/66  train_loss=0.100
Epoch   2 Batch   31/66  train_loss=0.106
Epoch   2 Batch   32/66  train_loss=0.077
Epoch   2 Batch   33/66  train_loss=0.077
Epoch   2 Batch   34/66  train_loss=0.073
Epoch   2 Batch   35/66  train_loss=0.043
Epoch   2 Batch   36/66  train_loss=0.099
Epoch   2 Batch   37/66  train_loss=0.049
Epoch   2 Batch   38/66  train_loss=0.073
Epoch   2 Batch   39/66  train_loss=0.079
Epoch   2 Batch   40/66  train_loss=0.198
Epoch   2 Batch   41/66  train_loss=0.090
Epoch   2 Batch   42/66  train_loss=0.141
Epoch   2 Batch   43/66  train_loss=0.443
Epoch   2 Batch   44/66  train_loss=0.192
Epoch   2 Batch   45/66  train_loss=0.024
Epoch   2 Batch   46/66  train_loss=0.043
Epoch   2 Batch   47/66  train_loss=0.143
Epoch   2 Batch   48/66  train_loss=0.190
Epoch   2 Batch   49/66  train_loss=0.160
Epoch   2 Batch   50/66  train_loss=0.093
Epoch   2 Batch   51/66  train_loss=0.070
Epoch   2 Batch   52/66  train_loss=0.057
Epoch   2 Batch   53/66  train_loss=0.054
Epoch   2 Batch   54/66  train_loss=0.083
Epoch   2 Batch   55/66  train_loss=0.072
Epoch   2 Batch   56/66  train_loss=0.099
Epoch   2 Batch   57/66  train_loss=0.150
Epoch   2 Batch   58/66  train_loss=0.112
Epoch   2 Batch   59/66  train_loss=0.086
Epoch   2 Batch   60/66  train_loss=0.827
Epoch   2 Batch   61/66  train_loss=0.252
Epoch   2 Batch   62/66  train_loss=0.098
Epoch   2 Batch   63/66  train_loss=0.078
Epoch   2 Batch   64/66  train_loss=0.144
Epoch   2 Batch   65/66  train_loss=0.096
Epoch   3 Batch    0/66  train_loss=0.106
Epoch   3 Batch    1/66  train_loss=0.108
Epoch   3 Batch    2/66  train_loss=0.099
Epoch   3 Batch    3/66  train_loss=0.025
Epoch   3 Batch    4/66  train_loss=0.023
Epoch   3 Batch    5/66  train_loss=0.421
Epoch   3 Batch    6/66  train_loss=0.393
Epoch   3 Batch    7/66  train_loss=0.278
Epoch   3 Batch    8/66  train_loss=0.290
Epoch   3 Batch    9/66  train_loss=0.292
Epoch   3 Batch   10/66  train_loss=0.182
Epoch   3 Batch   11/66  train_loss=0.122
Epoch   3 Batch   12/66  train_loss=0.324
Epoch   3 Batch   13/66  train_loss=0.170
Epoch   3 Batch   14/66  train_loss=0.315
Epoch   3 Batch   15/66  train_loss=0.116
Epoch   3 Batch   16/66  train_loss=0.135
Epoch   3 Batch   17/66  train_loss=0.101
Epoch   3 Batch   18/66  train_loss=0.101
Epoch   3 Batch   19/66  train_loss=0.149
Epoch   3 Batch   20/66  train_loss=0.087
Epoch   3 Batch   21/66  train_loss=0.101
Epoch   3 Batch   22/66  train_loss=0.113
Epoch   3 Batch   23/66  train_loss=0.078
Epoch   3 Batch   24/66  train_loss=0.093
Epoch   3 Batch   25/66  train_loss=0.063
Epoch   3 Batch   26/66  train_loss=0.064
Epoch   3 Batch   27/66  train_loss=0.073
Epoch   3 Batch   28/66  train_loss=0.084
Epoch   3 Batch   29/66  train_loss=0.041
Epoch   3 Batch   30/66  train_loss=0.055
Epoch   3 Batch   31/66  train_loss=0.067
Epoch   3 Batch   32/66  train_loss=0.042
Epoch   3 Batch   33/66  train_loss=0.041
Epoch   3 Batch   34/66  train_loss=0.051
Epoch   3 Batch   35/66  train_loss=0.054
Epoch   3 Batch   36/66  train_loss=0.091
Epoch   3 Batch   37/66  train_loss=0.036
Epoch   3 Batch   38/66  train_loss=0.049
Epoch   3 Batch   39/66  train_loss=0.059
Epoch   3 Batch   40/66  train_loss=0.160
Epoch   3 Batch   41/66  train_loss=0.051
Epoch   3 Batch   42/66  train_loss=0.050
Epoch   3 Batch   43/66  train_loss=0.021
Epoch   3 Batch   44/66  train_loss=0.006
Epoch   3 Batch   45/66  train_loss=0.011
Epoch   3 Batch   46/66  train_loss=0.023
Epoch   3 Batch   47/66  train_loss=0.022
Epoch   3 Batch   48/66  train_loss=0.009
Epoch   3 Batch   49/66  train_loss=0.006
Epoch   3 Batch   50/66  train_loss=0.032
Epoch   3 Batch   51/66  train_loss=0.042
Epoch   3 Batch   52/66  train_loss=0.035
Epoch   3 Batch   53/66  train_loss=0.033
Epoch   3 Batch   54/66  train_loss=0.049
Epoch   3 Batch   55/66  train_loss=0.042
Epoch   3 Batch   56/66  train_loss=0.054
Epoch   3 Batch   57/66  train_loss=0.072
Epoch   3 Batch   58/66  train_loss=0.057
Epoch   3 Batch   59/66  train_loss=0.044
Epoch   3 Batch   60/66  train_loss=0.315
Epoch   3 Batch   61/66  train_loss=0.092
Epoch   3 Batch   62/66  train_loss=0.048
Epoch   3 Batch   63/66  train_loss=0.048
Epoch   3 Batch   64/66  train_loss=0.063
Epoch   3 Batch   65/66  train_loss=0.047
Epoch   4 Batch    0/66  train_loss=0.077
Epoch   4 Batch    1/66  train_loss=0.090
Epoch   4 Batch    2/66  train_loss=0.105
Epoch   4 Batch    3/66  train_loss=0.096
Epoch   4 Batch    4/66  train_loss=0.019
Epoch   4 Batch    5/66  train_loss=0.424
Epoch   4 Batch    6/66  train_loss=0.312
Epoch   4 Batch    7/66  train_loss=0.260
Epoch   4 Batch    8/66  train_loss=0.285
Epoch   4 Batch    9/66  train_loss=0.292
Epoch   4 Batch   10/66  train_loss=0.139
Epoch   4 Batch   11/66  train_loss=0.029
Epoch   4 Batch   12/66  train_loss=0.191
Epoch   4 Batch   13/66  train_loss=0.122
Epoch   4 Batch   14/66  train_loss=0.254
Epoch   4 Batch   15/66  train_loss=0.072
Epoch   4 Batch   16/66  train_loss=0.086
Epoch   4 Batch   17/66  train_loss=0.075
Epoch   4 Batch   18/66  train_loss=0.071
Epoch   4 Batch   19/66  train_loss=0.141
Epoch   4 Batch   20/66  train_loss=0.074
Epoch   4 Batch   21/66  train_loss=0.086
Epoch   4 Batch   22/66  train_loss=0.134
Epoch   4 Batch   23/66  train_loss=0.066
Epoch   4 Batch   24/66  train_loss=0.064
Epoch   4 Batch   25/66  train_loss=0.050
Epoch   4 Batch   26/66  train_loss=0.054
Epoch   4 Batch   27/66  train_loss=0.061
Epoch   4 Batch   28/66  train_loss=0.072
Epoch   4 Batch   29/66  train_loss=0.031
Epoch   4 Batch   30/66  train_loss=0.046
Epoch   4 Batch   31/66  train_loss=0.066
Epoch   4 Batch   32/66  train_loss=0.039
Epoch   4 Batch   33/66  train_loss=0.036
Epoch   4 Batch   34/66  train_loss=0.045
Epoch   4 Batch   35/66  train_loss=0.030
Epoch   4 Batch   36/66  train_loss=0.075
Epoch   4 Batch   37/66  train_loss=0.026
Epoch   4 Batch   38/66  train_loss=0.037
Epoch   4 Batch   39/66  train_loss=0.050
Epoch   4 Batch   40/66  train_loss=0.143
Epoch   4 Batch   41/66  train_loss=0.040
Epoch   4 Batch   42/66  train_loss=0.041
Epoch   4 Batch   43/66  train_loss=0.013
Epoch   4 Batch   44/66  train_loss=0.010
Epoch   4 Batch   45/66  train_loss=0.004
Epoch   4 Batch   46/66  train_loss=0.006
Epoch   4 Batch   47/66  train_loss=0.010
Epoch   4 Batch   48/66  train_loss=0.010
Epoch   4 Batch   49/66  train_loss=0.005
Epoch   4 Batch   50/66  train_loss=0.020
Epoch   4 Batch   51/66  train_loss=0.032
Epoch   4 Batch   52/66  train_loss=0.027
Epoch   4 Batch   53/66  train_loss=0.026
Epoch   4 Batch   54/66  train_loss=0.036
Epoch   4 Batch   55/66  train_loss=0.032
Epoch   4 Batch   56/66  train_loss=0.041
Epoch   4 Batch   57/66  train_loss=0.048
Epoch   4 Batch   58/66  train_loss=0.039
Epoch   4 Batch   59/66  train_loss=0.033
Epoch   4 Batch   60/66  train_loss=0.155
Epoch   4 Batch   61/66  train_loss=0.041
Epoch   4 Batch   62/66  train_loss=0.036
Epoch   4 Batch   63/66  train_loss=0.041
Epoch   4 Batch   64/66  train_loss=0.043
Epoch   4 Batch   65/66  train_loss=0.032
Epoch   5 Batch    0/66  train_loss=0.051
Epoch   5 Batch    1/66  train_loss=0.058
Epoch   5 Batch    2/66  train_loss=0.082
Epoch   5 Batch    3/66  train_loss=0.085
Epoch   5 Batch    4/66  train_loss=0.015
Epoch   5 Batch    5/66  train_loss=0.351
Epoch   5 Batch    6/66  train_loss=0.272
Epoch   5 Batch    7/66  train_loss=0.198
Epoch   5 Batch    8/66  train_loss=0.196
Epoch   5 Batch    9/66  train_loss=0.261
Epoch   5 Batch   10/66  train_loss=0.095
Epoch   5 Batch   11/66  train_loss=0.024
Epoch   5 Batch   12/66  train_loss=0.193
Epoch   5 Batch   13/66  train_loss=0.067
Epoch   5 Batch   14/66  train_loss=0.238
Epoch   5 Batch   15/66  train_loss=0.038
Epoch   5 Batch   16/66  train_loss=0.048
Epoch   5 Batch   17/66  train_loss=0.040
Epoch   5 Batch   18/66  train_loss=0.037
Epoch   5 Batch   19/66  train_loss=0.110
Epoch   5 Batch   20/66  train_loss=0.040
Epoch   5 Batch   21/66  train_loss=0.060
Epoch   5 Batch   22/66  train_loss=0.095
Epoch   5 Batch   23/66  train_loss=0.041
Epoch   5 Batch   24/66  train_loss=0.041
Epoch   5 Batch   25/66  train_loss=0.036
Epoch   5 Batch   26/66  train_loss=0.044
Epoch   5 Batch   27/66  train_loss=0.058
Epoch   5 Batch   28/66  train_loss=0.065
Epoch   5 Batch   29/66  train_loss=0.025
Epoch   5 Batch   30/66  train_loss=0.032
Epoch   5 Batch   31/66  train_loss=0.052
Epoch   5 Batch   32/66  train_loss=0.026
Epoch   5 Batch   33/66  train_loss=0.025
Epoch   5 Batch   34/66  train_loss=0.035
Epoch   5 Batch   35/66  train_loss=0.018
Epoch   5 Batch   36/66  train_loss=0.065
Epoch   5 Batch   37/66  train_loss=0.021
Epoch   5 Batch   38/66  train_loss=0.034
Epoch   5 Batch   39/66  train_loss=0.043
Epoch   5 Batch   40/66  train_loss=0.128
Epoch   5 Batch   41/66  train_loss=0.032
Epoch   5 Batch   42/66  train_loss=0.031
Epoch   5 Batch   43/66  train_loss=0.040
Epoch   5 Batch   44/66  train_loss=0.030
Epoch   5 Batch   45/66  train_loss=0.010
Epoch   5 Batch   46/66  train_loss=0.004
Epoch   5 Batch   47/66  train_loss=0.015
Epoch   5 Batch   48/66  train_loss=0.025
Epoch   5 Batch   49/66  train_loss=0.018
Epoch   5 Batch   50/66  train_loss=0.027
Epoch   5 Batch   51/66  train_loss=0.032
Epoch   5 Batch   52/66  train_loss=0.017
Epoch   5 Batch   53/66  train_loss=0.017
Epoch   5 Batch   54/66  train_loss=0.025
Epoch   5 Batch   55/66  train_loss=0.025
Epoch   5 Batch   56/66  train_loss=0.034
Epoch   5 Batch   57/66  train_loss=0.027
Epoch   5 Batch   58/66  train_loss=0.028
Epoch   5 Batch   59/66  train_loss=0.033
Epoch   5 Batch   60/66  train_loss=0.142
Epoch   5 Batch   61/66  train_loss=0.035
Epoch   5 Batch   62/66  train_loss=0.028
Epoch   5 Batch   63/66  train_loss=0.032
Epoch   5 Batch   64/66  train_loss=0.039
Epoch   5 Batch   65/66  train_loss=0.028
Epoch   6 Batch    0/66  train_loss=0.032
Epoch   6 Batch    1/66  train_loss=0.041
Epoch   6 Batch    2/66  train_loss=0.061
Epoch   6 Batch    3/66  train_loss=0.072
Epoch   6 Batch    4/66  train_loss=0.031
Epoch   6 Batch    5/66  train_loss=0.291
Epoch   6 Batch    6/66  train_loss=0.214
Epoch   6 Batch    7/66  train_loss=0.133
Epoch   6 Batch    8/66  train_loss=0.116
Epoch   6 Batch    9/66  train_loss=0.209
Epoch   6 Batch   10/66  train_loss=0.080
Epoch   6 Batch   11/66  train_loss=0.020
Epoch   6 Batch   12/66  train_loss=0.203
Epoch   6 Batch   13/66  train_loss=0.084
Epoch   6 Batch   14/66  train_loss=0.244
Epoch   6 Batch   15/66  train_loss=0.038
Epoch   6 Batch   16/66  train_loss=0.045
Epoch   6 Batch   17/66  train_loss=0.040
Epoch   6 Batch   18/66  train_loss=0.035
Epoch   6 Batch   19/66  train_loss=0.105
Epoch   6 Batch   20/66  train_loss=0.031
Epoch   6 Batch   21/66  train_loss=0.049
Epoch   6 Batch   22/66  train_loss=0.078
Epoch   6 Batch   23/66  train_loss=0.029
Epoch   6 Batch   24/66  train_loss=0.031
Epoch   6 Batch   25/66  train_loss=0.030
Epoch   6 Batch   26/66  train_loss=0.043
Epoch   6 Batch   27/66  train_loss=0.063
Epoch   6 Batch   28/66  train_loss=0.063
Epoch   6 Batch   29/66  train_loss=0.023
Epoch   6 Batch   30/66  train_loss=0.022
Epoch   6 Batch   31/66  train_loss=0.044
Epoch   6 Batch   32/66  train_loss=0.019
Epoch   6 Batch   33/66  train_loss=0.017
Epoch   6 Batch   34/66  train_loss=0.030
Epoch   6 Batch   35/66  train_loss=0.018
Epoch   6 Batch   36/66  train_loss=0.060
Epoch   6 Batch   37/66  train_loss=0.020
Epoch   6 Batch   38/66  train_loss=0.036
Epoch   6 Batch   39/66  train_loss=0.041
Epoch   6 Batch   40/66  train_loss=0.132
Epoch   6 Batch   41/66  train_loss=0.031
Epoch   6 Batch   42/66  train_loss=0.031
Epoch   6 Batch   43/66  train_loss=0.069
Epoch   6 Batch   44/66  train_loss=0.058
Epoch   6 Batch   45/66  train_loss=0.029
Epoch   6 Batch   46/66  train_loss=0.006
Epoch   6 Batch   47/66  train_loss=0.007
Epoch   6 Batch   48/66  train_loss=0.028
Epoch   6 Batch   49/66  train_loss=0.037
Epoch   6 Batch   50/66  train_loss=0.056
Epoch   6 Batch   51/66  train_loss=0.061
Epoch   6 Batch   52/66  train_loss=0.026
Epoch   6 Batch   53/66  train_loss=0.020
Epoch   6 Batch   54/66  train_loss=0.023
Epoch   6 Batch   55/66  train_loss=0.017
Epoch   6 Batch   56/66  train_loss=0.028
Epoch   6 Batch   57/66  train_loss=0.022
Epoch   6 Batch   58/66  train_loss=0.029
Epoch   6 Batch   59/66  train_loss=0.038
Epoch   6 Batch   60/66  train_loss=0.110
Epoch   6 Batch   61/66  train_loss=0.053
Epoch   6 Batch   62/66  train_loss=0.034
Epoch   6 Batch   63/66  train_loss=0.042
Epoch   6 Batch   64/66  train_loss=0.034
Epoch   6 Batch   65/66  train_loss=0.024
Epoch   7 Batch    0/66  train_loss=0.030
Epoch   7 Batch    1/66  train_loss=0.037
Epoch   7 Batch    2/66  train_loss=0.069
Epoch   7 Batch    3/66  train_loss=0.093
Epoch   7 Batch    4/66  train_loss=0.047
Epoch   7 Batch    5/66  train_loss=0.303
Epoch   7 Batch    6/66  train_loss=0.217
Epoch   7 Batch    7/66  train_loss=0.146
Epoch   7 Batch    8/66  train_loss=0.128
Epoch   7 Batch    9/66  train_loss=0.203
Epoch   7 Batch   10/66  train_loss=0.086
Epoch   7 Batch   11/66  train_loss=0.017
Epoch   7 Batch   12/66  train_loss=0.164
Epoch   7 Batch   13/66  train_loss=0.102
Epoch   7 Batch   14/66  train_loss=0.284
Epoch   7 Batch   15/66  train_loss=0.030
Epoch   7 Batch   16/66  train_loss=0.030
Epoch   7 Batch   17/66  train_loss=0.029
Epoch   7 Batch   18/66  train_loss=0.024
Epoch   7 Batch   19/66  train_loss=0.104
Epoch   7 Batch   20/66  train_loss=0.030
Epoch   7 Batch   21/66  train_loss=0.050
Epoch   7 Batch   22/66  train_loss=0.071
Epoch   7 Batch   23/66  train_loss=0.025
Epoch   7 Batch   24/66  train_loss=0.026
Epoch   7 Batch   25/66  train_loss=0.024
Epoch   7 Batch   26/66  train_loss=0.034
Epoch   7 Batch   27/66  train_loss=0.057
Epoch   7 Batch   28/66  train_loss=0.060
Epoch   7 Batch   29/66  train_loss=0.021
Epoch   7 Batch   30/66  train_loss=0.019
Epoch   7 Batch   31/66  train_loss=0.042
Epoch   7 Batch   32/66  train_loss=0.019
Epoch   7 Batch   33/66  train_loss=0.016
Epoch   7 Batch   34/66  train_loss=0.029
Epoch   7 Batch   35/66  train_loss=0.022
Epoch   7 Batch   36/66  train_loss=0.053
Epoch   7 Batch   37/66  train_loss=0.016
Epoch   7 Batch   38/66  train_loss=0.029
Epoch   7 Batch   39/66  train_loss=0.038
Epoch   7 Batch   40/66  train_loss=0.119
Epoch   7 Batch   41/66  train_loss=0.028
Epoch   7 Batch   42/66  train_loss=0.028
Epoch   7 Batch   43/66  train_loss=0.057
Epoch   7 Batch   44/66  train_loss=0.048
Epoch   7 Batch   45/66  train_loss=0.027
Epoch   7 Batch   46/66  train_loss=0.009
Epoch   7 Batch   47/66  train_loss=0.004
Epoch   7 Batch   48/66  train_loss=0.013
Epoch   7 Batch   49/66  train_loss=0.025
Epoch   7 Batch   50/66  train_loss=0.061
Epoch   7 Batch   51/66  train_loss=0.084
Epoch   7 Batch   52/66  train_loss=0.049
Epoch   7 Batch   53/66  train_loss=0.038
Epoch   7 Batch   54/66  train_loss=0.034
Epoch   7 Batch   55/66  train_loss=0.018
Epoch   7 Batch   56/66  train_loss=0.024
Epoch   7 Batch   57/66  train_loss=0.020
Epoch   7 Batch   58/66  train_loss=0.025
Epoch   7 Batch   59/66  train_loss=0.032
Epoch   7 Batch   60/66  train_loss=0.103
Epoch   7 Batch   61/66  train_loss=0.058
Epoch   7 Batch   62/66  train_loss=0.038
Epoch   7 Batch   63/66  train_loss=0.047
Epoch   7 Batch   64/66  train_loss=0.032
Epoch   7 Batch   65/66  train_loss=0.022
Epoch   8 Batch    0/66  train_loss=0.030
Epoch   8 Batch    1/66  train_loss=0.034
Epoch   8 Batch    2/66  train_loss=0.067
Epoch   8 Batch    3/66  train_loss=0.083
Epoch   8 Batch    4/66  train_loss=0.045
Epoch   8 Batch    5/66  train_loss=0.303
Epoch   8 Batch    6/66  train_loss=0.210
Epoch   8 Batch    7/66  train_loss=0.143
Epoch   8 Batch    8/66  train_loss=0.129
Epoch   8 Batch    9/66  train_loss=0.192
Epoch   8 Batch   10/66  train_loss=0.087
Epoch   8 Batch   11/66  train_loss=0.014
Epoch   8 Batch   12/66  train_loss=0.141
Epoch   8 Batch   13/66  train_loss=0.079
Epoch   8 Batch   14/66  train_loss=0.247
Epoch   8 Batch   15/66  train_loss=0.031
Epoch   8 Batch   16/66  train_loss=0.026
Epoch   8 Batch   17/66  train_loss=0.021
Epoch   8 Batch   18/66  train_loss=0.015
Epoch   8 Batch   19/66  train_loss=0.098
Epoch   8 Batch   20/66  train_loss=0.023
Epoch   8 Batch   21/66  train_loss=0.048
Epoch   8 Batch   22/66  train_loss=0.071
Epoch   8 Batch   23/66  train_loss=0.022
Epoch   8 Batch   24/66  train_loss=0.021
Epoch   8 Batch   25/66  train_loss=0.021
Epoch   8 Batch   26/66  train_loss=0.030
Epoch   8 Batch   27/66  train_loss=0.053
Epoch   8 Batch   28/66  train_loss=0.057
Epoch   8 Batch   29/66  train_loss=0.018
Epoch   8 Batch   30/66  train_loss=0.018
Epoch   8 Batch   31/66  train_loss=0.040
Epoch   8 Batch   32/66  train_loss=0.018
Epoch   8 Batch   33/66  train_loss=0.016
Epoch   8 Batch   34/66  train_loss=0.029
Epoch   8 Batch   35/66  train_loss=0.021
Epoch   8 Batch   36/66  train_loss=0.050
Epoch   8 Batch   37/66  train_loss=0.014
Epoch   8 Batch   38/66  train_loss=0.023
Epoch   8 Batch   39/66  train_loss=0.035
Epoch   8 Batch   40/66  train_loss=0.105
Epoch   8 Batch   41/66  train_loss=0.024
Epoch   8 Batch   42/66  train_loss=0.023
Epoch   8 Batch   43/66  train_loss=0.045
Epoch   8 Batch   44/66  train_loss=0.038
Epoch   8 Batch   45/66  train_loss=0.022
Epoch   8 Batch   46/66  train_loss=0.008
Epoch   8 Batch   47/66  train_loss=0.003
Epoch   8 Batch   48/66  train_loss=0.008
Epoch   8 Batch   49/66  train_loss=0.016
Epoch   8 Batch   50/66  train_loss=0.052
Epoch   8 Batch   51/66  train_loss=0.077
Epoch   8 Batch   52/66  train_loss=0.053
Epoch   8 Batch   53/66  train_loss=0.041
Epoch   8 Batch   54/66  train_loss=0.033
Epoch   8 Batch   55/66  train_loss=0.016
Epoch   8 Batch   56/66  train_loss=0.020
Epoch   8 Batch   57/66  train_loss=0.014
Epoch   8 Batch   58/66  train_loss=0.025
Epoch   8 Batch   59/66  train_loss=0.027
Epoch   8 Batch   60/66  train_loss=0.133
Epoch   8 Batch   61/66  train_loss=0.063
Epoch   8 Batch   62/66  train_loss=0.025
Epoch   8 Batch   63/66  train_loss=0.032
Epoch   8 Batch   64/66  train_loss=0.025
Epoch   8 Batch   65/66  train_loss=0.017
Epoch   9 Batch    0/66  train_loss=0.024
Epoch   9 Batch    1/66  train_loss=0.032
Epoch   9 Batch    2/66  train_loss=0.053
Epoch   9 Batch    3/66  train_loss=0.048
Epoch   9 Batch    4/66  train_loss=0.029
Epoch   9 Batch    5/66  train_loss=0.299
Epoch   9 Batch    6/66  train_loss=0.196
Epoch   9 Batch    7/66  train_loss=0.112
Epoch   9 Batch    8/66  train_loss=0.096
Epoch   9 Batch    9/66  train_loss=0.156
Epoch   9 Batch   10/66  train_loss=0.067
Epoch   9 Batch   11/66  train_loss=0.020
Epoch   9 Batch   12/66  train_loss=0.132
Epoch   9 Batch   13/66  train_loss=0.064
Epoch   9 Batch   14/66  train_loss=0.218
Epoch   9 Batch   15/66  train_loss=0.027
Epoch   9 Batch   16/66  train_loss=0.024
Epoch   9 Batch   17/66  train_loss=0.020
Epoch   9 Batch   18/66  train_loss=0.014
Epoch   9 Batch   19/66  train_loss=0.097
Epoch   9 Batch   20/66  train_loss=0.020
Epoch   9 Batch   21/66  train_loss=0.046
Epoch   9 Batch   22/66  train_loss=0.070
Epoch   9 Batch   23/66  train_loss=0.017
Epoch   9 Batch   24/66  train_loss=0.015
Epoch   9 Batch   25/66  train_loss=0.017
Epoch   9 Batch   26/66  train_loss=0.027
Epoch   9 Batch   27/66  train_loss=0.052
Epoch   9 Batch   28/66  train_loss=0.055
Epoch   9 Batch   29/66  train_loss=0.015
Epoch   9 Batch   30/66  train_loss=0.015
Epoch   9 Batch   31/66  train_loss=0.038
Epoch   9 Batch   32/66  train_loss=0.016
Epoch   9 Batch   33/66  train_loss=0.014
Epoch   9 Batch   34/66  train_loss=0.027
Epoch   9 Batch   35/66  train_loss=0.019
Epoch   9 Batch   36/66  train_loss=0.048
Epoch   9 Batch   37/66  train_loss=0.013
Epoch   9 Batch   38/66  train_loss=0.020
Epoch   9 Batch   39/66  train_loss=0.033
Epoch   9 Batch   40/66  train_loss=0.097
Epoch   9 Batch   41/66  train_loss=0.021
Epoch   9 Batch   42/66  train_loss=0.020
Epoch   9 Batch   43/66  train_loss=0.037
Epoch   9 Batch   44/66  train_loss=0.031
Epoch   9 Batch   45/66  train_loss=0.019
Epoch   9 Batch   46/66  train_loss=0.008
Epoch   9 Batch   47/66  train_loss=0.003
Epoch   9 Batch   48/66  train_loss=0.006
Epoch   9 Batch   49/66  train_loss=0.012
Epoch   9 Batch   50/66  train_loss=0.045
Epoch   9 Batch   51/66  train_loss=0.066
Epoch   9 Batch   52/66  train_loss=0.049
Epoch   9 Batch   53/66  train_loss=0.036
Epoch   9 Batch   54/66  train_loss=0.027
Epoch   9 Batch   55/66  train_loss=0.012
Epoch   9 Batch   56/66  train_loss=0.016
Epoch   9 Batch   57/66  train_loss=0.011
Epoch   9 Batch   58/66  train_loss=0.026
Epoch   9 Batch   59/66  train_loss=0.024
Epoch   9 Batch   60/66  train_loss=0.109
Epoch   9 Batch   61/66  train_loss=0.060
Epoch   9 Batch   62/66  train_loss=0.018
Epoch   9 Batch   63/66  train_loss=0.024
Epoch   9 Batch   64/66  train_loss=0.023
Epoch   9 Batch   65/66  train_loss=0.017
Epoch  10 Batch    0/66  train_loss=0.023
Epoch  10 Batch    1/66  train_loss=0.034
Epoch  10 Batch    2/66  train_loss=0.046
Epoch  10 Batch    3/66  train_loss=0.026
Epoch  10 Batch    4/66  train_loss=0.017
Epoch  10 Batch    5/66  train_loss=0.295
Epoch  10 Batch    6/66  train_loss=0.185
Epoch  10 Batch    7/66  train_loss=0.085
Epoch  10 Batch    8/66  train_loss=0.069
Epoch  10 Batch    9/66  train_loss=0.130
Epoch  10 Batch   10/66  train_loss=0.051
Epoch  10 Batch   11/66  train_loss=0.026
Epoch  10 Batch   12/66  train_loss=0.128
Epoch  10 Batch   13/66  train_loss=0.052
Epoch  10 Batch   14/66  train_loss=0.203
Epoch  10 Batch   15/66  train_loss=0.023
Epoch  10 Batch   16/66  train_loss=0.020
Epoch  10 Batch   17/66  train_loss=0.018
Epoch  10 Batch   18/66  train_loss=0.013
Epoch  10 Batch   19/66  train_loss=0.096
Epoch  10 Batch   20/66  train_loss=0.019
Epoch  10 Batch   21/66  train_loss=0.043
Epoch  10 Batch   22/66  train_loss=0.069
Epoch  10 Batch   23/66  train_loss=0.014
Epoch  10 Batch   24/66  train_loss=0.013
Epoch  10 Batch   25/66  train_loss=0.013
Epoch  10 Batch   26/66  train_loss=0.024
Epoch  10 Batch   27/66  train_loss=0.050
Epoch  10 Batch   28/66  train_loss=0.053
Epoch  10 Batch   29/66  train_loss=0.012
Epoch  10 Batch   30/66  train_loss=0.011
Epoch  10 Batch   31/66  train_loss=0.037
Epoch  10 Batch   32/66  train_loss=0.014
Epoch  10 Batch   33/66  train_loss=0.011
Epoch  10 Batch   34/66  train_loss=0.024
Epoch  10 Batch   35/66  train_loss=0.015
Epoch  10 Batch   36/66  train_loss=0.044
Epoch  10 Batch   37/66  train_loss=0.010
Epoch  10 Batch   38/66  train_loss=0.016
Epoch  10 Batch   39/66  train_loss=0.031
Epoch  10 Batch   40/66  train_loss=0.090
Epoch  10 Batch   41/66  train_loss=0.018
Epoch  10 Batch   42/66  train_loss=0.018
Epoch  10 Batch   43/66  train_loss=0.032
Epoch  10 Batch   44/66  train_loss=0.027
Epoch  10 Batch   45/66  train_loss=0.015
Epoch  10 Batch   46/66  train_loss=0.006
Epoch  10 Batch   47/66  train_loss=0.003
Epoch  10 Batch   48/66  train_loss=0.005
Epoch  10 Batch   49/66  train_loss=0.011
Epoch  10 Batch   50/66  train_loss=0.041
Epoch  10 Batch   51/66  train_loss=0.056
Epoch  10 Batch   52/66  train_loss=0.042
Epoch  10 Batch   53/66  train_loss=0.029
Epoch  10 Batch   54/66  train_loss=0.020
Epoch  10 Batch   55/66  train_loss=0.009
Epoch  10 Batch   56/66  train_loss=0.014
Epoch  10 Batch   57/66  train_loss=0.010
Epoch  10 Batch   58/66  train_loss=0.029
Epoch  10 Batch   59/66  train_loss=0.022
Epoch  10 Batch   60/66  train_loss=0.084
Epoch  10 Batch   61/66  train_loss=0.051
Epoch  10 Batch   62/66  train_loss=0.012
Epoch  10 Batch   63/66  train_loss=0.019
Epoch  10 Batch   64/66  train_loss=0.021
Epoch  10 Batch   65/66  train_loss=0.017
Epoch  11 Batch    0/66  train_loss=0.022
Epoch  11 Batch    1/66  train_loss=0.033
Epoch  11 Batch    2/66  train_loss=0.041
Epoch  11 Batch    3/66  train_loss=0.014
Epoch  11 Batch    4/66  train_loss=0.010
Epoch  11 Batch    5/66  train_loss=0.287
Epoch  11 Batch    6/66  train_loss=0.174
Epoch  11 Batch    7/66  train_loss=0.062
Epoch  11 Batch    8/66  train_loss=0.048
Epoch  11 Batch    9/66  train_loss=0.113
Epoch  11 Batch   10/66  train_loss=0.041
Epoch  11 Batch   11/66  train_loss=0.033
Epoch  11 Batch   12/66  train_loss=0.124
Epoch  11 Batch   13/66  train_loss=0.043
Epoch  11 Batch   14/66  train_loss=0.193
Epoch  11 Batch   15/66  train_loss=0.018
Epoch  11 Batch   16/66  train_loss=0.017
Epoch  11 Batch   17/66  train_loss=0.016
Epoch  11 Batch   18/66  train_loss=0.012
Epoch  11 Batch   19/66  train_loss=0.093
Epoch  11 Batch   20/66  train_loss=0.017
Epoch  11 Batch   21/66  train_loss=0.039
Epoch  11 Batch   22/66  train_loss=0.069
Epoch  11 Batch   23/66  train_loss=0.011
Epoch  11 Batch   24/66  train_loss=0.012
Epoch  11 Batch   25/66  train_loss=0.011
Epoch  11 Batch   26/66  train_loss=0.022
Epoch  11 Batch   27/66  train_loss=0.049
Epoch  11 Batch   28/66  train_loss=0.051
Epoch  11 Batch   29/66  train_loss=0.009
Epoch  11 Batch   30/66  train_loss=0.009
Epoch  11 Batch   31/66  train_loss=0.035
Epoch  11 Batch   32/66  train_loss=0.012
Epoch  11 Batch   33/66  train_loss=0.010
Epoch  11 Batch   34/66  train_loss=0.023
Epoch  11 Batch   35/66  train_loss=0.013
Epoch  11 Batch   36/66  train_loss=0.042
Epoch  11 Batch   37/66  train_loss=0.009
Epoch  11 Batch   38/66  train_loss=0.013
Epoch  11 Batch   39/66  train_loss=0.029
Epoch  11 Batch   40/66  train_loss=0.084
Epoch  11 Batch   41/66  train_loss=0.016
Epoch  11 Batch   42/66  train_loss=0.017
Epoch  11 Batch   43/66  train_loss=0.026
Epoch  11 Batch   44/66  train_loss=0.020
Epoch  11 Batch   45/66  train_loss=0.011
Epoch  11 Batch   46/66  train_loss=0.004
Epoch  11 Batch   47/66  train_loss=0.003
Epoch  11 Batch   48/66  train_loss=0.006
Epoch  11 Batch   49/66  train_loss=0.010
Epoch  11 Batch   50/66  train_loss=0.036
Epoch  11 Batch   51/66  train_loss=0.043
Epoch  11 Batch   52/66  train_loss=0.032
Epoch  11 Batch   53/66  train_loss=0.020
Epoch  11 Batch   54/66  train_loss=0.014
Epoch  11 Batch   55/66  train_loss=0.007
Epoch  11 Batch   56/66  train_loss=0.014
Epoch  11 Batch   57/66  train_loss=0.011
Epoch  11 Batch   58/66  train_loss=0.029
Epoch  11 Batch   59/66  train_loss=0.018
Epoch  11 Batch   60/66  train_loss=0.060
Epoch  11 Batch   61/66  train_loss=0.037
Epoch  11 Batch   62/66  train_loss=0.008
Epoch  11 Batch   63/66  train_loss=0.015
Epoch  11 Batch   64/66  train_loss=0.021
Epoch  11 Batch   65/66  train_loss=0.017
Epoch  12 Batch    0/66  train_loss=0.021
Epoch  12 Batch    1/66  train_loss=0.032
Epoch  12 Batch    2/66  train_loss=0.036
Epoch  12 Batch    3/66  train_loss=0.008
Epoch  12 Batch    4/66  train_loss=0.006
Epoch  12 Batch    5/66  train_loss=0.277
Epoch  12 Batch    6/66  train_loss=0.161
Epoch  12 Batch    7/66  train_loss=0.042
Epoch  12 Batch    8/66  train_loss=0.032
Epoch  12 Batch    9/66  train_loss=0.103
Epoch  12 Batch   10/66  train_loss=0.037
Epoch  12 Batch   11/66  train_loss=0.036
Epoch  12 Batch   12/66  train_loss=0.118
Epoch  12 Batch   13/66  train_loss=0.035
Epoch  12 Batch   14/66  train_loss=0.183
Epoch  12 Batch   15/66  train_loss=0.015
Epoch  12 Batch   16/66  train_loss=0.016
Epoch  12 Batch   17/66  train_loss=0.015
Epoch  12 Batch   18/66  train_loss=0.011
Epoch  12 Batch   19/66  train_loss=0.089
Epoch  12 Batch   20/66  train_loss=0.014
Epoch  12 Batch   21/66  train_loss=0.035
Epoch  12 Batch   22/66  train_loss=0.069
Epoch  12 Batch   23/66  train_loss=0.009
Epoch  12 Batch   24/66  train_loss=0.013
Epoch  12 Batch   25/66  train_loss=0.011
Epoch  12 Batch   26/66  train_loss=0.021
Epoch  12 Batch   27/66  train_loss=0.047
Epoch  12 Batch   28/66  train_loss=0.049
Epoch  12 Batch   29/66  train_loss=0.007
Epoch  12 Batch   30/66  train_loss=0.008
Epoch  12 Batch   31/66  train_loss=0.035
Epoch  12 Batch   32/66  train_loss=0.011
Epoch  12 Batch   33/66  train_loss=0.009
Epoch  12 Batch   34/66  train_loss=0.021
Epoch  12 Batch   35/66  train_loss=0.012
Epoch  12 Batch   36/66  train_loss=0.039
Epoch  12 Batch   37/66  train_loss=0.007
Epoch  12 Batch   38/66  train_loss=0.010
Epoch  12 Batch   39/66  train_loss=0.028
Epoch  12 Batch   40/66  train_loss=0.079
Epoch  12 Batch   41/66  train_loss=0.016
Epoch  12 Batch   42/66  train_loss=0.017
Epoch  12 Batch   43/66  train_loss=0.019
Epoch  12 Batch   44/66  train_loss=0.014
Epoch  12 Batch   45/66  train_loss=0.006
Epoch  12 Batch   46/66  train_loss=0.003
Epoch  12 Batch   47/66  train_loss=0.004
Epoch  12 Batch   48/66  train_loss=0.006
Epoch  12 Batch   49/66  train_loss=0.009
Epoch  12 Batch   50/66  train_loss=0.031
Epoch  12 Batch   51/66  train_loss=0.032
Epoch  12 Batch   52/66  train_loss=0.022
Epoch  12 Batch   53/66  train_loss=0.012
Epoch  12 Batch   54/66  train_loss=0.009
Epoch  12 Batch   55/66  train_loss=0.008
Epoch  12 Batch   56/66  train_loss=0.015
Epoch  12 Batch   57/66  train_loss=0.012
Epoch  12 Batch   58/66  train_loss=0.027
Epoch  12 Batch   59/66  train_loss=0.015
Epoch  12 Batch   60/66  train_loss=0.040
Epoch  12 Batch   61/66  train_loss=0.024
Epoch  12 Batch   62/66  train_loss=0.007
Epoch  12 Batch   63/66  train_loss=0.014
Epoch  12 Batch   64/66  train_loss=0.020
Epoch  12 Batch   65/66  train_loss=0.016
Epoch  13 Batch    0/66  train_loss=0.019
Epoch  13 Batch    1/66  train_loss=0.028
Epoch  13 Batch    2/66  train_loss=0.033
Epoch  13 Batch    3/66  train_loss=0.005
Epoch  13 Batch    4/66  train_loss=0.005
Epoch  13 Batch    5/66  train_loss=0.269
Epoch  13 Batch    6/66  train_loss=0.151
Epoch  13 Batch    7/66  train_loss=0.029
Epoch  13 Batch    8/66  train_loss=0.023
Epoch  13 Batch    9/66  train_loss=0.099
Epoch  13 Batch   10/66  train_loss=0.035
Epoch  13 Batch   11/66  train_loss=0.034
Epoch  13 Batch   12/66  train_loss=0.113
Epoch  13 Batch   13/66  train_loss=0.030
Epoch  13 Batch   14/66  train_loss=0.176
Epoch  13 Batch   15/66  train_loss=0.015
Epoch  13 Batch   16/66  train_loss=0.016
Epoch  13 Batch   17/66  train_loss=0.014
Epoch  13 Batch   18/66  train_loss=0.010
Epoch  13 Batch   19/66  train_loss=0.085
Epoch  13 Batch   20/66  train_loss=0.011
Epoch  13 Batch   21/66  train_loss=0.032
Epoch  13 Batch   22/66  train_loss=0.069
Epoch  13 Batch   23/66  train_loss=0.008
Epoch  13 Batch   24/66  train_loss=0.013
Epoch  13 Batch   25/66  train_loss=0.010
Epoch  13 Batch   26/66  train_loss=0.019
Epoch  13 Batch   27/66  train_loss=0.044
Epoch  13 Batch   28/66  train_loss=0.046
Epoch  13 Batch   29/66  train_loss=0.006
Epoch  13 Batch   30/66  train_loss=0.009
Epoch  13 Batch   31/66  train_loss=0.035
Epoch  13 Batch   32/66  train_loss=0.011
Epoch  13 Batch   33/66  train_loss=0.008
Epoch  13 Batch   34/66  train_loss=0.020
Epoch  13 Batch   35/66  train_loss=0.012
Epoch  13 Batch   36/66  train_loss=0.037
Epoch  13 Batch   37/66  train_loss=0.005
Epoch  13 Batch   38/66  train_loss=0.008
Epoch  13 Batch   39/66  train_loss=0.028
Epoch  13 Batch   40/66  train_loss=0.076
Epoch  13 Batch   41/66  train_loss=0.015
Epoch  13 Batch   42/66  train_loss=0.016
Epoch  13 Batch   43/66  train_loss=0.012
Epoch  13 Batch   44/66  train_loss=0.008
Epoch  13 Batch   45/66  train_loss=0.004
Epoch  13 Batch   46/66  train_loss=0.003
Epoch  13 Batch   47/66  train_loss=0.004
Epoch  13 Batch   48/66  train_loss=0.006
Epoch  13 Batch   49/66  train_loss=0.007
Epoch  13 Batch   50/66  train_loss=0.026
Epoch  13 Batch   51/66  train_loss=0.023
Epoch  13 Batch   52/66  train_loss=0.014
Epoch  13 Batch   53/66  train_loss=0.007
Epoch  13 Batch   54/66  train_loss=0.008
Epoch  13 Batch   55/66  train_loss=0.009
Epoch  13 Batch   56/66  train_loss=0.016
Epoch  13 Batch   57/66  train_loss=0.011
Epoch  13 Batch   58/66  train_loss=0.023
Epoch  13 Batch   59/66  train_loss=0.011
Epoch  13 Batch   60/66  train_loss=0.031
Epoch  13 Batch   61/66  train_loss=0.017
Epoch  13 Batch   62/66  train_loss=0.006
Epoch  13 Batch   63/66  train_loss=0.013
Epoch  13 Batch   64/66  train_loss=0.019
Epoch  13 Batch   65/66  train_loss=0.014
Epoch  14 Batch    0/66  train_loss=0.016
Epoch  14 Batch    1/66  train_loss=0.024
Epoch  14 Batch    2/66  train_loss=0.031
Epoch  14 Batch    3/66  train_loss=0.004
Epoch  14 Batch    4/66  train_loss=0.004
Epoch  14 Batch    5/66  train_loss=0.264
Epoch  14 Batch    6/66  train_loss=0.146
Epoch  14 Batch    7/66  train_loss=0.024
Epoch  14 Batch    8/66  train_loss=0.020
Epoch  14 Batch    9/66  train_loss=0.098
Epoch  14 Batch   10/66  train_loss=0.034
Epoch  14 Batch   11/66  train_loss=0.031
Epoch  14 Batch   12/66  train_loss=0.111
Epoch  14 Batch   13/66  train_loss=0.027
Epoch  14 Batch   14/66  train_loss=0.174
Epoch  14 Batch   15/66  train_loss=0.014
Epoch  14 Batch   16/66  train_loss=0.015
Epoch  14 Batch   17/66  train_loss=0.013
Epoch  14 Batch   18/66  train_loss=0.008
Epoch  14 Batch   19/66  train_loss=0.082
Epoch  14 Batch   20/66  train_loss=0.010
Epoch  14 Batch   21/66  train_loss=0.031
Epoch  14 Batch   22/66  train_loss=0.070
Epoch  14 Batch   23/66  train_loss=0.008
Epoch  14 Batch   24/66  train_loss=0.012
Epoch  14 Batch   25/66  train_loss=0.009
Epoch  14 Batch   26/66  train_loss=0.017
Epoch  14 Batch   27/66  train_loss=0.042
Epoch  14 Batch   28/66  train_loss=0.045
Epoch  14 Batch   29/66  train_loss=0.006
Epoch  14 Batch   30/66  train_loss=0.008
Epoch  14 Batch   31/66  train_loss=0.035
Epoch  14 Batch   32/66  train_loss=0.010
Epoch  14 Batch   33/66  train_loss=0.007
Epoch  14 Batch   34/66  train_loss=0.020
Epoch  14 Batch   35/66  train_loss=0.012
Epoch  14 Batch   36/66  train_loss=0.036
Epoch  14 Batch   37/66  train_loss=0.005
Epoch  14 Batch   38/66  train_loss=0.008
Epoch  14 Batch   39/66  train_loss=0.027
Epoch  14 Batch   40/66  train_loss=0.075
Epoch  14 Batch   41/66  train_loss=0.014
Epoch  14 Batch   42/66  train_loss=0.015
Epoch  14 Batch   43/66  train_loss=0.008
Epoch  14 Batch   44/66  train_loss=0.006
Epoch  14 Batch   45/66  train_loss=0.003
Epoch  14 Batch   46/66  train_loss=0.003
Epoch  14 Batch   47/66  train_loss=0.004
Epoch  14 Batch   48/66  train_loss=0.005
Epoch  14 Batch   49/66  train_loss=0.006
Epoch  14 Batch   50/66  train_loss=0.023
Epoch  14 Batch   51/66  train_loss=0.018
Epoch  14 Batch   52/66  train_loss=0.010
Epoch  14 Batch   53/66  train_loss=0.005
Epoch  14 Batch   54/66  train_loss=0.007
Epoch  14 Batch   55/66  train_loss=0.010
Epoch  14 Batch   56/66  train_loss=0.016
Epoch  14 Batch   57/66  train_loss=0.010
Epoch  14 Batch   58/66  train_loss=0.020
Epoch  14 Batch   59/66  train_loss=0.009
Epoch  14 Batch   60/66  train_loss=0.028
Epoch  14 Batch   61/66  train_loss=0.013
Epoch  14 Batch   62/66  train_loss=0.005
Epoch  14 Batch   63/66  train_loss=0.012
Epoch  14 Batch   64/66  train_loss=0.018
Epoch  14 Batch   65/66  train_loss=0.012
(737, 1)
Exporting trained model tob'/home/zzx/scenic_prediction/models2/01'
INFO:tensorflow:No assets to save.
INFO:tensorflow:No assets to write.
INFO:tensorflow:SavedModel written to: b'/home/zzx/scenic_prediction/models2/01/saved_model.pb'
In [7]:
plt.plot(losses,color='g')
plt.show()

plt.plot(y_test,color = 'k')
plt.plot(targets_predict,color = 'r')
plt.show()
In [9]:
!ls -rtl /home/zzx/scenic_prediction/models2/01
总用量 200
drwxr-xr-x 2 zzx zzx   4096 7月  12 17:49 variables
-rw-rw-r-- 1 zzx zzx 199730 7月  12 17:49 saved_model.pb

3)模型部署

tensorflow serving 支持多种模型多种版本部署,可以实现AB测试,或者调用指定的模型版本

模型配置文件如下:

In [10]:
!cat /home/zzx/scenic_prediction/tfserv.conf
model_config_list: {
  config: {
    name: "nanshanchi_1hours",
    base_path: "/home/zzx/scenic_prediction/nanshanchi_1hours",
    model_platform: "tensorflow"
  },
  config: {
     name: "nanshanchi_2hours",
     base_path: "/home/zzx/scenic_prediction/nanshanchi_2hours",
     model_platform: "tensorflow"
  },
  config: {
     name: "nanshanchi_3hours",
     base_path: "/home/zzx/scenic_prediction/nanshanchi_3hours",
     model_platform: "tensorflow"
  },
  config: {
    name: "daxiaodongtian_1hours",
    base_path: "/home/zzx/scenic_prediction/daxiaodongtian_1hours",
    model_platform: "tensorflow"
  },
  config: {
     name: "daxiaodongtian_2hours",
     base_path: "/home/zzx/scenic_prediction/daxiaodongtian_2hours",
     model_platform: "tensorflow"
  },
  config: {
     name: "daxiaodongtian_3hours",
     base_path: "/home/zzx/scenic_prediction/daxiaodongtian_3hours",
     model_platform: "tensorflow"
  },
  config: {
    name: "tianyahaijiao_1hours",
    base_path: "/home/zzx/scenic_prediction/tianyahaijiao_1hours",
    model_platform: "tensorflow"
  },
  config: {
     name: "tianyahaijiao_2hours",
     base_path: "/home/zzx/scenic_prediction/tianyahaijiao_2hours",
     model_platform: "tensorflow"
  },
  config: {
     name: "tianyahaijiao_3hours",
     base_path: "/home/zzx/scenic_prediction/tianyahaijiao_3hours",
     model_platform: "tensorflow"
  },
  config: {
    name: "sanyaxidao_1hours",
    base_path: "/home/zzx/scenic_prediction/sanyaxidao_1hours",
    model_platform: "tensorflow"
  },
  config: {
     name: "sanyaxidao_2hours",
     base_path: "/home/zzx/scenic_prediction/sanyaxidao_2hours",
     model_platform: "tensorflow"
  },
  config: {
     name: "sanyaxidao_3hours",
     base_path: "/home/zzx/scenic_prediction/sanyaxidao_3hours",
     model_platform: "tensorflow"
  },
  config: {
    name: "wuzhizhoudao_1hours",
    base_path: "/home/zzx/scenic_prediction/wuzhizhoudao_1hours",
    model_platform: "tensorflow"
  },
  config: {
     name: "wuzhizhoudao_2hours",
     base_path: "/home/zzx/scenic_prediction/wuzhizhoudao_2hours",
     model_platform: "tensorflow"
  },
  config: {
     name: "wuzhizhoudao_3hours",
     base_path: "/home/zzx/scenic_prediction/wuzhizhoudao_3hours",
     model_platform: "tensorflow"
  },
}

执行下面的命令既可以把模型部署好了

tensorflow_model_server --port=8500 --model_config_file=./tfserv.conf

4)服务调用客户端

因为tensorflow serving 是通过protobuf来实现客户端和服务器之间的连接的,所以把protobuf文件编译成支持各种语言的客户端,请参考tobegit3hub,但是支持使用这样的客户端有点不方便。因为前端的人员还需要研究tensorflow serving的各种接口,代码耦合性很高,并且学习成本很大。本文通过官方支持的tensorflow-serving-api结合python版本的socketio库实现一个中转服务的RESTful接口,方便前端直接调用。本文写作的时候,查看官网发现serving已经官方支持RESTful接口了,所以没必要自己实现,不过由于之前还没有,所以把已经写好的代码还是放到这里来吧。

socketio服务提供的接口,socketio支持各种版本的语言客户端,所以前端可以选择各种语言和本服务通讯。

In [12]:
import numpy as np
import socketio
import eventlet
import eventlet.wsgi
from flask import Flask
from grpc.beta import implementations
import tensorflow as tf
from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2
from tensorflow.python.framework import dtypes
import json

sio = socketio.Server()
app = Flask(__name__)


HOST="127.0.0.1"
PORT=8500


channel=implementations.insecure_channel(HOST,PORT)
stub=prediction_service_pb2.beta_create_PredictionService_stub(channel=channel)

@sio.on('scenic_prediction_2hour',namespace='/')
def nanshanchi_1hours(sid, data):
    if data:
        print(data)
        inputs=np.array(data.split(',')).astype(np.float64)
        request=predict_pb2.PredictRequest()
        request.model_spec.name='scenic_prediction_2hour'
        request.model_spec.signature_name='scenic_prediction_2hour'
        request.inputs['inputs'].CopyFrom(tf.contrib.util.make_tensor_proto(inputs,dtype=dtypes.float64,shape=[1, 20,1]))
        
        response=stub.Predict(request,10.0)
        print(response)
        results={}
        for key in response.outputs:
            tensor_proto=response.outputs[key]
            nd_array=tf.contrib.util.make_ndarray(tensor_proto)
            results[key]=str(nd_array[0][0])
            
        print(results)
        return json.dumps(results)

    else:
        print("no data")
        # NOTE: DON'T EDIT THIS.
        sio.emit('manual', data={}, skip_sid=True)
 

@sio.on('connect',namespace='/')
def connect(sid, environ):
    print("connect ", sid)
    
@sio.on('disconnect',namespace='/')
def disconnect(sid):
    print(sid)
    
# wrap Flask application with engineio's middleware
app = socketio.Middleware(sio, app)

# deploy as an eventlet WSGI server
eventlet.wsgi.server(eventlet.listen(('', 4567)), app)
In [ ]: