当前位置:网站首页>9 use of tensorboard
9 use of tensorboard
2022-06-26 15:50:00 【X1996_】
Tensorflow2.0 Next use TensorBoard(Win10)
One 、keras Use... Under version
Callback function needs to be defined , And set the parameters
The meaning of each parameter 
log_dir: preservation TensorBoard Directory path of the log file to be parsed .
histogram_freq: The default is 0. Calculate the frequency of activation value and weight histogram of each layer of the model ( With epoch meter ). If set to 0, Histogram will not be calculated . If you want histogram Visualization , Validation data must be specified ( Or split validation set ).
write_graph: The default is True. Whether in TensorBoard Visual graphics in . When set to True when , The log file will become very large .
write_images: The default is False. Whether to write model weights , stay TensorBoard Visualize weights as images .
update_freq: The default is "epoch". It can be "epoch","batch" Or an integer . When using "batch" when , At every batch After the loss,metrics write in TensorBoard."epoch" Empathy . If you use integers , such as 1000, Then every time 1000 individual batch take loss,metrics Write to TensorBoard. Too often will slow down the training speed .
profile_batch: The default is 2. Every time how many batch Analyze once Profile.profile_batch Must be a non negative integer or a tuple of integers . A pair of positive integers means to enter Profile Of batch The scope of the . Set up profile_batch=0 Will disable Profile analysis .
embeddings_freq: The default is 0.embedding stay epochs The frequency of being visualized in . If set to 0,embedding Will not be able to visualize .
embeddings_metadata: The default is None. Do not understand .a dictionary which maps layer name to a file name in which metadata for this embedding layer is saved. See the details about metadata files format. In case if the same metadata file is used for all embedding layers, string can be passed.
keras Experiment under , need mnist Data sets
from __future__ import absolute_import, division, print_function, unicode_literals
import tensorflow as tf
from tensorflow.keras.layers import Dense, Flatten, Conv2D
from tensorflow.keras import Model
import numpy as np
import datetime
# On demand ,OOM
from tensorflow.compat.v1 import ConfigProto
from tensorflow.compat.v1 import InteractiveSession
config = ConfigProto()
config.gpu_options.allow_growth = True
session = InteractiveSession(config=config)
mnist = np.load("mnist.npz")
x_train, y_train, x_test, y_test = mnist['x_train'],mnist['y_train'],mnist['x_test'],mnist['y_test']
x_train, x_test = x_train / 255.0, x_test / 255.0
# Add a channels dimension
x_train = x_train[..., tf.newaxis]
x_test = x_test[..., tf.newaxis]
class MyModel(Model):
def __init__(self):
super(MyModel, self).__init__()
self.conv1 = Conv2D(32, 3, activation='relu')
self.flatten = Flatten()
self.d1 = Dense(128, activation='relu')
self.d2 = Dense(10, activation='softmax')
@tf.function
def call(self, x):
x = self.conv1(x)
x = self.flatten(x)
x = self.d1(x)
return self.d2(x)
model = MyModel()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
# Set the callback function
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir="keras_logv2",
histogram_freq=1,
profile_batch = 100000000)
model.fit(x=x_train,
y=y_train,
epochs=20,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback])
The log will be saved in keras_logv2 In the folder .
stay jupyter notebook of use win10 I met a lot of problems when I opened it directly , It didn't work .
%load_ext tensorboard
%tensorboard --logdir keras_logv2
Directly in Anaconda Just open it from the command line 
Enter... Under the folder where the log is located :
tensorboard --logdir keras_logv2
keras_logv2 Is the folder name , Note the environment that needs to be activated . Then press enter and a web address will appear , Just copy it to the browser and open it :http://localhost:6006/

Two 、 Customize your workout to use TensorBoard
Mainly depends on tf.summary This class
Create a folder
Save the picture 、 Scalar 、 Text 、 Model distribution 、 voice
The first one is more important 


The process : Create folder --> Open the tracking --> write in --> export
from __future__ import absolute_import, division, print_function, unicode_literals
import tensorflow as tf
from tensorflow.keras.layers import Dense, Flatten, Conv2D
from tensorflow.keras import Model
import numpy as np
import datetime
mnist = np.load("mnist.npz")
x_train, y_train, x_test, y_test = mnist['x_train'],mnist['y_train'],mnist['x_test'],mnist['y_test']
x_train, x_test = x_train / 255.0, x_test / 255.0
# Add a channels dimension
x_train = x_train[..., tf.newaxis]
x_test = x_test[..., tf.newaxis]
train_ds = tf.data.Dataset.from_tensor_slices(
(x_train, y_train)).shuffle(10000).batch(32)
test_ds = tf.data.Dataset.from_tensor_slices((x_test, y_test)).batch(32)
class MyModel(Model):
def __init__(self,**kwargs):
super(MyModel, self).__init__(**kwargs)
self.conv1 = Conv2D(32, 3, activation='relu')
self.flatten = Flatten()
self.d1 = Dense(128, activation='relu')
self.d2 = Dense(10, activation='softmax')
@tf.function
def call(self, x):
x = self.conv1(x)
x = self.flatten(x)
x = self.d1(x)
return self.d2(x)
loss_object = tf.keras.losses.SparseCategoricalCrossentropy()
optimizer = tf.keras.optimizers.Adam()
train_loss = tf.keras.metrics.Mean(name='train_loss')
train_accuracy = tf.keras.metrics.SparseCategoricalAccuracy(name='train_accuracy')
test_loss = tf.keras.metrics.Mean(name='test_loss')
test_accuracy = tf.keras.metrics.SparseCategoricalAccuracy(name='test_accuracy')
# @tf.function
def train_step(images, labels):
with tf.GradientTape() as tape:
predictions = model(images)
loss = loss_object(labels, predictions)
gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
train_loss(loss)
train_accuracy(labels, predictions)
# @tf.function
def test_step(images, labels):
predictions = model(images)
t_loss = loss_object(labels, predictions)
test_loss(t_loss)
test_accuracy(labels, predictions)
model = MyModel()
stamp = datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
import os
logdir = os.path.join("logs/"+stamp)
# Create folder
summary_writer = tf.summary.create_file_writer(logdir)
# Open the tracking ,# Turn on Trace, You can record chart structure and profile Information
tf.summary.trace_on(graph=True, profiler=False)
EPOCHS = 5
for epoch in range(EPOCHS):
for (x_train, y_train) in train_ds:
train_step(x_train, y_train)
with summary_writer.as_default(): # The recorder you want to use
tf.summary.scalar('train_loss', train_loss.result(), step=epoch)
tf.summary.scalar('train_accuracy', train_accuracy.result(), step=epoch) # You can also add other custom variables
# for (x_test, y_test) in test_ds:
# test_step(x_test, y_test)
# # Trace test sets
# with summary_writer.as_default(): # The recorder you want to use
# tf.summary.scalar('test_loss', test_loss.result(), step=epoch)
# tf.summary.scalar('test_accuracy', test_accuracy.result(), step=epoch) # You can also add other custom variables
template = 'Epoch {}, Loss: {}, Accuracy: {}, Test Loss: {}, Test Accuracy: {}'
print(template.format(epoch + 1,
train_loss.result(),
train_accuracy.result() * 100,
test_loss.result(),
test_accuracy.result() * 100))
# Reset metrics every epoch
train_loss.reset_states()
test_loss.reset_states()
train_accuracy.reset_states()
test_accuracy.reset_states()
# preservation Trace Information to file
with summary_writer.as_default():
tf.summary.trace_export(name="model_trace", step=5, profiler_outdir=None)
Because there is a static graph defined in the model , So we can only save the training , The test cannot be saved 

If you need to save all of them, do not define the forward propagation as a static graph , The training is defined as a static graph


Then the test place is commented out
In this way, the graphs of the training set and the test set are separated 
边栏推荐
- Is it safe to open an account for mobile stock registration? Is there any risk?
- 【leetcode】331. 验证二叉树的前序序列化
- 【leetcode】48.旋转图像
- 【毕业季·进击的技术er】 什么是微信小程序,带你推开小程序的大门
- 买股票通过券商经理的开户二维码开户资金是否安全?想开户炒股
- 2022 Beijing Shijingshan District specializes in the application process for special new small and medium-sized enterprises, with a subsidy of 100000-200000 yuan
- selenium chrome 禁用js 禁用图片
- When a project with cmake is cross compiled to a link, an error cannot be found So dynamic library file
- NFT 平台安全指南(2)
- 在重新格式化时不要删除自定义换行符(Don‘t remove custom line breaks on reformat)
猜你喜欢

Svg capital letter a animation JS effect

NFT交易原理分析(1)

Summary of students' learning career (2022)
![[CEPH] MKDIR | mksnap process source code analysis | lock state switching example](/img/4a/0aeb69ae6527c65a67be535828b48a.jpg)
[CEPH] MKDIR | mksnap process source code analysis | lock state switching example

【C语言练习——打印空心上三角及其变形】

How to configure and use the new single line lidar

How to handle 2gcsv files that cannot be opened? Use byzer

Super double efficiency! Pycharm ten tips

Solana capacity expansion mechanism analysis (1): an extreme attempt to sacrifice availability for efficiency | catchervc research

svg环绕地球动画js特效
随机推荐
NFT 项目的开发、部署、上线的流程(2)
2022北京石景山区专精特新中小企业申报流程,补贴10-20万
Nanopi duo2 connection WiFi
/etc/profile、/etc/bashrc、~/. Bashrc differences
Reflection modification final
AUTO sharding policy will apply DATA sharding policy as it failed to apply FILE sharding policy
[problem solving] the loading / downloading time of the new version of webots texture and other resource files is too long
手机上怎么开户?在线开户安全么?
[C language practice - printing hollow upper triangle and its deformation]
Evaluate:huggingface评价指标模块入门详细介绍
全面解析Discord安全问题
[tcapulusdb knowledge base] tcapulusdb OMS business personnel permission introduction
[tcapulusdb knowledge base] tcapulusdb operation and maintenance doc introduction
【leetcode】701. Insert operation in binary search tree
NFT transaction principle analysis (1)
[tcapulusdb knowledge base] tcapulusdb doc acceptance - transaction execution introduction
【leetcode】48. Rotate image
如何辨别合约问题
Summer camp is coming!!! Chongchongchong
买股票通过券商经理的开户二维码开户资金是否安全?想开户炒股