当前位置:网站首页>Pytorch save and load model
Pytorch save and load model
2022-07-23 09:40:00 【Mick..】
Model saving and loading
1 Only save and load model parameters
torch.save(model.state_dict(), PATH) ### Save the parameters of the model to this address , Suffix named pt
model = model(*args, **kwargs) ### Defining models
model.load_state_dict(torch.load(PATH)) ## Import model parameters
2 Save and load the entire model
torch.save(model,path)
model=torch.load(path)This method can directly save the whole model , There is no need to redefine the model when applying .
Define network structure
The simplest network structure is defined here . Full connection layer of two layers
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.layer1=nn.Linear(1,3) ### Linear layer
self.layer2=nn.Linear(3,1)
def forward(self,x):
x=self.layer1(x)
x=torch.relu(x) ###relu Activation function
x=self.layer2(x)
return x
Training neural network
import torch
import torch.nn as nn
import numpy as np
import matplotlib.pyplot as plt
epoches=2000
# Learning rate is defined as 0.01
learning_rate=0.01
# Create a model
model=Net()
optimizer = optim.Adam(params=model.parameters(), lr=learning_rate, weight_decay=1e-5)
criterion = nn.MSELoss() # Define the loss function
# Use the optimizer to update the network weights ,lr For learning rate ,
for i in range(epoch): # Set training epoch Time
model.train() # Set the state of the model to train
for j in Sample: # Traverse each sample
optimizer.zero_grad() # Gradient cleanup , Prepare for this gradient calculation
output = model(j)
loss = criterion(output, target)
loss.backward() ### The average recorded here loss
optimizer.step() # Update network weights
if (epoch+1) % 10==0: ## Print the current status every ten times
print("Epoch {} / {},loss {:.4f}".format(epoch+1,num_epoches,loss.item()))
Pytorch Model preservation
##torch.save() You can save dictionary type data
save_checkpoint({'loss': i, 'state_dict': model.state_dict()},dir)
def save_checkpoint(state, dic): #state Is the weight and state of the model dic Is the directory where the model is saved
if not os.path.exists(dir):
os.makedirs(directory)
fileName = directory + 'last.pth'
torch.save(state,fileName)# Use torch.save Function directly saves the trained model
边栏推荐
猜你喜欢
![[simple bug handling]](/img/25/3428978cabb79cdae5df6f98af6562.png)
[simple bug handling]

PNA modified polypeptide BZ Val Gly Arg PNA | BOC Val Leu Gly Arg PNA

The new regulation issued the first construction certificate, and the gold content of highway and water conservancy specialty increased

xmpp 服务研究(一)

How to preserve peptide nucleic acid | peptide nucleic acid containing azobenzene unit (n-pna) | 99Tcm labeled c-myc mRNA

Salary increase artifact

Repeat: the difference between Pearson Pearson correlation coefficient and Spearman Spearman correlation coefficient

Get the C language pointer in one article

LEADTOOLS 20-22 Crack-DotNetCore!!!

肽核酸如何保存|包含偶氮苯单元的肽核酸(N-PNA)|99Tcm标记c-myc mRNA
随机推荐
1058 A+B in Hogwarts
Canal configuration 01
The difference between MySQL key and index and the creation and deletion of indexes
PNA PNA custom service | PNA clamp PCR (pna-pcr) | cgappna multivalent PNA ligand
canal 第7篇
污水处理厂设备巡检的解决方案
mysql的key和index的区别及创建删除索引
服务器内存性能调优
[FPGA tutorial case 37] communication case 7 - FFT, IFFT Fourier transform and inverse transform based on FPGA
每周推荐短视频:为什么会写这样一本书?
Peptide modified PNA peptide nucleic acid bz-d-phe-val-arg-pna|l-phe-val-arg-pna
Notifyicondata tray used by BCG
广发期货可以开户吗?安全吗
作物叶片病害识别系统
PNA peptide nucleic acid modified peptide suc ala PNA | 2-ala ala leupna | AC Phe Gly PNA
LeetCode 练习——关于二叉树的最近公共祖先两道题
Solution of equipment inspection in sewage treatment plant
Repeatable-read RR mode interval lock
Blog milestones
JDBC工具类