Turbot-DL入门教程篇-Pytorch应用-torch.autograd包
说明:
- 介绍Pytorch应用-torch.autograd包
环境:
- Python 3.5.2
步骤:
- 新建文件
$ vim torch_autograd.py
- 内容如下
import torch
from torch.autograd import Variable
batch_n = 100
hidden_layer = 100
input_data = 1000
output_data = 10
x = Variable(torch.randn(batch_n , input_data) , requires_grad = False)
y = Variable(torch.randn(batch_n , output_data) , requires_grad = False)
w1 = Variable(torch.randn(input_data,hidden_layer),requires_grad = True)
w2 = Variable(torch.randn(hidden_layer,output_data),requires_grad = True)
epoch_n = 20
learning_rate = 1e-6
for epoch in range(epoch_n):
y_pred = x.mm(w1).clamp(min= 0 ).mm(w2)
loss = (y_pred - y).pow(2).sum()
print("Epoch:{} , Loss:{:.4f}".format(epoch, loss.data))
loss.backward()
w1.data -= learning_rate * w1.grad.data
w2.data -= learning_rate * w2.grad.data
w1.grad.data.zero_()
w2.grad.data.zero_()
- 运行
$ python3 torch_autograd.py
- 结果如下
Epoch:0 , Loss:51193236.0000
Epoch:1 , Loss:118550784.0000
Epoch:2 , Loss:451814400.0000
Epoch:3 , Loss:715576704.0000
Epoch:4 , Loss:21757992.0000
Epoch:5 , Loss:11608872.0000
Epoch:6 , Loss:7414747.5000
Epoch:7 , Loss:5172238.5000
Epoch:8 , Loss:3814624.2500
Epoch:9 , Loss:2930500.2500
Epoch:10 , Loss:2325424.0000
Epoch:11 , Loss:1895581.7500
Epoch:12 , Loss:1581226.2500
Epoch:13 , Loss:1345434.7500
Epoch:14 , Loss:1164679.7500
Epoch:15 , Loss:1023319.6875
Epoch:16 , Loss:910640.1875
Epoch:17 , Loss:819365.0625
Epoch:18 , Loss:743999.4375
Epoch:19 , Loss:680776.3750
Process finished with exit code 0
- 可以看出,对参数的优化在顺利进行,因为loss值也越来越低。
获取最新文章: 扫一扫右上角的二维码加入“创客智造”公众号