gaze360代码运行结果分析

编程入门 行业动态 更新时间:2024-10-09 07:23:57

gaze360<a href=https://www.elefans.com/category/jswz/34/1771412.html style=代码运行结果分析"/>

gaze360代码运行结果分析

 GazeLSTM模型FLOPs计算可视化

import torchvision
from ptflops import get_model_complexity_infomodel = GazeLSTM()
macs, params = get_model_complexity_info(model, (7, 3, 224, 224), as_strings=True, print_per_layer_stat=True)
print('macs: ', macs, 'params: ', params)

通过在model中添加这几行代码,可以得到计算量和参数量。

输出结果由下图所示:

GazeLSTM(14.58 M, 100.000% Params, 12.78 GMac, 100.000% MACs, (base_model): ResNet(11.95 M, 81.950% Params, 12.76 GMac, 99.855% MACs, (conv1): Conv2d(9.41 k, 0.065% Params, 826.1 MMac, 6.466% MACs, 3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)(bn1): BatchNorm2d(128, 0.001% Params, 11.24 MMac, 0.088% MACs, 64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 5.62 MMac, 0.044% MACs, inplace=True)(maxpool): MaxPool2d(0, 0.000% Params, 5.62 MMac, 0.044% MACs, kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)(layer1): Sequential(147.97 k, 1.015% Params, 3.25 GMac, 25.469% MACs, (0): BasicBlock(73.98 k, 0.508% Params, 1.63 GMac, 12.735% MACs, (conv1): Conv2d(36.86 k, 0.253% Params, 809.24 MMac, 6.334% MACs, 64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn1): BatchNorm2d(128, 0.001% Params, 2.81 MMac, 0.022% MACs, 64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 2.81 MMac, 0.022% MACs, inplace=True)(conv2): Conv2d(36.86 k, 0.253% Params, 809.24 MMac, 6.334% MACs, 64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(128, 0.001% Params, 2.81 MMac, 0.022% MACs, 64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True))(1): BasicBlock(73.98 k, 0.508% Params, 1.63 GMac, 12.735% MACs, (conv1): Conv2d(36.86 k, 0.253% Params, 809.24 MMac, 6.334% MACs, 64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn1): BatchNorm2d(128, 0.001% Params, 2.81 MMac, 0.022% MACs, 64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 2.81 MMac, 0.022% MACs, inplace=True)(conv2): Conv2d(36.86 k, 0.253% Params, 809.24 MMac, 6.334% MACs, 64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(128, 0.001% Params, 2.81 MMac, 0.022% MACs, 64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)))(layer2): Sequential(525.57 k, 3.605% Params, 2.89 GMac, 22.599% MACs, (0): BasicBlock(230.14 k, 1.579% Params, 1.26 GMac, 9.897% MACs, (conv1): Conv2d(73.73 k, 0.506% Params, 404.62 MMac, 3.167% MACs, 64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)(bn1): BatchNorm2d(256, 0.002% Params, 1.4 MMac, 0.011% MACs, 128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 1.4 MMac, 0.011% MACs, inplace=True)(conv2): Conv2d(147.46 k, 1.012% Params, 809.24 MMac, 6.334% MACs, 128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(256, 0.002% Params, 1.4 MMac, 0.011% MACs, 128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(downsample): Sequential(8.45 k, 0.058% Params, 46.36 MMac, 0.363% MACs, (0): Conv2d(8.19 k, 0.056% Params, 44.96 MMac, 0.352% MACs, 64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)(1): BatchNorm2d(256, 0.002% Params, 1.4 MMac, 0.011% MACs, 128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)))(1): BasicBlock(295.42 k, 2.027% Params, 1.62 GMac, 12.702% MACs, (conv1): Conv2d(147.46 k, 1.012% Params, 809.24 MMac, 6.334% MACs, 128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn1): BatchNorm2d(256, 0.002% Params, 1.4 MMac, 0.011% MACs, 128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 1.4 MMac, 0.011% MACs, inplace=True)(conv2): Conv2d(147.46 k, 1.012% Params, 809.24 MMac, 6.334% MACs, 128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(256, 0.002% Params, 1.4 MMac, 0.011% MACs, 128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)))(layer3): Sequential(2.1 M, 14.404% Params, 2.88 GMac, 22.560% MACs, (0): BasicBlock(919.04 k, 6.305% Params, 1.26 GMac, 9.875% MACs, (conv1): Conv2d(294.91 k, 2.023% Params, 404.62 MMac, 3.167% MACs, 128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)(bn1): BatchNorm2d(512, 0.004% Params, 702.46 KMac, 0.005% MACs, 256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 702.46 KMac, 0.005% MACs, inplace=True)(conv2): Conv2d(589.82 k, 4.046% Params, 809.24 MMac, 6.334% MACs, 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(512, 0.004% Params, 702.46 KMac, 0.005% MACs, 256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(downsample): Sequential(33.28 k, 0.228% Params, 45.66 MMac, 0.357% MACs, (0): Conv2d(32.77 k, 0.225% Params, 44.96 MMac, 0.352% MACs, 128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)(1): BatchNorm2d(512, 0.004% Params, 702.46 KMac, 0.005% MACs, 256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)))(1): BasicBlock(1.18 M, 8.100% Params, 1.62 GMac, 12.685% MACs, (conv1): Conv2d(589.82 k, 4.046% Params, 809.24 MMac, 6.334% MACs, 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn1): BatchNorm2d(512, 0.004% Params, 702.46 KMac, 0.005% MACs, 256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 702.46 KMac, 0.005% MACs, inplace=True)(conv2): Conv2d(589.82 k, 4.046% Params, 809.24 MMac, 6.334% MACs, 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(512, 0.004% Params, 702.46 KMac, 0.005% MACs, 256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)))(layer4): Sequential(8.39 M, 57.582% Params, 2.88 GMac, 22.541% MACs, (0): BasicBlock(3.67 M, 25.198% Params, 1.26 GMac, 9.864% MACs, (conv1): Conv2d(1.18 M, 8.093% Params, 404.62 MMac, 3.167% MACs, 256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)(bn1): BatchNorm2d(1.02 k, 0.007% Params, 351.23 KMac, 0.003% MACs, 512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 351.23 KMac, 0.003% MACs, inplace=True)(conv2): Conv2d(2.36 M, 16.185% Params, 809.24 MMac, 6.334% MACs, 512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(1.02 k, 0.007% Params, 351.23 KMac, 0.003% MACs, 512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(downsample): Sequential(132.1 k, 0.906% Params, 45.31 MMac, 0.355% MACs, (0): Conv2d(131.07 k, 0.899% Params, 44.96 MMac, 0.352% MACs, 256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)(1): BatchNorm2d(1.02 k, 0.007% Params, 351.23 KMac, 0.003% MACs, 512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)))(1): BasicBlock(4.72 M, 32.384% Params, 1.62 GMac, 12.677% MACs, (conv1): Conv2d(2.36 M, 16.185% Params, 809.24 MMac, 6.334% MACs, 512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn1): BatchNorm2d(1.02 k, 0.007% Params, 351.23 KMac, 0.003% MACs, 512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)(relu): ReLU(0, 0.000% Params, 351.23 KMac, 0.003% MACs, inplace=True)(conv2): Conv2d(2.36 M, 16.185% Params, 809.24 MMac, 6.334% MACs, 512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)(bn2): BatchNorm2d(1.02 k, 0.007% Params, 351.23 KMac, 0.003% MACs, 512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)))(avgpool): AdaptiveAvgPool2d(0, 0.000% Params, 175.62 KMac, 0.001% MACs, output_size=(1, 1))(fc1): Linear(513.0 k, 3.519% Params, 3.58 MMac, 0.028% MACs, in_features=512, out_features=1000, bias=True)(fc2): Linear(256.26 k, 1.758% Params, 1.79 MMac, 0.014% MACs, in_features=1000, out_features=256, bias=True))(lstm): LSTM(2.63 M, 18.040% Params, 18.48 MMac, 0.145% MACs, 256, 256, num_layers=2, batch_first=True, bidirectional=True)(last_layer): Linear(1.54 k, 0.011% Params, 1.54 KMac, 0.000% MACs, in_features=512, out_features=3, bias=True)
)
macs:  12.78 GMac params:  14.58 M

 通过上图我们可以清晰地看到每一层的参数量params和macs,其中FLOPs=2*macs,所以最终我们可以得到总FLOPs=25.56GMac,参数量params=14.58M。

 基于Gaze360数据集训练结果

 观察最后一轮训练结果平均angular loss为13左右,这与论文中作者得到的结果基本一致。

 

更多推荐

gaze360代码运行结果分析

本文发布于:2024-02-14 07:23:59,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1762208.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:代码

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!