PyTorch实现AlexNet示例

编程入门 行业动态 更新时间:2024-10-26 22:22:43
今天小编就为大家分享一篇PyTorch实现AlexNet示例,具有很好的参考价值,希望对大家有所帮助。一起跟随小编过来看看吧

PyTorch: github/shanglianlm0525/PyTorch-Networks

import torchimport torch.nn as nnimport torchvisionclass AlexNet(nn.Module): def __init__(self,num_classes=1000): super(AlexNet,self).__init__() self.feature_extraction = nn.Sequential( nn.Conv2d(in_channels=3,out_channels=96,kernel_size=11,stride=4,padding=2,bias=False), nn.ReLU(inplace=True), nn.MaxPool2d(kernel_size=3,stride=2,padding=0), nn.Conv2d(in_channels=96,out_channels=192,kernel_size=5,stride=1,padding=2,bias=False), nn.ReLU(inplace=True), nn.MaxPool2d(kernel_size=3,stride=2,padding=0), nn.Conv2d(in_channels=192,out_channels=384,kernel_size=3,stride=1,padding=1,bias=False), nn.ReLU(inplace=True), nn.Conv2d(in_channels=384,out_channels=256,kernel_size=3,stride=1,padding=1,bias=False), nn.ReLU(inplace=True), nn.Conv2d(in_channels=256,out_channels=256,kernel_size=3,stride=1,padding=1,bias=False), nn.ReLU(inplace=True), nn.MaxPool2d(kernel_size=3, stride=2, padding=0), ) self.classifier = nn.Sequential( nn.Dropout(p=0.5), nn.Linear(in_features=256*6*6,out_features=4096), nn.ReLU(inplace=True), nn.Dropout(p=0.5), nn.Linear(in_features=4096, out_features=4096), nn.ReLU(inplace=True), nn.Linear(in_features=4096, out_features=num_classes), ) def forward(self,x): x = self.feature_extraction(x) x = x.view(x.size(0),256*6*6) x = self.classifier(x) return xif __name__ =='__main__': # model = torchvision.models.AlexNet() model = AlexNet() print(model) input = torch.randn(8,3,224,224) out = model(input) print(out.shape)

以上这篇PyTorch实现AlexNet示例就是小编分享给大家的全部内容了,希望能给大家一个参考,

  • 0
  • 0
  • 0
  • 0
  • 0

更多推荐

PyTorch实现AlexNet示例

本文发布于:2023-06-11 15:12:14,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/637418.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:示例   PyTorch   AlexNet

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!