欢迎访问《空军工程大学学报》官方网站!

咨询热线:029-84786242 RSS EMAIL-ALERT
基于并行残差卷积网络的图像超分辨重建
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP311

基金项目:

国家自然科学基金(41671409)


Image SuperResolution in Combination with Convolution Neural Network
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对VDSR模型卷积核单一和DRRN模型不能全局利用的问题,提出了基于并行残差卷积神经网络的联合卷积图像超分辨重建模型。模型首先利用原始卷积层和扩张卷积层融合,建立联合卷积层,然后利用跳跃链接,将多种抽象层次的特征进行融合,最后完成整个超分辨网络的模型构建。提出的模型具有以下优点:①扩张卷积神经网络与原始卷积神经网络融合,在计算机复杂度不变的情况下,可以获取更多尺度的信息,因此具有更强的表达能力;②跳跃链接方式,将抽象层度较低与较高抽象层次的信息融合,获取更多的信息,使得模型具有更强的学习能力。通过在多个数据集上进行实验,模型在大多数任务中与VDSR、DRRN和SRCNN等先进模型相比,IFC值取得了大于0.1的提升。

    Abstract:

    Aimed at the problems that the VDSR model convolution kernel is single and the DRRN model fails to take advantages of global features, a combined convolution image superresolution model is proposed based on parallel residual convolution neural networks. Firstly, the combined convolution neural layer is structured by the original convolution layer and dilated convolution layer, and the skip connection approach is employed to connect the different layers to take advantage of different level features, completing superresolution network. There are two advantages of this model:①Combination of dilated convolution neural layers and original convolution layers can capture multiscale features without computationconsuming. Based on this approach, the network can get more presentation capacity. ②Skip connection approach fuse lowlevel information and highlevel information. From this approach, different level features can be learned. This means that stronger learning ability can be obtained. Based on the experiment results on multiple data sets, more than 0.1 IFC improvement is achieved, compared with the stateoftheart models VDSR, DRRN, SRCNN in most tasks.

    参考文献
    相似文献
    引证文献
引用本文

杨伟铭,张钰.基于并行残差卷积网络的图像超分辨重建[J].空军工程大学学报,2019,20(4):84-89

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2019-10-23
  • 出版日期: