基于DenseNet FPN网络的视频帧内CU快速划分算法
DOI:
作者:
作者单位:

1.沈阳理工大学;2.中国科学院沈阳自动化所

作者简介:

通讯作者:

中图分类号:

TN919.8

基金项目:


Video Frame-Level CU Rapid Partitioning Algorithm Based on DenseNet FPN Network
Author:
Affiliation:

1.Shenyang Ligong University;2.Shenyang Institute of Automation, Chinese Academy of Sciences

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对H.266/VVC(versatile video coding)帧内编码中CU(coding unit)划分存在计算复杂度过高的问题,本文提出了一种基于DenseNet FPN(feature pyramid network)网络的CU快速划分算法。该算法能够大幅度降低VVC的编码复杂度,减少编码时间。首先,本文提出了一种基于纹理复杂度的CU分类算法,来评估CU块的纹理复杂度。其次,本文提出一种基于DenseNet FPN的网络模型,利用多尺度信息来优化CU划分,以适应多尺度情况下的编码需求。最后,本文设计了一个新的自适应的质量复杂度均衡损失函数,用于平衡编码质量和计算复杂度。所提算法进行了大量的实验分析,结果证明,与VTM10.0(VVC test model)相比,所提算法的帧内编码平均时间减少了44.268%,而BDBR(bj?ntegaard delta bit rate)仅增加了0.94%。

    Abstract:

    To address the issue of high computational complexity in CU (coding unit) partitioning for H.266/VVC (versatile video coding) intra-frame coding, this paper proposes a CU fast partitioning algorithm based on DenseNet FPN (feature pyramid network). The algorithm significantly reduces the encoding complexity of VVC, resulting in reduced encoding time. Firstly, a CU classification algorithm based on texture complexity is proposed to evaluate the texture complexity of CU blocks. Secondly, a network model based on DenseNet FPN is introduced, utilizing multi-scale information to optimize CU partitioning to adapt to encoding requirements in various scales. Lastly, a novel adaptive quality-complexity balanced loss function is designed to balance encoding quality and computational complexity. Extensive experimental analysis is conducted for the proposed algorithm, and the results demonstrate that compared to VTM10.0 (VVC test model), the average encoding time of the proposed algorithm is reduced by 44.268%, while the BDBR (bj?ntegaard delta bit rate) only increases by 0.94%.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2023-06-15
  • 最后修改日期:2023-06-15
  • 录用日期:2023-09-12
  • 在线发布日期:
  • 出版日期: