面向高阶微分方程求解的预测-微分解耦式神经网络
作者:
作者单位:

国防科技大学空天科学学院

作者简介:

通讯作者:

中图分类号:

O241.8

基金项目:

国家自然科学基金资助项目(12302197)


Decoupled prediction-differential neural networks for solving high-order differential equations
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献()
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    物理信息驱动的算子学习方法(如PI-DeepONet)在加速求解偏微分方程(PDEs)时,因自动微分技术限制,在高阶问题中面临训练成本高昂的挑战。提出一种预测-微分解耦式神经网络架构UNet-RBF,采用U-Net作为预测网络提取PDEs参数的空间特征,并利用径向基函数(RBF)网络作为轻量级的微分网络施加物理约束。通过在训练中冻结RBF网络参数,实现了预测任务与微分计算的解耦,显著降低了自动微分的计算开销。数值实验结果表明,UNet-RBF在保证高预测精度(相对误差小于1%)的同时,大幅提升了高阶PDEs求解的训练效率,相较于传统PI-DeepONet,在四阶问题中训练效率提升超过1500%,且模型鲁棒性更强,为复杂物理问题的快速、准确求解提供了有效途径。

    Abstract:

    Physics-informed operator learning methods (e.g., PI-DeepONet), while advantageous for accelerating the solution of partial differential equations (PDEs), face challenges of high training costs in high-order problems due to limitations of automatic differentiation. To address this, a novel prediction-differential decoupled neural network architecture, UNet-RBF, is proposed. It employs U-Net as the prediction network to extract spatial features of PDE parameters and utilizes a lightweight Radial Basis Function (RBF) network as the differential network to impose physical constraints. By freezing the RBF network parameters during training, the prediction task is decoupled from differential computation, significantly reducing the computational overhead of automatic differentiation. Numerical experiments demonstrate that UNet-RBF substantially improves the training efficiency and stability for solving high-order PDEs while maintaining high prediction accuracy (relative error less than 1%). Compared to the traditional PI-DeepONet, the training efficiency for fourth-order problems is increased by over 1500%, and the model exhibits stronger robustness, offering an effective pathway for the rapid and accurate solution of complex physical problems.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2025-03-18
  • 最后修改日期:2025-06-15
  • 录用日期:2025-06-16
  • 在线发布日期:
  • 出版日期:
文章二维码