应用多GPU的可压缩湍流并行计算
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

国家自然科学基金资助项目(91016010,91216117)


Parallel computation of compressible turbulence using multi-GPU clusters
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    利用CUDA Fortran语言发展了基于图形处理器(GPU)的计算流体力学可压缩湍流求解器。该求解器基于结构网格有限体积法,空间离散采用AUSMPW+格式,湍流模型为k-ω SST两方程模型,采用MPI实现并行计算。针对最新的GPU架构,讨论了通量计算的优化方法及GPU计算与PCIe数据传输、MPI通信重叠的多GPU并行算法。进行了超声速进气道及空天飞机等算例的数值模拟以验证GPU在大网格量情况下的加速性能。计算结果表明:相对于Intel Xeon E5-2670 CPU单一核心的计算时间,单块NVIDIA GTX Titan Black GPU可获得107~125倍的加速比。利用四块GPU实现了复杂外形1.34亿网格的快速计算,并行效率为91.6%。

    Abstract:

    Based on CUDA Fortran for compressible turbulence simulations, a finite volume computational fluid dynamics solver on the GPU(Graphical Processing Unit) was developed. The solver was implemented with an AUSMPW+ scheme for the spatial dispersion, the k-ω SST model for turbulence model, and MPI communication for parallel computing. Some optimization strategies for fluxes computation and multi-GPU parallel algorithms for overlap of PCIe data transfer and MPI communication with GPU computation have been discussed for the latest generation GPU architecture. Several test cases, such as a supersonic inlet and a space shuttle were chosen to demonstrate the acceleration performance of GPU on large-scale grid size. Results show that when using a NVIDIA GTX Titan Black GPU, the computational expense can be reduced by 107~125 times than using a single core of an Intel Xeon E5-2670 CPU. Fast computing for a complex configuration with 0.134 billion grid sizes has been achieved by using 4 GPUs and the parallel efficiency is 91.6%.

    参考文献
    相似文献
    引证文献
引用本文

曹文斌,李桦,谢文佳,等.应用多GPU的可压缩湍流并行计算[J].国防科技大学学报,2015,37(3):78-83.
CAO Wenbin, LI Hua, XIE Wenjia, et al. Parallel computation of compressible turbulence using multi-GPU clusters[J]. Journal of National University of Defense Technology,2015,37(3):78-83.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2014-10-07
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2015-06-30
  • 出版日期:
文章二维码