Helper thread pre-fetching model based on learning gradients of control parameters
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    To the applications with irregular accessing memory, if the overhead of accessing memory for a given application is much greater than that of computation, it will make the helper thread lag behind the main thread. Hereby, an improved helper thread prefetching model by adding control parameters was proposed. The gradient descent algorithm is one of the most popular machine learning algorithms, which was adopted to determine the optimal control parameters. The amount of the memory access tasks was controlled by the control parameters effectively, which makes the helper thread be finished ahead of the main thread. The experiment results show that the speedup of system performance is achieved by 1.1 times to 1.5 times.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:November 16,2015
  • Revised:
  • Adopted:
  • Online: November 08,2016
  • Published:
Article QR Code