Work Assisting: Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling
We present \textit{work assisting}, a novel scheduling strategy for mixing data parallelism (loop parallelism) with task parallelism, where threads share their current data-parallel activity in a shared array to let other threads assist. In contrast to most existing work in this space, our algorithm aims at preserving the structure of data parallelism instead of implementing all parallelism as task parallelism. This enables the use of self-scheduling for data parallelism, as required by certain data-parallel algorithms, and only exploits data parallelism if task parallelism is not sufficient. It provides full flexibility: neither the number of threads for a data-parallel loop nor the distribution over threads need to be fixed before the loop starts. We present benchmarks to demonstrate that our scheduling algorithm, depending on the problem, behaves similar to, or outperforms schedulers based purely on task parallelism.
Tue 25 JunDisplayed time zone: Windhoek change
13:40 - 15:20 | |||
13:40 25mTalk | Apple Array Allocation ARRAY Vanessa McHale Northern Trust File Attached | ||
14:05 25mTalk | Shray: an Owner-Compute Distributed Shared-Memory System ARRAY Stefan Schrijvers Radboud University, Thomas Koopman Radboud University, Sven-Bodo Scholz Radboud University DOI | ||
14:30 25mTalk | Work Assisting: Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling ARRAY DOI | ||
14:55 25mTalk | Zero-Overhead Parallel Scans for Multi-Core CPUs ARRAY Ivo Gabe de Wolff Utrecht University, David van Balen , Gabriele Keller Utrecht University, Trevor L. McDonell Utrecht University File Attached |