Publication Type
Journal Article
Version
publishedVersion
Publication Date
11-2016
Abstract
We present a method for determining the ratio of the tasks when breaking any complex workload in such a way that once the outputs from all tasks are joined, their full completion takes less time and exhibit smaller variance than when running on the undivided workload. To do that, we have to infer the capabilities of the processing unit executing the divided workloads or tasks. We propose a Bayesian Inference algorithm to infer the amount of time each task takes in a way that does not require prior knowledge on the processing unit capability. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet. Then we show that the Bayesian inference algorithm correctly estimates the amount of time each task takes when executed in one of the processing units.
Keywords
Parallelization, Partitioning, Workflow, Uncertainty, Optimization, Machines
Discipline
Computer Sciences | Theory and Algorithms
Research Areas
Data Science and Engineering
Publication
Netnomics
Volume
17
Issue
3
First Page
233
Last Page
253
ISSN
1385-9587
Identifier
10.1007/s11066-016-9111-5
Publisher
Springer Verlag (Germany)
Citation
CHUA, Freddy and HUBERMAN, Bernardo A..
Partitioning uncertain workloads. (2016). Netnomics. 17, (3), 233-253.
Available at: https://ink.library.smu.edu.sg/sis_research/3974
Copyright Owner and License
Authors
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1007/s11066-016-9111-5