It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
In distributed machine learning, data shuffling is a crucial data preprocessing technique that significantly impacts the efficiency and performance of model training. As distributed machine learning scales across multiple computing nodes, the ability to shuffle data effectively and efficiently has become essential for achieving high-quality model performance and minimizing communication costs. This paper systematically explores various data shuffling methods, including random shuffling, stratified shuffling, K-fold shuffling, and coded shuffling, each with distinct advantages, limitations, and application scenarios. Random shuffling is simple and fast but may lead to imbalanced class distributions, while stratified shuffling maintains class proportions at the cost of increased complexity. K-fold shuffling provides robust model evaluation through multiple training-validation splits, though it is computationally demanding. Coded shuffling, on the other hand, optimizes communication costs in distributed settings but requires sophisticated encoding-decoding techniques. The study also highlights the challenges associated with current shuffling techniques, such as handling class imbalance, high computational complexity, and adapting to dynamic, real-time data. This paper proposes potential solutions to enhance the efficacy of data shuffling, including hybrid methodologies, automated stratification processes, and optimized coding strategies. This work aims to guide future research on data shuffling in distributed machine learning environments, ultimately advancing model robustness and generalization across complex real-world applications.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer





