“…We have also used data parallelism to implement LambdaML. Other research topics in distributed ML include compression [6,7,52,53,93,96,97,101], decentralization [28,41,59,65,90,91,100], synchronization [4,19,26,46,66,68,87,94,102], straggler [8,56,83,89,98,105], data partition [1,3,36,55,77], etc.…”