Elastic Coordination for Scalable Machine Learning

Project Period: 2019-03-01 – 2024-02-29
Externally Funded
Acronym
ScaleML
Principal Investigator
Dan-Adrian Alistarh
Department(s)
Alistarh Group
Grant Number
805223
Funding Organisation
EC/H2020

4 Publications

2019 | Conference Paper | IST-REx-ID: 6673   OA
Efficiency guarantees for parallel incremental algorithms under relaxed schedulers
D.-A. Alistarh, G. Nadiradze, N. Koval, in:, 31st ACM Symposium on Parallelism in Algorithms and Architectures, ACM Press, 2019, pp. 145–154.
View | DOI | Download (ext.) | arXiv
 
2019 | Conference Paper | IST-REx-ID: 7201   OA
SparCML: High-performance sparse communication for machine learning
C. Renggli, S. Ashkboos, M. Aghagolzadeh, D.-A. Alistarh, T. Hoefler, in:, International Conference for High Performance Computing, Networking, Storage and Analysis, SC, ACM, 2019.
View | DOI | Download (ext.) | arXiv
 
2019 | Conference Paper | IST-REx-ID: 7542   OA
Powerset convolutional neural networks
C. Wendler, D.-A. Alistarh, M. Püschel, in:, Neural Information Processing Systems Foundation, 2019, pp. 927–938.
View | Download (ext.) | arXiv
 
2020 | Conference Paper | IST-REx-ID: 7636
Non-blocking interpolation search trees with doubly-logarithmic running time
T.A. Brown, A. Prokopec, D.-A. Alistarh, in:, Proceedings of the ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming, PPOPP, ACM, 2020, pp. 276–291.
View | DOI