Projects

Programmes

Publications and software

Research publications

Sparse communication for distributed gradient descent

We make distributed stochastic gradient descent faster by exchanging sparse updates instead of dense...

Aji, A & Heafield, K 2017, Sparse Communication for Distributed Gradient Descent. in Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP 2017). Association for Computational Linguistics (ACL), pp. 440-445, EMNLP 2017: Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 7-11 September.

Events

Interest groups

People