---
_id: '9571'
abstract:
- lang: eng
text: As the size and complexity of models and datasets grow, so does the need for
communication-efficient variants of stochastic gradient descent that can be deployed
to perform parallel model training. One popular communication-compression method
for data-parallel SGD is QSGD (Alistarh et al., 2017), which quantizes and encodes
gradients to reduce communication costs. The baseline variant of QSGD provides
strong theoretical guarantees, however, for practical purposes, the authors proposed
a heuristic variant which we call QSGDinf, which demonstrated impressive empirical
gains for distributed training of large neural networks. In this paper, we build
on this work to propose a new gradient quantization scheme, and show that it has
both stronger theoretical guarantees than QSGD, and matches and exceeds the empirical
performance of the QSGDinf heuristic and of other compression methods.
article_processing_charge: No
article_type: original
author:
- first_name: Ali
full_name: Ramezani-Kebrya, Ali
last_name: Ramezani-Kebrya
- first_name: Fartash
full_name: Faghri, Fartash
last_name: Faghri
- first_name: Ilya
full_name: Markov, Ilya
last_name: Markov
- first_name: Vitalii
full_name: Aksenov, Vitalii
id: 2980135A-F248-11E8-B48F-1D18A9856A87
last_name: Aksenov
- first_name: Dan-Adrian
full_name: Alistarh, Dan-Adrian
id: 4A899BFC-F248-11E8-B48F-1D18A9856A87
last_name: Alistarh
orcid: 0000-0003-3650-940X
- first_name: Daniel M.
full_name: Roy, Daniel M.
last_name: Roy
citation:
ama: 'Ramezani-Kebrya A, Faghri F, Markov I, Aksenov V, Alistarh D-A, Roy DM. NUQSGD:
Provably communication-efficient data-parallel SGD via nonuniform quantization.
Journal of Machine Learning Research. 2021;22(114):1−43.'
apa: 'Ramezani-Kebrya, A., Faghri, F., Markov, I., Aksenov, V., Alistarh, D.-A.,
& Roy, D. M. (2021). NUQSGD: Provably communication-efficient data-parallel
SGD via nonuniform quantization. Journal of Machine Learning Research.
Journal of Machine Learning Research.'
chicago: 'Ramezani-Kebrya, Ali, Fartash Faghri, Ilya Markov, Vitalii Aksenov, Dan-Adrian
Alistarh, and Daniel M. Roy. “NUQSGD: Provably Communication-Efficient Data-Parallel
SGD via Nonuniform Quantization.” Journal of Machine Learning Research.
Journal of Machine Learning Research, 2021.'
ieee: 'A. Ramezani-Kebrya, F. Faghri, I. Markov, V. Aksenov, D.-A. Alistarh, and
D. M. Roy, “NUQSGD: Provably communication-efficient data-parallel SGD via nonuniform
quantization,” Journal of Machine Learning Research, vol. 22, no. 114.
Journal of Machine Learning Research, p. 1−43, 2021.'
ista: 'Ramezani-Kebrya A, Faghri F, Markov I, Aksenov V, Alistarh D-A, Roy DM. 2021.
NUQSGD: Provably communication-efficient data-parallel SGD via nonuniform quantization.
Journal of Machine Learning Research. 22(114), 1−43.'
mla: 'Ramezani-Kebrya, Ali, et al. “NUQSGD: Provably Communication-Efficient Data-Parallel
SGD via Nonuniform Quantization.” Journal of Machine Learning Research,
vol. 22, no. 114, Journal of Machine Learning Research, 2021, p. 1−43.'
short: A. Ramezani-Kebrya, F. Faghri, I. Markov, V. Aksenov, D.-A. Alistarh, D.M.
Roy, Journal of Machine Learning Research 22 (2021) 1−43.
date_created: 2021-06-20T22:01:33Z
date_published: 2021-04-01T00:00:00Z
date_updated: 2024-03-06T12:22:07Z
day: '01'
ddc:
- '000'
department:
- _id: DaAl
external_id:
arxiv:
- '1908.06077'
file:
- access_level: open_access
checksum: 6428aa8bcb67768b6949c99b55d5281d
content_type: application/pdf
creator: asandaue
date_created: 2021-06-23T07:09:41Z
date_updated: 2021-06-23T07:09:41Z
file_id: '9595'
file_name: 2021_JournalOfMachineLearningResearch_Ramezani-Kebrya.pdf
file_size: 11237154
relation: main_file
success: 1
file_date_updated: 2021-06-23T07:09:41Z
has_accepted_license: '1'
intvolume: ' 22'
issue: '114'
language:
- iso: eng
license: https://creativecommons.org/licenses/by/4.0/
main_file_link:
- open_access: '1'
url: https://www.jmlr.org/papers/v22/20-255.html
month: '04'
oa: 1
oa_version: Published Version
page: 1−43
publication: Journal of Machine Learning Research
publication_identifier:
eissn:
- '15337928'
issn:
- '15324435'
publication_status: published
publisher: Journal of Machine Learning Research
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'NUQSGD: Provably communication-efficient data-parallel SGD via nonuniform
quantization'
tmp:
image: /images/cc_by.png
legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
short: CC BY (4.0)
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 22
year: '2021'
...
---
_id: '8544'
abstract:
- lang: eng
text: The synaptotrophic hypothesis posits that synapse formation stabilizes dendritic
branches, yet this hypothesis has not been causally tested in vivo in the mammalian
brain. Presynaptic ligand cerebellin-1 (Cbln1) and postsynaptic receptor GluD2
mediate synaptogenesis between granule cells and Purkinje cells in the molecular
layer of the cerebellar cortex. Here we show that sparse but not global knockout
of GluD2 causes under-elaboration of Purkinje cell dendrites in the deep molecular
layer and overelaboration in the superficial molecular layer. Developmental, overexpression,
structure-function, and genetic epistasis analyses indicate that dendrite morphogenesis
defects result from competitive synaptogenesis in a Cbln1/GluD2-dependent manner.
A generative model of dendritic growth based on competitive synaptogenesis largely
recapitulates GluD2 sparse and global knockout phenotypes. Our results support
the synaptotrophic hypothesis at initial stages of dendrite development, suggest
a second mode in which cumulative synapse formation inhibits further dendrite
growth, and highlight the importance of competition in dendrite morphogenesis.
acknowledgement: We thank M. Mishina for GluD2fl frozen embryos, T.C. Südhof and J.I.
Morgan for Cbln1fl mice, L. Anderson for help in generating the MADM alleles, W.
Joo for a previously unpublished construct, M. Yuzaki, K. Shen, J. Ding, and members
of the Luo lab, including J.M. Kebschull, H. Li, J. Li, T. Li, C.M. McLaughlin,
D. Pederick, J. Ren, D.C. Wang and C. Xu for discussions and critiques of the manuscript,
and M. Yuzaki for supporting Y.H.T. during the final phase of this project. Y.H.T.
was supported by a JSPS fellowship; S.A.S. was supported by a Stanford Graduate
Fellowship and an NSF Predoctoral Fellowship; L.J. is supported by a Stanford Graduate
Fellowship and an NSF Predoctoral Fellowship; M.J.W. is supported by a Burroughs
Wellcome Fund CASI Award. This work was supported by an NIH grant (R01-NS050538)
to L.L.; the European Research Council (ERC) under the European Union's Horizon
2020 research and innovations programme (No. 725780 LinPro) to S.H.; and Simons
and James S. McDonnell Foundations and an NSF CAREER award to S.G.; L.L. is an HHMI
investigator.
article_processing_charge: No
article_type: original
author:
- first_name: Yukari H.
full_name: Takeo, Yukari H.
last_name: Takeo
- first_name: S. Andrew
full_name: Shuster, S. Andrew
last_name: Shuster
- first_name: Linnie
full_name: Jiang, Linnie
last_name: Jiang
- first_name: Miley
full_name: Hu, Miley
last_name: Hu
- first_name: David J.
full_name: Luginbuhl, David J.
last_name: Luginbuhl
- first_name: Thomas
full_name: Rülicke, Thomas
last_name: Rülicke
- first_name: Ximena
full_name: Contreras, Ximena
id: 475990FE-F248-11E8-B48F-1D18A9856A87
last_name: Contreras
- first_name: Simon
full_name: Hippenmeyer, Simon
id: 37B36620-F248-11E8-B48F-1D18A9856A87
last_name: Hippenmeyer
orcid: 0000-0003-2279-1061
- first_name: Mark J.
full_name: Wagner, Mark J.
last_name: Wagner
- first_name: Surya
full_name: Ganguli, Surya
last_name: Ganguli
- first_name: Liqun
full_name: Luo, Liqun
last_name: Luo
citation:
ama: Takeo YH, Shuster SA, Jiang L, et al. GluD2- and Cbln1-mediated competitive
synaptogenesis shapes the dendritic arbors of cerebellar Purkinje cells. Neuron.
2021;109(4):P629-644.E8. doi:10.1016/j.neuron.2020.11.028
apa: Takeo, Y. H., Shuster, S. A., Jiang, L., Hu, M., Luginbuhl, D. J., Rülicke,
T., … Luo, L. (2021). GluD2- and Cbln1-mediated competitive synaptogenesis shapes
the dendritic arbors of cerebellar Purkinje cells. Neuron. Elsevier. https://doi.org/10.1016/j.neuron.2020.11.028
chicago: Takeo, Yukari H., S. Andrew Shuster, Linnie Jiang, Miley Hu, David J. Luginbuhl,
Thomas Rülicke, Ximena Contreras, et al. “GluD2- and Cbln1-Mediated Competitive
Synaptogenesis Shapes the Dendritic Arbors of Cerebellar Purkinje Cells.” Neuron.
Elsevier, 2021. https://doi.org/10.1016/j.neuron.2020.11.028.
ieee: Y. H. Takeo et al., “GluD2- and Cbln1-mediated competitive synaptogenesis
shapes the dendritic arbors of cerebellar Purkinje cells,” Neuron, vol.
109, no. 4. Elsevier, p. P629–644.E8, 2021.
ista: Takeo YH, Shuster SA, Jiang L, Hu M, Luginbuhl DJ, Rülicke T, Contreras X,
Hippenmeyer S, Wagner MJ, Ganguli S, Luo L. 2021. GluD2- and Cbln1-mediated competitive
synaptogenesis shapes the dendritic arbors of cerebellar Purkinje cells. Neuron.
109(4), P629–644.E8.
mla: Takeo, Yukari H., et al. “GluD2- and Cbln1-Mediated Competitive Synaptogenesis
Shapes the Dendritic Arbors of Cerebellar Purkinje Cells.” Neuron, vol.
109, no. 4, Elsevier, 2021, p. P629–644.E8, doi:10.1016/j.neuron.2020.11.028.
short: Y.H. Takeo, S.A. Shuster, L. Jiang, M. Hu, D.J. Luginbuhl, T. Rülicke, X.
Contreras, S. Hippenmeyer, M.J. Wagner, S. Ganguli, L. Luo, Neuron 109 (2021)
P629–644.E8.
date_created: 2020-09-21T11:59:47Z
date_published: 2021-02-17T00:00:00Z
date_updated: 2024-03-06T12:12:48Z
day: '17'
department:
- _id: SiHi
doi: 10.1016/j.neuron.2020.11.028
ec_funded: 1
intvolume: ' 109'
issue: '4'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.1101/2020.06.14.151258
month: '02'
oa: 1
oa_version: Preprint
page: P629-644.E8
project:
- _id: 260018B0-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '725780'
name: Principles of Neural Stem Cell Lineage Progression in Cerebral Cortex Development
publication: Neuron
publication_identifier:
eissn:
- 1097-4199
publication_status: published
publisher: Elsevier
quality_controlled: '1'
scopus_import: '1'
status: public
title: GluD2- and Cbln1-mediated competitive synaptogenesis shapes the dendritic arbors
of cerebellar Purkinje cells
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 109
year: '2021'
...
---
_id: '9791'
abstract:
- lang: eng
text: We provide a definition of the effective mass for the classical polaron described
by the Landau-Pekar equations. It is based on a novel variational principle, minimizing
the energy functional over states with given (initial) velocity. The resulting
formula for the polaron's effective mass agrees with the prediction by Landau
and Pekar.
acknowledgement: We thank Herbert Spohn for helpful comments. Funding from the European
Union’s Horizon 2020 research and innovation programme under the ERC grant agreement
No. 694227 (D.F. and R.S.) and under the Marie Skłodowska-Curie Grant Agreement
No. 754411 (S.R.) is gratefully acknowledged..
article_number: '2107.03720 '
article_processing_charge: No
author:
- first_name: Dario
full_name: Feliciangeli, Dario
id: 41A639AA-F248-11E8-B48F-1D18A9856A87
last_name: Feliciangeli
orcid: 0000-0003-0754-8530
- first_name: Simone Anna Elvira
full_name: Rademacher, Simone Anna Elvira
id: 856966FE-A408-11E9-977E-802DE6697425
last_name: Rademacher
orcid: 0000-0001-5059-4466
- first_name: Robert
full_name: Seiringer, Robert
id: 4AFD0470-F248-11E8-B48F-1D18A9856A87
last_name: Seiringer
orcid: 0000-0002-6781-0521
citation:
ama: Feliciangeli D, Rademacher SAE, Seiringer R. The effective mass problem for
the Landau-Pekar equations. arXiv.
apa: Feliciangeli, D., Rademacher, S. A. E., & Seiringer, R. (n.d.). The effective
mass problem for the Landau-Pekar equations. arXiv.
chicago: Feliciangeli, Dario, Simone Anna Elvira Rademacher, and Robert Seiringer.
“The Effective Mass Problem for the Landau-Pekar Equations.” ArXiv, n.d.
ieee: D. Feliciangeli, S. A. E. Rademacher, and R. Seiringer, “The effective mass
problem for the Landau-Pekar equations,” arXiv. .
ista: Feliciangeli D, Rademacher SAE, Seiringer R. The effective mass problem for
the Landau-Pekar equations. arXiv, 2107.03720.
mla: Feliciangeli, Dario, et al. “The Effective Mass Problem for the Landau-Pekar
Equations.” ArXiv, 2107.03720.
short: D. Feliciangeli, S.A.E. Rademacher, R. Seiringer, ArXiv (n.d.).
date_created: 2021-08-06T08:49:45Z
date_published: 2021-07-08T00:00:00Z
date_updated: 2024-03-06T12:30:45Z
day: '08'
ddc:
- '510'
department:
- _id: RoSe
ec_funded: 1
external_id:
arxiv:
- '2107.03720'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2107.03720
month: '07'
oa: 1
oa_version: Preprint
project:
- _id: 260C2330-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '754411'
name: ISTplus - Postdoctoral Fellowships
- _id: 25C6DC12-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '694227'
name: Analysis of quantum many-body systems
publication: arXiv
publication_status: submitted
related_material:
record:
- id: '10755'
relation: later_version
status: public
- id: '9733'
relation: dissertation_contains
status: public
status: public
title: The effective mass problem for the Landau-Pekar equations
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '7553'
abstract:
- lang: eng
text: Normative theories and statistical inference provide complementary approaches
for the study of biological systems. A normative theory postulates that organisms
have adapted to efficiently solve essential tasks, and proceeds to mathematically
work out testable consequences of such optimality; parameters that maximize the
hypothesized organismal function can be derived ab initio, without reference to
experimental data. In contrast, statistical inference focuses on efficient utilization
of data to learn model parameters, without reference to any a priori notion of
biological function, utility, or fitness. Traditionally, these two approaches
were developed independently and applied separately. Here we unify them in a coherent
Bayesian framework that embeds a normative theory into a family of maximum-entropy
“optimization priors.” This family defines a smooth interpolation between a data-rich
inference regime (characteristic of “bottom-up” statistical models), and a data-limited
ab inito prediction regime (characteristic of “top-down” normative theory). We
demonstrate the applicability of our framework using data from the visual cortex,
and argue that the flexibility it affords is essential to address a number of
fundamental challenges relating to inference and prediction in complex, high-dimensional
biological problems.
acknowledgement: The authors thank Dario Ringach for providing the V1 receptive fields
and Olivier Marre for providing the retinal receptive fields. W.M. was funded by
the European Union’s Horizon 2020 research and innovation programme under the Marie
Skłodowska-Curie grant agreement no. 754411. M.H. was funded in part by Human Frontiers
Science grant no. HFSP RGP0032/2018.
article_processing_charge: No
author:
- first_name: Wiktor F
full_name: Mlynarski, Wiktor F
id: 358A453A-F248-11E8-B48F-1D18A9856A87
last_name: Mlynarski
- first_name: Michal
full_name: Hledik, Michal
id: 4171253A-F248-11E8-B48F-1D18A9856A87
last_name: Hledik
- first_name: Thomas R
full_name: Sokolowski, Thomas R
id: 3E999752-F248-11E8-B48F-1D18A9856A87
last_name: Sokolowski
orcid: 0000-0002-1287-3779
- first_name: Gašper
full_name: Tkačik, Gašper
id: 3D494DCA-F248-11E8-B48F-1D18A9856A87
last_name: Tkačik
orcid: 0000-0002-6699-1455
citation:
ama: Mlynarski WF, Hledik M, Sokolowski TR, Tkačik G. Statistical analysis and optimality
of neural systems. Neuron. 2021;109(7):1227-1241.e5. doi:10.1016/j.neuron.2021.01.020
apa: Mlynarski, W. F., Hledik, M., Sokolowski, T. R., & Tkačik, G. (2021). Statistical
analysis and optimality of neural systems. Neuron. Cell Press. https://doi.org/10.1016/j.neuron.2021.01.020
chicago: Mlynarski, Wiktor F, Michal Hledik, Thomas R Sokolowski, and Gašper Tkačik.
“Statistical Analysis and Optimality of Neural Systems.” Neuron. Cell Press,
2021. https://doi.org/10.1016/j.neuron.2021.01.020.
ieee: W. F. Mlynarski, M. Hledik, T. R. Sokolowski, and G. Tkačik, “Statistical
analysis and optimality of neural systems,” Neuron, vol. 109, no. 7. Cell
Press, p. 1227–1241.e5, 2021.
ista: Mlynarski WF, Hledik M, Sokolowski TR, Tkačik G. 2021. Statistical analysis
and optimality of neural systems. Neuron. 109(7), 1227–1241.e5.
mla: Mlynarski, Wiktor F., et al. “Statistical Analysis and Optimality of Neural
Systems.” Neuron, vol. 109, no. 7, Cell Press, 2021, p. 1227–1241.e5, doi:10.1016/j.neuron.2021.01.020.
short: W.F. Mlynarski, M. Hledik, T.R. Sokolowski, G. Tkačik, Neuron 109 (2021)
1227–1241.e5.
date_created: 2020-02-28T11:00:12Z
date_published: 2021-04-07T00:00:00Z
date_updated: 2024-03-06T14:22:51Z
day: '07'
department:
- _id: GaTk
doi: 10.1016/j.neuron.2021.01.020
ec_funded: 1
external_id:
isi:
- '000637809600006'
intvolume: ' 109'
isi: 1
issue: '7'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.1101/848374
month: '04'
oa: 1
oa_version: Preprint
page: 1227-1241.e5
project:
- _id: 260C2330-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '754411'
name: ISTplus - Postdoctoral Fellowships
publication: Neuron
publication_status: published
publisher: Cell Press
quality_controlled: '1'
related_material:
link:
- description: News on IST Homepage
relation: press_release
url: https://ist.ac.at/en/news/can-evolution-be-predicted/
record:
- id: '15020'
relation: dissertation_contains
status: public
scopus_import: '1'
status: public
title: Statistical analysis and optimality of neural systems
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 109
year: '2021'
...
---
_id: '10598'
abstract:
- lang: eng
text: ' We consider the problem of estimating a signal from measurements obtained
via a generalized linear model. We focus on estimators based on approximate message
passing (AMP), a family of iterative algorithms with many appealing features:
the performance of AMP in the high-dimensional limit can be succinctly characterized
under suitable model assumptions; AMP can also be tailored to the empirical distribution
of the signal entries, and for a wide class of estimation problems, AMP is conjectured
to be optimal among all polynomial-time algorithms. However, a major issue of
AMP is that in many models (such as phase retrieval), it requires an initialization
correlated with the ground-truth signal and independent from the measurement matrix.
Assuming that such an initialization is available is typically not realistic.
In this paper, we solve this problem by proposing an AMP algorithm initialized
with a spectral estimator. With such an initialization, the standard AMP analysis
fails since the spectral estimator depends in a complicated way on the design
matrix. Our main contribution is a rigorous characterization of the performance
of AMP with spectral initialization in the high-dimensional limit. The key technical
idea is to define and analyze a two-phase artificial AMP algorithm that first
produces the spectral estimator, and then closely approximates the iterates of
the true AMP. We also provide numerical results that demonstrate the validity
of the proposed approach. '
acknowledgement: The authors would like to thank Andrea Montanari for helpful discussions.
M. Mondelli was partially supported by the 2019 Lopez-Loreta Prize. R. Venkataramanan
was partially supported by the Alan Turing Institute under the EPSRC grant EP/N510129/1.
alternative_title:
- Proceedings of Machine Learning Research
article_processing_charge: Yes (via OA deal)
author:
- first_name: Marco
full_name: Mondelli, Marco
id: 27EB676C-8706-11E9-9510-7717E6697425
last_name: Mondelli
orcid: 0000-0002-3242-7020
- first_name: Ramji
full_name: Venkataramanan, Ramji
last_name: Venkataramanan
citation:
ama: 'Mondelli M, Venkataramanan R. Approximate message passing with spectral initialization
for generalized linear models. In: Banerjee A, Fukumizu K, eds. Proceedings
of The 24th International Conference on Artificial Intelligence and Statistics.
Vol 130. ML Research Press; 2021:397-405.'
apa: 'Mondelli, M., & Venkataramanan, R. (2021). Approximate message passing
with spectral initialization for generalized linear models. In A. Banerjee &
K. Fukumizu (Eds.), Proceedings of The 24th International Conference on Artificial
Intelligence and Statistics (Vol. 130, pp. 397–405). Virtual, San Diego, CA,
United States: ML Research Press.'
chicago: Mondelli, Marco, and Ramji Venkataramanan. “Approximate Message Passing
with Spectral Initialization for Generalized Linear Models.” In Proceedings
of The 24th International Conference on Artificial Intelligence and Statistics,
edited by Arindam Banerjee and Kenji Fukumizu, 130:397–405. ML Research Press,
2021.
ieee: M. Mondelli and R. Venkataramanan, “Approximate message passing with spectral
initialization for generalized linear models,” in Proceedings of The 24th International
Conference on Artificial Intelligence and Statistics, Virtual, San Diego,
CA, United States, 2021, vol. 130, pp. 397–405.
ista: 'Mondelli M, Venkataramanan R. 2021. Approximate message passing with spectral
initialization for generalized linear models. Proceedings of The 24th International
Conference on Artificial Intelligence and Statistics. AISTATS: Artificial Intelligence
and Statistics, Proceedings of Machine Learning Research, vol. 130, 397–405.'
mla: Mondelli, Marco, and Ramji Venkataramanan. “Approximate Message Passing with
Spectral Initialization for Generalized Linear Models.” Proceedings of The
24th International Conference on Artificial Intelligence and Statistics, edited
by Arindam Banerjee and Kenji Fukumizu, vol. 130, ML Research Press, 2021, pp.
397–405.
short: M. Mondelli, R. Venkataramanan, in:, A. Banerjee, K. Fukumizu (Eds.), Proceedings
of The 24th International Conference on Artificial Intelligence and Statistics,
ML Research Press, 2021, pp. 397–405.
conference:
end_date: 2021-04-15
location: Virtual, San Diego, CA, United States
name: 'AISTATS: Artificial Intelligence and Statistics'
start_date: 2021-04-13
date_created: 2022-01-03T11:34:22Z
date_published: 2021-04-01T00:00:00Z
date_updated: 2024-03-07T10:36:53Z
day: '01'
department:
- _id: MaMo
editor:
- first_name: Arindam
full_name: Banerjee, Arindam
last_name: Banerjee
- first_name: Kenji
full_name: Fukumizu, Kenji
last_name: Fukumizu
external_id:
arxiv:
- '2010.03460'
intvolume: ' 130'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://proceedings.mlr.press/v130/mondelli21a.html
month: '04'
oa: 1
oa_version: Preprint
page: 397-405
project:
- _id: 059876FA-7A3F-11EA-A408-12923DDC885E
name: Prix Lopez-Loretta 2019 - Marco Mondelli
publication: Proceedings of The 24th International Conference on Artificial Intelligence
and Statistics
publication_identifier:
issn:
- 2640-3498
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
related_material:
record:
- id: '12480'
relation: later_version
status: public
scopus_import: '1'
status: public
title: Approximate message passing with spectral initialization for generalized linear
models
type: conference
user_id: 3E5EF7F0-F248-11E8-B48F-1D18A9856A87
volume: 130
year: '2021'
...