---
_id: '10762'
abstract:
- lang: eng
text: Methods inspired from machine learning have recently attracted great interest
in the computational study of quantum many-particle systems. So far, however,
it has proven challenging to deal with microscopic models in which the total number
of particles is not conserved. To address this issue, we propose a new variant
of neural network states, which we term neural coherent states. Taking the Fröhlich
impurity model as a case study, we show that neural coherent states can learn
the ground state of non-additive systems very well. In particular, we observe
substantial improvement over the standard coherent state estimates in the most
challenging intermediate coupling regime. Our approach is generic and does not
assume specific details of the system, suggesting wide applications.
acknowledgement: "We acknowledge fruitful discussions with Giacomo Bighin, Giammarco
Fabiani, Areg Ghazaryan, Christoph\r\nLampert, and Artem Volosniev at various stages
of this work. W.R. is a recipient of a DOC Fellowship of the\r\nAustrian Academy
of Sciences and has received funding from the EU Horizon 2020 programme under the
Marie\r\nSkłodowska-Curie Grant Agreement No. 665385. M. L. acknowledges support
by the European Research Council (ERC) Starting Grant No. 801770 (ANGULON). This
work is part of the Shell-NWO/FOM-initiative “Computational sciences for energy
research” of Shell and Chemical Sciences, Earth and Life Sciences, Physical Sciences,
FOM and STW."
article_processing_charge: No
author:
- first_name: Wojciech
full_name: Rzadkowski, Wojciech
id: 48C55298-F248-11E8-B48F-1D18A9856A87
last_name: Rzadkowski
orcid: 0000-0002-1106-4419
- first_name: Mikhail
full_name: Lemeshko, Mikhail
id: 37CB05FA-F248-11E8-B48F-1D18A9856A87
last_name: Lemeshko
orcid: 0000-0002-6990-7802
- first_name: Johan H.
full_name: Mentink, Johan H.
last_name: Mentink
citation:
ama: Rzadkowski W, Lemeshko M, Mentink JH. Artificial neural network states for
non-additive systems. arXiv. doi:10.48550/arXiv.2105.15193
apa: Rzadkowski, W., Lemeshko, M., & Mentink, J. H. (n.d.). Artificial neural
network states for non-additive systems. arXiv. https://doi.org/10.48550/arXiv.2105.15193
chicago: Rzadkowski, Wojciech, Mikhail Lemeshko, and Johan H. Mentink. “Artificial
Neural Network States for Non-Additive Systems.” ArXiv, n.d. https://doi.org/10.48550/arXiv.2105.15193.
ieee: W. Rzadkowski, M. Lemeshko, and J. H. Mentink, “Artificial neural network
states for non-additive systems,” arXiv. .
ista: Rzadkowski W, Lemeshko M, Mentink JH. Artificial neural network states for
non-additive systems. arXiv, 10.48550/arXiv.2105.15193.
mla: Rzadkowski, Wojciech, et al. “Artificial Neural Network States for Non-Additive
Systems.” ArXiv, doi:10.48550/arXiv.2105.15193.
short: W. Rzadkowski, M. Lemeshko, J.H. Mentink, ArXiv (n.d.).
date_created: 2022-02-17T11:18:57Z
date_published: 2021-05-31T00:00:00Z
date_updated: 2023-09-07T13:44:16Z
day: '31'
department:
- _id: MiLe
doi: 10.48550/arXiv.2105.15193
ec_funded: 1
external_id:
arxiv:
- '2105.15193'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2105.15193
month: '05'
oa: 1
oa_version: Preprint
page: '2105.15193'
project:
- _id: 2688CF98-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '801770'
name: 'Angulon: physics and applications of a new quasiparticle'
- _id: 2564DBCA-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '665385'
name: International IST Doctoral Program
publication: arXiv
publication_status: submitted
related_material:
record:
- id: '10759'
relation: dissertation_contains
status: public
status: public
title: Artificial neural network states for non-additive systems
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '9418'
abstract:
- lang: eng
text: "Deep learning is best known for its empirical success across a wide range
of applications\r\nspanning computer vision, natural language processing and speech.
Of equal significance,\r\nthough perhaps less known, are its ramifications for
learning theory: deep networks have\r\nbeen observed to perform surprisingly well
in the high-capacity regime, aka the overfitting\r\nor underspecified regime.
Classically, this regime on the far right of the bias-variance curve\r\nis associated
with poor generalisation; however, recent experiments with deep networks\r\nchallenge
this view.\r\n\r\nThis thesis is devoted to investigating various aspects of underspecification
in deep learning.\r\nFirst, we argue that deep learning models are underspecified
on two levels: a) any given\r\ntraining dataset can be fit by many different functions,
and b) any given function can be\r\nexpressed by many different parameter configurations.
We refer to the second kind of\r\nunderspecification as parameterisation redundancy
and we precisely characterise its extent.\r\nSecond, we characterise the implicit
criteria (the inductive bias) that guide learning in the\r\nunderspecified regime.
Specifically, we consider a nonlinear but tractable classification\r\nsetting,
and show that given the choice, neural networks learn classifiers with a large
margin.\r\nThird, we consider learning scenarios where the inductive bias is not
by itself sufficient to\r\ndeal with underspecification. We then study different
ways of ‘tightening the specification’: i)\r\nIn the setting of representation
learning with variational autoencoders, we propose a hand-\r\ncrafted regulariser
based on mutual information. ii) In the setting of binary classification, we\r\nconsider
soft-label (real-valued) supervision. We derive a generalisation bound for linear\r\nnetworks
supervised in this way and verify that soft labels facilitate fast learning. Finally,
we\r\nexplore an application of soft-label supervision to the training of multi-exit
models."
acknowledged_ssus:
- _id: ScienComp
- _id: CampIT
- _id: E-Lib
alternative_title:
- ISTA Thesis
article_processing_charge: No
author:
- first_name: Phuong
full_name: Bui Thi Mai, Phuong
id: 3EC6EE64-F248-11E8-B48F-1D18A9856A87
last_name: Bui Thi Mai
citation:
ama: Phuong M. Underspecification in deep learning. 2021. doi:10.15479/AT:ISTA:9418
apa: Phuong, M. (2021). Underspecification in deep learning. Institute of
Science and Technology Austria. https://doi.org/10.15479/AT:ISTA:9418
chicago: Phuong, Mary. “Underspecification in Deep Learning.” Institute of Science
and Technology Austria, 2021. https://doi.org/10.15479/AT:ISTA:9418.
ieee: M. Phuong, “Underspecification in deep learning,” Institute of Science and
Technology Austria, 2021.
ista: Phuong M. 2021. Underspecification in deep learning. Institute of Science
and Technology Austria.
mla: Phuong, Mary. Underspecification in Deep Learning. Institute of Science
and Technology Austria, 2021, doi:10.15479/AT:ISTA:9418.
short: M. Phuong, Underspecification in Deep Learning, Institute of Science and
Technology Austria, 2021.
date_created: 2021-05-24T13:06:23Z
date_published: 2021-05-30T00:00:00Z
date_updated: 2023-09-08T11:11:12Z
day: '30'
ddc:
- '000'
degree_awarded: PhD
department:
- _id: GradSch
- _id: ChLa
doi: 10.15479/AT:ISTA:9418
file:
- access_level: open_access
checksum: 4f0abe64114cfed264f9d36e8d1197e3
content_type: application/pdf
creator: bphuong
date_created: 2021-05-24T11:22:29Z
date_updated: 2021-05-24T11:22:29Z
file_id: '9419'
file_name: mph-thesis-v519-pdfimages.pdf
file_size: 2673905
relation: main_file
success: 1
- access_level: closed
checksum: f5699e876bc770a9b0df8345a77720a2
content_type: application/zip
creator: bphuong
date_created: 2021-05-24T11:56:02Z
date_updated: 2021-05-24T11:56:02Z
file_id: '9420'
file_name: thesis.zip
file_size: 92995100
relation: source_file
file_date_updated: 2021-05-24T11:56:02Z
has_accepted_license: '1'
language:
- iso: eng
month: '05'
oa: 1
oa_version: Published Version
page: '125'
publication_identifier:
issn:
- 2663-337X
publication_status: published
publisher: Institute of Science and Technology Austria
related_material:
record:
- id: '7435'
relation: part_of_dissertation
status: deleted
- id: '7481'
relation: part_of_dissertation
status: public
- id: '9416'
relation: part_of_dissertation
status: public
- id: '7479'
relation: part_of_dissertation
status: public
status: public
supervisor:
- first_name: Christoph
full_name: Lampert, Christoph
id: 40C20FD2-F248-11E8-B48F-1D18A9856A87
last_name: Lampert
orcid: 0000-0001-8622-7887
title: Underspecification in deep learning
type: dissertation
user_id: c635000d-4b10-11ee-a964-aac5a93f6ac1
year: '2021'
...
---
_id: '14177'
abstract:
- lang: eng
text: "The focus of disentanglement approaches has been on identifying independent
factors of variation in data. However, the causal variables underlying real-world
observations are often not statistically independent. In this work, we bridge
the gap to real-world scenarios by analyzing the behavior of the most prominent
disentanglement approaches on correlated data in a large-scale empirical study
(including 4260 models). We show and quantify that systematically induced correlations
in the dataset are being learned and reflected in the latent representations,
which has implications for downstream applications of disentanglement such as
fairness. We also demonstrate how to resolve these latent correlations, either
using weak supervision during\r\ntraining or by post-hoc correcting a pre-trained
model with a small number of labels."
alternative_title:
- PMLR
article_processing_charge: No
author:
- first_name: Frederik
full_name: Träuble, Frederik
last_name: Träuble
- first_name: Elliot
full_name: Creager, Elliot
last_name: Creager
- first_name: Niki
full_name: Kilbertus, Niki
last_name: Kilbertus
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
- first_name: Andrea
full_name: Dittadi, Andrea
last_name: Dittadi
- first_name: Anirudh
full_name: Goyal, Anirudh
last_name: Goyal
- first_name: Bernhard
full_name: Schölkopf, Bernhard
last_name: Schölkopf
- first_name: Stefan
full_name: Bauer, Stefan
last_name: Bauer
citation:
ama: 'Träuble F, Creager E, Kilbertus N, et al. On disentangled representations
learned from correlated data. In: Proceedings of the 38th International Conference
on Machine Learning. Vol 139. ML Research Press; 2021:10401-10412.'
apa: 'Träuble, F., Creager, E., Kilbertus, N., Locatello, F., Dittadi, A., Goyal,
A., … Bauer, S. (2021). On disentangled representations learned from correlated
data. In Proceedings of the 38th International Conference on Machine Learning
(Vol. 139, pp. 10401–10412). Virtual: ML Research Press.'
chicago: Träuble, Frederik, Elliot Creager, Niki Kilbertus, Francesco Locatello,
Andrea Dittadi, Anirudh Goyal, Bernhard Schölkopf, and Stefan Bauer. “On Disentangled
Representations Learned from Correlated Data.” In Proceedings of the 38th International
Conference on Machine Learning, 139:10401–12. ML Research Press, 2021.
ieee: F. Träuble et al., “On disentangled representations learned from correlated
data,” in Proceedings of the 38th International Conference on Machine Learning,
Virtual, 2021, vol. 139, pp. 10401–10412.
ista: 'Träuble F, Creager E, Kilbertus N, Locatello F, Dittadi A, Goyal A, Schölkopf
B, Bauer S. 2021. On disentangled representations learned from correlated data.
Proceedings of the 38th International Conference on Machine Learning. ICML: International
Conference on Machine Learning, PMLR, vol. 139, 10401–10412.'
mla: Träuble, Frederik, et al. “On Disentangled Representations Learned from Correlated
Data.” Proceedings of the 38th International Conference on Machine Learning,
vol. 139, ML Research Press, 2021, pp. 10401–12.
short: F. Träuble, E. Creager, N. Kilbertus, F. Locatello, A. Dittadi, A. Goyal,
B. Schölkopf, S. Bauer, in:, Proceedings of the 38th International Conference
on Machine Learning, ML Research Press, 2021, pp. 10401–10412.
conference:
end_date: 2021-07-24
location: Virtual
name: 'ICML: International Conference on Machine Learning'
start_date: 2021-07-18
date_created: 2023-08-22T14:03:47Z
date_published: 2021-08-01T00:00:00Z
date_updated: 2023-09-11T10:18:48Z
day: '01'
department:
- _id: FrLo
extern: '1'
external_id:
arxiv:
- '2006.07886'
intvolume: ' 139'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2006.07886
month: '08'
oa: 1
oa_version: Published Version
page: 10401-10412
publication: Proceedings of the 38th International Conference on Machine Learning
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
scopus_import: '1'
status: public
title: On disentangled representations learned from correlated data
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 139
year: '2021'
...
---
_id: '14176'
abstract:
- lang: eng
text: "Intensive care units (ICU) are increasingly looking towards machine learning
for methods to provide online monitoring of critically ill patients. In machine
learning, online monitoring is often formulated as a supervised learning problem.
Recently, contrastive learning approaches have demonstrated promising improvements
over competitive supervised benchmarks. These methods rely on well-understood
data augmentation techniques developed for image data which do not apply to online
monitoring. In this work, we overcome this limitation by\r\nsupplementing time-series
data augmentation techniques with a novel contrastive\r\nlearning objective which
we call neighborhood contrastive learning (NCL). Our objective explicitly groups
together contiguous time segments from each patient while maintaining state-specific
information. Our experiments demonstrate a marked improvement over existing work
applying contrastive methods to medical time-series."
alternative_title:
- PMLR
article_processing_charge: No
author:
- first_name: Hugo
full_name: Yèche, Hugo
last_name: Yèche
- first_name: Gideon
full_name: Dresdner, Gideon
last_name: Dresdner
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
- first_name: Matthias
full_name: Hüser, Matthias
last_name: Hüser
- first_name: Gunnar
full_name: Rätsch, Gunnar
last_name: Rätsch
citation:
ama: 'Yèche H, Dresdner G, Locatello F, Hüser M, Rätsch G. Neighborhood contrastive
learning applied to online patient monitoring. In: Proceedings of 38th International
Conference on Machine Learning. Vol 139. ML Research Press; 2021:11964-11974.'
apa: 'Yèche, H., Dresdner, G., Locatello, F., Hüser, M., & Rätsch, G. (2021).
Neighborhood contrastive learning applied to online patient monitoring. In Proceedings
of 38th International Conference on Machine Learning (Vol. 139, pp. 11964–11974).
Virtual: ML Research Press.'
chicago: Yèche, Hugo, Gideon Dresdner, Francesco Locatello, Matthias Hüser, and
Gunnar Rätsch. “Neighborhood Contrastive Learning Applied to Online Patient Monitoring.”
In Proceedings of 38th International Conference on Machine Learning, 139:11964–74.
ML Research Press, 2021.
ieee: H. Yèche, G. Dresdner, F. Locatello, M. Hüser, and G. Rätsch, “Neighborhood
contrastive learning applied to online patient monitoring,” in Proceedings
of 38th International Conference on Machine Learning, Virtual, 2021, vol.
139, pp. 11964–11974.
ista: Yèche H, Dresdner G, Locatello F, Hüser M, Rätsch G. 2021. Neighborhood contrastive
learning applied to online patient monitoring. Proceedings of 38th International
Conference on Machine Learning. International Conference on Machine Learning,
PMLR, vol. 139, 11964–11974.
mla: Yèche, Hugo, et al. “Neighborhood Contrastive Learning Applied to Online Patient
Monitoring.” Proceedings of 38th International Conference on Machine Learning,
vol. 139, ML Research Press, 2021, pp. 11964–74.
short: H. Yèche, G. Dresdner, F. Locatello, M. Hüser, G. Rätsch, in:, Proceedings
of 38th International Conference on Machine Learning, ML Research Press, 2021,
pp. 11964–11974.
conference:
end_date: 2021-07-24
location: Virtual
name: International Conference on Machine Learning
start_date: 2021-07-18
date_created: 2023-08-22T14:03:04Z
date_published: 2021-08-01T00:00:00Z
date_updated: 2023-09-11T10:16:55Z
day: '01'
department:
- _id: FrLo
extern: '1'
external_id:
arxiv:
- '2106.05142'
intvolume: ' 139'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2106.05142
month: '08'
oa: 1
oa_version: Preprint
page: 11964-11974
publication: Proceedings of 38th International Conference on Machine Learning
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
scopus_import: '1'
status: public
title: Neighborhood contrastive learning applied to online patient monitoring
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 139
year: '2021'
...
---
_id: '14182'
abstract:
- lang: eng
text: "When machine learning systems meet real world applications, accuracy is only\r\none
of several requirements. In this paper, we assay a complementary\r\nperspective
originating from the increasing availability of pre-trained and\r\nregularly improving
state-of-the-art models. While new improved models develop\r\nat a fast pace,
downstream tasks vary more slowly or stay constant. Assume that\r\nwe have a large
unlabelled data set for which we want to maintain accurate\r\npredictions. Whenever
a new and presumably better ML models becomes available,\r\nwe encounter two problems:
(i) given a limited budget, which data points should\r\nbe re-evaluated using
the new model?; and (ii) if the new predictions differ\r\nfrom the current ones,
should we update? Problem (i) is about compute cost,\r\nwhich matters for very
large data sets and models. Problem (ii) is about\r\nmaintaining consistency of
the predictions, which can be highly relevant for\r\ndownstream applications;
our demand is to avoid negative flips, i.e., changing\r\ncorrect to incorrect
predictions. In this paper, we formalize the Prediction\r\nUpdate Problem and
present an efficient probabilistic approach as answer to the\r\nabove questions.
In extensive experiments on standard classification benchmark\r\ndata sets, we
show that our method outperforms alternative strategies along key\r\nmetrics for
backward-compatible prediction updates."
article_processing_charge: No
author:
- first_name: Frederik
full_name: Träuble, Frederik
last_name: Träuble
- first_name: Julius von
full_name: Kügelgen, Julius von
last_name: Kügelgen
- first_name: Matthäus
full_name: Kleindessner, Matthäus
last_name: Kleindessner
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
- first_name: Bernhard
full_name: Schölkopf, Bernhard
last_name: Schölkopf
- first_name: Peter
full_name: Gehler, Peter
last_name: Gehler
citation:
ama: 'Träuble F, Kügelgen J von, Kleindessner M, Locatello F, Schölkopf B, Gehler
P. Backward-compatible prediction updates: A probabilistic approach. In: 35th
Conference on Neural Information Processing Systems. Vol 34. ; 2021:116-128.'
apa: 'Träuble, F., Kügelgen, J. von, Kleindessner, M., Locatello, F., Schölkopf,
B., & Gehler, P. (2021). Backward-compatible prediction updates: A probabilistic
approach. In 35th Conference on Neural Information Processing Systems (Vol.
34, pp. 116–128). Virtual.'
chicago: 'Träuble, Frederik, Julius von Kügelgen, Matthäus Kleindessner, Francesco
Locatello, Bernhard Schölkopf, and Peter Gehler. “Backward-Compatible Prediction
Updates: A Probabilistic Approach.” In 35th Conference on Neural Information
Processing Systems, 34:116–28, 2021.'
ieee: 'F. Träuble, J. von Kügelgen, M. Kleindessner, F. Locatello, B. Schölkopf,
and P. Gehler, “Backward-compatible prediction updates: A probabilistic approach,”
in 35th Conference on Neural Information Processing Systems, Virtual, 2021,
vol. 34, pp. 116–128.'
ista: 'Träuble F, Kügelgen J von, Kleindessner M, Locatello F, Schölkopf B, Gehler
P. 2021. Backward-compatible prediction updates: A probabilistic approach. 35th
Conference on Neural Information Processing Systems. NeurIPS: Neural Information
Processing Systems vol. 34, 116–128.'
mla: 'Träuble, Frederik, et al. “Backward-Compatible Prediction Updates: A Probabilistic
Approach.” 35th Conference on Neural Information Processing Systems, vol.
34, 2021, pp. 116–28.'
short: F. Träuble, J. von Kügelgen, M. Kleindessner, F. Locatello, B. Schölkopf,
P. Gehler, in:, 35th Conference on Neural Information Processing Systems, 2021,
pp. 116–128.
conference:
end_date: 2021-12-10
location: Virtual
name: 'NeurIPS: Neural Information Processing Systems'
start_date: 2021-12-07
date_created: 2023-08-22T14:05:41Z
date_published: 2021-07-02T00:00:00Z
date_updated: 2023-09-11T11:31:59Z
day: '02'
department:
- _id: FrLo
extern: '1'
external_id:
arxiv:
- '2107.01057'
intvolume: ' 34'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2107.01057
month: '07'
oa: 1
oa_version: Preprint
page: 116-128
publication: 35th Conference on Neural Information Processing Systems
publication_identifier:
isbn:
- '9781713845393'
publication_status: published
quality_controlled: '1'
status: public
title: 'Backward-compatible prediction updates: A probabilistic approach'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 34
year: '2021'
...
---
_id: '14181'
abstract:
- lang: eng
text: Variational Inference makes a trade-off between the capacity of the variational
family and the tractability of finding an approximate posterior distribution.
Instead, Boosting Variational Inference allows practitioners to obtain increasingly
good posterior approximations by spending more compute. The main obstacle to widespread
adoption of Boosting Variational Inference is the amount of resources necessary
to improve over a strong Variational Inference baseline. In our work, we trace
this limitation back to the global curvature of the KL-divergence. We characterize
how the global curvature impacts time and memory consumption, address the problem
with the notion of local curvature, and provide a novel approximate backtracking
algorithm for estimating local curvature. We give new theoretical convergence
rates for our algorithms and provide experimental validation on synthetic and
real-world datasets.
article_processing_charge: No
author:
- first_name: Gideon
full_name: Dresdner, Gideon
last_name: Dresdner
- first_name: Saurav
full_name: Shekhar, Saurav
last_name: Shekhar
- first_name: Fabian
full_name: Pedregosa, Fabian
last_name: Pedregosa
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
- first_name: Gunnar
full_name: Rätsch, Gunnar
last_name: Rätsch
citation:
ama: 'Dresdner G, Shekhar S, Pedregosa F, Locatello F, Rätsch G. Boosting variational
inference with locally adaptive step-sizes. In: Proceedings of the Thirtieth
International Joint Conference on Artificial Intelligence. International Joint
Conferences on Artificial Intelligence; 2021:2337-2343. doi:10.24963/ijcai.2021/322'
apa: 'Dresdner, G., Shekhar, S., Pedregosa, F., Locatello, F., & Rätsch, G.
(2021). Boosting variational inference with locally adaptive step-sizes. In Proceedings
of the Thirtieth International Joint Conference on Artificial Intelligence
(pp. 2337–2343). Montreal, Canada: International Joint Conferences on Artificial
Intelligence. https://doi.org/10.24963/ijcai.2021/322'
chicago: Dresdner, Gideon, Saurav Shekhar, Fabian Pedregosa, Francesco Locatello,
and Gunnar Rätsch. “Boosting Variational Inference with Locally Adaptive Step-Sizes.”
In Proceedings of the Thirtieth International Joint Conference on Artificial
Intelligence, 2337–43. International Joint Conferences on Artificial Intelligence,
2021. https://doi.org/10.24963/ijcai.2021/322.
ieee: G. Dresdner, S. Shekhar, F. Pedregosa, F. Locatello, and G. Rätsch, “Boosting
variational inference with locally adaptive step-sizes,” in Proceedings of
the Thirtieth International Joint Conference on Artificial Intelligence, Montreal,
Canada, 2021, pp. 2337–2343.
ista: 'Dresdner G, Shekhar S, Pedregosa F, Locatello F, Rätsch G. 2021. Boosting
variational inference with locally adaptive step-sizes. Proceedings of the Thirtieth
International Joint Conference on Artificial Intelligence. IJCAI: International
Joint Conference on Artificial Intelligence, 2337–2343.'
mla: Dresdner, Gideon, et al. “Boosting Variational Inference with Locally Adaptive
Step-Sizes.” Proceedings of the Thirtieth International Joint Conference on
Artificial Intelligence, International Joint Conferences on Artificial Intelligence,
2021, pp. 2337–43, doi:10.24963/ijcai.2021/322.
short: G. Dresdner, S. Shekhar, F. Pedregosa, F. Locatello, G. Rätsch, in:, Proceedings
of the Thirtieth International Joint Conference on Artificial Intelligence, International
Joint Conferences on Artificial Intelligence, 2021, pp. 2337–2343.
conference:
end_date: 2021-08-27
location: Montreal, Canada
name: 'IJCAI: International Joint Conference on Artificial Intelligence'
start_date: 2021-08-19
date_created: 2023-08-22T14:05:14Z
date_published: 2021-05-19T00:00:00Z
date_updated: 2023-09-11T11:14:30Z
day: '19'
department:
- _id: FrLo
doi: 10.24963/ijcai.2021/322
extern: '1'
external_id:
arxiv:
- '2105.09240'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.48550/arXiv.2105.09240
month: '05'
oa: 1
oa_version: Published Version
page: 2337-2343
publication: Proceedings of the Thirtieth International Joint Conference on Artificial
Intelligence
publication_identifier:
eisbn:
- '9780999241196'
publication_status: published
publisher: International Joint Conferences on Artificial Intelligence
quality_controlled: '1'
status: public
title: Boosting variational inference with locally adaptive step-sizes
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '14179'
abstract:
- lang: eng
text: Self-supervised representation learning has shown remarkable success in a
number of domains. A common practice is to perform data augmentation via hand-crafted
transformations intended to leave the semantics of the data invariant. We seek
to understand the empirical success of this approach from a theoretical perspective.
We formulate the augmentation process as a latent variable model by postulating
a partition of the latent representation into a content component, which is assumed
invariant to augmentation, and a style component, which is allowed to change.
Unlike prior work on disentanglement and independent component analysis, we allow
for both nontrivial statistical and causal dependencies in the latent space. We
study the identifiability of the latent representation based on pairs of views
of the observations and prove sufficient conditions that allow us to identify
the invariant content partition up to an invertible mapping in both generative
and discriminative settings. We find numerical simulations with dependent latent
variables are consistent with our theory. Lastly, we introduce Causal3DIdent,
a dataset of high-dimensional, visually complex images with rich causal dependencies,
which we use to study the effect of data augmentations performed in practice.
article_processing_charge: No
author:
- first_name: Julius von
full_name: Kügelgen, Julius von
last_name: Kügelgen
- first_name: Yash
full_name: Sharma, Yash
last_name: Sharma
- first_name: Luigi
full_name: Gresele, Luigi
last_name: Gresele
- first_name: Wieland
full_name: Brendel, Wieland
last_name: Brendel
- first_name: Bernhard
full_name: Schölkopf, Bernhard
last_name: Schölkopf
- first_name: Michel
full_name: Besserve, Michel
last_name: Besserve
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
citation:
ama: 'Kügelgen J von, Sharma Y, Gresele L, et al. Self-supervised learning with
data augmentations provably isolates content from style. In: Advances in Neural
Information Processing Systems. Vol 34. ; 2021:16451-16467.'
apa: Kügelgen, J. von, Sharma, Y., Gresele, L., Brendel, W., Schölkopf, B., Besserve,
M., & Locatello, F. (2021). Self-supervised learning with data augmentations
provably isolates content from style. In Advances in Neural Information Processing
Systems (Vol. 34, pp. 16451–16467). Virtual.
chicago: Kügelgen, Julius von, Yash Sharma, Luigi Gresele, Wieland Brendel, Bernhard
Schölkopf, Michel Besserve, and Francesco Locatello. “Self-Supervised Learning
with Data Augmentations Provably Isolates Content from Style.” In Advances
in Neural Information Processing Systems, 34:16451–67, 2021.
ieee: J. von Kügelgen et al., “Self-supervised learning with data augmentations
provably isolates content from style,” in Advances in Neural Information Processing
Systems, Virtual, 2021, vol. 34, pp. 16451–16467.
ista: 'Kügelgen J von, Sharma Y, Gresele L, Brendel W, Schölkopf B, Besserve M,
Locatello F. 2021. Self-supervised learning with data augmentations provably isolates
content from style. Advances in Neural Information Processing Systems. NeurIPS:
Neural Information Processing Systems vol. 34, 16451–16467.'
mla: Kügelgen, Julius von, et al. “Self-Supervised Learning with Data Augmentations
Provably Isolates Content from Style.” Advances in Neural Information Processing
Systems, vol. 34, 2021, pp. 16451–67.
short: J. von Kügelgen, Y. Sharma, L. Gresele, W. Brendel, B. Schölkopf, M. Besserve,
F. Locatello, in:, Advances in Neural Information Processing Systems, 2021, pp.
16451–16467.
conference:
end_date: 2021-12-10
location: Virtual
name: 'NeurIPS: Neural Information Processing Systems'
start_date: 2021-12-07
date_created: 2023-08-22T14:04:36Z
date_published: 2021-06-08T00:00:00Z
date_updated: 2023-09-11T10:33:19Z
day: '08'
department:
- _id: FrLo
extern: '1'
external_id:
arxiv:
- '2106.04619'
intvolume: ' 34'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2106.04619
month: '06'
oa: 1
oa_version: Preprint
page: 16451-16467
publication: Advances in Neural Information Processing Systems
publication_identifier:
isbn:
- '9781713845393'
publication_status: published
quality_controlled: '1'
status: public
title: Self-supervised learning with data augmentations provably isolates content
from style
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 34
year: '2021'
...
---
_id: '14180'
abstract:
- lang: eng
text: 'Modern neural network architectures can leverage large amounts of data to
generalize well within the training distribution. However, they are less capable
of systematic generalization to data drawn from unseen but related distributions,
a feat that is hypothesized to require compositional reasoning and reuse of knowledge.
In this work, we present Neural Interpreters, an architecture that factorizes
inference in a self-attention network as a system of modules, which we call \emph{functions}.
Inputs to the model are routed through a sequence of functions in a way that is
end-to-end learned. The proposed architecture can flexibly compose computation
along width and depth, and lends itself well to capacity extension after training.
To demonstrate the versatility of Neural Interpreters, we evaluate it in two distinct
settings: image classification and visual abstract reasoning on Raven Progressive
Matrices. In the former, we show that Neural Interpreters perform on par with
the vision transformer using fewer parameters, while being transferrable to a
new task in a sample efficient manner. In the latter, we find that Neural Interpreters
are competitive with respect to the state-of-the-art in terms of systematic generalization. '
article_processing_charge: No
author:
- first_name: Nasim
full_name: Rahaman, Nasim
last_name: Rahaman
- first_name: Muhammad Waleed
full_name: Gondal, Muhammad Waleed
last_name: Gondal
- first_name: Shruti
full_name: Joshi, Shruti
last_name: Joshi
- first_name: Peter
full_name: Gehler, Peter
last_name: Gehler
- first_name: Yoshua
full_name: Bengio, Yoshua
last_name: Bengio
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
- first_name: Bernhard
full_name: Schölkopf, Bernhard
last_name: Schölkopf
citation:
ama: 'Rahaman N, Gondal MW, Joshi S, et al. Dynamic inference with neural interpreters.
In: Advances in Neural Information Processing Systems. Vol 34. ; 2021:10985-10998.'
apa: Rahaman, N., Gondal, M. W., Joshi, S., Gehler, P., Bengio, Y., Locatello, F.,
& Schölkopf, B. (2021). Dynamic inference with neural interpreters. In Advances
in Neural Information Processing Systems (Vol. 34, pp. 10985–10998). Virtual.
chicago: Rahaman, Nasim, Muhammad Waleed Gondal, Shruti Joshi, Peter Gehler, Yoshua
Bengio, Francesco Locatello, and Bernhard Schölkopf. “Dynamic Inference with Neural
Interpreters.” In Advances in Neural Information Processing Systems, 34:10985–98,
2021.
ieee: N. Rahaman et al., “Dynamic inference with neural interpreters,” in
Advances in Neural Information Processing Systems, Virtual, 2021, vol.
34, pp. 10985–10998.
ista: 'Rahaman N, Gondal MW, Joshi S, Gehler P, Bengio Y, Locatello F, Schölkopf
B. 2021. Dynamic inference with neural interpreters. Advances in Neural Information
Processing Systems. NeurIPS: Neural Information Processing Systems vol. 34, 10985–10998.'
mla: Rahaman, Nasim, et al. “Dynamic Inference with Neural Interpreters.” Advances
in Neural Information Processing Systems, vol. 34, 2021, pp. 10985–98.
short: N. Rahaman, M.W. Gondal, S. Joshi, P. Gehler, Y. Bengio, F. Locatello, B.
Schölkopf, in:, Advances in Neural Information Processing Systems, 2021, pp. 10985–10998.
conference:
end_date: 2021-12-10
location: Virtual
name: 'NeurIPS: Neural Information Processing Systems'
start_date: 2021-12-07
date_created: 2023-08-22T14:04:55Z
date_published: 2021-10-12T00:00:00Z
date_updated: 2023-09-11T11:33:46Z
day: '12'
department:
- _id: FrLo
extern: '1'
external_id:
arxiv:
- '2110.06399'
intvolume: ' 34'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.48550/arXiv.2110.06399
month: '10'
oa: 1
oa_version: Preprint
page: 10985-10998
publication: Advances in Neural Information Processing Systems
publication_identifier:
isbn:
- '9781713845393'
publication_status: published
quality_controlled: '1'
status: public
title: Dynamic inference with neural interpreters
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 34
year: '2021'
...
---
_id: '14117'
abstract:
- lang: eng
text: 'The two fields of machine learning and graphical causality arose and are
developed separately. However, there is, now, cross-pollination and increasing
interest in both fields to benefit from the advances of the other. In this article,
we review fundamental concepts of causal inference and relate them to crucial
open problems of machine learning, including transfer and generalization, thereby
assaying how causality can contribute to modern machine learning research. This
also applies in the opposite direction: we note that most work in causality starts
from the premise that the causal variables are given. A central problem for AI
and causality is, thus, causal representation learning, that is, the discovery
of high-level causal variables from low-level observations. Finally, we delineate
some implications of causality for machine learning and propose key research areas
at the intersection of both communities.'
article_processing_charge: No
article_type: original
author:
- first_name: Bernhard
full_name: Scholkopf, Bernhard
last_name: Scholkopf
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
- first_name: Stefan
full_name: Bauer, Stefan
last_name: Bauer
- first_name: Nan Rosemary
full_name: Ke, Nan Rosemary
last_name: Ke
- first_name: Nal
full_name: Kalchbrenner, Nal
last_name: Kalchbrenner
- first_name: Anirudh
full_name: Goyal, Anirudh
last_name: Goyal
- first_name: Yoshua
full_name: Bengio, Yoshua
last_name: Bengio
citation:
ama: Scholkopf B, Locatello F, Bauer S, et al. Toward causal representation learning.
Proceedings of the IEEE. 2021;109(5):612-634. doi:10.1109/jproc.2021.3058954
apa: Scholkopf, B., Locatello, F., Bauer, S., Ke, N. R., Kalchbrenner, N., Goyal,
A., & Bengio, Y. (2021). Toward causal representation learning. Proceedings
of the IEEE. Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/jproc.2021.3058954
chicago: Scholkopf, Bernhard, Francesco Locatello, Stefan Bauer, Nan Rosemary Ke,
Nal Kalchbrenner, Anirudh Goyal, and Yoshua Bengio. “Toward Causal Representation
Learning.” Proceedings of the IEEE. Institute of Electrical and Electronics
Engineers, 2021. https://doi.org/10.1109/jproc.2021.3058954.
ieee: B. Scholkopf et al., “Toward causal representation learning,” Proceedings
of the IEEE, vol. 109, no. 5. Institute of Electrical and Electronics Engineers,
pp. 612–634, 2021.
ista: Scholkopf B, Locatello F, Bauer S, Ke NR, Kalchbrenner N, Goyal A, Bengio
Y. 2021. Toward causal representation learning. Proceedings of the IEEE. 109(5),
612–634.
mla: Scholkopf, Bernhard, et al. “Toward Causal Representation Learning.” Proceedings
of the IEEE, vol. 109, no. 5, Institute of Electrical and Electronics Engineers,
2021, pp. 612–34, doi:10.1109/jproc.2021.3058954.
short: B. Scholkopf, F. Locatello, S. Bauer, N.R. Ke, N. Kalchbrenner, A. Goyal,
Y. Bengio, Proceedings of the IEEE 109 (2021) 612–634.
date_created: 2023-08-21T12:19:30Z
date_published: 2021-05-01T00:00:00Z
date_updated: 2023-09-11T11:43:35Z
day: '01'
department:
- _id: FrLo
doi: 10.1109/jproc.2021.3058954
extern: '1'
external_id:
arxiv:
- '2102.11107'
intvolume: ' 109'
issue: '5'
keyword:
- Electrical and Electronic Engineering
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.1109/JPROC.2021.3058954
month: '05'
oa: 1
oa_version: Published Version
page: 612-634
publication: Proceedings of the IEEE
publication_identifier:
eissn:
- 1558-2256
issn:
- 0018-9219
publication_status: published
publisher: Institute of Electrical and Electronics Engineers
quality_controlled: '1'
scopus_import: '1'
status: public
title: Toward causal representation learning
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 109
year: '2021'
...
---
_id: '14178'
abstract:
- lang: eng
text: Learning meaningful representations that disentangle the underlying structure
of the data generating process is considered to be of key importance in machine
learning. While disentangled representations were found to be useful for diverse
tasks such as abstract reasoning and fair classification, their scalability and
real-world impact remain questionable. We introduce a new high-resolution dataset
with 1M simulated images and over 1,800 annotated real-world images of the same
setup. In contrast to previous work, this new dataset exhibits correlations, a
complex underlying structure, and allows to evaluate transfer to unseen simulated
and real-world settings where the encoder i) remains in distribution or ii) is
out of distribution. We propose new architectures in order to scale disentangled
representation learning to realistic high-resolution settings and conduct a large-scale
empirical study of disentangled representations on this dataset. We observe that
disentanglement is a good predictor for out-of-distribution (OOD) task performance.
article_processing_charge: No
author:
- first_name: Andrea
full_name: Dittadi, Andrea
last_name: Dittadi
- first_name: Frederik
full_name: Träuble, Frederik
last_name: Träuble
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
- first_name: Manuel
full_name: Wüthrich, Manuel
last_name: Wüthrich
- first_name: Vaibhav
full_name: Agrawal, Vaibhav
last_name: Agrawal
- first_name: Ole
full_name: Winther, Ole
last_name: Winther
- first_name: Stefan
full_name: Bauer, Stefan
last_name: Bauer
- first_name: Bernhard
full_name: Schölkopf, Bernhard
last_name: Schölkopf
citation:
ama: 'Dittadi A, Träuble F, Locatello F, et al. On the transfer of disentangled
representations in realistic settings. In: The Ninth International Conference
on Learning Representations. ; 2021.'
apa: Dittadi, A., Träuble, F., Locatello, F., Wüthrich, M., Agrawal, V., Winther,
O., … Schölkopf, B. (2021). On the transfer of disentangled representations in
realistic settings. In The Ninth International Conference on Learning Representations.
Virtual.
chicago: Dittadi, Andrea, Frederik Träuble, Francesco Locatello, Manuel Wüthrich,
Vaibhav Agrawal, Ole Winther, Stefan Bauer, and Bernhard Schölkopf. “On the Transfer
of Disentangled Representations in Realistic Settings.” In The Ninth International
Conference on Learning Representations, 2021.
ieee: A. Dittadi et al., “On the transfer of disentangled representations
in realistic settings,” in The Ninth International Conference on Learning Representations,
Virtual, 2021.
ista: 'Dittadi A, Träuble F, Locatello F, Wüthrich M, Agrawal V, Winther O, Bauer
S, Schölkopf B. 2021. On the transfer of disentangled representations in realistic
settings. The Ninth International Conference on Learning Representations. ICLR:
International Conference on Learning Representations.'
mla: Dittadi, Andrea, et al. “On the Transfer of Disentangled Representations in
Realistic Settings.” The Ninth International Conference on Learning Representations,
2021.
short: A. Dittadi, F. Träuble, F. Locatello, M. Wüthrich, V. Agrawal, O. Winther,
S. Bauer, B. Schölkopf, in:, The Ninth International Conference on Learning Representations,
2021.
conference:
end_date: 2021-05-07
location: Virtual
name: 'ICLR: International Conference on Learning Representations'
start_date: 2021-05-03
date_created: 2023-08-22T14:04:16Z
date_published: 2021-05-04T00:00:00Z
date_updated: 2023-09-11T10:55:30Z
day: '04'
department:
- _id: FrLo
extern: '1'
external_id:
arxiv:
- '2010.14407'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2010.14407
month: '05'
oa: 1
oa_version: Preprint
publication: The Ninth International Conference on Learning Representations
publication_status: published
quality_controlled: '1'
status: public
title: On the transfer of disentangled representations in realistic settings
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '14221'
abstract:
- lang: eng
text: 'The world is structured in countless ways. It may be prudent to enforce corresponding
structural properties to a learning algorithm''s solution, such as incorporating
prior beliefs, natural constraints, or causal structures. Doing so may translate
to faster, more accurate, and more flexible models, which may directly relate
to real-world impact. In this dissertation, we consider two different research
areas that concern structuring a learning algorithm''s solution: when the structure
is known and when it has to be discovered.'
article_number: '2111.13693'
article_processing_charge: No
author:
- first_name: Francesco
full_name: Locatello, Francesco
id: 26cfd52f-2483-11ee-8040-88983bcc06d4
last_name: Locatello
orcid: 0000-0002-4850-0683
citation:
ama: Locatello F. Enforcing and discovering structure in machine learning. arXiv.
doi:10.48550/arXiv.2111.13693
apa: Locatello, F. (n.d.). Enforcing and discovering structure in machine learning.
arXiv. https://doi.org/10.48550/arXiv.2111.13693
chicago: Locatello, Francesco. “Enforcing and Discovering Structure in Machine Learning.”
ArXiv, n.d. https://doi.org/10.48550/arXiv.2111.13693.
ieee: F. Locatello, “Enforcing and discovering structure in machine learning,” arXiv.
.
ista: Locatello F. Enforcing and discovering structure in machine learning. arXiv,
2111.13693.
mla: Locatello, Francesco. “Enforcing and Discovering Structure in Machine Learning.”
ArXiv, 2111.13693, doi:10.48550/arXiv.2111.13693.
short: F. Locatello, ArXiv (n.d.).
date_created: 2023-08-22T14:23:35Z
date_published: 2021-11-26T00:00:00Z
date_updated: 2023-09-12T07:04:44Z
day: '26'
department:
- _id: FrLo
doi: 10.48550/arXiv.2111.13693
extern: '1'
external_id:
arxiv:
- '2111.13693'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.48550/arXiv.2111.13693
month: '11'
oa: 1
oa_version: Preprint
publication: arXiv
publication_status: submitted
status: public
title: Enforcing and discovering structure in machine learning
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '14278'
abstract:
- lang: eng
text: 'The Birkhoff conjecture says that the boundary of a strictly convex integrable
billiard table is necessarily an ellipse. In this article, we consider a stronger
notion of integrability, namely, integrability close to the boundary, and prove
a local version of this conjecture: a small perturbation of almost every ellipse
that preserves integrability near the boundary, is itself an ellipse. We apply
this result to study local spectral rigidity of ellipses using the connection
between the wave trace of the Laplacian and the dynamics near the boundary and
establish rigidity for almost all of them.'
article_number: '2111.12171'
article_processing_charge: No
author:
- first_name: Illya
full_name: Koval, Illya
id: 2eed1f3b-896a-11ed-bdf8-93c7c4bf159e
last_name: Koval
citation:
ama: Koval I. Local strong Birkhoff conjecture and local spectral rigidity of almost
every ellipse. arXiv. doi:10.48550/ARXIV.2111.12171
apa: Koval, I. (n.d.). Local strong Birkhoff conjecture and local spectral rigidity
of almost every ellipse. arXiv. https://doi.org/10.48550/ARXIV.2111.12171
chicago: Koval, Illya. “Local Strong Birkhoff Conjecture and Local Spectral Rigidity
of Almost Every Ellipse.” ArXiv, n.d. https://doi.org/10.48550/ARXIV.2111.12171.
ieee: I. Koval, “Local strong Birkhoff conjecture and local spectral rigidity of
almost every ellipse,” arXiv. .
ista: Koval I. Local strong Birkhoff conjecture and local spectral rigidity of almost
every ellipse. arXiv, 2111.12171.
mla: Koval, Illya. “Local Strong Birkhoff Conjecture and Local Spectral Rigidity
of Almost Every Ellipse.” ArXiv, 2111.12171, doi:10.48550/ARXIV.2111.12171.
short: I. Koval, ArXiv (n.d.).
date_created: 2023-09-06T08:35:43Z
date_published: 2021-11-23T00:00:00Z
date_updated: 2023-09-15T06:44:00Z
day: '23'
department:
- _id: GradSch
doi: 10.48550/ARXIV.2111.12171
external_id:
arxiv:
- '2111.12171'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.48550/arXiv.2111.12171
month: '11'
oa: 1
oa_version: Preprint
publication: arXiv
publication_status: submitted
status: public
title: Local strong Birkhoff conjecture and local spectral rigidity of almost every
ellipse
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '10199'
abstract:
- lang: eng
text: The design and verification of concurrent systems remains an open challenge
due to the non-determinism that arises from the inter-process communication. In
particular, concurrent programs are notoriously difficult both to be written correctly
and to be analyzed formally, as complex thread interaction has to be accounted
for. The difficulties are further exacerbated when concurrent programs get executed
on modern-day hardware, which contains various buffering and caching mechanisms
for efficiency reasons. This causes further subtle non-determinism, which can
often produce very unintuitive behavior of the concurrent programs. Model checking
is at the forefront of tackling the verification problem, where the task is to
decide, given as input a concurrent system and a desired property, whether the
system satisfies the property. The inherent state-space explosion problem in model
checking of concurrent systems causes naïve explicit methods not to scale, thus
more inventive methods are required. One such method is stateless model checking
(SMC), which explores in memory-efficient manner the program executions rather
than the states of the program. State-of-the-art SMC is typically coupled with
partial order reduction (POR) techniques, which argue that certain executions
provably produce identical system behavior, thus limiting the amount of executions
one needs to explore in order to cover all possible behaviors. Another method
to tackle the state-space explosion is symbolic model checking, where the considered
techniques operate on a succinct implicit representation of the input system rather
than explicitly accessing the system. In this thesis we present new techniques
for verification of concurrent systems. We present several novel POR methods for
SMC of concurrent programs under various models of semantics, some of which account
for write-buffering mechanisms. Additionally, we present novel algorithms for
symbolic model checking of finite-state concurrent systems, where the desired
property of the systems is to ensure a formally defined notion of fairness.
acknowledged_ssus:
- _id: SSU
alternative_title:
- ISTA Thesis
article_processing_charge: No
author:
- first_name: Viktor
full_name: Toman, Viktor
id: 3AF3DA7C-F248-11E8-B48F-1D18A9856A87
last_name: Toman
orcid: 0000-0001-9036-063X
citation:
ama: Toman V. Improved verification techniques for concurrent systems. 2021. doi:10.15479/at:ista:10199
apa: Toman, V. (2021). Improved verification techniques for concurrent systems.
Institute of Science and Technology Austria. https://doi.org/10.15479/at:ista:10199
chicago: Toman, Viktor. “Improved Verification Techniques for Concurrent Systems.”
Institute of Science and Technology Austria, 2021. https://doi.org/10.15479/at:ista:10199.
ieee: V. Toman, “Improved verification techniques for concurrent systems,” Institute
of Science and Technology Austria, 2021.
ista: Toman V. 2021. Improved verification techniques for concurrent systems. Institute
of Science and Technology Austria.
mla: Toman, Viktor. Improved Verification Techniques for Concurrent Systems.
Institute of Science and Technology Austria, 2021, doi:10.15479/at:ista:10199.
short: V. Toman, Improved Verification Techniques for Concurrent Systems, Institute
of Science and Technology Austria, 2021.
date_created: 2021-10-29T20:09:01Z
date_published: 2021-10-31T00:00:00Z
date_updated: 2023-09-19T09:59:54Z
day: '31'
ddc:
- '000'
degree_awarded: PhD
department:
- _id: GradSch
- _id: KrCh
doi: 10.15479/at:ista:10199
ec_funded: 1
file:
- access_level: open_access
checksum: 4f412a1ee60952221b499a4b1268df35
content_type: application/pdf
creator: vtoman
date_created: 2021-11-08T14:12:22Z
date_updated: 2021-11-08T14:12:22Z
file_id: '10225'
file_name: toman_th_final.pdf
file_size: 2915234
relation: main_file
- access_level: closed
checksum: 9584943f99127be2dd2963f6784c37d4
content_type: application/zip
creator: vtoman
date_created: 2021-11-08T14:12:46Z
date_updated: 2021-11-09T09:00:50Z
file_id: '10226'
file_name: toman_thesis.zip
file_size: 8616056
relation: source_file
file_date_updated: 2021-11-09T09:00:50Z
has_accepted_license: '1'
keyword:
- concurrency
- verification
- model checking
language:
- iso: eng
month: '10'
oa: 1
oa_version: Published Version
page: '166'
project:
- _id: 2564DBCA-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '665385'
name: International IST Doctoral Program
- _id: 25F2ACDE-B435-11E9-9278-68D0E5697425
call_identifier: FWF
grant_number: S11402-N23
name: Rigorous Systems Engineering
- _id: 25892FC0-B435-11E9-9278-68D0E5697425
grant_number: ICT15-003
name: Efficient Algorithms for Computer Aided Verification
- _id: 0599E47C-7A3F-11EA-A408-12923DDC885E
call_identifier: H2020
grant_number: '863818'
name: 'Formal Methods for Stochastic Models: Algorithms and Applications'
publication_identifier:
issn:
- 2663-337X
publication_status: published
publisher: Institute of Science and Technology Austria
related_material:
record:
- id: '10190'
relation: part_of_dissertation
status: public
- id: '10191'
relation: part_of_dissertation
status: public
- id: '9987'
relation: part_of_dissertation
status: public
- id: '141'
relation: part_of_dissertation
status: public
status: public
supervisor:
- first_name: Krishnendu
full_name: Chatterjee, Krishnendu
id: 2E5DCA20-F248-11E8-B48F-1D18A9856A87
last_name: Chatterjee
orcid: 0000-0002-4561-241X
title: Improved verification techniques for concurrent systems
type: dissertation
user_id: c635000d-4b10-11ee-a964-aac5a93f6ac1
year: '2021'
...
---
_id: '8429'
abstract:
- lang: eng
text: We develop a Bayesian model (BayesRR-RC) that provides robust SNP-heritability
estimation, an alternative to marker discovery, and accurate genomic prediction,
taking 22 seconds per iteration to estimate 8.4 million SNP-effects and 78 SNP-heritability
parameters in the UK Biobank. We find that only ≤10% of the genetic variation
captured for height, body mass index, cardiovascular disease, and type 2 diabetes
is attributable to proximal regulatory regions within 10kb upstream of genes,
while 12-25% is attributed to coding regions, 32–44% to introns, and 22-28% to
distal 10-500kb upstream regions. Up to 24% of all cis and coding regions of each
chromosome are associated with each trait, with over 3,100 independent exonic
and intronic regions and over 5,400 independent regulatory regions having ≥95%
probability of contributing ≥0.001% to the genetic variance of these four traits.
Our open-source software (GMRM) provides a scalable alternative to current approaches
for biobank data.
acknowledgement: This project was funded by an SNSF Eccellenza Grant to MRR (PCEGP3-181181),
and by core funding from the Institute of Science and Technology Austria. We would
like to thank the participants of the cohort studies, and the Ecole Polytechnique
Federal Lausanne (EPFL) SCITAS for their excellent compute resources, their generosity
with their time and the kindness of their support. P.M.V. acknowledges funding from
the Australian National Health and Medical Research Council (1113400) and the Australian
Research Council (FL180100072). L.R. acknowledges funding from the Kjell & Märta
Beijer Foundation (Stockholm, Sweden). We also would like to acknowledge Simone
Rubinacci, Oliver Delanau, Alexander Terenin, Eleonora Porcu, and Mike Goddard for
their useful comments and suggestions.
article_number: '6972'
article_processing_charge: No
article_type: original
author:
- first_name: Marion
full_name: Patxot, Marion
last_name: Patxot
- first_name: Daniel
full_name: Trejo Banos, Daniel
last_name: Trejo Banos
- first_name: Athanasios
full_name: Kousathanas, Athanasios
last_name: Kousathanas
- first_name: Etienne J
full_name: Orliac, Etienne J
last_name: Orliac
- first_name: Sven E
full_name: Ojavee, Sven E
last_name: Ojavee
- first_name: Gerhard
full_name: Moser, Gerhard
last_name: Moser
- first_name: Julia
full_name: Sidorenko, Julia
last_name: Sidorenko
- first_name: Zoltan
full_name: Kutalik, Zoltan
last_name: Kutalik
- first_name: Reedik
full_name: Magi, Reedik
last_name: Magi
- first_name: Peter M
full_name: Visscher, Peter M
last_name: Visscher
- first_name: Lars
full_name: Ronnegard, Lars
last_name: Ronnegard
- first_name: Matthew Richard
full_name: Robinson, Matthew Richard
id: E5D42276-F5DA-11E9-8E24-6303E6697425
last_name: Robinson
orcid: 0000-0001-8982-8813
citation:
ama: Patxot M, Trejo Banos D, Kousathanas A, et al. Probabilistic inference of the
genetic architecture underlying functional enrichment of complex traits. Nature
Communications. 2021;12(1). doi:10.1038/s41467-021-27258-9
apa: Patxot, M., Trejo Banos, D., Kousathanas, A., Orliac, E. J., Ojavee, S. E.,
Moser, G., … Robinson, M. R. (2021). Probabilistic inference of the genetic architecture
underlying functional enrichment of complex traits. Nature Communications.
Springer Nature. https://doi.org/10.1038/s41467-021-27258-9
chicago: Patxot, Marion, Daniel Trejo Banos, Athanasios Kousathanas, Etienne J Orliac,
Sven E Ojavee, Gerhard Moser, Julia Sidorenko, et al. “Probabilistic Inference
of the Genetic Architecture Underlying Functional Enrichment of Complex Traits.”
Nature Communications. Springer Nature, 2021. https://doi.org/10.1038/s41467-021-27258-9.
ieee: M. Patxot et al., “Probabilistic inference of the genetic architecture
underlying functional enrichment of complex traits,” Nature Communications,
vol. 12, no. 1. Springer Nature, 2021.
ista: Patxot M, Trejo Banos D, Kousathanas A, Orliac EJ, Ojavee SE, Moser G, Sidorenko
J, Kutalik Z, Magi R, Visscher PM, Ronnegard L, Robinson MR. 2021. Probabilistic
inference of the genetic architecture underlying functional enrichment of complex
traits. Nature Communications. 12(1), 6972.
mla: Patxot, Marion, et al. “Probabilistic Inference of the Genetic Architecture
Underlying Functional Enrichment of Complex Traits.” Nature Communications,
vol. 12, no. 1, 6972, Springer Nature, 2021, doi:10.1038/s41467-021-27258-9.
short: M. Patxot, D. Trejo Banos, A. Kousathanas, E.J. Orliac, S.E. Ojavee, G. Moser,
J. Sidorenko, Z. Kutalik, R. Magi, P.M. Visscher, L. Ronnegard, M.R. Robinson,
Nature Communications 12 (2021).
date_created: 2020-09-17T10:52:38Z
date_published: 2021-11-30T00:00:00Z
date_updated: 2023-09-26T10:36:14Z
day: '30'
ddc:
- '610'
department:
- _id: MaRo
doi: 10.1038/s41467-021-27258-9
external_id:
isi:
- '000724450600023'
file:
- access_level: open_access
checksum: 384681be17aff902c149a48f52d13d4f
content_type: application/pdf
creator: cchlebak
date_created: 2021-12-06T07:47:11Z
date_updated: 2021-12-06T07:47:11Z
file_id: '10419'
file_name: 2021_NatComm_Paxtot.pdf
file_size: 6519771
relation: main_file
success: 1
file_date_updated: 2021-12-06T07:47:11Z
has_accepted_license: '1'
intvolume: ' 12'
isi: 1
issue: '1'
language:
- iso: eng
license: https://creativecommons.org/licenses/by/4.0/
month: '11'
oa: 1
oa_version: Published Version
publication: Nature Communications
publication_identifier:
eissn:
- 2041-1723
publication_status: published
publisher: Springer Nature
quality_controlled: '1'
related_material:
record:
- id: '13063'
relation: research_data
status: public
scopus_import: '1'
status: public
title: Probabilistic inference of the genetic architecture underlying functional enrichment
of complex traits
tmp:
image: /images/cc_by.png
legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
short: CC BY (4.0)
type: journal_article
user_id: 4359f0d1-fa6c-11eb-b949-802e58b17ae8
volume: 12
year: '2021'
...
---
_id: '10854'
abstract:
- lang: eng
text: "Consider a distributed task where the communication network is fixed but
the local inputs given to the nodes of the distributed system may change over
time. In this work, we explore the following question: if some of the local inputs
change, can an existing solution be updated efficiently, in a dynamic and distributed
manner?\r\nTo address this question, we define the batch dynamic CONGEST model
in which we are given a bandwidth-limited communication network and a dynamic
edge labelling defines the problem input. The task is to maintain a solution to
a graph problem on the labelled graph under batch changes. We investigate, when
a batch of alpha edge label changes arrive, - how much time as a function of alpha
we need to update an existing solution, and - how much information the nodes have
to keep in local memory between batches in order to update the solution quickly.\r\nOur
work lays the foundations for the theory of input-dynamic distributed network
algorithms. We give a general picture of the complexity landscape in this model,
design both universal algorithms and algorithms for concrete problems, and present
a general framework for lower bounds. The diverse time complexity of our model
spans from constant time, through time polynomial in alpha, and to alpha time,
which we show to be enough for any task."
acknowledgement: We thank Jukka Suomela for discussions. We also thank our shepherd
Mohammad Hajiesmaili and the reviewers for their time and suggestions on how to
improve the paper. This project has received funding from the European Research
Council (ERC) under the European Union’s Horizon 2020 research and innovation programme
(grant agreement No 805223 ScaleML), from the European Union’s Horizon 2020 research
and innovation programme under the Marie Skłodowska–Curie grant agreement No. 840605,
from the Vienna Science and Technology Fund (WWTF) project WHATIF, ICT19-045, 2020-2024,
and from the Austrian Science Fund (FWF) and netIDEE SCIENCE project P 33775-N.
article_processing_charge: No
author:
- first_name: Klaus-Tycho
full_name: Foerster, Klaus-Tycho
last_name: Foerster
- first_name: Janne
full_name: Korhonen, Janne
id: C5402D42-15BC-11E9-A202-CA2BE6697425
last_name: Korhonen
- first_name: Ami
full_name: Paz, Ami
last_name: Paz
- first_name: Joel
full_name: Rybicki, Joel
id: 334EFD2E-F248-11E8-B48F-1D18A9856A87
last_name: Rybicki
orcid: 0000-0002-6432-6646
- first_name: Stefan
full_name: Schmid, Stefan
last_name: Schmid
citation:
ama: 'Foerster K-T, Korhonen J, Paz A, Rybicki J, Schmid S. Input-dynamic distributed
algorithms for communication networks. In: Abstract Proceedings of the 2021
ACM SIGMETRICS / International Conference on Measurement and Modeling of Computer
Systems. Association for Computing Machinery; 2021:71-72. doi:10.1145/3410220.3453923'
apa: 'Foerster, K.-T., Korhonen, J., Paz, A., Rybicki, J., & Schmid, S. (2021).
Input-dynamic distributed algorithms for communication networks. In Abstract
Proceedings of the 2021 ACM SIGMETRICS / International Conference on Measurement
and Modeling of Computer Systems (pp. 71–72). Virtual, Online: Association
for Computing Machinery. https://doi.org/10.1145/3410220.3453923'
chicago: Foerster, Klaus-Tycho, Janne Korhonen, Ami Paz, Joel Rybicki, and Stefan
Schmid. “Input-Dynamic Distributed Algorithms for Communication Networks.” In
Abstract Proceedings of the 2021 ACM SIGMETRICS / International Conference
on Measurement and Modeling of Computer Systems, 71–72. Association for Computing
Machinery, 2021. https://doi.org/10.1145/3410220.3453923.
ieee: K.-T. Foerster, J. Korhonen, A. Paz, J. Rybicki, and S. Schmid, “Input-dynamic
distributed algorithms for communication networks,” in Abstract Proceedings
of the 2021 ACM SIGMETRICS / International Conference on Measurement and Modeling
of Computer Systems, Virtual, Online, 2021, pp. 71–72.
ista: 'Foerster K-T, Korhonen J, Paz A, Rybicki J, Schmid S. 2021. Input-dynamic
distributed algorithms for communication networks. Abstract Proceedings of the
2021 ACM SIGMETRICS / International Conference on Measurement and Modeling of
Computer Systems. SIGMETRICS: International Conference on Measurement and Modeling
of Computer Systems, 71–72.'
mla: Foerster, Klaus-Tycho, et al. “Input-Dynamic Distributed Algorithms for Communication
Networks.” Abstract Proceedings of the 2021 ACM SIGMETRICS / International
Conference on Measurement and Modeling of Computer Systems, Association for
Computing Machinery, 2021, pp. 71–72, doi:10.1145/3410220.3453923.
short: K.-T. Foerster, J. Korhonen, A. Paz, J. Rybicki, S. Schmid, in:, Abstract
Proceedings of the 2021 ACM SIGMETRICS / International Conference on Measurement
and Modeling of Computer Systems, Association for Computing Machinery, 2021, pp.
71–72.
conference:
end_date: 2021-06-18
location: Virtual, Online
name: 'SIGMETRICS: International Conference on Measurement and Modeling of Computer
Systems'
start_date: 2021-06-14
date_created: 2022-03-18T08:48:41Z
date_published: 2021-05-01T00:00:00Z
date_updated: 2023-09-26T10:40:55Z
day: '01'
department:
- _id: DaAl
doi: 10.1145/3410220.3453923
ec_funded: 1
external_id:
arxiv:
- '2005.07637'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2005.07637
month: '05'
oa: 1
oa_version: Preprint
page: 71-72
project:
- _id: 268A44D6-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '805223'
name: Elastic Coordination for Scalable Machine Learning
- _id: 26A5D39A-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '840605'
name: Coordination in constrained and natural distributed systems
publication: Abstract Proceedings of the 2021 ACM SIGMETRICS / International Conference
on Measurement and Modeling of Computer Systems
publication_identifier:
isbn:
- '9781450380720'
publication_status: published
publisher: Association for Computing Machinery
quality_controlled: '1'
related_material:
record:
- id: '10855'
relation: extended_version
status: public
scopus_import: '1'
status: public
title: Input-dynamic distributed algorithms for communication networks
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '10855'
abstract:
- lang: eng
text: 'Consider a distributed task where the communication network is fixed but
the local inputs given to the nodes of the distributed system may change over
time. In this work, we explore the following question: if some of the local inputs
change, can an existing solution be updated efficiently, in a dynamic and distributed
manner? To address this question, we define the batch dynamic \congest model in
which we are given a bandwidth-limited communication network and a dynamic edge
labelling defines the problem input. The task is to maintain a solution to a graph
problem on the labeled graph under batch changes. We investigate, when a batch
of α edge label changes arrive, \beginitemize \item how much time as a function
of α we need to update an existing solution, and \item how much information the
nodes have to keep in local memory between batches in order to update the solution
quickly. \enditemize Our work lays the foundations for the theory of input-dynamic
distributed network algorithms. We give a general picture of the complexity landscape
in this model, design both universal algorithms and algorithms for concrete problems,
and present a general framework for lower bounds. In particular, we derive non-trivial
upper bounds for two selected, contrasting problems: maintaining a minimum spanning
tree and detecting cliques.'
acknowledgement: "We thank Jukka Suomela for discussions. We also thank our shepherd
Mohammad Hajiesmaili\r\nand the reviewers for their time and suggestions on how
to improve the paper. This project\r\nhas received funding from the European Research
Council (ERC) under the European Union’s\r\nHorizon 2020 research and innovation
programme (grant agreement No 805223 ScaleML), from the European Union’s Horizon
2020 research and innovation programme under the Marie\r\nSk lodowska–Curie grant
agreement No. 840605, from the Vienna Science and Technology Fund (WWTF) project
WHATIF, ICT19-045, 2020-2024, and from the Austrian Science Fund (FWF) and netIDEE
SCIENCE project P 33775-N."
article_processing_charge: No
article_type: original
author:
- first_name: Klaus-Tycho
full_name: Foerster, Klaus-Tycho
last_name: Foerster
- first_name: Janne
full_name: Korhonen, Janne
id: C5402D42-15BC-11E9-A202-CA2BE6697425
last_name: Korhonen
- first_name: Ami
full_name: Paz, Ami
last_name: Paz
- first_name: Joel
full_name: Rybicki, Joel
id: 334EFD2E-F248-11E8-B48F-1D18A9856A87
last_name: Rybicki
orcid: 0000-0002-6432-6646
- first_name: Stefan
full_name: Schmid, Stefan
last_name: Schmid
citation:
ama: Foerster K-T, Korhonen J, Paz A, Rybicki J, Schmid S. Input-dynamic distributed
algorithms for communication networks. Proceedings of the ACM on Measurement
and Analysis of Computing Systems. 2021;5(1):1-33. doi:10.1145/3447384
apa: Foerster, K.-T., Korhonen, J., Paz, A., Rybicki, J., & Schmid, S. (2021).
Input-dynamic distributed algorithms for communication networks. Proceedings
of the ACM on Measurement and Analysis of Computing Systems. Association for
Computing Machinery. https://doi.org/10.1145/3447384
chicago: Foerster, Klaus-Tycho, Janne Korhonen, Ami Paz, Joel Rybicki, and Stefan
Schmid. “Input-Dynamic Distributed Algorithms for Communication Networks.” Proceedings
of the ACM on Measurement and Analysis of Computing Systems. Association for
Computing Machinery, 2021. https://doi.org/10.1145/3447384.
ieee: K.-T. Foerster, J. Korhonen, A. Paz, J. Rybicki, and S. Schmid, “Input-dynamic
distributed algorithms for communication networks,” Proceedings of the ACM
on Measurement and Analysis of Computing Systems, vol. 5, no. 1. Association
for Computing Machinery, pp. 1–33, 2021.
ista: Foerster K-T, Korhonen J, Paz A, Rybicki J, Schmid S. 2021. Input-dynamic
distributed algorithms for communication networks. Proceedings of the ACM on Measurement
and Analysis of Computing Systems. 5(1), 1–33.
mla: Foerster, Klaus-Tycho, et al. “Input-Dynamic Distributed Algorithms for Communication
Networks.” Proceedings of the ACM on Measurement and Analysis of Computing
Systems, vol. 5, no. 1, Association for Computing Machinery, 2021, pp. 1–33,
doi:10.1145/3447384.
short: K.-T. Foerster, J. Korhonen, A. Paz, J. Rybicki, S. Schmid, Proceedings of
the ACM on Measurement and Analysis of Computing Systems 5 (2021) 1–33.
date_created: 2022-03-18T09:10:27Z
date_published: 2021-03-01T00:00:00Z
date_updated: 2023-09-26T10:40:55Z
day: '01'
department:
- _id: DaAl
doi: 10.1145/3447384
ec_funded: 1
external_id:
arxiv:
- '2005.07637'
intvolume: ' 5'
issue: '1'
keyword:
- Computer Networks and Communications
- Hardware and Architecture
- Safety
- Risk
- Reliability and Quality
- Computer Science (miscellaneous)
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/2005.07637
month: '03'
oa: 1
oa_version: Preprint
page: 1-33
project:
- _id: 26A5D39A-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '840605'
name: Coordination in constrained and natural distributed systems
- _id: 268A44D6-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '805223'
name: Elastic Coordination for Scalable Machine Learning
publication: Proceedings of the ACM on Measurement and Analysis of Computing Systems
publication_identifier:
issn:
- 2476-1249
publication_status: published
publisher: Association for Computing Machinery
quality_controlled: '1'
related_material:
record:
- id: '10854'
relation: shorter_version
status: public
scopus_import: '1'
status: public
title: Input-dynamic distributed algorithms for communication networks
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 5
year: '2021'
...
---
_id: '9293'
abstract:
- lang: eng
text: 'We consider planning problems for graphs, Markov Decision Processes (MDPs),
and games on graphs in an explicit state space. While graphs represent the most
basic planning model, MDPs represent interaction with nature and games on graphs
represent interaction with an adversarial environment. We consider two planning
problems with k different target sets: (a) the coverage problem asks whether there
is a plan for each individual target set; and (b) the sequential target reachability
problem asks whether the targets can be reached in a given sequence. For the coverage
problem, we present a linear-time algorithm for graphs, and quadratic conditional
lower bound for MDPs and games on graphs. For the sequential target problem, we
present a linear-time algorithm for graphs, a sub-quadratic algorithm for MDPs,
and a quadratic conditional lower bound for games on graphs. Our results with
conditional lower bounds, based on the boolean matrix multiplication (BMM) conjecture
and strong exponential time hypothesis (SETH), establish (i) model-separation
results showing that for the coverage problem MDPs and games on graphs are harder
than graphs, and for the sequential reachability problem games on graphs are harder
than MDPs and graphs; and (ii) problem-separation results showing that for MDPs
the coverage problem is harder than the sequential target problem.'
article_number: '103499'
article_processing_charge: No
article_type: original
author:
- first_name: Krishnendu
full_name: Chatterjee, Krishnendu
id: 2E5DCA20-F248-11E8-B48F-1D18A9856A87
last_name: Chatterjee
orcid: 0000-0002-4561-241X
- first_name: Wolfgang
full_name: Dvořák, Wolfgang
last_name: Dvořák
- first_name: Monika H
full_name: Henzinger, Monika H
id: 540c9bbd-f2de-11ec-812d-d04a5be85630
last_name: Henzinger
orcid: 0000-0002-5008-6530
- first_name: Alexander
full_name: Svozil, Alexander
last_name: Svozil
citation:
ama: Chatterjee K, Dvořák W, Henzinger MH, Svozil A. Algorithms and conditional
lower bounds for planning problems. Artificial Intelligence. 2021;297(8).
doi:10.1016/j.artint.2021.103499
apa: Chatterjee, K., Dvořák, W., Henzinger, M. H., & Svozil, A. (2021). Algorithms
and conditional lower bounds for planning problems. Artificial Intelligence.
Elsevier. https://doi.org/10.1016/j.artint.2021.103499
chicago: Chatterjee, Krishnendu, Wolfgang Dvořák, Monika H Henzinger, and Alexander
Svozil. “Algorithms and Conditional Lower Bounds for Planning Problems.” Artificial
Intelligence. Elsevier, 2021. https://doi.org/10.1016/j.artint.2021.103499.
ieee: K. Chatterjee, W. Dvořák, M. H. Henzinger, and A. Svozil, “Algorithms and
conditional lower bounds for planning problems,” Artificial Intelligence,
vol. 297, no. 8. Elsevier, 2021.
ista: Chatterjee K, Dvořák W, Henzinger MH, Svozil A. 2021. Algorithms and conditional
lower bounds for planning problems. Artificial Intelligence. 297(8), 103499.
mla: Chatterjee, Krishnendu, et al. “Algorithms and Conditional Lower Bounds for
Planning Problems.” Artificial Intelligence, vol. 297, no. 8, 103499, Elsevier,
2021, doi:10.1016/j.artint.2021.103499.
short: K. Chatterjee, W. Dvořák, M.H. Henzinger, A. Svozil, Artificial Intelligence
297 (2021).
date_created: 2021-03-28T22:01:40Z
date_published: 2021-03-16T00:00:00Z
date_updated: 2023-09-26T10:41:42Z
day: '16'
department:
- _id: KrCh
doi: 10.1016/j.artint.2021.103499
external_id:
arxiv:
- '1804.07031'
isi:
- '000657537500003'
intvolume: ' 297'
isi: 1
issue: '8'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://arxiv.org/abs/1804.07031
month: '03'
oa: 1
oa_version: Preprint
publication: Artificial Intelligence
publication_identifier:
issn:
- 0004-3702
publication_status: published
publisher: Elsevier
quality_controlled: '1'
related_material:
record:
- id: '35'
relation: earlier_version
status: public
scopus_import: '1'
status: public
title: Algorithms and conditional lower bounds for planning problems
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 297
year: '2021'
...
---
_id: '13063'
abstract:
- lang: eng
text: We develop a Bayesian model (BayesRR-RC) that provides robust SNP-heritability
estimation, an alternative to marker discovery, and accurate genomic prediction,
taking 22 seconds per iteration to estimate 8.4 million SNP-effects and 78 SNP-heritability
parameters in the UK Biobank. We find that only $\leq$ 10\% of the genetic variation
captured for height, body mass index, cardiovascular disease, and type 2 diabetes
is attributable to proximal regulatory regions within 10kb upstream of genes,
while 12-25% is attributed to coding regions, 32-44% to introns, and 22-28% to
distal 10-500kb upstream regions. Up to 24% of all cis and coding regions of each
chromosome are associated with each trait, with over 3,100 independent exonic
and intronic regions and over 5,400 independent regulatory regions having >95%
probability of contributing >0.001% to the genetic variance of these four traits.
Our open-source software (GMRM) provides a scalable alternative to current approaches
for biobank data.
article_processing_charge: No
author:
- first_name: Matthew Richard
full_name: Robinson, Matthew Richard
id: E5D42276-F5DA-11E9-8E24-6303E6697425
last_name: Robinson
orcid: 0000-0001-8982-8813
citation:
ama: Robinson MR. Probabilistic inference of the genetic architecture of functional
enrichment of complex traits. 2021. doi:10.5061/dryad.sqv9s4n51
apa: Robinson, M. R. (2021). Probabilistic inference of the genetic architecture
of functional enrichment of complex traits. Dryad. https://doi.org/10.5061/dryad.sqv9s4n51
chicago: Robinson, Matthew Richard. “Probabilistic Inference of the Genetic Architecture
of Functional Enrichment of Complex Traits.” Dryad, 2021. https://doi.org/10.5061/dryad.sqv9s4n51.
ieee: M. R. Robinson, “Probabilistic inference of the genetic architecture of functional
enrichment of complex traits.” Dryad, 2021.
ista: Robinson MR. 2021. Probabilistic inference of the genetic architecture of
functional enrichment of complex traits, Dryad, 10.5061/dryad.sqv9s4n51.
mla: Robinson, Matthew Richard. Probabilistic Inference of the Genetic Architecture
of Functional Enrichment of Complex Traits. Dryad, 2021, doi:10.5061/dryad.sqv9s4n51.
short: M.R. Robinson, (2021).
date_created: 2023-05-23T16:20:16Z
date_published: 2021-11-04T00:00:00Z
date_updated: 2023-09-26T10:36:15Z
day: '04'
ddc:
- '570'
department:
- _id: MaRo
doi: 10.5061/dryad.sqv9s4n51
license: https://creativecommons.org/publicdomain/zero/1.0/
main_file_link:
- open_access: '1'
url: https://doi.org/10.5061/dryad.sqv9s4n51
month: '11'
oa: 1
oa_version: Published Version
publisher: Dryad
related_material:
link:
- relation: software
url: https://github.com/medical-genomics-group/gmrm
record:
- id: '8429'
relation: used_in_publication
status: public
status: public
title: Probabilistic inference of the genetic architecture of functional enrichment
of complex traits
tmp:
image: /images/cc_0.png
legal_code_url: https://creativecommons.org/publicdomain/zero/1.0/legalcode
name: Creative Commons Public Domain Dedication (CC0 1.0)
short: CC0 (1.0)
type: research_data_reference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2021'
...
---
_id: '9304'
abstract:
- lang: eng
text: The high processing cost, poor mechanical properties and moderate performance
of Bi2Te3–based alloys used in thermoelectric devices limit the cost-effectiveness
of this energy conversion technology. Towards solving these current challenges,
in the present work, we detail a low temperature solution-based approach to produce
Bi2Te3-Cu2-xTe nanocomposites with improved thermoelectric performance. Our approach
consists in combining proper ratios of colloidal nanoparticles and to consolidate
the resulting mixture into nanocomposites using a hot press. The transport properties
of the nanocomposites are characterized and compared with those of pure Bi2Te3
nanomaterials obtained following the same procedure. In contrast with most previous
works, the presence of Cu2-xTe nanodomains does not result in a significant reduction
of the lattice thermal conductivity of the reference Bi2Te3 nanomaterial, which
is already very low. However, the introduction of Cu2-xTe yields a nearly threefold
increase of the power factor associated to a simultaneous increase of the Seebeck
coefficient and electrical conductivity at temperatures above 400 K. Taking into
account the band alignment of the two materials, we rationalize this increase
by considering that Cu2-xTe nanostructures, with a relatively low electron affinity,
are able to inject electrons into Bi2Te3, enhancing in this way its electrical
conductivity. The simultaneous increase of the Seebeck coefficient is related
to the energy filtering of charge carriers at energy barriers within Bi2Te3 domains
associated with the accumulation of electrons in regions nearby a Cu2-xTe/Bi2Te3
heterojunction. Overall, with the incorporation of a proper amount of Cu2-xTe
nanoparticles, we demonstrate a 250% improvement of the thermoelectric figure
of merit of Bi2Te3.
acknowledgement: "This work was supported by the European Regional Development Funds
and by the Generalitat de Catalunya through the project 2017SGR1246. Y.Z, C.X, M.L,
K.X and X.H thank the China Scholarship Council for the scholarship support. MI
acknowledges financial support from IST Austria. YL acknowledges funding from the
European Union’s Horizon 2020 research and innovation program under the Marie Sklodowska-Curie
grant agreement No. 754411. ICN2\r\nacknowledges funding from Generalitat de Catalunya
2017 SGR 327 and the Spanish MINECO project ENE2017-85087-C3. ICN2 is supported
by the Severo Ochoa program from the Spanish MINECO (grant no. SEV-2017-0706) and
is funded by the CERCA Program/Generalitat de Catalunya. Part of the present work
has been performed in the framework of Universitat Autònoma de Barcelona Materials
Science PhD program."
article_number: '129374'
article_processing_charge: No
article_type: original
author:
- first_name: Yu
full_name: Zhang, Yu
last_name: Zhang
- first_name: Congcong
full_name: Xing, Congcong
last_name: Xing
- first_name: Yu
full_name: Liu, Yu
id: 2A70014E-F248-11E8-B48F-1D18A9856A87
last_name: Liu
orcid: 0000-0001-7313-6740
- first_name: Mengyao
full_name: Li, Mengyao
last_name: Li
- first_name: Ke
full_name: Xiao, Ke
last_name: Xiao
- first_name: Pablo
full_name: Guardia, Pablo
last_name: Guardia
- first_name: Seungho
full_name: Lee, Seungho
id: BB243B88-D767-11E9-B658-BC13E6697425
last_name: Lee
orcid: 0000-0002-6962-8598
- first_name: Xu
full_name: Han, Xu
last_name: Han
- first_name: Ahmad
full_name: Moghaddam, Ahmad
last_name: Moghaddam
- first_name: Joan J
full_name: Roa, Joan J
last_name: Roa
- first_name: Jordi
full_name: Arbiol, Jordi
last_name: Arbiol
- first_name: Maria
full_name: Ibáñez, Maria
id: 43C61214-F248-11E8-B48F-1D18A9856A87
last_name: Ibáñez
orcid: 0000-0001-5013-2843
- first_name: Kai
full_name: Pan, Kai
last_name: Pan
- first_name: Mirko
full_name: Prato, Mirko
last_name: Prato
- first_name: Ying
full_name: Xie, Ying
last_name: Xie
- first_name: Andreu
full_name: Cabot, Andreu
last_name: Cabot
citation:
ama: Zhang Y, Xing C, Liu Y, et al. Influence of copper telluride nanodomains on
the transport properties of n-type bismuth telluride. Chemical Engineering
Journal. 2021;418(8). doi:10.1016/j.cej.2021.129374
apa: Zhang, Y., Xing, C., Liu, Y., Li, M., Xiao, K., Guardia, P., … Cabot, A. (2021).
Influence of copper telluride nanodomains on the transport properties of n-type
bismuth telluride. Chemical Engineering Journal. Elsevier. https://doi.org/10.1016/j.cej.2021.129374
chicago: Zhang, Yu, Congcong Xing, Yu Liu, Mengyao Li, Ke Xiao, Pablo Guardia, Seungho
Lee, et al. “Influence of Copper Telluride Nanodomains on the Transport Properties
of N-Type Bismuth Telluride.” Chemical Engineering Journal. Elsevier, 2021.
https://doi.org/10.1016/j.cej.2021.129374.
ieee: Y. Zhang et al., “Influence of copper telluride nanodomains on the
transport properties of n-type bismuth telluride,” Chemical Engineering Journal,
vol. 418, no. 8. Elsevier, 2021.
ista: Zhang Y, Xing C, Liu Y, Li M, Xiao K, Guardia P, Lee S, Han X, Moghaddam A,
Roa JJ, Arbiol J, Ibáñez M, Pan K, Prato M, Xie Y, Cabot A. 2021. Influence of
copper telluride nanodomains on the transport properties of n-type bismuth telluride.
Chemical Engineering Journal. 418(8), 129374.
mla: Zhang, Yu, et al. “Influence of Copper Telluride Nanodomains on the Transport
Properties of N-Type Bismuth Telluride.” Chemical Engineering Journal,
vol. 418, no. 8, 129374, Elsevier, 2021, doi:10.1016/j.cej.2021.129374.
short: Y. Zhang, C. Xing, Y. Liu, M. Li, K. Xiao, P. Guardia, S. Lee, X. Han, A.
Moghaddam, J.J. Roa, J. Arbiol, M. Ibáñez, K. Pan, M. Prato, Y. Xie, A. Cabot,
Chemical Engineering Journal 418 (2021).
date_created: 2021-04-04T22:01:20Z
date_published: 2021-08-15T00:00:00Z
date_updated: 2023-09-27T07:36:29Z
day: '15'
department:
- _id: MaIb
doi: 10.1016/j.cej.2021.129374
ec_funded: 1
external_id:
isi:
- '000655672000005'
intvolume: ' 418'
isi: 1
issue: '8'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://ddd.uab.cat/record/271949
month: '08'
oa: 1
oa_version: Submitted Version
project:
- _id: 260C2330-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '754411'
name: ISTplus - Postdoctoral Fellowships
publication: Chemical Engineering Journal
publication_identifier:
issn:
- 1385-8947
publication_status: published
publisher: Elsevier
quality_controlled: '1'
scopus_import: '1'
status: public
title: Influence of copper telluride nanodomains on the transport properties of n-type
bismuth telluride
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 418
year: '2021'
...
---
_id: '9793'
abstract:
- lang: eng
text: Astrocytes extensively infiltrate the neuropil to regulate critical aspects
of synaptic development and function. This process is regulated by transcellular
interactions between astrocytes and neurons via cell adhesion molecules. How astrocytes
coordinate developmental processes among one another to parse out the synaptic
neuropil and form non-overlapping territories is unknown. Here we identify a molecular
mechanism regulating astrocyte-astrocyte interactions during development to coordinate
astrocyte morphogenesis and gap junction coupling. We show that hepaCAM, a disease-linked,
astrocyte-enriched cell adhesion molecule, regulates astrocyte competition for
territory and morphological complexity in the developing mouse cortex. Furthermore,
conditional deletion of Hepacam from developing astrocytes significantly impairs
gap junction coupling between astrocytes and disrupts the balance between synaptic
excitation and inhibition. Mutations in HEPACAM cause megalencephalic leukoencephalopathy
with subcortical cysts in humans. Therefore, our findings suggest that disruption
of astrocyte self-organization mechanisms could be an underlying cause of neural
pathology.
acknowledgement: This work was supported by the National Institutes of Health (R01
DA047258 and R01 NS102237 to C.E., F32 NS100392 to K.T.B.) and the Holland-Trice
Brain Research Award (to C.E.). K.T.B. was supported by postdoctoral fellowships
from the Foerster-Bernstein Family and The Hartwell Foundation. The Hippenmeyer
lab was supported by the European Research Council (ERC) under the European Union’s
Horizon 2020 research and innovations program (725780 LinPro) to S.H. R.E. was supported
by Ministerio de Ciencia y Tecnología (RTI2018-093493-B-I00). We thank the Duke
Light Microscopy Core Facility, the Duke Transgenic Mouse Facility, Dr. U. Schulte
for assistance with proteomic experiments, and Dr. D. Silver for critical review
of the manuscript. Cartoon elements of figure panels were created using BioRender.com.
article_processing_charge: No
article_type: original
author:
- first_name: Katherine T.
full_name: Baldwin, Katherine T.
last_name: Baldwin
- first_name: Christabel X.
full_name: Tan, Christabel X.
last_name: Tan
- first_name: Samuel T.
full_name: Strader, Samuel T.
last_name: Strader
- first_name: Changyu
full_name: Jiang, Changyu
last_name: Jiang
- first_name: Justin T.
full_name: Savage, Justin T.
last_name: Savage
- first_name: Xabier
full_name: Elorza-Vidal, Xabier
last_name: Elorza-Vidal
- first_name: Ximena
full_name: Contreras, Ximena
id: 475990FE-F248-11E8-B48F-1D18A9856A87
last_name: Contreras
- first_name: Thomas
full_name: Rülicke, Thomas
last_name: Rülicke
- first_name: Simon
full_name: Hippenmeyer, Simon
id: 37B36620-F248-11E8-B48F-1D18A9856A87
last_name: Hippenmeyer
orcid: 0000-0003-2279-1061
- first_name: Raúl
full_name: Estévez, Raúl
last_name: Estévez
- first_name: Ru-Rong
full_name: Ji, Ru-Rong
last_name: Ji
- first_name: Cagla
full_name: Eroglu, Cagla
last_name: Eroglu
citation:
ama: Baldwin KT, Tan CX, Strader ST, et al. HepaCAM controls astrocyte self-organization
and coupling. Neuron. 2021;109(15):2427-2442.e10. doi:10.1016/j.neuron.2021.05.025
apa: Baldwin, K. T., Tan, C. X., Strader, S. T., Jiang, C., Savage, J. T., Elorza-Vidal,
X., … Eroglu, C. (2021). HepaCAM controls astrocyte self-organization and coupling.
Neuron. Elsevier. https://doi.org/10.1016/j.neuron.2021.05.025
chicago: Baldwin, Katherine T., Christabel X. Tan, Samuel T. Strader, Changyu Jiang,
Justin T. Savage, Xabier Elorza-Vidal, Ximena Contreras, et al. “HepaCAM Controls
Astrocyte Self-Organization and Coupling.” Neuron. Elsevier, 2021. https://doi.org/10.1016/j.neuron.2021.05.025.
ieee: K. T. Baldwin et al., “HepaCAM controls astrocyte self-organization
and coupling,” Neuron, vol. 109, no. 15. Elsevier, p. 2427–2442.e10, 2021.
ista: Baldwin KT, Tan CX, Strader ST, Jiang C, Savage JT, Elorza-Vidal X, Contreras
X, Rülicke T, Hippenmeyer S, Estévez R, Ji R-R, Eroglu C. 2021. HepaCAM controls
astrocyte self-organization and coupling. Neuron. 109(15), 2427–2442.e10.
mla: Baldwin, Katherine T., et al. “HepaCAM Controls Astrocyte Self-Organization
and Coupling.” Neuron, vol. 109, no. 15, Elsevier, 2021, p. 2427–2442.e10,
doi:10.1016/j.neuron.2021.05.025.
short: K.T. Baldwin, C.X. Tan, S.T. Strader, C. Jiang, J.T. Savage, X. Elorza-Vidal,
X. Contreras, T. Rülicke, S. Hippenmeyer, R. Estévez, R.-R. Ji, C. Eroglu, Neuron
109 (2021) 2427–2442.e10.
date_created: 2021-08-06T09:08:25Z
date_published: 2021-08-04T00:00:00Z
date_updated: 2023-09-27T07:46:09Z
day: '04'
department:
- _id: SiHi
doi: 10.1016/j.neuron.2021.05.025
ec_funded: 1
external_id:
isi:
- '000692851900010'
pmid:
- '34171291'
intvolume: ' 109'
isi: 1
issue: '15'
language:
- iso: eng
main_file_link:
- open_access: '1'
url: https://doi.org/10.1016/j.neuron.2021.05.025
month: '08'
oa: 1
oa_version: Published Version
page: 2427-2442.e10
pmid: 1
project:
- _id: 260018B0-B435-11E9-9278-68D0E5697425
call_identifier: H2020
grant_number: '725780'
name: Principles of Neural Stem Cell Lineage Progression in Cerebral Cortex Development
publication: Neuron
publication_identifier:
eissn:
- 1097-4199
issn:
- 0896-6273
publication_status: published
publisher: Elsevier
quality_controlled: '1'
scopus_import: '1'
status: public
title: HepaCAM controls astrocyte self-organization and coupling
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 109
year: '2021'
...