Noga Zaslavsky
Noga Zaslavsky
Home
Research
Publications
Selected
Recent
All
CV
Publications
Filter:  
Type
Conference abstract
Conference paper
Journal article
Preprint
Book section
Thesis
Date
2024
2023
2022
2021
2020
2019
2018
2017
2015
Optimal compression in human concept learning
Nathaniel Imel, Noga Zaslavsky.
CogSci
, 2024.
PDF
Cite
Proceedings
Language use is only sparsely compositional: The case of English adjective-noun phrases in humans and large language models
Aalok Sathe, Evelina Fedorenko, Noga Zaslavsky.
CogSci
, 2024.
PDF
Cite
Proceedings
Bridging semantics and pragmatics in information-theoretic emergent communication
Eleonora Gualdoni, Mycal Tucker, Roger P. Levy, Noga Zaslavsky.
SCiL
, 2024.
PDF
Cite
DOI
Discreteness and systematicity emerge to facilitate communication in a continuous signal-meaning space
Alicia Chen, Matthias Hofer, Moshe Poliak, Roger P. Levy, and Noga Zaslavsky.
EvoLang XV
, 2024.
PDF
Cite
DOI
Artificial neural network language models predict human brain responses to language even after a developmentally realistic amount of training
Eghbal A Hosseini, Martin Schrimpf, Yian Zhang, Samuel Bowman, Noga Zaslavsky, Evelina Fedorenko.
Neurobiology of Language
, 2024.
PDF
Cite
DOI
Noisy Population Dynamics Lead to Efficiently Compressed Semantic Systems
Nathaniel Imel, Richard Futrell, Michael Franke, Noga Zaslavsky.
InfoCog @ NeruIPS
, 2023.
PDF
Cite
Human-Guided Complexity-Controlled Abstractions
Andi Peng, Mycal Tucker, Eoin M. Kenny, Noga Zaslavsky, Pulkit Agrawal, Julie Shah.
NeurIPS
, 2023.
PDF
Cite
Code
Video
Proceedings
Supplemental
Teasing apart the representational spaces of ANN language models to discover key axes of model-to-brain alignment
Eghbal Hosseini, Noga Zaslavsky, Colton Casto, Evelina Fedorenko.
CCN
, 2023.
PDF
Evidence for a language-independent conceptual representation of pronominal referents
Mora Maldonado, Noga Zaslavsky, Jennifer Culbertson.
CogSci
, 2023.
PDF
Cite
Proceedings
Alignment of ANN Language Models with Humans After a Developmentally Realistic Amount of Training
Eghbal Hosseini, Martin Schrimpf, Yian Zhang, Samuel Bowman, Noga Zaslavsky, Evelina Fedorenko.
COSYNE
, 2023.
PDF
Generalization and Translatability in Emergent Communication via Informational Constraints
Mycal Tucker, Roger Levy, Julie Shah, Noga Zaslavsky.
InfoCog @ NeurIPS
, 2022.
PDF
Cite
Trading off Utility, Informativeness, and Complexity in Emergent Communication
Mycal Tucker, Julie Shah, Roger Levy, Noga Zaslavsky.
NeurIPS
, 2022.
PDF
Cite
Code
Supplementary Material
Beyond linear regression: mapping models in cognitive neuroscience should align with research goals
Anna A. Ivanova, Martin Schrimpf, Stefano Anzellotti, Noga Zaslavsky, Evelina Fedorenko, Leyla Isik.
NBDT
, 2022.
PDF
Cite
DOI
Towards Human-Agent Communication via the Information Bottleneck Principle
Mycal Tucker, Julie Shah, Roger Levy, Noga Zaslavsky.
RSS Workshop on Social Intelligence in Humans and Robots
, 2022.
PDF
Cite
RSS Workshop
Teasing apart models of pragmatics using optimal reference game design
Irene Zhou, Jennifer Hu, Roger Levy, Noga Zaslavsky.
CogSci
, 2022.
PDF
Cite
Code
Proceedings
The emergence of discrete and systematic communication in a continuous signal-meaning space
Alicia Chen, Matthias Hofer, Moshe Poliak, Roger Levy, Noga Zaslavsky.
CogSci
, 2022.
PDF
Cite
Proceedings
The evolution of color naming reflects pressure for efficiency: Evidence from the recent past
Noga Zaslavsky*, Karee Garvin*, Charles Kemp, Naftali Tishby, Terry Regier.
Journal of Language Evolution
, 2022.
PDF
Cite
Code
Dataset
DOI
Preprint
The forms and meanings of grammatical markers support efficient communication
Francis Mollica, Geoffrey Bacon, Noga Zaslavsky, Yang Xu, Terry Regier, Charles Kemp.
PNAS
, 2021.
PDF
Cite
Code
Dataset
DOI
Preprint
Scalable pragmatic communication via self-supervision
Jennifer Hu, Roger Levy, Noga Zaslavsky.
ICML Workshop on Self-Supervised Learning for Reasoning and Perception
, 2021.
PDF
Cite
ICML Workshop
Let's talk (efficiently) about us: Person systems achieve near-optimal compression
Noga Zaslavsky*, Mora Maldonado*, Jennifer Culbertson.
CogSci
, 2021.
PDF
Cite
Proceedings
Empirical support for a Rate-Distortion account of pragmatic reasoning
Irene Zhou, Jennifer Hu, Roger P. Levy, Noga Zaslavsky.
CogSci
, 2021.
PDF
Cite
Proceedings
Competition from novel features drives scalar inferences in reference games
Jennifer Hu, Noga Zaslavsky, Roger P. Levy.
CogSci
, 2021.
PDF
Cite
Code
Proceedings
Is it that simple? Linear mapping models in cognitive neuroscience
Anna A. Ivanova, Martin Schrimpf, Stefano Anzellotti, Noga Zaslavsky, Evelina Fedorenko, Leyla Isik.
bioRxiv
, 2021.
PDF
Cite
DOI
Probing artificial neural networks: insights from neuroscience
Anna A. Ivanova, John Hewitt, Noga Zaslavsky.
ICLR Brain2AI Workshop
, 2021.
PDF
ICLR Workshop
Crosslinguistic patterns in person systems reflect efficient coding
Mora Maldonado*, Noga Zaslavsky*, Jennifer Culbertson.
CUNY
, 2021.
PDF
A Rate–Distortion view of human pragmatic reasoning
Noga Zaslavsky, Jennifer Hu, Roger Levy.
SCiL
, 2021.
PDF
Cite
DOI
Bayesian Approaches to Color Category Learning
Thomas Griffiths, Noga Zaslavsky.
Encyclopedia of Color Science and Technology
, 2021.
PDF
Cite
DOI
Cloze Distillation: Improving Neural Language Models with Human Next-Word Prediction
Tiwalayo Eisape, Noga Zaslavsky, Roger Levy.
CoNLL
, 2020.
PDF
Cite
DOI
A Rate–Distortion view of human pragmatic reasoning
Noga Zaslavsky, Jennifer Hu, Roger Levy.
arXiv preprint
, 2020.
PDF
Cite
Proceedings
Toward human-like object naming in artificial neural systems
Tiwalayo Eisape, Roger Levy, Joshua Tenenbaum, Noga Zaslavsky.
BAICS @ ICLR
, 2020.
PDF
Cite
Video
ICLR Workshop
Emergence of pragmatic reasoning from least-effort optimization
Noga Zaslavsky, Jennifer Hu, Roger Levy.
EvoLang XIII
, 2020.
PDF
Cite
DOI
Information-Theoretic Principles in the Evolution of Semantic Systems
Noga Zaslavsky.
PhD Thesis, The Hebrew University
, 2020.
PDF
Cite
Deterministic annealing and the evolution of Information Bottleneck representations
Noga Zaslavsky, Naftali Tishby.
Technical report
, 2019.
PDF
Cite
Evolution and efficiency in color naming: The case of Nafaanra
Noga Zaslavsky*, Karee Garvin*, Charles Kemp, Naftali Tishby, Terry Regier.
CogSci
, 2019.
PDF
Cite
Semantic categories of artifacts and animals reflect efficient coding
Noga Zaslavsky, Terry Regier, Naftali Tishby, Charles Kemp.
CogSci
, 2019.
PDF
Cite
Communicative need in color naming
Noga Zaslavsky, Charles Kemp, Naftali Tishby, Terry Regier.
CNP
, 2019.
PDF
Cite
DOI
Color naming reflects both perceptual structure and communicative need
Noga Zaslavsky, Charles Kemp, Naftali Tishby, Terry Regier.
topiCS
, 2019.
PDF
Cite
DOI
Efficient compression in color naming and its evolution
ELSC Prize for Outstanding Publication
Noga Zaslavsky, Charles Kemp, Terry Regier, Naftali Tishby.
PNAS
, 2018.
PDF
Cite
Code
DOI
SI
Movie 1
Movie 2
Color naming reflects both perceptual structure and communicative need (earlier version)
Best paper award for computational modeling of language
Noga Zaslavsky, Charles Kemp, Naftali Tishby, Terry Regier.
CogSci
, 2018.
PDF
Cite
Efficient human-like semantic representations via the Information Bottleneck principle
Noga Zaslavsky, Charles Kemp, Terry Regier, Naftali Tishby.
CIAI @ NeurIPS
, 2017.
PDF
Cite
NeurIPS Workshop
Efficient encoding of motion is mediated by gap junctions in the fly visual system
Siwei Wang, Alexander Borst, Noga Zaslavsky, Naftali Tishby, Idan Segev.
PLoS Comput Biol
, 2017.
PDF
Cite
DOI
Early motion processing circuit uses gap junctions to achieve efficient stimuli encoding
Siwei Wang, Noga Zaslavsky, Alexander Borst, Naftali Tishby, Idan Segev.
COSYNE
, 2017.
PDF
Cite
Deep learning and the Information Bottleneck principle
Naftali Tishby, Noga Zaslavsky.
ITW
, 2015.
PDF
Cite
DOI
Cite
×