Follow
Maxwell Nye
Maxwell Nye
adept.ai
Verified email at alum.mit.edu - Homepage
Title
Cited by
Cited by
Year
Program synthesis with large language models
J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
10192021
Show your work: Scratchpads for intermediate computation with language models
M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ...
arXiv preprint arXiv:2112.00114, 2021
4872021
DreamCoder: growing generalizable, interpretable knowledge with wake–sleep Bayesian program learning
K Ellis, L Wong, M Nye, M Sable-Meyer, L Cary, L Anaya Pozo, L Hewitt, ...
Philosophical Transactions of the Royal Society A 381 (2251), 20220050, 2023
2192023
Dreamcoder: Bootstrapping inductive program synthesis with wake-sleep library learning
K Ellis, C Wong, M Nye, M Sablé-Meyer, L Morales, L Hewitt, L Cary, ...
Proceedings of the 42nd acm sigplan international conference on programming …, 2021
1802021
Write, execute, assess: Program synthesis with a repl
K Ellis, M Nye, Y Pu, F Sosa, J Tenenbaum, A Solar-Lezama
Advances in Neural Information Processing Systems 32, 2019
1612019
Implicit representations of meaning in neural language models
BZ Li, M Nye, J Andreas
arXiv preprint arXiv:2106.00737, 2021
1352021
Learning to infer program sketches
M Nye, L Hewitt, J Tenenbaum, A Solar-Lezama
International Conference on Machine Learning, 4861-4870, 2019
1212019
Learning compositional rules via neural program synthesis
M Nye, A Solar-Lezama, J Tenenbaum, BM Lake
Advances in Neural Information Processing Systems 33, 10832-10842, 2020
1162020
Improving coherence and consistency in neural sequence models with dual-system, neuro-symbolic reasoning
M Nye, M Tessler, J Tenenbaum, BM Lake
Advances in Neural Information Processing Systems 34, 25192-25204, 2021
1052021
The variational homoencoder: Learning to learn high capacity generative models from few examples
LB Hewitt, MI Nye, A Gane, T Jaakkola, JB Tenenbaum
arXiv preprint arXiv:1807.08919, 2018
752018
Communicating natural programs to humans and machines
S Acquaviva, Y Pu, M Kryven, T Sechopoulos, C Wong, G Ecanow, M Nye, ...
Advances in Neural Information Processing Systems 35, 3731-3743, 2022
532022
Program synthesis with large language models. CoRR abs/2108.07732 (2021)
J Austin, A Odena, MI Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
492021
Introducing our multimodal models, 2023
R Bavishi, E Elsen, C Hawthorne, M Nye, A Odena, A Somani, S Tasırlar
URL https://www. adept. ai/blog/fuyu-8b 2, 0
42
Show your work: Scratchpads for intermediate computation with language models, 2021
M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ...
URL https://arxiv. org/abs/2112.00114, 2021
362021
Representing partial programs with blended abstract semantics
M Nye, Y Pu, M Bowers, J Andreas, JB Tenenbaum, A Solar-Lezama
arXiv preprint arXiv:2012.12964, 2020
252020
Are efficient deep representations learnable?
M Nye, A Saxe
arXiv preprint arXiv:1807.06399, 2018
222018
A large-scale benchmark for few-shot program induction and synthesis
F Alet, J Lopez-Contreras, J Koppel, M Nye, A Solar-Lezama, ...
International Conference on Machine Learning, 175-186, 2021
212021
Program synthesis with large language models (2021)
J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
132021
Language modeling with latent situations
BZ Li, M Nye, J Andreas
arXiv preprint arXiv:2212.10012, 2022
82022
Larc: Language annotated abstraction and reasoning corpus
S Acquaviva, Y Pu, M Nye, C Wong, MH Tessler, J Tenenbaum
Proceedings of the Annual Meeting of the Cognitive Science Society 43 (43), 2021
42021
The system can't perform the operation now. Try again later.
Articles 1–20