{"forum": "SyeumQYUUH", "submission_url": "https://openreview.net/forum?id=SyeumQYUUH", "submission_content": {"TL;DR": "connections between predictive coding and VAEs + new frontiers", "authors": ["Joseph Marino"], "abstract": "Predictive coding, within theoretical neuroscience, and variational autoencoders, within machine learning, both involve latent Gaussian models and variational inference. While these areas share a common origin, they have evolved largely independently. We outline connections and contrasts between these areas, using their relationships to identify new parallels between machine learning and neuroscience. We then discuss specific frontiers at this intersection: backpropagation, normalizing flows, and attention, with mutual benefits for both fields.", "keywords": ["predictive coding", "variational autoencoders", "probabilistic models", "variational inference"], "title": "Predictive Coding, Variational Autoencoders, and Biological Connections", "authorids": ["jmarino@caltech.edu"], "pdf": "/pdf/bbd45ab171a9841a5ca613af3afbb2699ff28659.pdf", "paperhash": "marino|predictive_coding_variational_autoencoders_and_biological_connections"}, "submission_cdate": 1568211743600, "submission_tcdate": 1568211743600, "submission_tmdate": 1571848607990, "submission_ddate": null, "review_id": ["HylHjQhvDr", "BJlldx4qDS", "ByejQahcwH"], "review_url": ["https://openreview.net/forum?id=SyeumQYUUH¬eId=HylHjQhvDr", "https://openreview.net/forum?id=SyeumQYUUH¬eId=BJlldx4qDS", "https://openreview.net/forum?id=SyeumQYUUH¬eId=ByejQahcwH"], "review_cdate": [1569338268608, 1569501287604, 1569537315495], "review_tcdate": [1569338268608, 1569501287604, 1569537315495], "review_tmdate": [1570047559070, 1570047548879, 1570047542674], "review_readers": [["everyone"], ["everyone"], ["everyone"]], "review_writers": [["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper7/AnonReviewer2"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper7/AnonReviewer1"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper7/AnonReviewer3"]], "review_reply_count": [{"replyCount": 0}, {"replyCount": 0}, {"replyCount": 0}], "review_replyto": ["SyeumQYUUH", "SyeumQYUUH", "SyeumQYUUH"], "review_content": [{"evaluation": "2: Poor", "intersection": "4: High", "importance_comment": "Predictive coding is a current theory in systems neuroscience with a lot of potential for development by looking at deep generative models. Likewise, deep generative models inspired by thalamocortical architecture and dynamics could result in improvements to online perceptual learning.", "clarity": "2: Can get the general idea", "technical_rigor": "1: Not convincing", "intersection_comment": "The article is addressing open questions relevant to both AI and neuroscience.", "rigor_comment": "Since there are no results in this submission, I have read it like a synthesis of two divergent literatures. The initial descriptions of predictive coding and variational autoencoders are precise and succinct. However, the comparisons and contrasts is very shallow. The discussion on biologically plausible backpropagation is worthwhile, however, the connection to either predictive coding or variational autoencoders is not made. The discussion on normalizing flows is interesting and links with predictive coding are established, so I accept from this article that this could be an interesting frontier for research.", "comment": "Strengths:\n\nThe ideas floated in this article have a lot of potential to launch a research topic. The structure is strong and would make for a good first draft of a research grant.\n\nAreas for improvement:\n\nThe thesis needs to be more focused. What is(are) the research question(s) that you want the reader to reach by the time they finish reading? Narrowing this down and making it clear is an absolute must. In this vein, I felt the discussion of normalizing flows was particularly promising.", "importance": "3: Important", "title": "A brief review of predictive coding and variational autoencoders, with suggestions for future research", "category": "Common question to both AI & Neuro", "clarity_comment": "While the descriptions of the concepts in this article are clear, the overall synthesis and thesis of the article are not. "}, {"title": "Backpropagation and normalizing flows", "importance": "3: Important", "importance_comment": "Predictive coding remains of great interest in systems neuroscience - with much effort devoted to linking theory to biological function. Thalamocortical architecture has been relatively well characterized biologically, suggesting it may be a good architecture for future efforts.", "rigor_comment": "As the authors' goal seems to have been to present a synthesis of ideas from the field, the technical rigor may be acceptable on these grounds. ", "clarity_comment": "The overall text was well-written and easy to follow. The figures were only somewhat helpful, but as this is a synthesis paper, added to the overall story.", "clarity": "4: Well-written", "evaluation": "3: Good", "intersection_comment": "Backpropagation is an area of great interest for both AI and neuroscience; in this sense, this paper highlights interesting ways in which this could be a future research direction. Broadly, I wonder at statements of biology relying on local learning rules - of course it does, but the studies referenced are largely in neuronal culture dishes. Perhaps by understanding dynamics at a systems scale (in small model organisms perhaps), it may be both local and global. It is unclear based on the authors' framing if their deep network approach allows for such flexibility.", "intersection": "3: Medium", "comment": "The work is very clear to read and follow - overall, the presentation is strong. There are many potential areas of interest that arise from the ideas outlined here.\n\nOverall, however, the work would benefit from more discussion by the authors of why they chose these topics (i.e. which of these ideas is of particular interest, such that they think that these are an interesting new research direction). Some sort of brief outlook or summary for future consideration would be of added value at the end of the document.", "technical_rigor": "2: Marginally convincing", "category": "Common question to both AI & Neuro"}, {"title": "Very preliminary work connecting two related paradigms, but providing only very speculative new links", "importance": "2: Marginally important", "importance_comment": "The paper provides a high-level overview of predictive coding and VAEs and speculatively connects these two methods to outstanding questions in neuroscience (the function of lateral connections and whether backpropagation occurs in the brain). ", "rigor_comment": "The high-level overview of VAEs and predictive coding appears to be correct. However, the connections made in this paper to neuroscience (in the sections on backpropagation and normalizing flows) are largely speculative. No substantive predictions are made, and the biological details are not examined with enough granularity to draw any solid conclusions. For example, it's unclear in what sense normalizing flows may \"help justify design choices in predictive coding,\" as claimed.", "clarity_comment": "The exposition is fairly clear. ", "clarity": "3: Average readability", "evaluation": "1: Very poor", "intersection_comment": "This paper attempts to build a bridge between variational autoencoders (an important framework for generative modeling and unsupervised learning in ML) and predictive coding (a controversial, but potentially powerful explanatory framework in neuroscience).", "intersection": "5: Outstanding", "comment": "The paper provides a high-level overview of predictive coding and VAEs and speculatively connects these two methods to outstanding questions in neuroscience (for example: to the function of lateral connections and the question of whether backpropagation-like computations occur in the brain). This work is very preliminary and presents no technical results.", "technical_rigor": "2: Marginally convincing", "category": "Common question to both AI & Neuro"}], "comment_id": [], "comment_cdate": [], "comment_tcdate": [], "comment_tmdate": [], "comment_readers": [], "comment_writers": [], "comment_reply_content": [], "comment_content": [], "comment_replyto": [], "comment_url": [], "meta_review_cdate": null, "meta_review_tcdate": null, "meta_review_tmdate": null, "meta_review_ddate ": null, "meta_review_title": null, "meta_review_metareview": null, "meta_review_confidence": null, "meta_review_readers": null, "meta_review_writers": null, "meta_review_reply_count": null, "meta_review_url": null, "decision": "Accept (Poster)"}