{"forum": "SklZVQtLLr", "submission_url": "https://openreview.net/forum?id=SklZVQtLLr", "submission_content": {"TL;DR": "A theoretical analysis of a new class of RNNs, trained on neuroscience tasks, allows us to identify the role of dynamical dimensionality and cell classes in neural computations.", "keywords": ["RNN", "reverse-engineering", "mean-field theory", "dimensionality", "cell classes"], "authors": ["Alexis M Dubreuil", "Adrian Valente", "Francesca Mastrogiuseppe", "Srdjan Ostojic"], "title": "Disentangling the roles of dimensionality and cell classes in neural computations", "abstract": "The description of neural computations in the field of neuroscience relies on two competing views: (i) a classical single-cell view that relates the activity of individual neurons to sensory or behavioural variables, and focuses on how different cell classes map onto computations; (ii) a more recent population view that instead characterises computations in terms of collective neural trajectories, and focuses on the dimensionality of these trajectories as animals perform tasks. How the two key concepts of cell classes and low-dimensional trajectories interact to shape neural computations is however currently not understood. Here we address this question by combining machine-learning tools for training RNNs with reverse-engineering and theoretical analyses of network dynamics. We introduce a novel class of theoretically tractable recurrent networks: low-rank, mixture of Gaussian RNNs. In these networks, the rank of the connectivity controls the dimensionality of the dynamics, while the number of components in the Gaussian mixture corresponds to the number of cell classes. Using back-propagation, we determine the minimum rank and number of cell classes needed to implement neuroscience tasks of increasing complexity. We then exploit mean-field theory to reverse-engineer the obtained solutions and identify the respective roles of dimensionality and cell classes. We show that the rank determines the phase-space available for dynamics that implement input-output mappings, while having multiple cell classes allows networks to flexibly switch between different types of dynamics in the available phase-space. Our results have implications for the analysis of neuroscience experiments and the development of explainable AI.", "authorids": ["alexis.dubreuil@ens.fr", "adrian.valente@ens.fr", "fran.mastrogiuseppe@gmail.com", "srdjan.ostojic@ens.fr"], "pdf": "/pdf/f9e575d223b3118b91d67317579bf66423ceb187.pdf", "paperhash": "dubreuil|disentangling_the_roles_of_dimensionality_and_cell_classes_in_neural_computations"}, "submission_cdate": 1568211753456, "submission_tcdate": 1568211753456, "submission_tmdate": 1572356863476, "submission_ddate": null, "review_id": ["Skeh5C2vDr", "HkxUh_WtvH", "SJlBDDsqDB"], "review_url": ["https://openreview.net/forum?id=SklZVQtLLr¬eId=Skeh5C2vDr", "https://openreview.net/forum?id=SklZVQtLLr¬eId=HkxUh_WtvH", "https://openreview.net/forum?id=SklZVQtLLr¬eId=SJlBDDsqDB"], "review_cdate": [1569341075900, 1569425581743, 1569531741305], "review_tcdate": [1569341075900, 1569425581743, 1569531741305], "review_tmdate": [1570047558621, 1570047555713, 1570047544917], "review_readers": [["everyone"], ["everyone"], ["everyone"]], "review_writers": [["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper31/AnonReviewer2"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper31/AnonReviewer3"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper31/AnonReviewer1"]], "review_reply_count": [{"replyCount": 0}, {"replyCount": 0}, {"replyCount": 0}], "review_replyto": ["SklZVQtLLr", "SklZVQtLLr", "SklZVQtLLr"], "review_content": [{"evaluation": "3: Good", "intersection": "4: High", "importance_comment": "It's nice to be able to relate task complexity to a simple property of connectivity matrices, and to use this to analyse tasks and networks including creating connectivity for multi-task networks. Main issues are (1) it's a very special case and the tasks studied are for the moment very simple, but a lot of promise, (2) not much in the way of actual results, more of a method that could be generalised/applied more widely.", "clarity": "4: Well-written", "technical_rigor": "3: Convincing", "intersection_comment": "Good application of techniques from ML to a neuroscience problem.", "rigor_comment": "Seems correct to me, but not enough space to have a lot of detail.", "comment": "Really interesting approach, main limitations are that it's a fairly special case (which I don't find problematic) and that it's a bit preliminary / proof of principle.", "importance": "3: Important", "title": "Clever way to relate a class of recurrent neural networks to tasks in terms of rank of connectivity matrices", "category": "AI->Neuro", "clarity_comment": "Mostly easy to follow, a little heavy in parts (unavoidably perhaps)."}, {"evaluation": "4: Very good", "intersection": "5: Outstanding", "importance_comment": "Lots of recent work, especially in neuroscience, has investigated the relationship between cell class, the dimension of the neural response, and the complexity of the task at hand. As it is difficult to investigate these relationships casually in a real brain, most work on this front has been rather speculative and observational. In probing these properties in artificial systems, the authors make important advances in our understanding of how cell classes and dimensionality underlie computation", "clarity": "4: Well-written", "technical_rigor": "4: Very convincing", "intersection_comment": "While this paper exclusively focused on simulation of artificial neural networks, the general idea and results speak directly to experiments and analyses performed within the experimental/systems neuroscience sphere. I believe that both AI and neuroscience communities will benefit from this work, thus meriting outstanding intersection. ", "rigor_comment": "From what is presented in the paper, the work seems to be quite rigorous. I appreciate that the authors included the mean field equations for understanding the dynamics of a neural population with a single cell class, and describe roughly how they extend these equations to include multiple classes. The figures presented are consistent with the text, and provide support for their simulations and general scientific argument. ", "comment": "Overall, I thought this was a really great paper. A few comments for improvement:\n- There might be a slight discrepancy in how systems neuroscientists talk about 'cell classes' and how this paper does. Generally, cell classes are defined by their functional (or genetic/anatomical) properties, which may be related (or not) to their connectivity to other neurons. Nonetheless, I do think the current study (in investigating how many populations are functionally related to each other) is interesting - I just find the nomenclature, and how it relates to other literature using the same nomenclature, a bit confusing. I think it would be really fascinating to further nail down the link between functionally-defined cell classes (ie cell classes defined the systems neuro way), dynamics, and neural computation.\n- The case presented - specifically, the mixture of Gaussians for different cell classes - feels a bit specific. It is nice that there is previous work to build on this, and Gaussians are a great case to start with, but the choice isn't totally motivated and doesn't connect fully with the experimental literature. ", "importance": "4: Very important", "title": "Interesting investigation into the specific utility of cell classes and dimensionality in neural computation", "category": "Common question to both AI & Neuro", "clarity_comment": "This paper was very well-written. It was technical without getting bogged down in detail, and an intuitive description of their simulations and analyses was presented. The authors did a great job of providing motivation for each set of simulations/analyses. Each result was also well-summarized, and I finished reading this paper feeling like I learned something interesting. "}, {"title": "Techniques to interpret recurrent neural networks trained on systems neuroscience tasks", "importance": "3: Important", "importance_comment": "The authors identify a critical issue in the population dynamics view of systems neuroscience: namely, the lack of consideration of cell class. They aim to address this issue by introducing cell classes into rank-constrained recurrent neural networks. \n", "rigor_comment": "The authors make effective use of a mix of analytical and computational techniques. Constrained optimization over low-rank weight matrices recovers the low-dimensional structure expected from low-dimensional tasks. The use of well-known tasks in the experimental literature is compelling.\n", "clarity_comment": "The paper is clearly written. The ultimate interpretation of the results with respect to either neurobiology or neural networks is not entirely clear. (See full comments below.)\n", "clarity": "4: Well-written", "evaluation": "4: Very good", "intersection_comment": "The authors address ongoing questions in systems neuroscience and issues of interpretability in conventionally trained neural networks.\n", "intersection": "4: High", "comment": "This paper studies the intersection of several interesting problems in systems neuroscience and neural networks, working in the context of ongoing debates in neuroscience and introducing new techniques in neural networks. Both the rank-restriction and the Gaussian reconstruction could be used in more complex tasks. A sentence or two describing how the rank restriction was imposed, and how the Gaussians were reconstructed, would be welcome. \n\nWhile the technical exposition was clear, the interpretation with respect to biology and neural networks could be clarified. Biological cell type diversity spans many more parameters than those described by the covariance approach; while the introduction of covariance classes is itself an interesting step, the authors might include some speculation on how this definition of cell class enriches the neuroscientist\u2019s understanding of population dynamics. Similarly, from a mathematical perspective, the authors could clarify why higher rank wouldn't achieve the same goal. ", "technical_rigor": "4: Very convincing", "category": "Common question to both AI & Neuro"}], "comment_id": [], "comment_cdate": [], "comment_tcdate": [], "comment_tmdate": [], "comment_readers": [], "comment_writers": [], "comment_reply_content": [], "comment_content": [], "comment_replyto": [], "comment_url": [], "meta_review_cdate": null, "meta_review_tcdate": null, "meta_review_tmdate": null, "meta_review_ddate ": null, "meta_review_title": null, "meta_review_metareview": null, "meta_review_confidence": null, "meta_review_readers": null, "meta_review_writers": null, "meta_review_reply_count": null, "meta_review_url": null, "decision": "Accept (Poster)"}