AMSR / conferences_raw /neuroai19 /neuroai19_SyeWE7tU8H.json
mfromm's picture
Upload 3539 files
fad35ef
raw
history blame contribute delete
No virus
15.1 kB
{"forum": "SyeWE7tU8H", "submission_url": "https://openreview.net/forum?id=SyeWE7tU8H", "submission_content": {"TL;DR": "We examine the hypothesis that the entropy of solution spaces for constraints on synaptic weights (the \"flexibility\" of the constraint) could serve as a cost function for neural circuit development.", "keywords": [], "authors": ["Gabriel Koch Ocker", "Michael A. Buice"], "title": "Flexible degrees of connectivity under synaptic weight constraints", "abstract": "Biological neural networks face homeostatic and resource constraints that restrict the allowed configurations of connection weights. If a constraint is tight it defines a very small solution space, and the size of these constraint spaces determines their potential overlap with the solutions for computational tasks. We study the geometry of the solution spaces for constraints on neurons' total synaptic weight and on individual synaptic weights, characterizing the connection degrees (numbers of partners) that maximize the size of these solution spaces. We then hypothesize that the size of constraints' solution spaces could serve as a cost function governing neural circuit development. We develop analytical approximations and bounds for the model evidence of the maximum entropy degree distributions under these cost functions. We test these on a published electron microscopic connectome of an associative learning center in the fly brain, finding evidence for a developmental progression in circuit structure.", "authorids": ["gabeo@alleninstitute.org", "michaelbu@alleninstitute.org"], "pdf": "/pdf/d9cf1fe9dde790c818c70d848b753de9b9e42c85.pdf", "paperhash": "ocker|flexible_degrees_of_connectivity_under_synaptic_weight_constraints"}, "submission_cdate": 1568211753045, "submission_tcdate": 1568211753045, "submission_tmdate": 1572492683561, "submission_ddate": null, "review_id": ["BJlD1APHPS", "Syl05buKwH", "B1g3TW59Pr"], "review_url": ["https://openreview.net/forum?id=SyeWE7tU8H&noteId=BJlD1APHPS", "https://openreview.net/forum?id=SyeWE7tU8H&noteId=Syl05buKwH", "https://openreview.net/forum?id=SyeWE7tU8H&noteId=B1g3TW59Pr"], "review_cdate": [1569189343164, 1569452437874, 1569526212244], "review_tcdate": [1569189343164, 1569452437874, 1569526212244], "review_tmdate": [1570047564064, 1570047553215, 1570047545349], "review_readers": [["everyone"], ["everyone"], ["everyone"]], "review_writers": [["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper30/AnonReviewer3"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper30/AnonReviewer2"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper30/AnonReviewer1"]], "review_reply_count": [{"replyCount": 0}, {"replyCount": 0}, {"replyCount": 0}], "review_replyto": ["SyeWE7tU8H", "SyeWE7tU8H", "SyeWE7tU8H"], "review_content": [{"evaluation": "4: Very good", "intersection": "3: Medium", "importance_comment": "The authors have studied the problem of how the degree of neurons in a network influences their ability to learn. The idea is that degrees are less flexible than the weights of connections. Therefore, for a neuron with fixed degree, the \"size\" of the space of possible weights should be maximized. The size is computed 3 different ways, and this model is applied to the Drosophila mushroom body connectome. This is a fresh approach and has implications for AI; unfortunately they are not emphasized.", "clarity": "4: Well-written", "technical_rigor": "4: Very convincing", "intersection_comment": "I think this is the weakest part of the submission. The motivation for this work is almost entirely from the biological perspective. I think that this work probably does have some implications for AI, but it needs to be discussed by the authors. Places for this are in the introduction & potentially the discussion if added. (I am actually uncertain whether this workshop offers opportunity for revision, but I am writing my review like I would any paper and hope the authors will at least consider making some changes for their next version.)\n\nTo an AI person, familiar with statistical learning theory, it will probably be hard to find a good take-away from this work. I think a natural connection to try and make would be to the complexity of learning with such a network. Constraining the weights in a network to lie within some ball of radius R is a way to bound the generalization error. The idea of constraining the weights in a network is closely connected to classical types of regularization, which bound the norms of the weights. Another possible connection to illuminate would be to weight normalization techniques including batch normalization. Pruning neural networks to reduce their size is another area to look into, since that also reduces the degrees from fully-connected. ", "rigor_comment": "I think the mathematics are correct and well-explained through Section 2. I thought the section on maximum entropy was harder to follow, probably because the inherent mathematics is more complicated. This could be alleviated by adding references & maybe pointing to an appendix. I would not care if the references extend beyond 4 pages; you can also eliminate line spacing there. You can also gain space by using the \\paragraph commands instead of \\subsection for the parts of Section 2.\n\nSpecific suggestions:\n* L. 39 strike \"synaptic partners\" and use \"degree\"\n* L. 40, I'd add \"J_i \\geq 0\" when defining J_i.\n* The volume & area of the simplex (ll. 58 & 73) need references.\n* Notation S_K is not defined (l. 92), but I gather it is the volume/area calculations from before. I would suggest using S_K^{\\leq net}, S_K^{= net}, and S_K^{individ} or something along those lines to clarify that these are the different ways of measuring \"size\". In fact, the language of \"size\" throughout the paper is kind of confusing until Sec. 2.1 when things become concrete. I would mention in the intro that you will use volume/area as ways to measure size.\n* Ll. 67 & 79 \"and vice versa\" It is unclear to me what the vice versa case is. Clarify.\n* L. 93 \"K^max\" isn't defined, strike \"for large K^max\"\n* L. 99 what is \\bar S? How you use the weights is muddled.\n* L. 99 \"Laplace approximation\" and \"model evidence\" need references. I gather that \"model evidence\" is something like log-likelihood; be more precise.\n* Ll. 101-102 in the binomial random wiring model, how do weights of the connections enter? \n", "comment": "How to structure a network in terms of in- and out-degrees of neurons is a fundamental question. In neuroscience, this area has been tackled more because neuron degree is easy to measure experimentally. On the AI side, there hasn't been as much focus on this kind of network structure, with the most common structures being fully-connected or convolutional. So I see this work as having potential relevance there. But more work will have to be done to see whether artificial neural networks end up following these same kind of principles.\n\nI like the simplicity of the analysis in this work and the dataset that it is applied to. I only wish there were more discussion of the take-aways for both neuroscientists and AI researchers. But I see that interest for more as a positive rather than a negative.", "importance": "4: Very important", "title": "Interesting theory of homeostasis, constraints, and degree distributions; needs more discussion & AI intersection", "category": "Common question to both AI & Neuro", "clarity_comment": "The paper is well-written and for the most part easy to read. As mentioned in the technical review, the language of \"size\" should be made clearer in the introduction by stating that \"size\" will mean volume or area under different assumptions. Similarly, using the precise language of \"degree\" is preferable in my opinion to using \"partners\".\n\nI would also spend a little more time in the intro motivating why \"size\" is something that'd be optimized. Can you learn more with a larger size, and do you think this connects to measures of dimensionality or complexity in ML systems (Rademacher/VC dim)? Also, constraining the degree is kind of what happens with convolutional layers, although they are very non-random.\n\nThere is no discussion section and there should be. What are the take-aways from this analysis? How do we interpret the conclusion that \"other factors come to dominate\" (L. 118) the network as Drosophila develops? I would like more speculation, for the biologists. Similarly, your model predicts an optimal K* dependent on various parameters; whereas this is known for mushroom body to be ~7 (in cerebellum, arguably similar, it is ~4). I'd like some discussion of whether your model is predictive of these properties or what it says about those networks' computational ability.\n\n* L. 3 \"size\" -> \"sizes\" & \"determines\" -> \"determine\"\n* L. 6 \"partners\" -> \"neighbors\" sounds better to me\n* L. 17 \"learning rule\" is \"not known\", but what about Hebb/anti-Hebb STDP rules?\n* L. 20 \"consider the hypothesis\" -> \"hypothesize\" would be better\n* L. 25 \"regulate\" is used twice, I'd change the second to \"stabilize\" or \"normalize\" or similar\n* L. 34 \"We find that overall, ...\" -> \"We find that, overall, ...\"\n* L. 36 would be nice to have some speculation about this \"developmental progression\"\n* L. 39 Suggest rephrase to \"where a neuron has degree K\" since you've already introduced degree = # neighbors = # partners\n* L. 54 for balanced references, I'd add ref to recent work of Arenas on experimental verification of this scaling\n* L. 60 \"measureable synaptic weight changes\" could mention \"i.e., # of discrete vesicles\"\n* L. 69 \"different types\" I think you mean \"many types\" of neurons <- plural\n* Figure 1: Suggest adding \"bounded net\", \"fixed net\", \"bounded individual\" labels to each row, on the left hand side under (a), (d), and (g)\n* L. 90 \"provides a cost function\" is awkward, maybe simply \"determines\" is better\n* L. 118 \"Other factors come to dominate their wiring\". What would these be? Since binomial is a good fit, would you say the network is random or not?\n* L. 74 \"net excitatory\" strike \"excitatory\", you haven't talked about E/I at all so this is confusing"}, {"title": "Difference in KC connectivity between young and adult flies", "importance": "2: Marginally important", "importance_comment": "This paper addresses questions that are important to fly olfaction research, but the discussion doesn't say much about how it can provide insight for AI research.", "rigor_comment": "The methodology present in the paper appears fine, but I have some questions about the neuroscience aspects:\n- Is there any experimental evidence that KC synapses undergo synaptic scaling?\n- Is there any experimental evidence that young flies have non-random KC connectivity?\n- What was the threshold for determining significance between models? I ask because it seems the evidence is very close between the binomial model (which is experimentally suggested) and bounded/fixed net weights for single-claw adults, and a small change in what it means to be significant would lead to the result that synaptic connectivity in adults is also better supported by fixed/bounded weights.", "clarity_comment": "The paper is nicely written, but it reads like it was originally a much longer paper compacted down to four pages. For example in Section 3, there isn't much detail on the maximum entropy and Laplace approximation methods, or how the connectomic data was integrated into the analysis. Also, Fig. 2 could use some sort of adjustment to separate out the blue and orange lines.", "clarity": "3: Average readability", "evaluation": "3: Good", "intersection_comment": "Right now this finding falls mostly into neuroscience, and I feel this paper needs some added discussion on how the study of fly olfaction can be used to advance AI.", "intersection": "2: Low", "technical_rigor": "3: Convincing", "category": "Not applicable"}, {"title": "Statistical analysis of the olfactory wiring diagram with not much of biological insights", "importance": "3: Important", "importance_comment": "In this work, the author(s) first characterized how various homeostatic constraints influences the optimal connection degree, then studied whether the connectivity structure found in Drosophila olfactory circuit is consistent with those homeostatic constraints. The results suggest that the model with the constraint on the net weight provides better fit for immature KCs than the binomial wiring model. Although the work has some merit, I\u2019m not convinced of the biological relevance. ", "rigor_comment": "In the manuscript, the parameter space was defined as a K-dimensional space, and the authors optimized K under some homeostatic constraints. However, considering the actual neural circuit, the problem should be defined as the problem of choosing (linear) K-dimensional subspace from N potential space (see eg. Ashok-kumar et al., Neuron, 2017). Thus, the benefit of having large K is underestimated in this study, which somewhat weakens the biological relevance.\n\nIn addition, the author(s) should discuss the range of p and J for which the optimal degree actually exists, as Eqs. (2) and (3) don\u2019t have any (real) solution at wide range of p and J.", "clarity_comment": "If you naively apply the stirling\u2019s approximation, you get slightly different expression for Eqs (2) and (3). The author(s) should clarify the approximation they used.\n\nI couldn\u2019t get how the results shown in Fig. 2 are calculated either. In particular, the accuracy of the binomial model highly depends on whether only the connected pairs were fitted or all the potential connections were considered. Moreover, in the former scenario, the distribution needs to be shifted by one to cancel the selection bias. Yet, these points were not discussed in the manuscript.", "clarity": "2: Can get the general idea", "evaluation": "3: Good", "intersection_comment": "The choice of regularization is still an important topic in ML, and I believe it is insightful to study what kind of regularizer is used in the brain. ", "intersection": "3: Medium", "comment": "Because PNs-to-KCs connections are arguably not plastic, I'm not sure an analysis based on the weight volume is biologically relevant, even if the combinatorial term is added up to the model. Still I believe this line of work is important for understanding the wiring principle of the brain.", "technical_rigor": "2: Marginally convincing", "category": "Common question to both AI & Neuro"}], "comment_id": [], "comment_cdate": [], "comment_tcdate": [], "comment_tmdate": [], "comment_readers": [], "comment_writers": [], "comment_reply_content": [], "comment_content": [], "comment_replyto": [], "comment_url": [], "meta_review_cdate": null, "meta_review_tcdate": null, "meta_review_tmdate": null, "meta_review_ddate ": null, "meta_review_title": null, "meta_review_metareview": null, "meta_review_confidence": null, "meta_review_readers": null, "meta_review_writers": null, "meta_review_reply_count": null, "meta_review_url": null, "decision": "Accept (Poster)"}