{"forum": "B1eyA3VFwS", "submission_url": "https://openreview.net/forum?id=B1eyA3VFwS", "submission_content": {"abstract": "Recent studies at the intersection of physics and deep learning have illustrated successes in the application of deep neural networks to partially or fully replace costly physics simulations. Enforcing physical constraints to solutions generated\nby neural networks remains a challenge, yet it is essential to the accuracy and trustworthiness of such model predictions. Many systems in the physical sciences are governed by Partial Differential Equations (PDEs). Enforcing these as hard\nconstraints, we show, are inefficient in conventional frameworks due to the high dimensionality of the generated fields. To this end, we propose the use of a novel differentiable spectral projection layer for neural networks that efficiently enforces\nspatial PDE constraints using spectral methods, yet is fully differentiable, allowing for its use as a layer in neural networks that supports end-to-end training. We show that its computational cost is cheaper than a regular convolution layer. We apply it to\nan important class of physical systems \u2013 incompressible turbulent flows, where the divergence-free PDE constraint is required. We train a 3D Conditional Generative Adversarial Network (CGAN) for turbulent flow super-resolution efficiently, whilst\nguaranteeing the spatial PDE constraint of zero divergence. Furthermore, our empirical results show that the model produces realistic flow fields with more accurate flow statistics when trained with hard constraints imposed via the proposed\nnovel differentiable spectral projection layer, as compared to soft constrained and unconstrained counterparts.", "title": "Enforcing Physical Constraints in Neural Neural Networks through Differentiable PDE Layer", "keywords": ["PDE", "Hard Constraints", "Turbulence", "Super-Resolution", "Spectral Methods"], "pdf": "/pdf/0cb6a7e7202debafe704399062922e4d72f3e447.pdf", "authors": ["Chiyu \"Max\" Jiang", "Karthik Kashinath", "Prabhat", "Philip Marcus"], "TL;DR": "A novel way of enforcing hard linear constraints within a convolutional neural network using a differentiable PDE layer.", "authorids": ["chiyu.jiang@berkeley.edu", "kkashinath@lbl.gov", "prabhat@lbl.gov", "pmarcus@me.berkeley.edu"], "paperhash": "jiang|enforcing_physical_constraints_in_neural_neural_networks_through_differentiable_pde_layer", "original_pdf": "/attachment/0cb6a7e7202debafe704399062922e4d72f3e447.pdf", "_bibtex": "@misc{\njiang2020enforcing,\ntitle={Enforcing Physical Constraints in Neural Neural Networks through Differentiable {\\{}PDE{\\}} Layer},\nauthor={Chiyu ''Max'' Jiang and Karthik Kashinath and Prabhat and Philip Marcus},\nyear={2020},\nurl={https://openreview.net/forum?id=B1eyA3VFwS}\n}"}, "submission_cdate": 1569438919199, "submission_tcdate": 1569438919199, "submission_tmdate": 1577168260831, "submission_ddate": null, "review_id": ["H1xdjdvhtH", "S1esiqF4FS", "H1ltqZlCYH"], "review_url": ["https://openreview.net/forum?id=B1eyA3VFwS¬eId=H1xdjdvhtH", "https://openreview.net/forum?id=B1eyA3VFwS¬eId=S1esiqF4FS", "https://openreview.net/forum?id=B1eyA3VFwS¬eId=H1ltqZlCYH"], "review_cdate": [1571743903544, 1571228323509, 1571844496795], "review_tcdate": [1571743903544, 1571228323509, 1571844496795], "review_tmdate": [1572972619578, 1572972619535, 1572972619491], "review_readers": [["everyone"], ["everyone"], ["everyone"]], "review_writers": [["ICLR.cc/2020/Conference/Paper249/AnonReviewer2"], ["ICLR.cc/2020/Conference/Paper249/AnonReviewer3"], ["ICLR.cc/2020/Conference/Paper249/AnonReviewer1"]], "review_reply_count": [{"replyCount": 0}, {"replyCount": 0}, {"replyCount": 0}], "review_replyto": ["B1eyA3VFwS", "B1eyA3VFwS", "B1eyA3VFwS"], "review_content": [{"rating": "3: Weak Reject", "experience_assessment": "I have published in this field for several years.", "review_assessment:_checking_correctness_of_derivations_and_theory": "I carefully checked the derivations and theory.", "review_assessment:_checking_correctness_of_experiments": "I carefully checked the experiments.", "title": "Official Blind Review #2", "review_assessment:_thoroughness_in_paper_reading": "I read the paper thoroughly.", "review": "This paper proposes to use a differentiable FFT layer to enforce hard constraints for results generated by a CNN. This is demonstrated and evaluated for a 3D turbulence data set (an interesting and challenging problem), and evaluated for a single case.\n\nWhile this goal is good by itself, and the domain of applications is a very interesting one, the paper gives the impression of being preliminary, and the claims for the proposed constraints are a bit too generic, in my opinion.\n\nFirst, the FFT effectively only yields a somewhat specialized method for projection onto the set of admissible solutions, and is demonstrated only for a single constraint, i.e., to make the flow field solenoidal. The same goal can actually be reached in different ways, e.g., by inferring a vector potential as proposed by Kim et al. 2019 in the \"DeepFluids\" paper. The latter employs a curl formulation, and as such is less general, but probably faster than the FFT based method proposed here.\n\nIn addition, the paper unfortunately contains only a single example. Here, several variants (no constraint, soft constraint, and the proposed method) are evaluated in addition to simpler interpolation methods. Visually, I could not really make out differences in figure 4. The metrics in table 1 look interesting, although it, e.g., didn't get clear to me what the \"KS stats\" mean. The graphs in figure 3 also paint a somewhat varied picture. While some regions seem to be well represented, others are clearly there in the references, but missing in one of the inferred versions.\n\nI was wondering in general - what is the intuition for divergence-freeness improving the TKE, for example? It's neat to see the metrics improve, but wouldn't one expect that a projection onto divergence free flows rather removes energy from the solutions, and hence maybe yield values that are too low? \n\nI think the paper could be improved by first evaluating the method for a series of smaller two-dimensional examples, before tackling a full 3D flow. This would simplify comparisons to other methods, and help to illustrate the properties of the method. Ideally, other constraints than enforcing divergence-freeness could be demonstrated to show the generality of employing an FFT projection in the loss function. So currently, I think this paper is not quite ready for a conference such as ICLR. It would be important to demonstrate that the result shown here is not an \"outlier\", but that the improvements are a general trend obtained via the proposed method."}, {"experience_assessment": "I have read many papers in this area.", "rating": "6: Weak Accept", "review_assessment:_thoroughness_in_paper_reading": "I read the paper at least twice and used my best judgement in assessing the paper.", "review_assessment:_checking_correctness_of_experiments": "I assessed the sensibility of the experiments.", "title": "Official Blind Review #3", "review_assessment:_checking_correctness_of_derivations_and_theory": "I assessed the sensibility of the derivations and theory.", "review": "The paper describes a way to efficiently enforce physical constraints expressed by linear PDEs on the output of a neural network. The idea is to have, as a last layer of the network, a projection onto the constrained solution space, and to back-propagate through it. That projection layer is made efficient for high-dimensional outputs via the fast Fourier transform (FFT), exploiting a well-known numerical trick. Importantly, the proposed strategy is very general, and can indeed be used with any PDE constraint that is a linear combination of differential operators.\n\nFirst, I would like to mention to the AC that the authors apparently forgot to run bibtex. The submitted paper had no bibliography and all references are \"?\". So technically the original submission was incomplete and would probably have to be rejected, on the grounds that it was impossible to check whether the literature references are appropriate. The authors have rectified this through a comment pointing to an (anonymous) version with references. For me, this is ok.\n\nThe research direction of the paper is a hot topic: how to reconcile data-driven deep learning with analytic physical models is, arguably, one of the big research questions holding back the wide-spread use of deep networks in several natural sciences (e.g., environmental and climate science, hydrology, etc.). Work in this direction could have a significant impact, and the approach developed the paper is rather general and, as far as I can tell, correct.\n\nThe experiments use the rather challenging task of super-resolving turbulent flow, subject to the Navier-Stokes constraints. Experiments are run on synthetic data from the JHTD simulation dataset. This is hard to avoid, since dense ground truth for flow fields is impossible to obtain, nevertheless it would have been more convincing to also show at least qualitative results for a real flow dataset. It is also not completely satisfactory that the experiments must use even as OUTput a significantly downscaled version of the dataset with a resolution that would not be tremendously useful in practice - this implicitly acknowledges that the paper, while reducing the computational cost compared to the direct projection method, did actually not overcome the most critical bottleneck one faces when combining high-dimensional physics simulation with deep learning: namely, that it is not tractable to simply store physical fields as explicit 3D/4D voxel grids if one wants to work with them on the GPU.\n\nAnother slightly bothering choice in the experiments is to compare only distributions. While I see the point that the prediction is, by its nature, ambiguous; I still think one should look at both flow statistics and at the actual flow field difference. Ambiguity is not a very convincing justification to not even try to predict correctly - by the author's definition any super-resolution task is ambiguous, still one aims, for instance, to recover the correct image, not just a plausible one. In a sense it is the whole point of prior knowledge (which the PDEs are in a learning context) to bring the solution closer to the right answer, when the data alone cannot do the job. In practice, a prediction that is very close to the true flow field, but ever so slightly violates the physical constraints is often more useful than one that strictly satisfies the constraints, but is way off.\n\nOne comment on the presentation: I feel that the discussion of soft constraints could be more extensive and more balanced. I agree that they are less principled, they do not devalue the present work. But we know, both for variational methods and for pre-deep learning methods, that soft constraints work rather well in practice, especially with an adaptive weight that gradually tightens the constraints. So it would be in order to not just dismiss the alternative in a half-sentence, but to give it proper consideration - especially since in terms of dissipation, it actually performs better than the hard constraint in the experiments.\n\nOverall, I find the topic important and the presented work is a sensible and nicely generic step forward. On the negative side, the paper does not fully deliver on the promise to make physics constraints in deep networks usable in practice. My rating reflects my impression that the bibliography and references are probably correct - this should be checked before reaching a final decision. \n"}, {"experience_assessment": "I have published in this field for several years.", "rating": "3: Weak Reject", "review_assessment:_thoroughness_in_paper_reading": "I read the paper at least twice and used my best judgement in assessing the paper.", "review_assessment:_checking_correctness_of_experiments": "I carefully checked the experiments.", "title": "Official Blind Review #1", "review_assessment:_checking_correctness_of_derivations_and_theory": "I carefully checked the derivations and theory.", "review": "\nThis work develops a differentiable spectral projection layer to enforce spatial PDE constraints using spectral methods, to achieve the introduction of the physical constraints in the end-to-end network without damaging the intrinsic property of the network. Analysis of computational cost shows the proposed layer is cheaper than the convolutional layer. The experimental comparison demonstrates the superiority of the proposed method. In my viewpoint, the novelty of this paper is somewhat novel. \n\nThis paper focuses on designing the PDE layer to constrain the network output without additional loss. Some constraint sets for comparison are clearly performed and then authors present the proposed spectral projection layer. From the mechanism of solving, the FFT (IFFT) operator is important component in this layer. It is curious about the role and importance between FFT (IFFT) and spectral projection. If possible, the authors maybe provide some analyses of these two components to deeply recognize the proposed layer. \n\nA series of compared experiments are conducted to verify the effectiveness of the proposed PDEL. But it is a little confusing in Table 1, e.g., why the second column obtains the second better in the all mean values. I know its result has a high score of residue. It will be better if authors can clarify their causes and analysis clearly.\n"}], "comment_id": [], "comment_cdate": [], "comment_tcdate": [], "comment_tmdate": [], "comment_readers": [], "comment_writers": [], "comment_reply_content": [], "comment_content": [], "comment_replyto": [], "comment_url": [], "meta_review_cdate": 1576798691376, "meta_review_tcdate": 1576798691376, "meta_review_tmdate": 1576800943882, "meta_review_ddate ": null, "meta_review_title": "Paper Decision", "meta_review_metareview": "This paper introduces an FFT-based loss function to enforce physical constraints in a CNN-based PDE solver. The proposed idea seems sensible, but the reviewers agreed that not enough attention was paid to baseline alternatives, and that a single example problem was not enough to understand the pros and cons of this method.", "meta_review_readers": ["everyone"], "meta_review_writers": ["ICLR.cc/2020/Conference/Program_Chairs"], "meta_review_reply_count": {"replyCount": 0}, "meta_review_url": ["https://openreview.net/forum?id=B1eyA3VFwS¬eId=ZmXK-yH3sC"], "decision": "Reject"}