wzuidema commited on
Commit
701f333
1 Parent(s): fb52eb2

removed comma

Browse files
Files changed (1) hide show
  1. app.py +1 -3
app.py CHANGED
@@ -319,9 +319,7 @@ Two key methods for Transformers are "attention rollout" (Abnar & Zuidema, 2020)
319
  [
320
  "Attribution methods are very interesting, but unfortunately do not work reliably out of the box.",
321
  8,0
322
- ],
323
-
324
-
325
  ],
326
  )
327
  iface.launch()
 
319
  [
320
  "Attribution methods are very interesting, but unfortunately do not work reliably out of the box.",
321
  8,0
322
+ ]
 
 
323
  ],
324
  )
325
  iface.launch()