icefire080
commited on
Commit
•
fdb6d86
1
Parent(s):
5a43832
a tiny bug fix missing default_training_args
Browse fileshello , there is a tiny bug in function "get_default_train_args" when num_layers != 6. A default_training_args which is a empy dict is missing.
geneformer/classifier_utils.py
CHANGED
@@ -387,6 +387,10 @@ def get_default_train_args(model, classifier, data, output_dir):
|
|
387 |
"per_device_train_batch_size": batch_size,
|
388 |
"per_device_eval_batch_size": batch_size,
|
389 |
}
|
|
|
|
|
|
|
|
|
390 |
|
391 |
training_args = {
|
392 |
"num_train_epochs": epochs,
|
|
|
387 |
"per_device_train_batch_size": batch_size,
|
388 |
"per_device_eval_batch_size": batch_size,
|
389 |
}
|
390 |
+
else:
|
391 |
+
default_training_args = {
|
392 |
+
"per_device_train_batch_size": batch_size,
|
393 |
+
}
|
394 |
|
395 |
training_args = {
|
396 |
"num_train_epochs": epochs,
|