Cuda error for MAX_TOTAL_TOKENS = 8192

#5
by aastha6 - opened
deleted

clear up pytorch's cache and reload the model.

Sign up or log in to comment