What I’m trying to do:
Read & Process large CSV files (~1GB). I have a business account and it is crashing even when I use background task. I have checked all previous discussions about upload large files.
Also, I need to do a lot of processing steps to the data from the CSV file so what is the best way to save the file temporary ? (For example, let say I have two buttons, one to remove duplicates and one to add new column, so the client has to press and send request to the backend which need to load the file and do processing, … till the final stage where the client saves it into google drive. I have tried saving the file temporary into the server as csv and it works well for small files but not for large files. Any suggestions on that?
What I’ve tried and what’s not working:
- Upload from local machine.
- Read from Google Drive
Code Sample:
# Client side
task = anvil.server.call('read_csv_from_drive', ds_path)
while not task.is_completed():
time.sleep(15)
flag, msg, ds_cols, ds_json = task.get_return_value()
#Server side:
task = anvil.server.launch_background_task('read_csv_from_drive_bg', ds_path)
return task
# read_csv_from_drive_bg
folder = app_files.app.get("input_datasets")
for f in folder.list_files():
if f["title"] == ds_path:
print ("reading ...")
print (f["title"])
tmp_name = "temp_f"
with open(tmp_name, 'wb') as f_:
print ("writing")
f_.write(f.get_bytes())
df = pd.read_csv(tmp_name)
Error:
anvil.server.ExecutionTerminatedError: Server code exited unexpectedly: 74dc4c7741