Read a numpy file with Data Files

What I’m trying to do:
I have a numpy files (file.npy) that contains a 2 by 2 matrix. I uploaded it to files with Data Files (I am on the hobby plan). I want to use this file server side.

What I’ve tried and what’s not working:
Using the docs for Data Files to read in txt or json files works as advertised, but using the same logic for the numpy file gives this error:
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x93 in position 0: invalid start byte

Code Sample:

def read_mdf():
  global mdf
  with open(data_files['mdf2100.npy']) as ff:
    mdf = np.load(ff)
  return mdf

mdf = read_mdf()

Pretty new to anvil, so most likely a real beginners question. Any pointers are most welcome
UPDATE: I converted the numpy file into a textfile, once with np.savetext() and once via json. The orignal npy file is 350MB, the json file 600MB the text file 1.1 GB. Both the json file and the text file failed on upload, with the simple msg: filename BAD REQUEST after reaching 99%
Is the there a size limit on uploaded files?
I will now try to handle all numpy file interactions on my own server and connect via the UPLINK possibility.
Any other thoughts?
Thanks for your time
UPDATE 2: Solved the update by restarting the anvil designer.
But then ran into a memory problem when reading the 600MB json file:
anvil.server.ExecutionTerminatedError: Server code execution process was killed. It may have run out of memory: 14cce47bd9
I will try to run this part of the code in the background.
Is there a solution to this other than buying more resources? Something which I cannot afford since this is an NGO project.
Thanks and best

I suspect that the high memory usage comes from parsing the file. During parsing, you have two copies of the data: one in the original format, and one in the final format (ndarray).

If that’s so, then it may be best to do that not in Server code, but in Uplink code instead. Since your Uplink code runs on your own PC, you can more easily have more memory resources.

Your Uplink code can save the result to a binary (Media) column, in a database table, for access by Server code.

If the Server still runs out of memory when loading the binary data as a single ndarray, then you might chop it up. Conveniently, a 2x2 array divides neatly into 4 well-defined chunks, which could be stored in 4 Media rows.

Thanks. The np array holds the simulation results by simulated timestep (each row) for all model variables (in columns). So I could break up the array into logical time slots, and with 4 slices (by row) it seems to work within the RAM one gets on the Hobby tier.