Is there a max size for internal memory?

What I’m trying to do:

I am tryng to create a pandas dataframe from an internal text/csv. And I get server termainations without any additional information. I have narrowed the error to a single line of code.

What I’ve tried and what’s not working:

I have tried various functions to process the databse result (text/csv) but all end in the same place with the same error.

I tried pd.read_csv(StringIO(res)). I am now try to get an iterator over the csv, but still having the same result.

Code Sample:
Here is the latest attempt

            res = self._post_query(query_string, accept='text/csv')
            print("slow dataframe sparql query finished")
            print("returning a text/csv of", len(res))
            reader = csv.DictReader(StringIO(res))
            for i, item in enumerate(reader):
              print(f"{i} item = {item}")
            print("done with reader")

Here is the output

going for data:
 

    prefix xsd: <http://www.w3.org/2001/XMLSchema#>
    prefix caa: <http://caa.org/ns#>
    select ?s ?text ?a ?dt 
     from <http://caa.org/RawDataFromTwitter> {
        ?s a caa:Tweet .
        ?s caa:twitterAccount ?a .
        ?s caa:text ?text .
        ?s caa:createdAt ?dt .
        filter ((?dt >= "2022-11-23T00:00:00"^^xsd:dateTime) && (?dt <= "2023-05-22T23:59:59"^^xsd:dateTime))      
    }
  
slow dataframe sparql query finished
returning a text/csv of
 
125611259
anvil.server.ExecutionTerminatedError: Server code exited unexpectedly: 177e6e5b49