Speeding up Leaflet MarkerClusters

What I’m trying to do:
Use Leaflet/ Leaflet MarkerCluster to display ~10,000 location markers on a map, without a significant time lag.

What I’ve tried and what’s not working:
It works, but is slow. Once the lat/longs have been pulled from the database, it takes 7 seconds to plot the points.
At the moment I initialise the map and plot points in the custom HTML of the LeafletMap object.
Once the data points are retrieved, they are passed, one by one, to the JS function that plots them, which is perhaps what is slowing this down:

locations = anvil.server.call('get_print_locations')
for location in locations:
  self.call_js('setClusters',
              location['name'],
              location['latitude'],
              location['longitude'],
              location['on_map'])
self.call_js('addClusterLayer ', datetime.now())

Is it possible to pass all the search results at once to the JS function that sets the marker clusters?
Any other tips on how to speed this up would be much appreciated.

Clone link:
https://anvil.works/build#clone:4BS3V7HLP7N3WMCK=Z43X22ZQKGGBUWRDDDA4AIAP

You may find this post helpful as it relates to search iterators

You can check if its the iteration causing the slowness by removing the call to self.call_js and timing the iteration. I suspect you’ll find the same issue because iterating over 10,000 datatable rows on the client will result in several server calls to fetch data (a search iterator is lazy and fetches rows in batches).

Thanks, @stucork that’s really helpful to know. I’ve now tried to use the to_csv workaround mentioned in the post, however it’s not making any noticeable difference to the speed:

    @anvil.server.callable
    def get_locations():
      csv = app_tables.locations.search().to_csv()
      df = pd.read_csv(io.BytesIO(csv.get_bytes()))
      locations = df.to_dict('records')
      return locations

Am I missing something in how it is implemented, or is the to_csv also making the multiple calls?