So I’m currently trying to make the business case regarding moving our subscription service up a level. As part of this, I’d like to quantify the speed increase from using the Persistent Server.
Currently, a big part of my code is sending data to MongoDB, which I do through their HTTP API. The server function here takes around 5 seconds, from time of invocation, to the return result of the request being printed in the console. I was wondering what sort of timeframes I’d be looking at with the persistent server? Obviously I’d expect to gain some amount of time, but how much will determine whether its worth actually upgrading
Further context, the app can be capable of sending a request a second, with the need for 5-20 users having this capability. Currently a 5 second wait time is too long, especially if multiple requests are queued up. Have tried using anvil_labs.non_blocking but that didn’t seem to increase speed, or lower time between requests
I’d also prefer to use pymongo for this use case, but found that the version currently installed by Anvil does not include load balancing that is required by newer versions of Mongo Server. If that is resolved, I’d still be running into speed issues without the persistent server. Anyone have this specific setup?
5 seconds sounds like a huge time. If your app imports some package that is very slow to load, you can try with lazy loading so you only load it when you really need it.
I’m in the middle of the usa, and my http endpoints with persistent server respond in less than 0.2 seconds. If you are close to London, you’ll get faster responses.
When starting a server function, does the time to load the server instance include every package that’s imported in every server code file? Or only the packages located within that specific Server Code file? The Server Code file that is being used here only imports anvil.server, collections and anvil.http, but obviously if it has to load every package in every server file that will slow it down
I tried using the Custom Packages feature, and noticed that the server instance was even slower to spin up
I’ll take a look at your suggestions this weekend when I get a chance, appreciate the suggestions!
Edit: I should have noted that I’m in Australia, about as bad as it gets geographically
If you have slow imports that are not required by the http endpoints, you can either try lazy imports (search in this forum, there are some nice examples) or create a second app with the http endpoints. The second app can share the tables with the main one and manage http endpoints, similar to microservices.
I might have not provided enough clarity. I’m using anvil.http to make a anvil.http.request(method='POST') to MongoDB’s Data API, not the other way around. I will implement the lazy imports and see how that goes. The idea is that Anvil pushes and pulls data to/from Mongo, instead of using Anvil’s builtin tables.
Moving all large packages into their respective server functions has taken at least 1 second from the average load time. The quick scan of all my server code shows plenty of places where I can be making things more efficient. I think that the Persistent Server will still be the go, but I’m glad I’ve been able to nail down part of the issue. That was a failing on my understanding, because for some reason I assumed that a server func call would only load packages that are used with that one server code file
Everyone in this forum has had the same assumption!
But I guess when the Anvil infrastructure receives a request, it knows what function of what app to call, but doesn’t know what module to load, so it loads the whole app.