How to deploy ml model in anvil server itself

Hi @mark.breuss
Actually turicreate is a module developed by APPLE , which runs only on windows10 (with WSL), Mac OS,Linux,
hence i have been google colab in my pc to use turicreate and create machine learning model.

I used uplink and then i created a web app using ANVIL , but the problem is ,
if i close my colab notebook, then the web app created using ANVIL work,

Hence i need my model to be present in the ANVIL’s Server itself , so that my web app works anytime anywhere,
but in anvil “TURICREATE” is not supported in free account and hence i can’t upload my model to anvil’s server

So is there any way to make my model always available even if my notebook is closed, using ANVIL
?

I’ll check out with the tensorflow js , thanks mark