Trying to Call Server Function from Uplink Jupyter Notebook

What I’m trying to do:
For my jupyter notebook linked webapp, I’m trying to have the user press a button, and calls a server function from my Jupyter notebook which launches a background task from my Jupyter notebook.

Works fine on my desktop when I have the Jupyter notebook open, but when I publish the app, I get this error: “NoServerFunctionError: No server function matching “launch_topic_map_htmls” has been registered”. I also get this error when I don’t have my Jupyter notebook open.

Do I need to place the launch_topic_map_htmls within the server module? Do I need to place create_topic_map_htmls within the server module as well? When does it make sense for the @anvil.server.callable and @anvil.server.background_task to be in the server module instead of the Jupyter notebook? I’m basically trying to understand how anvil can interact with a closed Jupyter notebook or when an anvil webapp runs on a computer without the Jupyter notebook.

What I’ve tried and what’s not working:

Code Sample:
In my anvil webapp I have:

background_task = anvil.server.call('launch_topic_map_htmls', num_topics_low, num_topics_high)
```python

In my Jupyter notebook, I have this code:
```python
@anvil.server.callable
def launch_topic_map_htmls(num_topics_low, num_topics_high):
    task = anvil.server.launch_background_task('create_topic_map_htmls', num_topics_low, num_topics_high)
    return task

@anvil.server.background_task
def create_topic_map_htmls(num_topics_low, num_topics_high):
    all_records = app_tables.input_table.search()
    print("creating data frame")
    dicts = [{'ID': r['ID'], 'Link': r['Link'], 'Title': r['Title'],
              'Index': r['Index'], 'User': r['User'], 'Date': r['Date'],
              'Post Length': r['Post Length'], 'Article': r['Article'], 'Source': r['Source']}
             for r in all_records]    
    RawNLP = pd.DataFrame.from_dict(dicts)
    l_df = add_lemmatized_data(RawNLP)
    data_lemmatized_stops, id2word, corpus = create_dls_id2word_corpus(l_df, no_below_num=3, no_above_num=0.4)
    
    for i in range(num_topics_low, num_topics_high+1):
        lda_model = build_LDA(corpus, id2word, num_tops = i)
        html = generate_html(lda_model, corpus, id2word) 
        html_byte = bytes(html, 'utf-8')
        html_obj = anvil.BlobMedia('text/html', html_byte, name = str(i) + "Topic Map.html")
        app_tables.html.add_row(num_topics=i,
                                htmls=html_obj)

Clone link:
share a copy of your app

What you’re seeing is the expected behavior. Anvil can only call the function from your Jupyter notebook when the notebook is open.

If you want a webapp to be able to run continuously, you need to make the functionality available either:

  1. In a continuously running python script on your desktop (or another computer you control), continuously connected to the internet.
  2. In a third-party cloud service, something like this: Deploy a Google Colab Notebook with Docker
  3. In an Anvil server module.

Regarding option #3, on the free plan, you don’t have a full Python environment there to install packages outside the standard library. But if you want to test things out that require a full Python environment before making the leap to a paid plan, Uplink allows you to do that. Another alternative is to deploy your own Anvil runtime to the cloud.

The bottom line is that making a full Python server function available in an always-on way costs money (or requires the use of your own hardware), which just kind of makes sense.

Thanks for the reply Tim.

I’m currently on the paid version of Anvil. So should I essentially move all my Jupyter notebook code into an Anvil server module? My code also depends on nltk.corpus, which requires me to download/install nltk data. Is that even possible on the Anvil server modules?

1 Like

If any imports require pip installing a package (and it’s not already in the list), you’ll need to request Anvil staff to do that: Anvil Docs | List of Packages

But it sounds like you’re talking about something different. Is it something where you can train the model locally, then load the trained parameters into an Anvil Data Table or something (via Uplink, perhaps) so that the function can run from a server module without having access to the full corpus?

Otherwise, options 1 or 2 might be the way to go. But I haven’t tried anything like this myself, so someone else may have a better answer.