I am trying to build a web app that allows people to upload videos directly to our AWS S3 bucket.
Since client code has a timeout limit of 30 seconds and these uploads may take longer than 30 seconds to upload due to their size, it seems background tasks are the way to go.
I have managed to get it working for smaller files (< 0.2 MB), however as soon as I try to run a file over 1 MB the background task runs forever and nothing gets uploaded to S3.
I need to get these videos into S3 as we have workflows that will run in AWS that will need the data in the S3 buckets. Therefore Google Drive and Anvil data tables canāt be used.
Two questions:
Are background tasks the correct way to do this? If not, what should I be doing?
If Yes to the above, how do I get this working for larger files?
So on looking at my old code I realized my code was actually being executed on an Uplink. I tried to get the code working on the Anvil server but due to IP location restrictions, I had to execute it on the Uplink. I do believe I tested this code on the server though:
import boto3
from boto3 import session
from botocore.client import Config
from boto3.s3.transfer import S3Transfer
# Initialize a session using DigitalOcean Spaces.
session = boto3.session.Session()
client = session.client('s3',
region_name='region_here',
endpoint_url= #endpoint_url_here,
aws_access_key_id= #access_key_here,
aws_secret_access_key= #super_secret_thing_here)
transfer = S3Transfer(client)
transfer.upload_file('source_file_path', 'bucket_name', 'destination_path')
I did run into an issue that I blamed on my source for the files being a location-based IP address restriction. But perhaps it wasnāt?
In my case it was a local file on the machine file system. Depends on where your file is coming from.
I had a tool to grab parliament recordings and upload them to S3 compatible storage. Anvil servers wouldnāt work because the site is only accessible in Mongolia, so I ran the code in an Uplink and the app managed some other stuff.
I ended up giving up on the app, but the boto3 code did work.
Normally when people ask that question, theyāre talking about uploading large files from the client to the Anvil servers. But youāve posted in a topic about S3.
I havenāt worked with S3, so canāt help on best practices there. I donāt know if going by way of the Anvil server is going to be best (so you can use the boto3 package there), or if youād be better off using Javascript libraries for working with S3 directly from the browser.
If you end up going by way of the Anvil server, let me know. I do have an example of the technique for uploading large files to it.
yes anything! Server is fine. I just want the user to select some files using the fileuploader and for that file to go āUPā somewhere where I can then move/manipulate it. and of courseā¦ I can do all that already but not with BIG files.