Hello, i am trying to use uplink because i need to import a csv file to a table, so i am testing my uplink with the key and this generic code, after i installed uplink in my machine with the command:
pip install anvil-uplink
And i use this code:
import anvil.server
anvil.server.connect("<uplink key>")
@anvil.server.callable
def say_hello(name):
print("Hello from the uplink, %s!" % name)
anvil.server.wait_forever()
when i run this on visual studio i get this output:
Connecting to wss://anvil.works/uplink
Anvil websocket open
Connected to "Default environment (published)" as SERVER
Shouldnt i get the print hello output instead? I am not sure how to connect this to my anvil app, so i can use my code to import a csv to my table…
In publishing your uplink key, you have allowed anyone to have unrestricted access to your app and its data tables. I suggest you reset that key quickly.
As for your question, everything looks like it’s working properly. You need to call your say_hello function from within your anvil app in order to see the message printed.
@owen.campbell
Thank you for the fast answer.
Yes i changed my uplink key right away.
Since i am new i didnt know about that, i will try to work now with the csv uploading code.
Thank you!
and see if what prints out to the output is actual malformed data, or if it is just some junk in the csv file like a header row, or a blank line, or missing data etc.
if whatever it prints out does not look important, uncomment the #app_tables… line and you should be able to insert the data skipping the non relevant lines (if they are not relevant)
On another note, there is no unique constraint available in the data tables (that I could tell) for any columns, if you ran your code once, you may already have partial data in the data table. If you check the table and would like to load the data from scratch but you don’t want to go through making the table again, you can use the truncate function to clear the table of all data but keep its structure.
In the data table it is the icon that looks like a trash can (er ‘rubbish bin’ )
Loading the same data twice will create duplicate records, using that simple csv method.