A few times I have read about some imports taking a long time to load, causing every server call to be slower than they need to be.
The answer to this I have seen a few times is to move the slow import into the function, breaking python convention, and requiring you to remember exactly what to import for each function.
It also requires code duplication if the import is needed on more than one function.
So I was inspired by the two forum threads above to write a wrapper function that uses a global dictionary of import statements, with the module name you use as the key:
LAZY_IMPORT_DICT = {
'json':'import json',
'timedelta':'from datetime import timedelta',
'datetime':'from datetime import datetime',
'date':'from datetime import date',
'googlebuild':'from googleapiclient.discovery import build as googlebuild',
'pd':'import pandas as pd',
'np':'import numpy as np',
'plt':'import matplotlib.pyplot as plt',
'tf':'import tensorflow as tf',
'load_iris':'from sklearn.datasets import load_iris',
#'':'',
}
from functools import wraps
from dis import get_instructions
def lazy_load_imports(func):
@wraps(func)
def wrapper_func(*args, **kwargs):
argval_set = {str(x.argval) for x in get_instructions(func)}
for import_key in argval_set.intersection( set(LAZY_IMPORT_DICT.keys()) ):
exec(LAZY_IMPORT_DICT[import_key] + "\nglobal " + import_key)
func(*args, **kwargs)
return wrapper_func
You put your import statement as the string, and the keys in the LAZY_IMPORT_DICT
dictionary are the names of the modules as you use them.
Then you decorate any function you wish to use lazy imports on with @lazy_load_imports
, and they will only load from the dictionary of imports if they are used in that function (and exist in the dictionary, obviously), reducing load time for large imports to only exactly what is used by the called function.