I have one Python script. Let's call it controller.py. I'd like to use controller.py to run another Python script and pass several variables to it. Let's call the second script analyzer.py.
What is the best way to do this without importing analyzer.py as module? And how do I reference the variables I'm passing to analyzer.py within that script?
Here's my failed attempt using subprocess:
controller.py
import subprocess var1='mytxt' var2=100 var3=True var4=[['x','y','z'],['x','c','d']] var5=r"C:\\Users\\me\\file.txt" myargs=var1,var2,var3,var4,var5 my_lst_str = ' '.join(map(str, myargs)) my_lst_str ='python analyzer.py '+my_lst_str subprocess.call(my_lst_str,shell=True) analyzer.py
print 'Argument List:', str(sys.argv) I have looked through similar questions on Stack Overflow. One oft-recommended solution I tried was importing analyzer.py as a module, but analyzer.py defines many different functions. Using it as a module creates lots of nested functions, and managing the scope of variables within those nested functions was cumbersome.
I need to use Python 2 for these scripts. I'm on a Windows 10 machine.
from analyzer import *? Because that would import everything. If you just want a single function from the module, import only that, e.g.from analyzer import the_function. Have as few global variables as possible (which is probably zero: constants are okay, but better to make them Enums). If a function needs a variable, pass it in as a parameter. If you keep passing the same parameters, group them into aclassor anamedtuple.subprocess: You can't just convert a list of lists into a string and then expect python to convert it back. You're doing things the hard way, but if you insist, look atast.literal_eval