2

How can I turn off pyspark logging from a python script? Pls Note : I do not want to make any changes in the spark logger properties file.

1 Answer 1

6

To remove (or modify) logging from a python script:

conf = SparkConf() conf.set('spark.logConf', 'true') # necessary in order to be able to change log level ... # other stuff and configuration # create the session spark = SparkSession.builder\ .config(conf=conf) \ .appName(app_name) \ .getOrCreate() # set the log level to one of ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN spark.sparkContext.setLogLevel("OFF") 

docs configuration

docs setLogLevel

Hope this helps, good luck!

Edit: For earlier versions, e.g. 1.6, you can try something like the following, taken from here

logger = sc._jvm.org.apache.log4j logger.LogManager.getLogger("org"). setLevel(logger.Level.OFF) # or logger.LogManager.getRootLogger().setLevel(logger.Level.OFF) 

I haven't tested it unfortunately, please, let me know if it works.

Sign up to request clarification or add additional context in comments.

3 Comments

I am working on spark 1.6, which doesnt support SparkSession :( Could you pls suggest some other solution.
@eiram_mahera edited the answer, let me know if it works for you :)
This still outputs logging for initiating the session. Any way to remove those?

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.