When I running the following inside IPython Notebook I don't see any output:
import logging logging.basicConfig(level=logging.DEBUG) logging.debug("test") Anyone know how to make it so I can see the "test" message inside the notebook?
Try following:
import logging logger = logging.getLogger() logger.setLevel(logging.DEBUG) logging.debug("test") According to logging.basicConfig:
Does basic configuration for the logging system by creating a StreamHandler with a default Formatter and adding it to the root logger. The functions debug(), info(), warning(), error() and critical() will call basicConfig() automatically if no handlers are defined for the root logger.
This function does nothing if the root logger already has handlers configured for it.
It seems like ipython notebook call basicConfig (or set handler) somewhere.
logger is created.ipykernel 4.5 (possibly as early as 4.4) github.com/jupyter/notebook/issues/1397logging.basicConfig take effect.If you still want to use basicConfig, reload the logging module like this
from importlib import reload # Not needed in Python 2 import logging reload(logging) logging.basicConfig(format='%(asctime)s %(levelname)s:%(message)s', level=logging.DEBUG, datefmt='%I:%M:%S') reload is now imp.reloadreload helped me, since I did't see log messages in jupyter console. Grazie)My understanding is that the IPython session starts up logging so basicConfig doesn't work. Here is the setup that works for me (I wish this was not so gross looking since I want to use it for almost all my notebooks):
import logging logger = logging.getLogger() fhandler = logging.FileHandler(filename='mylog.log', mode='a') formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') fhandler.setFormatter(formatter) logger.addHandler(fhandler) logger.setLevel(logging.DEBUG) Now when I run:
logging.error('hello!') logging.debug('This is a debug message') logging.info('this is an info message') logging.warning('tbllalfhldfhd, warning.') I get a "mylog.log" file in the same directory as my notebook that contains:
2015-01-28 09:49:25,026 - root - ERROR - hello! 2015-01-28 09:49:25,028 - root - DEBUG - This is a debug message 2015-01-28 09:49:25,029 - root - INFO - this is an info message 2015-01-28 09:49:25,032 - root - WARNING - tbllalfhldfhd, warning. Note that if you rerun this without restarting the IPython session it will write duplicate entries to the file since there would now be two file handlers defined
Bear in mind that stderr is the default stream for the logging module, so in IPython and Jupyter notebooks you might not see anything unless you configure the stream to stdout:
import logging import sys logging.basicConfig(format='%(asctime)s | %(levelname)s : %(message)s', level=logging.INFO, stream=sys.stdout) logging.info('Hello world!') As of logging version 3.8 a force parameter has been added that removes any existing handlers, which allows basicConfig to work. This worked on IPython version 7.29.0 and Jupyter Lab version 3.2.1.
import logging logging.basicConfig(level=logging.DEBUG, force = True) logging.debug("test") After redirecting logs the REPL console, now each keystroke may emit it's own logging message. For example, REPL ptpython begins emitting these log messages after each keystroke.
>>> h DEBUG:parso.python.diff:line_lengths old: 1; new: 1 DEBUG:parso.python.diff:diff parser start DEBUG:parso.python.diff:-> code[replace] old[1:1] new[1:1] DEBUG:parso.python.diff:line_lengths old: 1; new: 1 DEBUG:parso.python.diff:parse_part from 1 to 1 (to 0 in part parser) DEBUG:parso.python.diff:-> code[replace] old[1:1] new[1:1] [F2] Menu - CPython 3.8.10 DEBUG:parso.python.diff:diff parser end DEBUG:parso.python.diff:parse_part from 1 to 1 (to 0 in part parser) DEBUG:parso.python.diff:diff parser end DEBUG:asyncio:Using proactor: IocpProactor >>> h The REPL becomes so noisy it's unusable.
These global logger instances can be quieted by setting the log levels to something less verbose like logging.WARNING.
>>> logging.getLogger("parso.python.diff").setLevel(logging.WARNING) >>> logging.getLogger("asyncio").setLevel(logging.WARNING) This is probably applicable to other REPLs. Just substitute the appropriate noisy logger name in the call to getLogger.
What worked for me now (Jupyter, notebook server is: 5.4.1, IPython 7.0.1)
import logging logging.basicConfig() logger = logging.getLogger('Something') logger.setLevel(logging.DEBUG) Now I can use logger to print info, otherwise I would see only message from the default level (logging.WARNING) or above.
basicConfig() tp make it work.You can configure logging by running %config Application.log_level="INFO"
For more information, see IPython kernel options
%config to see the supported classed, Application is not one of them. ipython 7.9.0 here.I wanted a simple and straightforward answer to this, with nicely styled output so here's my recommendation
import sys import logging logging.basicConfig( format='%(asctime)s [%(levelname)s] %(name)s - %(message)s', level=logging.INFO, datefmt='%Y-%m-%d %H:%M:%S', stream=sys.stdout, ) log = logging.getLogger('notebook') Then you can use log.info() or any of the other logging levels anywhere in your notebook with output that looks like this
2020-10-28 17:07:08 [INFO] notebook - Hello world 2020-10-28 17:12:22 [INFO] notebook - More info here 2020-10-28 17:12:22 [INFO] notebook - And some more stream=sys.stdout worked for me in Jupyter Lab.I setup a logger for both file and I wanted it to show up on the notebook. Turns out adding a filehandler clears out the default stream handlder.
logger = logging.getLogger() formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') # Setup file handler fhandler = logging.FileHandler('my.log') fhandler.setLevel(logging.DEBUG) fhandler.setFormatter(formatter) # Configure stream handler for the cells chandler = logging.StreamHandler() chandler.setLevel(logging.DEBUG) chandler.setFormatter(formatter) # Add both handlers logger.addHandler(fhandler) logger.addHandler(chandler) logger.setLevel(logging.DEBUG) # Show the handlers logger.handlers # Log Something logger.info("Test info") logger.debug("Test debug") logger.error("Test error") logger is the root logger. i think it's better practice to create a new logger eg with getLogger(__name__). this is recommended by the docs, second paragraph here: docs.python.org/3/library/logging.html#logger-objectsimport logging # make a handler handler = logging.StreamHandler() formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') handler.setFormatter(formatter) # add it to the root logger logging.getLogger().addHandler(handler) # make a logger for this notebook, set verbosity logger = logging.getLogger(__name__) logger.setLevel('DEBUG') # send messages logger.debug("debug message") logger.info("so much info") logger.warning("you've veen warned!") logger.error("bad news") logger.critical("really bad news") 2021-09-02 18:18:27,397 - __main__ - DEBUG - debug message 2021-09-02 18:18:27,397 - __main__ - INFO - so much info 2021-09-02 18:18:27,398 - __main__ - WARNING - you've veen warned! 2021-09-02 18:18:27,398 - __main__ - ERROR - bad news 2021-09-02 18:18:27,399 - __main__ - CRITICAL - really bad news logging.getLogger('google').setLevel('DEBUG') from google.cloud import storage client = storage.Client() 2021-09-02 18:18:27,415 - google.auth._default - DEBUG - Checking None for explicit credentials as part of auth process... 2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Checking Cloud SDK credentials as part of auth process... 2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Cloud SDK credentials not found on disk; not using them ... When using loggers in Ipynb notebooks, specifically Google Colab or Kaggle, it is important to clear any existing loggers and add a StreamHandler explicitly. Since the cell can be rerun as well.
Doing something like this always works for me.
import logging from datetime import datetime # Configure the logger logger = logging.getLogger() # Get the root logger logger.setLevel(logging.DEBUG) # Set the root logger to debug # Remove all existing handlers (to prevent duplicate logging) if logger.hasHandlers(): logger.handlers.clear() # Create console handler console_handler = logging.StreamHandler() console_handler.setLevel(logging.DEBUG) # Console handler also listens to DEBUG level formatter = logging.Formatter('%(asctime)s %(levelname)s:%(message)s', datefmt='%d-%m-%Y %I:%M:%S %p') console_handler.setFormatter(formatter) # Add console handler to the logger logger.addHandler(console_handler) # Add FileHandler fhandler = logging.FileHandler(filename=f'{int(datetime.utcnow().timestamp())}.log', mode='a') fhandler.setFormatter(formatter) logger.addHandler(fhandler)
ipython3 notebook --versionreturns1.0.0