220

When I running the following inside IPython Notebook I don't see any output:

import logging logging.basicConfig(level=logging.DEBUG) logging.debug("test") 

Anyone know how to make it so I can see the "test" message inside the notebook?

4
  • What version of IPython are you using, since this works in 1.0? Commented Sep 13, 2013 at 13:12
  • @ViktorKerkez ipython3 notebook --version returns 1.0.0 Commented Sep 13, 2013 at 13:19
  • imgur.com/1b7nGZz I get this when I try your code. Commented Sep 13, 2013 at 13:43
  • @ViktorKerkez: Ya I don't get that, guess I should file an issue... Commented Sep 13, 2013 at 13:47

11 Answers 11

214

Try following:

import logging logger = logging.getLogger() logger.setLevel(logging.DEBUG) logging.debug("test") 

According to logging.basicConfig:

Does basic configuration for the logging system by creating a StreamHandler with a default Formatter and adding it to the root logger. The functions debug(), info(), warning(), error() and critical() will call basicConfig() automatically if no handlers are defined for the root logger.

This function does nothing if the root logger already has handlers configured for it.

It seems like ipython notebook call basicConfig (or set handler) somewhere.

Sign up to request clarification or add additional context in comments.

7 Comments

The same occurs in a normal IPython console: it doesn't print anything, unless a root logger is created.
This solution works again in ipykernel 4.5 (possibly as early as 4.4) github.com/jupyter/notebook/issues/1397
This does not work any more. Not with the Jupyter Notebook 5.3.0
Works for me with the following jupyter installation: ``` $ jupyter --version jupyter core : 4.6.3 jupyter-notebook : not installed qtconsole : not installed ipython : 7.18.1 ipykernel : 5.3.4 jupyter client : 6.1.7 jupyter lab : not installed nbconvert : not installed ipywidgets : not installed nbformat : not installed traitlets : 5.0.5 ```
Note that a restart of the notebook kernel seems to required before changes in logging.basicConfig take effect.
|
95

If you still want to use basicConfig, reload the logging module like this

from importlib import reload # Not needed in Python 2 import logging reload(logging) logging.basicConfig(format='%(asctime)s %(levelname)s:%(message)s', level=logging.DEBUG, datefmt='%I:%M:%S') 

4 Comments

For anyone trying to do this in Python 3: reload is now imp.reload
as of Python 3.5, you should use importlib.reload as the imp module is being deprecated.
If anyone is having trouble with Spyder with logging (where all attempts at modifying logger behavior were unsuccessful), this just ended a day-long goose-chase. github.com/spyder-ide/spyder/issues/2572 Thanks a lot !
reload helped me, since I did't see log messages in jupyter console. Grazie)
44

My understanding is that the IPython session starts up logging so basicConfig doesn't work. Here is the setup that works for me (I wish this was not so gross looking since I want to use it for almost all my notebooks):

import logging logger = logging.getLogger() fhandler = logging.FileHandler(filename='mylog.log', mode='a') formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') fhandler.setFormatter(formatter) logger.addHandler(fhandler) logger.setLevel(logging.DEBUG) 

Now when I run:

logging.error('hello!') logging.debug('This is a debug message') logging.info('this is an info message') logging.warning('tbllalfhldfhd, warning.') 

I get a "mylog.log" file in the same directory as my notebook that contains:

2015-01-28 09:49:25,026 - root - ERROR - hello! 2015-01-28 09:49:25,028 - root - DEBUG - This is a debug message 2015-01-28 09:49:25,029 - root - INFO - this is an info message 2015-01-28 09:49:25,032 - root - WARNING - tbllalfhldfhd, warning. 

Note that if you rerun this without restarting the IPython session it will write duplicate entries to the file since there would now be two file handlers defined

2 Comments

To make this less "gross looking", put the code in a module on your python path, and import it. Prettier and easy to upgrade in the future.
Or use logging.config.fileConfig('logging.conf') and put all setup in there.
34

Bear in mind that stderr is the default stream for the logging module, so in IPython and Jupyter notebooks you might not see anything unless you configure the stream to stdout:

import logging import sys logging.basicConfig(format='%(asctime)s | %(levelname)s : %(message)s', level=logging.INFO, stream=sys.stdout) logging.info('Hello world!') 

Comments

28

As of logging version 3.8 a force parameter has been added that removes any existing handlers, which allows basicConfig to work. This worked on IPython version 7.29.0 and Jupyter Lab version 3.2.1.

import logging logging.basicConfig(level=logging.DEBUG, force = True) logging.debug("test") 

After redirecting logs the REPL console, now each keystroke may emit it's own logging message. For example, REPL ptpython begins emitting these log messages after each keystroke.

>>> h DEBUG:parso.python.diff:line_lengths old: 1; new: 1 DEBUG:parso.python.diff:diff parser start DEBUG:parso.python.diff:-> code[replace] old[1:1] new[1:1] DEBUG:parso.python.diff:line_lengths old: 1; new: 1 DEBUG:parso.python.diff:parse_part from 1 to 1 (to 0 in part parser) DEBUG:parso.python.diff:-> code[replace] old[1:1] new[1:1] [F2] Menu - CPython 3.8.10 DEBUG:parso.python.diff:diff parser end DEBUG:parso.python.diff:parse_part from 1 to 1 (to 0 in part parser) DEBUG:parso.python.diff:diff parser end DEBUG:asyncio:Using proactor: IocpProactor >>> h 

The REPL becomes so noisy it's unusable.

These global logger instances can be quieted by setting the log levels to something less verbose like logging.WARNING.

>>> logging.getLogger("parso.python.diff").setLevel(logging.WARNING) >>> logging.getLogger("asyncio").setLevel(logging.WARNING) 

This is probably applicable to other REPLs. Just substitute the appropriate noisy logger name in the call to getLogger.

1 Comment

Please someone delete all the outdated answers. This is the way! Thanks!
23

What worked for me now (Jupyter, notebook server is: 5.4.1, IPython 7.0.1)

import logging logging.basicConfig() logger = logging.getLogger('Something') logger.setLevel(logging.DEBUG) 

Now I can use logger to print info, otherwise I would see only message from the default level (logging.WARNING) or above.

1 Comment

Yes, that works. One has to run basicConfig() tp make it work.
16

You can configure logging by running %config Application.log_level="INFO"

For more information, see IPython kernel options

4 Comments

Welcome to StackOverflow and thanks for your help. You might want to make your answer even better by adding some explanation.
This was actually the most useful answer for me!
Can you add a few lines with an example? What is the logger handle to invoke to print log messages?
At least ipython 7.9.0 (or jupyter 6.0.2) ignores the suggested code, since it doesn't support this class from the running console. Run %config to see the supported classed, Application is not one of them. ipython 7.9.0 here.
6

I wanted a simple and straightforward answer to this, with nicely styled output so here's my recommendation

import sys import logging logging.basicConfig( format='%(asctime)s [%(levelname)s] %(name)s - %(message)s', level=logging.INFO, datefmt='%Y-%m-%d %H:%M:%S', stream=sys.stdout, ) log = logging.getLogger('notebook') 

Then you can use log.info() or any of the other logging levels anywhere in your notebook with output that looks like this

2020-10-28 17:07:08 [INFO] notebook - Hello world 2020-10-28 17:12:22 [INFO] notebook - More info here 2020-10-28 17:12:22 [INFO] notebook - And some more 

1 Comment

stream=sys.stdout worked for me in Jupyter Lab.
5

I setup a logger for both file and I wanted it to show up on the notebook. Turns out adding a filehandler clears out the default stream handlder.

logger = logging.getLogger() formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') # Setup file handler fhandler = logging.FileHandler('my.log') fhandler.setLevel(logging.DEBUG) fhandler.setFormatter(formatter) # Configure stream handler for the cells chandler = logging.StreamHandler() chandler.setLevel(logging.DEBUG) chandler.setFormatter(formatter) # Add both handlers logger.addHandler(fhandler) logger.addHandler(chandler) logger.setLevel(logging.DEBUG) # Show the handlers logger.handlers # Log Something logger.info("Test info") logger.debug("Test debug") logger.error("Test error") 

1 Comment

fyi - here, logger is the root logger. i think it's better practice to create a new logger eg with getLogger(__name__). this is recommended by the docs, second paragraph here: docs.python.org/3/library/logging.html#logger-objects
2

setup

import logging # make a handler handler = logging.StreamHandler() formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') handler.setFormatter(formatter) # add it to the root logger logging.getLogger().addHandler(handler) 

log from your own logger

# make a logger for this notebook, set verbosity logger = logging.getLogger(__name__) logger.setLevel('DEBUG') # send messages logger.debug("debug message") logger.info("so much info") logger.warning("you've veen warned!") logger.error("bad news") logger.critical("really bad news") 
2021-09-02 18:18:27,397 - __main__ - DEBUG - debug message 2021-09-02 18:18:27,397 - __main__ - INFO - so much info 2021-09-02 18:18:27,398 - __main__ - WARNING - you've veen warned! 2021-09-02 18:18:27,398 - __main__ - ERROR - bad news 2021-09-02 18:18:27,399 - __main__ - CRITICAL - really bad news 

capture logging from other libraries

logging.getLogger('google').setLevel('DEBUG') from google.cloud import storage client = storage.Client() 
2021-09-02 18:18:27,415 - google.auth._default - DEBUG - Checking None for explicit credentials as part of auth process... 2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Checking Cloud SDK credentials as part of auth process... 2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Cloud SDK credentials not found on disk; not using them ... 

Comments

0

When using loggers in Ipynb notebooks, specifically Google Colab or Kaggle, it is important to clear any existing loggers and add a StreamHandler explicitly. Since the cell can be rerun as well.

Doing something like this always works for me.

import logging from datetime import datetime # Configure the logger logger = logging.getLogger() # Get the root logger logger.setLevel(logging.DEBUG) # Set the root logger to debug # Remove all existing handlers (to prevent duplicate logging) if logger.hasHandlers(): logger.handlers.clear() # Create console handler console_handler = logging.StreamHandler() console_handler.setLevel(logging.DEBUG) # Console handler also listens to DEBUG level formatter = logging.Formatter('%(asctime)s %(levelname)s:%(message)s', datefmt='%d-%m-%Y %I:%M:%S %p') console_handler.setFormatter(formatter) # Add console handler to the logger logger.addHandler(console_handler) # Add FileHandler fhandler = logging.FileHandler(filename=f'{int(datetime.utcnow().timestamp())}.log', mode='a') fhandler.setFormatter(formatter) logger.addHandler(fhandler) 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.