I'm initializing a SharedMemory in python to be shared between multiple processes and I've noticed that it always seems to be filled with zeros (which is fine), but I don't understand why this is occurring as the documentation doesn't state there is a default value to fill the memory with.
This is my test code, opened in two seperate power shells, shell 1:
import numpy as np from multiprocessing.shared_memory import SharedMemory def get_array_nbytes(rows, cols, dtype): array = np.zeros((rows, cols), dtype=dtype) nbytes = array.nbytes del array return nbytes rows = 10000000 depths_columns = 18 array_sm = SharedMemory(create=True, size=get_array_nbytes(rows, depths_columns, np.float32), name='array_sm') shell 2:
from multiprocessing.shared_memory import SharedMemory import numpy as np array_sm = SharedMemory("depths_array") array = np.ndarray((rows, 18), dtype=np.float32, buffer=array_sm.buf) now in the second shell you can follow this up with:
array[0] array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], dtype=float32) or
np.where(array != 0) (array([], dtype=int64), array([], dtype=int64)) Is this behavior always going to be the case or is this a fluke? Is there some sort of undocumented initialization to zero happening in the background?