I have a bunch of custom classes for which I've implemented a method of saving files in HDF5 format using the `h5py` module.

A bit of background: I've accomplished this by first implementing a serialization interface that represents the data in each class as a dictionary containing specific types of data (at the moment, the representations can only contain numpy.ndarray, numpy.int64, numpy.float64, str, and other dictionary instances). The advantage of this limitation is that it puts the dictionaries in data types that are `h5py` defaults. I was surprised to find a dearth of code tutorials on recursively saving dictionaries to HDF5 files, so I would really appreciate feedback on my implementation.

## EDIT

*I've replaced assertions with* `isinstance` *checks that `raise ValueError()`s when they fail. I didn't get rid of the* `classmethod` *decorators because they were necessary for running the code within a my* `ReportInterface` *class, which is hopefully clearer now that I have included the class declaration.*

*The code has some redundant comments because it will probably be viewed and forked by other physicists with even less python experience than me; if this is interfering with people's reading it, I can edit the comments out.*

**Imports:**

 import numpy as np
 import h5py
 import os

**Class definition, with save/load functions**

 class ReportInterface(object):

 # ...more class details...

 @classmethod
 def __save_dict_to_hdf5__(cls, dic, filename):
 """
 Save a dictionary whose contents are only strings, np.float64, np.int64,
 np.ndarray, and other dictionaries following this structure
 to an HDF5 file. These are the sorts of dictionaries that are meant
 to be produced by the ReportInterface__to_dict__() method. The saved
 dictionary can then be loaded using __load_dict_to_hdf5__(), and the
 contents of the loaded dictionary will be the same as those of the
 original.
 """
 if os.path.exists(filename):
 raise ValueError('File %s exists, will not overwrite.' % filename)
 with h5py.File(filename, 'w') as h5file:
 cls.__recursively_save_dict_contents_to_group__(h5file, '/', dic)

 @classmethod
 def __recursively_save_dict_contents_to_group__(cls, h5file, path, dic):
 """
 Take an already open HDF5 file and insert the contents of a dictionary
 at the current path location. Can call itself recursively to fill
 out HDF5 files with the contents of a dictionary.
 """
 # argument type checking
 if not isinstance(dic, dict):
 raise ValueError("must provide a dictionary")
 if not isinstance(path, str):
 raise ValueError("path must be a string")
 if not isinstance(h5file, h5py._hl.files.File):
 raise ValueError("must be an open h5py file")
 # save items to the hdf5 file
 for key, item in dic.items():
 if not isinstance(key, str):
 raise ValueError("dict keys must be strings to save to hdf5")
 # save strings, numpy.int64, and numpy.float64 types
 if isinstance(item, (np.int64, np.float64, str)):
 h5file[path + key] = item
 if not h5file[path + key].value == item:
 raise ValueError('The data representation in the HDF5 file does not match the original dict.')
 # save numpy arrays
 elif isinstance(item, np.ndarray):
 h5file[path + key] = item
 if not np.array_equal(h5file[path + key].value, item):
 raise ValueError('The data representation in the HDF5 file does not match the original dict.')
 # save dictionaries
 elif isinstance(item, dict):
 cls.__recursively_save_dict_contents_to_group__(h5file, path + key + '/', item)
 # other types cannot be saved and will result in an error
 else:
 raise ValueError('Cannot save %s type.' % type(item))

 @classmethod
 def __load_dict_from_hdf5__(cls, filename):
 """
 Load a dictionary whose contents are only strings, floats, ints,
 numpy arrays, and other dictionaries following this structure
 from an HDF5 file. These dictionaries can then be used to reconstruct
 ReportInterface subclass instances using the
 ReportInterface.__from_dict__() method.
 """
 with h5py.File(filename, 'r') as h5file:
 return cls.__recursively_load_dict_contents_from_group__(h5file, '/')

 @classmethod
 def __recursively_load_dict_contents_from_group__(cls, h5file, path):
 """
 Load contents of an HDF5 group. If further groups are encountered,
 treat them like dicts and continue to load them recursively.
 """
 ans = {}
 for key, item in h5file[path].items():
 if isinstance(item, h5py._hl.dataset.Dataset):
 ans[key] = item.value
 elif isinstance(item, h5py._hl.group.Group):
 ans[key] = cls.__recursively_load_dict_contents_from_group__(h5file, path + key + '/')
 return ans


**Unit Test**

 print 'Testing HDF5 file saving capabilities.'
 ex = {
 'name': 'stefan',
 'age': np.int64(24),
 'fav_numbers': np.array([2,4,4.3]),
 'fav_tensors': {
 'levi_civita3d': np.array([
 [[0,0,0],[0,0,1],[0,-1,0]],
 [[0,0,-1],[0,0,0],[1,0,0]],
 [[0,1,0],[-1,0,0],[0,0,0]]
 ]),
 'kronecker2d': np.identity(3)
 }
 }
 ReportInterface.__save_dict_to_hdf5__(ex, 'foo.hdf5')
 loaded = ReportInterface.__load_dict_from_hdf5__('foo.hdf5')
 np.testing.assert_equal(loaded, ex)

This passes my unit tests for saving and loading dictionaries with data intact. But I really don't know how Pythonic this is and would appreciate feedback. I tried to leave an HDF5 tag, but it doesn't exist; anyone who is more familiar with the format, and perhaps with `h5py`, can maybe tell me if there is a more elegant or idiomatic way to do this (I don't want to confuse the next student who will maintain this), or if I am setting myself up for any nasty surprises.