I am trying to deal with a super-massive NetworkX Graph object with hundreds of millions of nodes. I'd like to be able to write it to file as to not consume all my computer memory. However, I need to constantly be searching across existing nodes, updating edges, etc.
Is there a good solution for this? I'm not sure how it would work with any of the file formats provided on http://networkx.lanl.gov/reference/readwrite.html
The only solution i can think of is to store each node as a separate file with references to other nodes in the filesystem - that way, opening one node for examination doesn't overload the memory. Is there an existing filesystem for large amounts of data (e.g. PyTables) to do this without writing my own boilerplate code?