2

This question is very similar to this one: How do you use freeze_graph.py in Tensorflow? but that one has not been answered and I have a different approach to the problem. Thus I would like some input.

I am also trying to load a .pb binary file and then freeze it. This is the code I tried.

Let me know if this gives you any ideas. This does not return errors. It just crashes my jupyter notebook.

import tensorflow as tf import sys from tensorflow.python.platform import gfile from tensorflow.core.protobuf import saved_model_pb2 from tensorflow.python.util import compat with tf.Session() as sess: model_filename ='saved_model.pb' # binary .pb file with gfile.FastGFile(model_filename, 'rb') as f: data = compat.as_bytes(f.read()) # reads binary sm = saved_model_pb2.SavedModel() print(sm) sm.ParseFromString(data) # parses through the file print(sm) if 1 != len(sm.meta_graphs): print('More than one graph found. Not sure which to write') sys.exit(1) g_in = tf.import_graph_def(sm.meta_graphs[0].graph_def) output_graph = "frozen_graph.pb" # Getting all output nodes for the frozen graph output_nodes = [n.name for n in tf.get_default_graph().as_graph_def().node] # This not working fully output_graph_def = tf.graph_util.convert_variables_to_constants( sess, # The session is used to retrieve the weights tf.get_default_graph().as_graph_def(), # The graph_def is used to retrieve the nodes output_nodes# The output node names are used to select the usefull nodes ) # Finally we serialize and dump the output graph to the filesystem with tf.gfile.GFile(output_graph, "wb") as f: f.write(output_graph_def.SerializeToString()) print("%d ops in the final graph." % len(output_graph_def.node)) print(g_in) LOGDIR='.' train_writer = tf.summary.FileWriter(LOGDIR) train_writer.add_graph(sess.graph) 

This code should generate a frozen file, but I don't completely understand tensorflow's saving mechanisms. If I take out the freezing the graph part from this code I get and events.out. file that can be read by tensorboard.

2 Answers 2

4

So after a lot of stumbling I realize that I was just loading the meta graph. Not the whole graph with variables. Here is code that does so:

def frozen_graph_maker(export_dir,output_graph): with tf.Session(graph=tf.Graph()) as sess: tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir) output_nodes = [n.name for n in tf.get_default_graph().as_graph_def().node] output_graph_def = tf.graph_util.convert_variables_to_constants( sess, # The session is used to retrieve the weights sess.graph_def, output_nodes# The output node names are used to select the usefull nodes ) # Finally we serialize and dump the output graph to the filesystem with tf.gfile.GFile(output_graph, "wb") as f: f.write(output_graph_def.SerializeToString()) def main(): export_dir='/dir/of/pb/and/variables' output_graph = "frozen_graph.pb" frozen_graph_maker(export_dir,output_graph) 

I realized that I was just loading the meta graph. I would love if someone could confirm my understanding of what was failing. With compat.as_bytes I was just loading it as a meta graph. Is there a way of integrating variables after doing that kind of loading or should I stick with tf.saved_model.loader.load() ? My attempt of loading was completely wrong as it wasn't even calling the folder of variables.

Another question: with [n.name for n in tf.get_default_graph().as_graph_def().node] I am putting all nodes into the output_nodes, should I just put the last node? It works with just the last node. What is the difference?

Sign up to request clarification or add additional context in comments.

2 Comments

There are no words I can use to express how much I appreciate this answer!
tf.saved_model.loader.load() is the default way of loading saved_model. So you have to stick to it. For your second question, output_nodes should only contain the nodes you want to export, here in your case last node. So, you can freely only add last node.
0

An easier solution would have been as follows:

import tensorflow as tf pb_saved_model = "/Users/vedanshu/saved_model/" _graph = tf.Graph() with _graph.as_default(): _sess = tf.Session(graph=_graph) model = tf.saved_model.loader.load(_sess, ["serve"], pb_saved_model) with tf.gfile.GFile("/Users/vedanshu/frozen_graph/frozen.pb", "wb") as f: f.write(model.SerializeToString()) 

If your saved_model has variables in it, it can be converted to constant as follows:

import tensorflow as tf pb_saved_model = "/Users/vedanshu/saved_model/" OUTPUT_NAMES = ["fc2/Relu"] _graph = tf.Graph() with _graph.as_default(): _sess = tf.Session(graph=_graph) model = tf.saved_model.loader.load(_sess, ["serve"], pb_saved_model) graphdef = tf.get_default_graph().as_graph_def() frozen_graph = tf.graph_util.convert_variables_to_constants(_sess,graphdef, OUTPUT_NAMES) frozen_graph = tf.graph_util.remove_training_nodes(frozen_graph) with tf.gfile.GFile("/Users/vedanshu/frozen_graph/frozen.pb", "wb") as f: f.write(frozen_graph) 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.