3

I'm learning tensorflow's wide_n_deep_tutorial these days, and I'm a little bit confused with the tf.contrib.layers.embedding_column. I wonder how does tensorflow implement the embedding column?

For example, suppose I have an sparse input with dimension 1000 and I want to embed it into a dense feature with dimension 10. Does it hold a fully connected network with 1000*10 params and train using BP to update the params? Or does it use some other techniques like FM to map the 1000 dim vector to a 10 dim vector?

1 Answer 1

1

There are 3 combiner in the embedding_column function:

"sum": do not normalize "mean": do l1 normalization "sqrtn": do l2 normalization. see more tf.embedding_lookup_sparse

There are not using FM to modulate/transform the dimensions.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.