Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

2
  • $\begingroup$ Thank you for the very detailed answer! There are still a couple of things that I am not sure I understand correctly. You say there are 48 weights + 1 bias for each kernel, so that means that each color has effectively its own filter, so shouldn't the sum over i,j (4,4) be actually over i,j,k (4,4,3)? because if I end up with 6x6x1 then the channel dimension is gone by then. $\endgroup$ Commented Feb 13, 2019 at 17:41
  • $\begingroup$ The second question is: with a final vector of 6x6x5 it is clear that each kernel generates its own set of outputs for the neurons of the fully connected layer, but when I inspect CNNs generated by Keras using the exact same parameter, I see that the first fully connected layer has only 6x6 neurons - this is where the initial confusion started for me. Thanks again! $\endgroup$ Commented Feb 13, 2019 at 17:42