I am making a multi-classfication neural net for a set of data. I have created the net but i think i need to specify a loss port at for each classification
Here are the labels for the classification and the encoder & decoders.
labels = {"Dark Colour", "Light Colour", "Mixture"} sublabels = {"Blue", "Yellow", "Mauve"} labeldec = NetDecoder[{"Class", labels}]; sublabdec = NetDecoder[{"Class", sublabels}]; bothdec = NetDecoder[{"Class", Flatten@{labels, sublabels}}] enc = NetEncoder[{"Class", {"Dark Colour", "Light Colour", "Mixture", "Blue", "Yellow", "Mauve"}}] Here is the Net
SNNnet[inputno_, outputno_, dropoutrate_, nlayers_, class_: True] := Module[{nhidden, linin, linout, bias}, nhidden = Flatten[{Table[{(nlayers*100) - i}, {i, 0, (nlayers*100), 100}]}]; linin = Flatten[{inputno, nhidden[[;; -2]]}]; linout = Flatten[{nhidden[[1 ;; -2]], outputno}]; NetChain[ Join[ Table[ NetChain[ {BatchNormalizationLayer[], LinearLayer[linout[[i]], "Input" -> linin[[i]]], ElementwiseLayer["SELU"], DropoutLayer[dropoutrate]}], {i, Length[nhidden] - 1}], {LinearLayer[outputno], If[class, SoftmaxLayer[], Nothing]}]]] net = NetInitialize@SNNnet[4, 6, 0.01, 8, True]; Here are the nodes that are used for the Netgraph function
nodes = Association["net" -> net, "l1" -> LinearLayer[3], "sm1" -> SoftmaxLayer[], "l2" -> LinearLayer[3], "sm2" -> SoftmaxLayer[], "myloss1" -> CrossEntropyLossLayer["Index", "Target" -> enc], "myloss2" -> CrossEntropyLossLayer["Index", "Target" -> enc]]; Here is what i want the NetGraph to do
connectivity = {NetPort["Data"] -> "net" -> "l1" -> "sm1" -> NetPort["Label"], "sm1" -> NetPort["myloss1", "Input"], NetPort[sublabels] -> NetPort["myloss1", "Target"], "myloss1" -> NetPort["Loss1"], "net" -> "l2" -> "sm2" -> NetPort["Sublabel"], "myloss2" -> NetPort["Loss2"], "sm2" -> NetPort["myloss2", "Input"], NetPort[labels] -> NetPort["myloss2", "Target"]}; The data will diverge at "net" for each classifcation and pass through the subsequent linear and softmax layer and to the relevant NetPort The problem im having is at loss port which diverges at each softmax layer.
When i run this code
NetGraph[nodes, connectivity, "Label" -> labeldec, "Sublabel" -> sublabdec] I recieve the error message: NetGraph::invedgesrc: NetPort[{Blue,Yellow,Mauve}] is not a valid source for NetPort[{myloss1,Target}].
Could anyone tell me why this occurring?
Thanks for reading.
Update 10/05/2020
Thanks to @aooiiii for his help so far.
Here is the code so far
fakedata = <|"Data" -> {{1, 2, 3, 4}, {4, 3, 2, 1}}, "Label" -> {"Mixture", "mauve"}, "Sublabel" -> {"Light Colour", "Yellow"}|>; labels = {"Dark Colour", "Light Colour", "Mixture"} sublabels = {"Blue", "Yellow", "Mauve"} labeldec = NetDecoder[{"Class", labels}]; sublabdec = NetDecoder[{"Class", sublabels}]; bothdec = NetDecoder[{"Class", Flatten@{labels, sublabels}}] Heres the net
SNNnet[inputno_, outputno_, dropoutrate_, nlayers_, class_: True] := Module[{nhidden, linin, linout, bias}, nhidden = Flatten[{Table[{(nlayers*100) - i}, {i, 0, (nlayers*100), 100}]}]; linin = Flatten[{inputno, nhidden[[;; -2]]}]; linout = Flatten[{nhidden[[1 ;; -2]], outputno}]; NetChain[ Join[Table[ NetChain[{BatchNormalizationLayer[], LinearLayer[linout[[i]], "Input" -> linin[[i]]], ElementwiseLayer["SELU"], DropoutLayer[dropoutrate]}], {i, Length[nhidden] - 1}], {LinearLayer[outputno], If[class, SoftmaxLayer[], Nothing]}]]] net = NetInitialize@SNNnet[4, 6, 0.01, 8, True]; nodes = {"net" -> net, "l1" -> LinearLayer[3], "sm1" -> SoftmaxLayer[], "l2" -> LinearLayer[3], "sm2" -> SoftmaxLayer[]}; connectivity = {NetPort["Data"] -> "net" -> {"l1", "l2"}, "l1" -> "sm1" -> NetPort["Label"], "l2" -> "sm2" -> NetPort["Sublabel"]}; netgraph = NetGraph[{ net, SoftmaxLayer[]}, {NetPort["Data"] -> 1 -> 2 -> NetPort["Output"]}, "Output" -> bothdec] This is then passed through a total and elementwise layer to take the average of the nets:
ensembleNet[n_] := Module[{ens}, ens = Table[NetTrain[netgraph, fakedata, LossFunction -> "Output", TargetDevice -> "GPU", TrainingStoppingCriterion -> <|"Criterion" -> "Loss", "InitialPatience" -> 50|>, RandomSeeding -> RandomInteger[10^6] + n], {i, 1, n}]; NetGraph[{Sequence @@ ens, TotalLayer[], ElementwiseLayer[#/N@n &], LinearLayer[3], SoftmaxLayer[], LinearLayer[3], SoftmaxLayer[]}, {NetPort["Input"] -> Range[n] -> n + 1 -> n + 2 -> n + 3 -> n + 4 -> NetPort["label"], n + 2 -> n + 5 -> n + 6 -> NetPort["sublabel"]}, "label" -> labeldec, "sublabel" -> sublabdec] // NetInitialize] ensTrained = ensembleNet[2] This does seem to work however, when observing the trained net, the loss does not decrease overtime. 
Do i need to include a Loss port at sm1 and sm2 to solve this?

bothdecandnetgraph. If I'm not mistaken, a color must be one of the threelabelsand at the same time one of the threesublabels, but not one and only one of the six "bothlabels", sobothdeccan be safely stripped away, and instead of trainingnetgraphyou should be training myfinalnet. On Out[117], the "Output" node clearly doesn't depend on any labels (miraculously, this is what allowed illegal labels in your dataset). When used as a loss, "Output" simply returns one, because a softmax layer's output always sums to one, hence no training. $\endgroup$