2
$\begingroup$

I am making a multi-classfication neural net for a set of data. I have created the net but i think i need to specify a loss port at for each classification

Here are the labels for the classification and the encoder & decoders.

labels = {"Dark Colour", "Light Colour", "Mixture"} sublabels = {"Blue", "Yellow", "Mauve"} labeldec = NetDecoder[{"Class", labels}]; sublabdec = NetDecoder[{"Class", sublabels}]; bothdec = NetDecoder[{"Class", Flatten@{labels, sublabels}}] enc = NetEncoder[{"Class", {"Dark Colour", "Light Colour", "Mixture", "Blue", "Yellow", "Mauve"}}] 

Here is the Net

SNNnet[inputno_, outputno_, dropoutrate_, nlayers_, class_: True] := Module[{nhidden, linin, linout, bias}, nhidden = Flatten[{Table[{(nlayers*100) - i}, {i, 0, (nlayers*100), 100}]}]; linin = Flatten[{inputno, nhidden[[;; -2]]}]; linout = Flatten[{nhidden[[1 ;; -2]], outputno}]; NetChain[ Join[ Table[ NetChain[ {BatchNormalizationLayer[], LinearLayer[linout[[i]], "Input" -> linin[[i]]], ElementwiseLayer["SELU"], DropoutLayer[dropoutrate]}], {i, Length[nhidden] - 1}], {LinearLayer[outputno], If[class, SoftmaxLayer[], Nothing]}]]] net = NetInitialize@SNNnet[4, 6, 0.01, 8, True]; 

Here are the nodes that are used for the Netgraph function

nodes = Association["net" -> net, "l1" -> LinearLayer[3], "sm1" -> SoftmaxLayer[], "l2" -> LinearLayer[3], "sm2" -> SoftmaxLayer[], "myloss1" -> CrossEntropyLossLayer["Index", "Target" -> enc], "myloss2" -> CrossEntropyLossLayer["Index", "Target" -> enc]]; 

Here is what i want the NetGraph to do

connectivity = {NetPort["Data"] -> "net" -> "l1" -> "sm1" -> NetPort["Label"], "sm1" -> NetPort["myloss1", "Input"], NetPort[sublabels] -> NetPort["myloss1", "Target"], "myloss1" -> NetPort["Loss1"], "net" -> "l2" -> "sm2" -> NetPort["Sublabel"], "myloss2" -> NetPort["Loss2"], "sm2" -> NetPort["myloss2", "Input"], NetPort[labels] -> NetPort["myloss2", "Target"]}; 

The data will diverge at "net" for each classifcation and pass through the subsequent linear and softmax layer and to the relevant NetPort The problem im having is at loss port which diverges at each softmax layer.

When i run this code

NetGraph[nodes, connectivity, "Label" -> labeldec, "Sublabel" -> sublabdec] 

I recieve the error message: NetGraph::invedgesrc: NetPort[{Blue,Yellow,Mauve}] is not a valid source for NetPort[{myloss1,Target}].

Could anyone tell me why this occurring?

Thanks for reading.

Update 10/05/2020

Thanks to @aooiiii for his help so far.

Here is the code so far

fakedata = <|"Data" -> {{1, 2, 3, 4}, {4, 3, 2, 1}}, "Label" -> {"Mixture", "mauve"}, "Sublabel" -> {"Light Colour", "Yellow"}|>; labels = {"Dark Colour", "Light Colour", "Mixture"} sublabels = {"Blue", "Yellow", "Mauve"} labeldec = NetDecoder[{"Class", labels}]; sublabdec = NetDecoder[{"Class", sublabels}]; bothdec = NetDecoder[{"Class", Flatten@{labels, sublabels}}] 

Heres the net

SNNnet[inputno_, outputno_, dropoutrate_, nlayers_, class_: True] := Module[{nhidden, linin, linout, bias}, nhidden = Flatten[{Table[{(nlayers*100) - i}, {i, 0, (nlayers*100), 100}]}]; linin = Flatten[{inputno, nhidden[[;; -2]]}]; linout = Flatten[{nhidden[[1 ;; -2]], outputno}]; NetChain[ Join[Table[ NetChain[{BatchNormalizationLayer[], LinearLayer[linout[[i]], "Input" -> linin[[i]]], ElementwiseLayer["SELU"], DropoutLayer[dropoutrate]}], {i, Length[nhidden] - 1}], {LinearLayer[outputno], If[class, SoftmaxLayer[], Nothing]}]]] net = NetInitialize@SNNnet[4, 6, 0.01, 8, True]; nodes = {"net" -> net, "l1" -> LinearLayer[3], "sm1" -> SoftmaxLayer[], "l2" -> LinearLayer[3], "sm2" -> SoftmaxLayer[]}; connectivity = {NetPort["Data"] -> "net" -> {"l1", "l2"}, "l1" -> "sm1" -> NetPort["Label"], "l2" -> "sm2" -> NetPort["Sublabel"]}; netgraph = NetGraph[{ net, SoftmaxLayer[]}, {NetPort["Data"] -> 1 -> 2 -> NetPort["Output"]}, "Output" -> bothdec] 

This is then passed through a total and elementwise layer to take the average of the nets:

ensembleNet[n_] := Module[{ens}, ens = Table[NetTrain[netgraph, fakedata, LossFunction -> "Output", TargetDevice -> "GPU", TrainingStoppingCriterion -> <|"Criterion" -> "Loss", "InitialPatience" -> 50|>, RandomSeeding -> RandomInteger[10^6] + n], {i, 1, n}]; NetGraph[{Sequence @@ ens, TotalLayer[], ElementwiseLayer[#/N@n &], LinearLayer[3], SoftmaxLayer[], LinearLayer[3], SoftmaxLayer[]}, {NetPort["Input"] -> Range[n] -> n + 1 -> n + 2 -> n + 3 -> n + 4 -> NetPort["label"], n + 2 -> n + 5 -> n + 6 -> NetPort["sublabel"]}, "label" -> labeldec, "sublabel" -> sublabdec] // NetInitialize] ensTrained = ensembleNet[2] 

This does seem to work however, when observing the trained net, the loss does not decrease overtime. enter image description here

enter image description here

Do i need to include a Loss port at sm1 and sm2 to solve this?

$\endgroup$
2
  • $\begingroup$ I don't understand the meaning of bothdec and netgraph. If I'm not mistaken, a color must be one of the three labels and at the same time one of the three sublabels, but not one and only one of the six "bothlabels", so bothdec can be safely stripped away, and instead of training netgraph you should be training my finalnet. On Out[117], the "Output" node clearly doesn't depend on any labels (miraculously, this is what allowed illegal labels in your dataset). When used as a loss, "Output" simply returns one, because a softmax layer's output always sums to one, hence no training. $\endgroup$ Commented May 10, 2020 at 15:58
  • $\begingroup$ There's still no need for specifying losses (as far as I understand). After clearing out the questions and fixing the bugs, what you need is just a more elaborate NetGraph that correctly combines the ensemble networks together. $\endgroup$ Commented May 10, 2020 at 15:59

1 Answer 1

1
$\begingroup$

In your case there's no need to specify the loss layers as Mathematica can attach them automatically.

nodes = {"net" -> net, "l1" -> LinearLayer[3], "sm1" -> SoftmaxLayer[], "l2" -> LinearLayer[3], "sm2" -> SoftmaxLayer[]}; connectivity = { NetPort["Data"] -> "net" -> {"l1", "l2"}, "l1" -> "sm1" -> NetPort["Label"], "l2" -> "sm2" -> NetPort["Sublabel"] }; finalnet = NetGraph[nodes, connectivity, "Label" -> labeldec, "Sublabel" -> sublabdec]; fakedata = <| "Data" -> {{1, 2, 3, 4}, {4, 3, 2, 1}}, "Label" -> {"Mixture", "Light Colour"}, "Sublabel" -> {"Mauve", "Yellow"} |>; NetTrain[finalnet, fakedata] 

Speaking of your question, the error message was generated because of a syntax error: in your code, NetPort[sublabels] is evaluated to NetPort[{"Blue","Yellow","Mauve"}], which is obviously incorrect. If you do indeed require explicit loss specification, consult the docs for the LossFunction option. This way is a little clunkier, but I can help you with it if necessary.

$\endgroup$
1
  • $\begingroup$ Thanks for your reply, I may have to explicitly state the loss's. This net will branch off to another net. I will update the question appropiately $\endgroup$ Commented May 10, 2020 at 13:25

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.