0
$\begingroup$

I have the following setup:

  • 2 input neurons (I1, I2)
  • 2 output neurons (O1, O2)
  • 1 hidden layer with 3 neurons (H1, H2, H3)
  • loss function = mse
  • optimizer = Adam
  • the values from I1 range from 0 - 100
  • the values from I2 range from 0 - 500
  • batch size = 16
  • learning rate = 0.1

The ANN should learn the following rules (regression problem):

  1. If I1 increasing O1 decreasing
  2. If I1 increasing O2 increasing
  3. If I2 increasing O1 constant
  4. If I2 increasing O1 decreasing

I am using the following model:

class DQN(nn.Module): def __init__(self): super().__init__() self.fc1 = nn.Linear(in_features=2, out_features=3) self.out = nn.Linear(in_features=3, out_features=2) def forward(self, t): t = F.relu(self.fc1(t)) t = self.out(t) return t 

However, it does not learn. My question is, which part should I focus on? Is a linear, fully connected network maybe not suitable for the rules (no linear regression)? Do I need more hidden layers / more neurons? Is the learning rate a problem? Should the input data be normalized?

I tinkered around a lot, but didn't get any improvements.

$\endgroup$
3
  • $\begingroup$ Although the suggested duplicate is classification using Torch, it is the exact same problem, and a common beginner's problem. $\endgroup$ Commented Feb 13, 2022 at 12:45
  • $\begingroup$ As you suggested, I normalized I1 with value / 100 and I2 with value / 500. So I always have values between 0-1. Unfortunately, the improvement is very limited. $\endgroup$ Commented Feb 14, 2022 at 19:25
  • $\begingroup$ Sorry, that looked like a clear problem to me. It seems the code, deign or data have other problems too. $\endgroup$ Commented Feb 14, 2022 at 20:22

1 Answer 1

0
$\begingroup$

The learning rate is too high. Make it between 0.001 and 0.0001. It's also possible that your hidden layer may not have modeled the relationship well. Try doubling/incrementing the neurons in the hidden layer like 4, 8, 12, 16, and then chose the appropriate one.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.