1
$\begingroup$

I want to estimate the classical simple linear regression parameters using a maximum likelihood estimation. However, I'm having trouble to program such a thing. Here's the data I generated:

 Clear[y, x, e] SeedRandom["MV régression simple"] e = RandomVariate[NormalDistribution[0, 4], 100]; x = RandomVariate[NormalDistribution[20, 3], 100]; y = 4 + x*2.1 + e; 

And here's what I did to try and find the likelihood function:

Clear[L] L[b0_, b1_, s_] = LogLikelihood[NormalDistribution[b0 + b1*x, s], y] 

I don't know how to take account of the fact that yi depends only on xi and not the whole x list of data.

Thanks for your future help

$\endgroup$

1 Answer 1

5
$\begingroup$

Taking your model (though with more random numbers to make confirmation of the result easier)

Clear[y, x, e] SeedRandom["MV régression simple"] e = RandomVariate[NormalDistribution[0, 4], 1000]; x = RandomVariate[NormalDistribution[20, 3], 1000]; y = 4 + x*2.1 + e; 

Define a slightly modified likelihood function, using MapThread to account for the differences in x and y values for each sample and summing over all instances with Total

Clear[L] L[b0_, b1_, s_] = Total[MapThread[ LogLikelihood[NormalDistribution[b0 + b1*#1, s], {#2}] &, {x, y}]]; 

A maximum likelihood estimate can be found, which is close to the specified parameters

Maximize[{L[b0, b1, s], s > 0}, {b0, b1, s}] (* {-2777.3, {b0 -> 3.82871, b1 -> 2.10828, s -> 3.8898}} *) 
$\endgroup$
2
  • $\begingroup$ Many thanks from the present $\endgroup$ Commented Sep 17, 2016 at 22:22
  • 4
    $\begingroup$ In this case Maximize[{LogLikelihood[NormalDistribution[0, s], y - (b0 + b1 x)], s > 0}, {b0, b1, s}] is a bit simpler. No need for MapThread or pure functions. $\endgroup$ Commented Sep 18, 2016 at 5:35

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.