1
$\begingroup$

If this is out of topic and going to be closed, I will appreciate to know where it is right to ask this question, as I am kind of lost right now.


I am a software engineering BSc, and recently started a deep learning position after having learned multiple MOOCs on the topic and having done some side projects.

some of those courses are

  • all of Andrew ng.'s courses on coursera
  • David Silver's reinforcement learning
  • cs231n by Stanford

I start to find myself missing knowledge which keeps popping up, mostly in the field of information theory, statistics and probability. For example:

KL divergence
Information units
cross entropy
why are probabilities always handled as log probabilities?

and more.

I find myself going down rabbit holes in Wikipedia, trying to understand probabilistic and statistical theories, and not finding an end (or, a start).


I wish to shrink my knowledge gap by learning some basic courses I seem to be missing.
Learning from Wikipedia doesn't work for me, as I seem to be lacking some more basic understanding, only I don't know what exactly.

What are some relevant courses or topics I can learn by a syllabus?

I am hoping one or two theoretical courses would close this for me. Just don't know what to look for.

$\endgroup$
7
  • 4
    $\begingroup$ I'm not sure what you mean by 'by a syllabus', but if you're willing to consider books, Information Theory, Inference and Learning Algorithms by MacKay links ML and information theory nicely, and is written at an undergraduate engineering level. "Elements of Information Theory" by Cover & Thomas is an excellent book on information theory itself at a general masters of engineering level. $\endgroup$ Commented Dec 16, 2020 at 12:51
  • 2
    $\begingroup$ FYI, I had a similar background to yours. 4 years into an ML PhD, I'm still down that very rabbit hole. If you're looking to do ML long term, I'd really suggest sitting down and working through a couple books back to front. Anything less and all of ML will seem like a bag of disjoint tricks. $\endgroup$ Commented Dec 16, 2020 at 12:54
  • $\begingroup$ @Oxonon Thanks! I will definately set time aside for those. I do know myself, however, and know that seeing lectures works better for me than books. Do you know of courses that go by any of those books maybe? by "By a syllabus" I mean something structured that wouldn't require me to stop and go learn topics from scratch, but rather would teach those topics in the correct order. The lack of order and structured background is what makes Wikipedia style learning not work for me. $\endgroup$ Commented Dec 16, 2020 at 12:58
  • 1
    $\begingroup$ MacKays book is made specifically with a syllabus approach in mind, at the start he highlights the order in which one can read through the chapters. I'm not familiar with online lectures. I'd however treat those as a nice easy-going overview to use ontop of detailed learning from a book, not as a substitute. I don't think (non-mathematical) lectures ever go into enough depth to learn a topic. $\endgroup$ Commented Dec 16, 2020 at 13:02
  • 2
    $\begingroup$ MacKays. It applies the ideas to more relevant topics for you rather than purely channel coding etc. These books should be available legally online (preprints etc). $\endgroup$ Commented Jan 10, 2021 at 10:52

1 Answer 1

0
$\begingroup$

Below are standard textbooks on information theory, each with lots of content and derivations of information theoretic measures

  1. MacKay, Information Theory, Inference and Learning Algorithms
  2. Cover & Thomas, Elements of Information Theory
  3. Norwich, Information, Sensation and Perception
$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.