What's the best way to model situations with dependencies? What I mean is well-defined arrangements like:
If I choose a 6-sided die with probability $p$ and a 20-sided one otherwise, and then roll it twice, what's the probability distribution on the sum of the two rolls?
Here's my current attempt:
p = 0.3 sides = TransformedDistribution[6+(1-d)*14, d \[Distributed]BernoulliDistribution[p]] rolls[dist_] := TransformedDistribution[x+y, {x,y} \[Distributed] ProductDistribution[{dist, 2}]] sumDist = ParameterMixtureDistribution[rolls[DiscreteUniformDistribution[{1,n}]], n \[Distributed] sides] RandomVariate[sumDist, 10] I get this formula for the random variate--it looks like $n$ isn't properly being passed through:
RandomVariate[ ParameterMixtureDistribution[ TransformedDistribution[\[FormalX]1 + \[FormalX]2, {\[FormalX]1, \ \[FormalX]2} \[Distributed] ProductDistribution[{DiscreteUniformDistribution[{1, n}], 2}]], n \[Distributed] TransformedDistribution[ 6 + 14 (1 - \[FormalX]), \[FormalX] \[Distributed] BernoulliDistribution[0.3]]], 10] My follow-on question will be to compute the probability of which die was chosen based on seeing a sum of e.g 11, so I'm trying to keep that initial choice in the model explicitly.

Conditioned[]will be useful here. $\endgroup$