Skip to main content
Commonmark migration
Source Link

What is the proper/canonical way to do this?

For example, $0 < r1 < 1$ and $0 < r2 < 1$. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards $0.5$.

The specific use case I have is a game/simulation I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviours also have random numbers associated with them. For the purposes of most behavioural calculations, I only use the behaviour random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less randomness.

Desired Outcome : Deterministically combine more than one source of entropy

So “Entity A” may have a small random number, and so most of its attributes are skewed small. It's behaviour random number happens to be big, so most of its behaviour attributes are skewed large. When combined the two random numbers deterministically result in $n$, and so some of its behaviour attributes skew towards $n$.

RE: Off-Topic

###RE: Off-Topic MyMy question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

What is the proper/canonical way to do this?

For example, $0 < r1 < 1$ and $0 < r2 < 1$. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards $0.5$.

The specific use case I have is a game/simulation I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviours also have random numbers associated with them. For the purposes of most behavioural calculations, I only use the behaviour random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less randomness.

Desired Outcome : Deterministically combine more than one source of entropy

So “Entity A” may have a small random number, and so most of its attributes are skewed small. It's behaviour random number happens to be big, so most of its behaviour attributes are skewed large. When combined the two random numbers deterministically result in $n$, and so some of its behaviour attributes skew towards $n$.

###RE: Off-Topic My question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

What is the proper/canonical way to do this?

For example, $0 < r1 < 1$ and $0 < r2 < 1$. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards $0.5$.

The specific use case I have is a game/simulation I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviours also have random numbers associated with them. For the purposes of most behavioural calculations, I only use the behaviour random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less randomness.

Desired Outcome : Deterministically combine more than one source of entropy

So “Entity A” may have a small random number, and so most of its attributes are skewed small. It's behaviour random number happens to be big, so most of its behaviour attributes are skewed large. When combined the two random numbers deterministically result in $n$, and so some of its behaviour attributes skew towards $n$.

RE: Off-Topic

My question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

corrected spelling, fixed grammar, improved readability
Source Link

What is the proper/canonical way to do this?

For example, 0 < r1 < 1$0 < r1 < 1$ and 0 < r2 < 1$0 < r2 < 1$. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards 0$0.5$.5

The specific use case I have is a game/simsimulation I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviorsbehaviours also have random numbers associated with them. For the purposes of most behavioralbehavioural calculations, I only use the behaviorbehaviour random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less random-nessrandomness.

Desired Outcome : Deterministically combine more than one source of entropy

Desired Outcome : Deterministically combine more than one source of entropy

So Entity A“Entity A” may have a small random number, and so most of its attributes are skewed small. It's behaviorbehaviour random number happens to be big, so most of its behaviorbehaviour attributes are skewed large. When combined the two random numbers deterministically result in n$n$, and so some of it's behaviorits behaviour attributes skew towards n$n$.

###RE: Off-Topic My question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

What is the proper/canonical way to do this?

For example, 0 < r1 < 1 and 0 < r2 < 1. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards 0.5

The specific use case I have is a game/sim I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviors also have random numbers associated with them. For the purposes of most behavioral calculations, I only use the behavior random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less random-ness.

Desired Outcome : Deterministically combine more than one source of entropy

So Entity A may have a small random number, and so most of its attributes are skewed small. It's behavior random number happens to be big, so most of its behavior attributes are skewed large. When combined the two random numbers deterministically result in n, and so some of it's behavior attributes skew towards n

###RE: Off-Topic My question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

What is the proper/canonical way to do this?

For example, $0 < r1 < 1$ and $0 < r2 < 1$. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards $0.5$.

The specific use case I have is a game/simulation I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviours also have random numbers associated with them. For the purposes of most behavioural calculations, I only use the behaviour random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less randomness.

Desired Outcome : Deterministically combine more than one source of entropy

So “Entity A” may have a small random number, and so most of its attributes are skewed small. It's behaviour random number happens to be big, so most of its behaviour attributes are skewed large. When combined the two random numbers deterministically result in $n$, and so some of its behaviour attributes skew towards $n$.

###RE: Off-Topic My question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

Tweeted twitter.com/#!/StackCrypto/status/419967454132326400
clarified question based on comments.
Source Link
Shad
  • 283
  • 3
  • 7

Combine two(+) random numbers without averaging Deterministically combine more than one source of entropy

What is the proper/canonical way to do this?

For example, 0 < r1 < 1 and 0 < r2 < 1. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards 0.5

The specific use case I have is a game/sim I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviors also have random numbers associated with them. For the purposes of most behavioral calculations, I only use the behavior random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less random-ness.

Desired Outcome : Deterministically combine more than one source of entropy

So Entity A may have a small random number, and so most of its attributes are skewed small. It's behavior random number happens to be big, so most of its behavior attributes are skewed large. When combined the two random numbers deterministically result in n, and so some of it's behavior attributes skew towards n

###RE: Off-Topic My question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

Combine two(+) random numbers without averaging

For example, 0 < r1 < 1 and 0 < r2 < 1. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards 0.5

The specific use case I have is a game/sim I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviors also have random numbers associated with them. For the purposes of most behavioral calculations, I only use the behavior random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less random-ness.

Deterministically combine more than one source of entropy

What is the proper/canonical way to do this?

For example, 0 < r1 < 1 and 0 < r2 < 1. Presuming uniformly distributed probabilities for the two, combining/averaging them is going to bias very quickly towards 0.5

The specific use case I have is a game/sim I'm programming. An entity has a random number it was given that's specific to it; Their individual behaviors also have random numbers associated with them. For the purposes of most behavioral calculations, I only use the behavior random number, but for a few I want to incorporate the entity's random number. Simply averaging them will result in less random-ness.

Desired Outcome : Deterministically combine more than one source of entropy

So Entity A may have a small random number, and so most of its attributes are skewed small. It's behavior random number happens to be big, so most of its behavior attributes are skewed large. When combined the two random numbers deterministically result in n, and so some of it's behavior attributes skew towards n

###RE: Off-Topic My question is not specific to game development. The particular use case is. How do cryptographers deterministically combine sources of entropy? I assume they have uses case for this outcome.

Source Link
Shad
  • 283
  • 3
  • 7
Loading