Skip to main content
Notice removed Draw attention by CommunityBot
Bounty Ended with no winning answer by CommunityBot
make question more concrete
Source Link
nalzok
  • 125
  • 6

Distortion-Rate function for standard Gaussian signals quantized with a givenuniform alphabet

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is$Y \sim U\{y_1, y_2, \cdots, y_{16}\}$ follows a discrete uniform distribution (in fact, $\{y_1, y_2, \cdots, y_{16}\}$ is a reconstruction alphabet for $X$). Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$$R^* = 2$ (bits, not nats!), I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_y p(x) p(y|x) d(x, y) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_{i=1}^{16} p(x)p(y_i|x) d(x, y_i) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental problem. Is there a known result about it?

Background:

Ultimately, I only care about $p(x|y)$, i.e. the backward test channel that attains the distortion-rate bound. I posed the question as finding $p(y|x)$ since it follows naturally from the definition of $D(R^*)$ and applying the Bayes rule to $p(y|x)$ gives us $p(x|y)$. However, if you can somehow find $p(x|y)$ directly, feel free to post an answer!

Attempt:

Since a higher rate can always decrease the distortion, the constraint is essentially $I(X; Y) = R^* = 2$. It follows that

$$ \begin{aligned} 2 = I(X; Y) =& H(X) - H(X|Y) \\ =& \frac{1}{2}\log_2(2 \pi e \sigma^2) - H(X|Y) \\ H(X|Y) =& \frac{1}{2}\log_2(2 \pi e \sigma^2) - 2 \end{aligned} $$

and

$$ \begin{aligned} 2 = I(X; Y) =& H(Y) - H(Y|X) \\ =& 4 - H(Y|X) \\ H(Y|X) =& 2 \end{aligned} $$

However, I have no clue how to proceed.

Distortion-Rate function for standard Gaussian signals quantized with a given alphabet

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is a discrete reconstruction alphabet. Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$, I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_y p(x) p(y|x) d(x, y) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental problem. Is there a known result about it?

Background:

Ultimately, I only care about $p(x|y)$, i.e. the backward test channel that attains the distortion-rate bound. I posed the question as finding $p(y|x)$ since it follows naturally from the definition of $D(R^*)$ and applying the Bayes rule to $p(y|x)$ gives us $p(x|y)$. However, if you can somehow find $p(x|y)$ directly, feel free to post an answer!

Distortion-Rate function for standard Gaussian signals quantized with a uniform alphabet

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \sim U\{y_1, y_2, \cdots, y_{16}\}$ follows a discrete uniform distribution (in fact, $\{y_1, y_2, \cdots, y_{16}\}$ is a reconstruction alphabet for $X$). Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^* = 2$ (bits, not nats!), I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_{i=1}^{16} p(x)p(y_i|x) d(x, y_i) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental problem. Is there a known result about it?

Background:

Ultimately, I only care about $p(x|y)$, i.e. the backward test channel that attains the distortion-rate bound. I posed the question as finding $p(y|x)$ since it follows naturally from the definition of $D(R^*)$ and applying the Bayes rule to $p(y|x)$ gives us $p(x|y)$. However, if you can somehow find $p(x|y)$ directly, feel free to post an answer!

Attempt:

Since a higher rate can always decrease the distortion, the constraint is essentially $I(X; Y) = R^* = 2$. It follows that

$$ \begin{aligned} 2 = I(X; Y) =& H(X) - H(X|Y) \\ =& \frac{1}{2}\log_2(2 \pi e \sigma^2) - H(X|Y) \\ H(X|Y) =& \frac{1}{2}\log_2(2 \pi e \sigma^2) - 2 \end{aligned} $$

and

$$ \begin{aligned} 2 = I(X; Y) =& H(Y) - H(Y|X) \\ =& 4 - H(Y|X) \\ H(Y|X) =& 2 \end{aligned} $$

However, I have no clue how to proceed.

Notice added Draw attention by nalzok
Bounty Started worth 50 reputation by nalzok
added 371 characters in body
Source Link
nalzok
  • 125
  • 6

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is a discrete reconstruction alphabet. Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$, I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_y p(x) p(y|x) d(x, y) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental problem. Is there a known result about it?

Background:

Ultimately, I only care about $p(x|y)$, i.e. the backward test channel that attains the distortion-rate bound. I posed the question as finding $p(y|x)$ since it follows naturally from the definition of $D(R^*)$ and applying the Bayes rule to $p(y|x)$ gives us $p(x|y)$. However, if you can somehow find $p(x|y)$ directly, feel free to post an answer!

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is a discrete reconstruction alphabet. Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$, I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_y p(x) p(y|x) d(x, y) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental problem. Is there a known result about it?

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is a discrete reconstruction alphabet. Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$, I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_y p(x) p(y|x) d(x, y) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental problem. Is there a known result about it?

Background:

Ultimately, I only care about $p(x|y)$, i.e. the backward test channel that attains the distortion-rate bound. I posed the question as finding $p(y|x)$ since it follows naturally from the definition of $D(R^*)$ and applying the Bayes rule to $p(y|x)$ gives us $p(x|y)$. However, if you can somehow find $p(x|y)$ directly, feel free to post an answer!

added 24 characters in body
Source Link
nalzok
  • 125
  • 6

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is a discrete reconstruction alphabet. Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$, I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int p(x) p(y|x) d(x, y) dxdy \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_y p(x) p(y|x) d(x, y) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental questionproblem. Is there a known result about it?

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is a discrete reconstruction alphabet. Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$, I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int p(x) p(y|x) d(x, y) dxdy \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental question. Is there a known result about it?

Let's say $X \sim N(0, 1)$ is a standard Gaussian input signal and $Y \in \{y_1, y_2, \cdots, y_n\}$ is a discrete reconstruction alphabet. Under the MSE distortion measure $d(x, y) = (x - y)^2$, for a given rate $R^*$, I want to find

$$ \begin{aligned} D(R^*) =& \inf_{p(y|x)} \int_{-\infty}^{\infty} \sum_y p(x) p(y|x) d(x, y) dx \\ & \text{subject to}\; I(X; Y) \le R^* \end{aligned}. $$

This is a convex optimization problem, which is generally easy, but here $p(y|x)$ can be an arbitrary continuous distribution so I don't know how to parameterize it. I am aware of the Blahut–Arimoto algorithm, but it only works when both $X$ and $Y$ are discrete and finite (in which case $p(y|x)$ can be expressed as a table).

Anyway, it sounds like a pretty fundamental problem. Is there a known result about it?

Source Link
nalzok
  • 125
  • 6
Loading