I want to use the "projected Gradient decent algorithm" to solve this optimization problem but I do not know why it does not converge. I appreciate if anybody can help me to find the mistake.
Given $$n, C, r_i, p_i, \quad∀ i={1,2,...,n} $$ I want to solve this optimization problem: $$maximize \quad f(x_1,x_2,...,x_n)=\prod_{i=1}^n {{((x_i-c_i)/r_i)}^{p_i}} $$ $$s.t \quad\quad\quad\quad {x_i≤r_i},\quad {c_i<x_i}, \quad {\sum_{i=1}^n x_i}=C$$ where $$0<p_i≤1 \quad\quad, \sum_{i=0}^n p_i =1\quad \quad\quad, ∀x_i,r_i∈R,\quad x_i,r_i>0,\quad C>0 $$ After logarithmic transformation and Lagrange relaxation the problem is transformed to : $$minimize \quad -f(x_1,x_2,...,x_n)=-\sum_{i=1}^n {{p_i}log{((x_i-c_i)/r_i)+\beta {((\sum_{i=1}^n x_i)}-C})} $$ $$s.t \quad\quad\quad\quad\quad\quad\quad\quad\quad\quad\quad {x_i≤r_i},\quad {c_i<x_i},\quad \beta>0$$ Now I use projected gradient algorithm as what follows: \begin{equation} \frac{\partial f}{\partial x_i} = -\frac{p_i}{x_i-c_i} +\beta ,\quad\quad\quad\quad \frac{\partial f}{\partial \beta} = {(\sum_{i=1}^n x_i)-C},\end{equation} Projected gradient algorithm: $$\alpha=0.0001; \quad \epsilon=0.000001;\quad \beta_{(k)}=0.1; $$ $$x_{i(k)}= c_i+(p_i/\beta_{(k)});\quad **//i=1,2,...,n$$ $$repeat $$ $$\beta_{(k+1)}=max(\beta_{(k)}-\alpha{(\sum_{i=1}^n x_{i(k)})-C),\epsilon)} \\x_{i(k+1)}=x_{i(k)}-\alpha(-\frac{p_i}{x_i-c_i} +\beta);\quad **//i=1,2...,n$$ $$x_{i(k+1)}=max\{{c_i+ \epsilon, \quad min\{{x_{i(k+1)},r_i}\}\}} \quad **//Projection$$ $$if \quad \quad \mid{x_{i(k+1)}-x_{i(k)}} \mid<\epsilon \quad and \quad \mid x_{i(k+1)}-x_{i(k)} \mid<\epsilon \quad then \quad stop \quad **//i=1,2,...,n $$ $$else\\ \quad \beta_{(k)}=\beta_{(k+1)}; \quad x_{i(k)}=x_{i(k+1)}; . \quad **//i=1,2,...,n $$