1
$\begingroup$

I am trying to understand a proof in this paper (DET = det -- A remark on the distributional determinant, Stefan Müller 1990, used to be available here http://shelf2.library.cmu.edu/Tech/53922174.pdf). We have a standard positive mollifier $\phi_r = r^{-n}\phi\left(\frac{x-x_0}{r}\right)$ and functions $v\in W^{1,p}(\Omega)$ and $\sigma\in L^q(\Omega, \mathbb{R}^n)$ with $\frac{1}{p}-\frac{1}{n}+\frac{1}{q}\leq 1$. Furthermore, $\textrm{div}\sigma = 0$.

Now there is the following step I do not fully understand:

$$ -\int_{B_r(x_0)} \sum_{j=1}^n \partial_j \phi_r (v(x_0) + Dv(x_0)(x-x_0))\sigma^j(x)dx = \int_{B_r(x_0)} \phi_r Dv(x_0)\cdot \sigma(x)dx $$

Does anybody have any tipps on how this identity is derived?

Best regards and many thanks in advance!

Here is what is written in the original paper:

What it says in the original paper

$\endgroup$
1
  • $\begingroup$ Probably a misprint in the RHS: should be $D\phi_r$ instead of $\phi_r D.$ And no $-$ in front of the LHS. $\endgroup$ Commented Jan 3, 2023 at 9:59

1 Answer 1

1
$\begingroup$

Using Gauss-Green theorem and also integration by parts: $$ \int_U u_{x_i}vdx = -\int_U u v_{x_i}dx + \int_{\partial U} uv \nu^i dS $$ from $div(\sigma)=0$ we know that: $$ div(\sigma) = 0 \rightarrow \sum\sigma^{i}_{x_i}=0 $$ also for mollifier $\psi$ we know that : $$ \psi_r \big|_{\partial B_r(x_0)} = 0 $$ Now using this we have : $$ \require{cancel} \int_{B_r(x_0)} \sum_i (\psi_r)_{x_i} (v(x_0)+Dv(x_0)(x-x_0))\sigma^i dx = \\ v(x_0)\sum_i \int_{B_r(x_0)} (\psi_r)_{x_i}\sigma^idx +Dv(x_0).\sum_i\int_{B_r(x_0)}(\psi_r)_{x_i}(x-x_0)\sigma^i dx = \\ v(x_0)\big(-\int_{B_r(x_0)} \psi_r \cancelto{0}{(\sum_i \sigma^i_{x_i})} dx + \sum_i \int_{\partial B_r(x_0)}\cancelto{0}{\psi_r \big|_{\partial B_r(x_0)}} \sigma^i \nu^idS \big) \\ +Dv(x_0). \big(-\sum_i \int_{B_r(x_0)} \psi_r ((x-x_0)\sigma^i)_{x_i}dx +\sum_i \int_{\partial B_r(x_0)}\cancelto{0}{\psi_r \big|_{\partial B_r(x_0)}}(x-x_0)\sigma^i \nu^idS \big) $$ boundry terms are zero hence: $$ = -Dv(x_0). \sum_i \int_{B_r(x_0)} \psi_r (\sigma^i e_i+(x-x_0)\sigma^i_{x_i})dx $$ Second term is again zero due to divergence of $\sigma$: $$ =- \int_{B_r(x_0)} \sum_i \psi_r Dv(x_0).(\sigma^i e_i) dx = -\int_{B_r(x_0)} \psi_r Dv(x_0).\sigma dx \quad \square $$

$\endgroup$
2
  • $\begingroup$ Thank you for your answer! One quick follow up question: Why can I pull $Dv(x_0)$ in front of the sum in the first identity? $\endgroup$ Commented Jan 3, 2023 at 14:13
  • 1
    $\begingroup$ $Dv(x_0)$ is a linear function. its act on $x$ is euqal to inner product of vector $Dv(x_0)$ which is a constant vector with $x$. dot product (inner product) is bilinear operator. it has distributive property. $Dv(x_0).(\sum_i \vec{v^i}) = \sum_i (Dv(x_0).\vec{v^i})$. $\endgroup$ Commented Jan 3, 2023 at 14:38

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.