0
$\begingroup$

I wanted to make sure of something, I know that for any function $f$ we have $ff'= \frac{1}{2}(f^2)'$. Then do we have the same principle for the gradient and divergence operators ? Lets take $f$ a function and $\vec{g}$ a vector, do we have the equalities below for any coordinates (spherical, cylindrical?...): $$ f \vec{\nabla}f \stackrel{?}{=} \frac{1}{2}\vec{\nabla}f^2$$ $$\vec{g}.(\vec{\nabla}\cdot \vec{g}) \stackrel{?}{=} \frac{1}{2} \vec{\nabla} \cdot \vec{g^2}$$ and if so does someone have a proof of it or where I could find one ? Because for the first one with the gradient I think this is true but I have big doubts for the one with the divergence. Thanks

$\endgroup$
3
  • $\begingroup$ How are you defining $g^2$? $\endgroup$ Commented Oct 9, 2022 at 16:55
  • $\begingroup$ Well that's mainly the point of my question for the divergence and instead of $g^2$ it could maybe be : $\vec{g} \times \vec{g}$ $\endgroup$ Commented Oct 9, 2022 at 17:19
  • $\begingroup$ But still is it correct for the gradient ? $\endgroup$ Commented Oct 9, 2022 at 17:20

1 Answer 1

0
$\begingroup$

$ \newcommand\R{\mathbb R} $

Let $\bullet_s : \R\times\R \to \R$ and $\bullet_v : \R^n\times\R^n \to \R^n$ be commutative products on vectors. A general fact is that the derivative of an expression is the sum of the derivatives of its parts. For $\bullet_s$ and a function $f : \R^n \to \R$ this looks like $$ \nabla(f\bullet_s f) = \dot\nabla(\dot f\bullet_s f) + \dot\nabla(f\bullet_s\dot f) = 2\dot\nabla(\dot f\bullet_s f). \tag{1} $$ The notation $\dot\nabla$ means that we are only differentiating $\dot f$, and the undotted $f$ should be thought of as constant while differentiating. An example of a more verbose notation would be $$ \dot\nabla(\dot f\bullet_s f) = \bigl[\nabla_y(f(y)\bullet_s f(x))\bigr]_{y=x}. $$ When $\bullet_s$ is just multiplication, then (1) gives $$ \nabla f^2 = 2(\nabla f)f = 2f\nabla f. $$

Similarly, when considering $\bullet_v$ and $g : \R^n \to \R^n$ we get $$ \nabla\cdot(g\bullet_v g) = \dot\nabla\cdot(\dot g\bullet_v g) + \dot\nabla\cdot(g\bullet_v\dot g) = 2\dot\nabla\cdot(\dot g\bullet_v g). \tag{2} $$ I stress that this requires that $\bullet_v$ is commutative; if it is anti-commutative like the cross product when $n=3$, a similar derivation to (2) shows that $\nabla\cdot(g\times g) = 0$, as it should since $g\times g = 0$.

The nice thing about the $\dot\nabla$ notation is that it can be treated like a generic vector; any identity that hold for all vectors will also hold when that vector replaced with $\dot\nabla$. So we could say that the reason $\dot\nabla(\dot ff) = f\nabla f$ works is because for any vector $v$ and scalars $a, b$ $$ v(ab) = b(va) $$ so replacing $v$ with $\dot\nabla$, $a$ with $\dot f$, and $b$ with $f$ gives the desired result.

We can use this idea to try to get what you want with $\bullet_v$. However, your equation $g(\nabla\cdot g) = \tfrac12\nabla\cdot g^2$ doesn't make sense as it stands, since the LHS is a vector but the RHS is a scalar. What we'll consider instead is trying to make true $$ \nabla\cdot(g\bullet_v g) = 2F(g)(\nabla\cdot g) \tag{3} $$ where is $F : \R^n \to \R$ and $F(g)$ is $F(g(x))$ with the $x$ dependence suppressed. A sufficient condition is for the following identity on vectors $u,v,w$ to hold: $$ u\cdot(v\bullet_v w) = F(w)(u\cdot v). $$ This fully determines $v\bullet_v w$. Choose $u = e_i$ where $e_i$ is the $i^\text{th}$ standard basis element, multiply by $e_i$, and sum over $i$. This gives $$ v\bullet_v w = F(w)\sum_{i=1}^n(e_i\cdot v)e_i = F(w)v. $$ We need this to be commutative, so $$ F(w)v - F(v)w = 0, $$ But this is impossible unless $F(v) = 0$ for all $v$, since we can simply choose $v$ and $w$ to be linearly independent. (In the case $n=1$ we can choose $F(v) = v$, and $\bullet_v$ and the dot product $\cdot$ are just multiplication.) This suggests (though does not prove) that there is no non-trivial product $\bullet_v$ taking vectors to vectors satisfying (3).

$\endgroup$
3
  • $\begingroup$ Wow ok thank you for your developed question and your time. So to put it in a nutshell I was 'right' for the gradient but it is more complicated for the divergence operator because we are dealing with vectors ? $\endgroup$ Commented Oct 10, 2022 at 16:27
  • $\begingroup$ @Michael Yes, I think that is a fair assessment. $\endgroup$ Commented Oct 10, 2022 at 17:05
  • $\begingroup$ Amazing, thanks again $\endgroup$ Commented Oct 10, 2022 at 20:00

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.