On Differential Forms

· klm's blog


Original post is here: eklausmeier.goip.de

Abstract. This article will give a very simple definition of $k$-forms or differential forms. It just requires basic knowledge about matrices and determinants. Furthermore a very simple proof will be given for the proposition that the double outer differentiation of $k$-forms vanishes.

MSC 2010: 58A10

1. Basic definitions #

We denote the submatrix of $A=(a_{ij})\in R^{m\times n}$ consisting of the rows $i_1,\ldots,i_k$ and the columns $j_1,\ldots,j_k$ with $$ [A]{\textstyle!{\scriptstyle j_1\atop\scriptstyle i_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle j_k\atop\scriptstyle i_k}} := \begin{pmatrix} a_{i_1j_1} & \ldots & a_{i_1j_k}\ \vdots & \ddots & \vdots\ a_{i_kj_1} & \ldots & a_{i_kj_k}\ \end{pmatrix} $$ [more_WP_Tag]and its determinant with $$ A{\textstyle!{\scriptstyle j_1\atop\scriptstyle i_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle j_k\atop\scriptstyle i_k}} := \det [A]{\textstyle!{\scriptstyle j_1\atop\scriptstyle i_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle j_k\atop\scriptstyle i_k}}. $$ For example $$ A = \begin{pmatrix}a_{11}&a_{12}&a_{13}\ a_{21}&a_{22}&a_{23}\ \end{pmatrix}, \qquad A_{1,2}^{1,3} = a_{11}a_{23} - a_{21}a_{13}. $$

Suppose $$ H \in R^{n\times(n+1)} $$ and let $$ f,g\colon U\subseteq R^n\to R, \qquad U \text{ open}, $$ be two functions which are two-times continuously differentiable. Then we call for a fixed $k$ the expression $$ f,H_\alpha^{1\ldots k}, \qquad \alpha=\left(i_1,\ldots,i_k\right) \in\left{1,\ldots,n\right}^k, $$ a basic $k$-form or basic differential form of order $k$. It's a real function of $n+k^2$ variables. For $k>n$ the expression is defined to be zero. If $f$ also depends on $\alpha$ then $$ \sum_{1\le i_1<\cdots<i_k\le n} f_{i_1\ldots i_k} H{\textstyle!{\scriptstyle1\atop\scriptstyle i_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle k\atop\scriptstyle i_k}} $$ is called a $k$-form. It's a real function of $n+kn$ variables which is $k$-linear in the $k$ column-vectors of $H$.

For example for $f\colon R\to R$ and $H\in R^{1\times1}$ we have $f(x),H$. This is a linear function in $H$ and a possibly non-linear function in $x$.

2. Differentiation of $k$-forms #

For the differential form $$ \omega = f H^{1\ldots k}\alpha, \qquad \alpha=\left(i_1,\ldots,i_k\right), $$ we define $$ d\omega := \sum{\nu=1}^n {\partial f\over\partial x_\nu} H^{1\ldots k+1}_{\nu,\alpha} $$ as the outer differentiation of $\omega$. This is a $(k+1)$-form. It's a function of $n+(k+1)n$ variables.

The $0$-form $$ \omega = f, \qquad \left|\alpha\right|=k=0 $$ yields $$ dw = \sum_{\nu=1}^n {\partial f\over\partial x_\nu} H^1_\nu, \tag 1 $$ which corresponds to $\nabla f = \mathop{\rm grad}f$.

In the special case $k=\left|\alpha\right|=1$ we get for $$ \omega = \sum_{i=1}^n f_i H^1_i $$ the result $$ d\omega = \sum_{i=1}^n \sum_{j=1}^n {\partial f_i\over\partial x_j} H^{1,2}{j,i} = \sum{i<j} \left({\partial f_i\over\partial x_j} - {\partial f_j\over\partial x_i}\right) H^{1,2}_{j,i}. \tag 2 $$

This corresponds to $\mathop{\rm rot} f$.

Let hat ($\hat{}$) mean exclusion from the index list. The case $k=n-1$ for $$ \omega = \sum_{i=1}^n (-1)^{i-1} f_i,H^{>1\ldots n-1>}{1\ldots\hat\imath\ldots n} $$ delivers $$ dw = \sum{i=1}^n \sum_{\nu=1}^n (-1)^{i-1} {\partial f_i\over\partial x_\nu} H^{>>1\ldots n}{\nu,1\ldots\hat\imath\ldots n} = \sum{i=1}^n {\partial f_i\over\partial x_\nu} H^{1\ldots n}{1\ldots n} = \left(\sum{i=1}^n{\partial f_i\over\partial x_i}\right) \det H. $$ This corresponds to $\mathop{\rm div}f$.

Theorem. For $\omega = f H_\alpha^{1\ldots k}$ we have $$ dd\omega = 0. $$

Proof: With $$ d\omega = \sum_{\nu=1}^n {\partial f\over\partial x_\nu} H_{\nu,\alpha}^{1\ldots k+1} $$ we get $$ dd\omega = \sum_{\nu=1}^n \sum_{\mu=1}^n {\partial^2 f\over\partial x_\nu\partial x_\mu} H_{\mu,\nu,\alpha}^{1\ldots k+2} $$ and this is zero, because $$ H_{\mu,\mu,\alpha}^{1\ldots k+2} = 0, \qquad H_{\mu,\nu,\alpha}^{1\ldots k+2} = -H_{\nu,\mu,\alpha}^{1\ldots k+2}, $$ and $$ {\partial^2f\over\partial x_\nu\partial x_\mu} = {\partial^2f\over\partial x_\mu\partial x_\nu}. $$

Application of this theorem to an $0$-form with an $f\colon U\subseteq R^n\to R$ and a $1$-form with an $a\colon U\to R^n$ reading (1) and then (2) yields $$ \mathop{\rm rot}\mathop{\rm grad} f = 0, \qquad \mathop{\rm div}\mathop{\rm rot} a = 0. $$ The second equation is only true for $n=3$ because $$ {n\choose 2} = n \quad (n\in N) \qquad\Leftrightarrow\qquad n = 3. $$

Definition. Suppose $$ \phi\colon D\to E\subset R^n, \qquad D\subset!\subset R^k, $$ is differentiable, its derivative denoted by $\phi'$, and $$ f\colon E\to R. $$ For the differential form $\omega = f H^{1\ldots k}\alpha$ we define the back-transportation as $$ \phi^*\omega := (f\circ\phi) , (\phi')^{1\ldots k}{\alpha} $$ and the integral over $k$-forms as $$ \int_\phi \omega := \int_D \phi^*\omega. $$

For example the case $k=1$, $$ \omega = \sum_{i=1}^n f_i H^1_i $$ gives $$ \phi^*\omega = \sum_{i=1}^n (f_i\circ\phi) , (\phi')_i^1 . $$

3. The outer product of differential forms #

Suppose $$ H\in R^{n\times(n+1)}, \qquad k+m\leq n. $$ For the two differential forms $$ \omega = \sum_{1\le i_1<\cdots<i_k\le n} f_{i_1\ldots i_k} H{\textstyle!{\scriptstyle1\atop\scriptstyle i_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle k\atop\scriptstyle i_k}} $$ and $$ \lambda = \sum_{1\le j_1<\cdots<j_m\le n} g_{j_1\ldots j_m} H{\textstyle!{\scriptstyle k+1\atop\scriptstyle j_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle k+m\atop\scriptstyle j_m}} $$ the outer product is defined as $$ w\land\lambda := \sum {\scriptstyle1\le i_1<\cdots<i_k\le n\atop \scriptstyle1\le j_1<\cdots<j_m\le n} f{i_1\ldots i_k} g_{j_1\ldots j_m} H{\textstyle!{\scriptstyle1\atop\scriptstyle i_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle k\atop\scriptstyle i_k} !{\scriptstyle k+1\atop\scriptstyle j_1} !{\scriptstyle\ldots\atop\scriptstyle\ldots} !{\scriptstyle k+m\atop\scriptstyle j_m}} . $$ This is a differential form of order $k+m$. It's a function in $n+(k+m)n$ variables.

Theorem. $$ d(\omega\land\lambda) = d\omega\land\lambda + (-1)^k\omega\land d\lambda $$

Proof: With $$ \omega = \sum_\alpha f_\alpha H_\alpha^{1\ldots k}, \qquad \lambda = \sum_\beta g_\beta H_\beta^{1\ldots m} $$ then $$ \eqalign{ d(\omega\land\lambda) &= \sum_{\alpha,\beta} \sum_{\nu=1}^n \left( {\partial f_\alpha\over\partial x_\nu} g_\beta + f_\beta {\partial g_\beta\over\partial x_\nu} \right) H_{\nu,\alpha,\beta}^{1\ldots k+m+1} \cr &= \sum_{\alpha,\beta} \sum_{\nu=1}^n {\partial f_\alpha\over\partial x_\nu} g_\beta H_{\nu,\alpha,\beta}^{1\ldots k+m+1} + \sum_{\alpha,\beta} \sum_{\nu=1}^n f_\alpha {\partial g_\beta\over\partial x_\nu} H_{\nu,\alpha,\beta}^{1\ldots k+m+1} \cr &= d\omega\land\lambda + (-1)^k\omega\land d\lambda,\cr } $$ due to $$ H_{\nu,\alpha,\beta}^{1\ldots k+m+1} = (-1)^k H_{\nu,\beta,\alpha}^{1\ldots k+m+1} $$ and $$ d\lambda = \sum_\beta \sum_{\nu=1}^n {\partial g_\beta\over\partial x_\nu} H_{\nu,\beta}^{1\ldots m+1} . $$

An alternative definition for the differentiation of $k$-forms could be given.

Theorem. Suppose $$ \omega = f H_\alpha^{1\ldots k}, \qquad0\le\left|\alpha\right|\le k, $$ and $$ H = \left(h_1,\ldots,h_n,h_{n+1}\right) \in R^{n\times(n+1)} $$ with $\alpha=\left(i_1,\ldots,i_k\right)$ we have $$ d\omega = \det\left( \mathop{\rm col}\left( \nabla f, [{\rm Id}n]\alpha^{1\ldots n} \right) [H]{1\ldots n}^{1\ldots k+1} \right) = \sum{\nu=1}^n {\partial f\over\partial x_\nu} H_{\nu,\alpha}^{1\ldots k+1}, $$ where $\rm col$ just stacks matrices one above another and ${\rm Id}_n$ is the identity matrix in $R^n$.

Proof: $$ d\omega = \left|\begin{matrix} \left\langle\nabla f,h_1\right\rangle & \ldots & \left\langle\nabla f,h_k\right\rangle & \left\langle\nabla f,h_{k+1}\right\rangle \ \left\langle e_{i_1},h_1\right\rangle & \ldots & \left\langle e_{i_1},h_k\right\rangle & \left\langle e_{i_1},h_{k+1}\right\rangle \ \vdots & \ddots & \vdots & \vdots\cr \left\langle e_{i_k},h_1\right\rangle & \ldots & \left\langle e_{i_k},h_k\right\rangle & \left\langle e_{i_k},h_{k+1}\right\rangle \cr \end{matrix}\right| $$ $$ \qquad = \sum_{\nu=1}^n {\partial f\over\partial x_\nu} \left|\begin{matrix} h_{1,\nu} & h_{1,i_1} & \ldots & h_{1,i_k}\ \vdots & \vdots & \ddots & \vdots\ h_{k,\nu} & h_{k,i_1} & \ldots & h_{k,i_k}\ h_{k+1,\nu}& h_{k+1,i_1} & \ldots & h_{k+1,i_k}\ \end{matrix}\right| $$ since $$ \left\langle\nabla f,h_1\right\rangle = \sum_{\nu=1}^n {\partial f\over\partial x_\nu} h_{1,\nu}, $$ $$ \vdots\qquad\qquad\vdots $$ $$ \left\langle\nabla f,h_{k+1}\right\rangle = \sum_{\nu=1}^n {\partial f\over\partial x_\nu} h_{k+1,\nu}. $$

REFERENCES.

  1. Walter Rudin, Principles of Mathematical Analysis, Second Edition, McGraw-Hill, New York, 1964

  2. Otto Forster, Analysis 3: Integralrechnung im $R^n$ mit Anwendungen, Third Edition, Friedrich Vieweg & Sohn, Braunschweig/Wiesbaden, 1984