Grassmann numbers and fields

Grassmann variables

So-called Grassmann variables are generators \(\theta_i\) of an algebra, and they are anti-commuting such that \[\theta_i \theta_j + \theta_j \theta_i = 0.\] An immediate consequence is that \({\theta_j}^2 = 0\).

Basis

If there is a finite set of generators \({\theta_1, \theta_2, \ldots, \theta_n},\) one can write general elements of the Grassmann algebra as a linear superposition (with coefficients that are ordinary complex (or real) numbers) of the following basis elements \[\begin{aligned} & 1, \\ & \theta_1, \theta_2, \ldots, \theta_n, \\ & \theta_1\theta_2, \theta_1\theta_3,\ldots, \theta_{2} \theta_3, \theta_{2} \theta_4,\ldots, \theta_{n-1} \theta_n, \\ & \ldots \\ & \theta_1\theta_2\theta_3 \cdots \theta_n. \end{aligned}\] There are \(2^n\) such basis elements, because each Grassmann variable \(\theta_j\) can be either present or absent.

Grade of monomial

To a monomial \(\theta_{j_1}\cdots\theta_{j_q}\) one can associate a grade \(q\) which counts the number of generators in the monomial. For \(A_p\) and \(A_q\) being two such monomials one has \[A_p A_q = (-1)^{p \cdot q} A_q A_p.\] In particular, the monomials of even grade \[\begin{aligned} & 1, \\ & \theta_1\theta_2, \theta_1\theta_3,\ldots, \theta_{2} \theta_3,\ldots, \theta_{n-1} \theta_n,\\ & \ldots \end{aligned}\] commute with other monomials, be the latter of even or odd grade.

Grassmann parity

One can define a Grassmann parity transformation \(P\) that acts on all generators according to \[P(\theta_j) = -\theta_j, \quad\quad\quad P^2 = \mathbb{1}.\] Even monomials are even, odd monomials are odd under this transformation. The parity even part of the algebra, spanned by the monomials of even grade, constitutes a sub-algebra. Because its elements commute with other elements of the algebra they behave “bosonic”, while elements of the Grassmann algebra that are odd with respect to \(P\) behave “fermionic”.

Functions of Grassmann variables

Because of \(\theta^2 = 0\), functions of a Grassmann variable \(\theta\) are always linear, \[f(\theta) = f_0 + \theta f_1.\] Note that \(f_0\) and \(f_1\) could depend on other Grassmann variables but not \(\theta\).

Differentiation for Grassmann variables

To define differentiation of \(f(\theta)\) with respect to \(\theta\) we first bring it to the form \[f(\theta) = f_0 + \theta f_1\] and set then \[\frac{\partial}{\partial\theta} f(\theta) = f_1.\] Note that similar to \(\theta^2 = 0\) one has also \(\left(\frac{\partial}{\partial\theta}\right)^2 = 0\). One may verify that the chain rule applies. Take \(\sigma(\theta)\) to be an odd element and \(x(\theta)\) an even element of the Grassmann algebra. One has then \[\frac{\partial}{\partial\theta} f(\sigma(\theta), x(\theta)) = \frac{\partial \sigma}{\partial \theta} \frac{\partial f}{\partial \sigma} + \frac{\partial x}{\partial \theta} \frac{\partial f}{\partial x}.\] The derivative we use here is a left derivative.

Consider for example \[f= f_0 + \theta_1 \theta_2.\] One has then \[\begin{split} \frac{\partial}{\partial \theta_1}f = \theta_2, & \quad\quad \frac{\partial}{\partial \theta_2} f = -\theta_1, \\ \frac{\partial}{\partial \theta_2}\frac{\partial}{\partial \theta_1} f = 1, & \quad\quad \frac{\partial}{\partial \theta_1}\frac{\partial}{\partial \theta_2} f = -1. \end{split}\] One could also define a right derivative such that \[f\frac{\overleftarrow{\partial}}{\partial \theta_1} = -\theta_2,\quad\quad f\frac{\overleftarrow{\partial}}{\partial \theta_2} = \theta_1.\]

Integration for Grassmann variables

To define integration for Grassmann variables one takes orientation from two properties of integrals from \(-\infty\) to \(\infty\) for ordinary numbers. One such property is linearity, \[\int_{-\infty}^\infty dx \; c \; f(x) = c \int_{-\infty}^\infty dx \; f(x).\] The other is invariance under shifts of the integration variable, \[\int_{-\infty}^{\infty} dx \; f(x+a) = \int_{-\infty}^\infty dx\;f(x).\] For a function of a Grassmann variable \[f(\theta) = f_0 + \theta f_1\] One sets therefore \[\int d\theta \; f(\theta) = f_1.\] In other words, we have defined \[\int d\theta = 0, \quad\quad\quad \int d\theta\;\theta = 1.\] This is indeed linear and makes sure that \[\int d\theta\; f(\theta+\sigma) = \int d\theta \left\{(f_0 + \sigma f_1) + \theta f_1\;\right\} = \int d\theta\;f(\theta) = f_1.\] Note that one has formally \[\int d\theta\;f(\theta) = \frac{\partial}{\partial \theta} f(\theta).\]

Several variables

For functions of several variables one has \[\int d\theta_1 \int d\theta_2 f(\theta_1, \theta_2) = \frac{\partial}{\partial \theta_1} \frac{\partial}{\partial \theta_2} f(\theta_1, \theta_2).\] It is easy to see that derivatives with respect to Grassmann variables anti-commute \[\frac{\partial}{\partial \theta_j} \frac{\partial}{\partial \theta_k} = - \frac{\partial}{\partial \theta_k} \frac{\partial}{\partial \theta_j},\] and accordingly also the differentials anti-commute \[d\theta_j d\theta_k = -d\theta_k d\theta_j.\]

Functions of several Grassmann variables

A function that depends on a set of Grassmann variables \(\theta_1,\ldots,\theta_n\) can be written as \[f(\theta) = f_0 + \theta_j f^j_1 + \frac{1}{2} \theta_{j_1} \theta_{j_2} f_2^{j_1\;j_2}+ \ldots+ \frac{1}{n!} \theta_{j_1}\cdots\theta_{j_n} f_n^{j_1 \cdots j_n}.\] We use here Einsteins summation convention with indices \(j_k\) being summed over. The coefficients \(f_k^{j_1\cdots j_k}\) are completely anti-symmetric with respect to the interchange of any part of indices. In particular, the last coefficient can only be of the form \[f_n^{j_1 \cdots j_n} = \tilde{f}_n \varepsilon_{j_1\cdots j_n},\] where \(\varepsilon_{j_1\cdots j_n}\) is the completely anti-symmetric Levi-Civita symbol in \(n\) dimensions with \(\varepsilon_{12\ldots n} =1.\)

Differentiation and integration

Let us now discuss what happens if we differentiate or integrate \(f(\theta)\). One has \[\frac{\partial}{\partial \theta_k} f(\theta) = f_1^k + \theta_{j_2} f_2^{k j_2} + \ldots + \frac{1}{(n-1)!} \theta_{j_2}\cdots \theta_{j_n} f_n^{k j_2 \cdots j_n},\] and similar for higher order derivatives. In particular \[\frac{\partial}{\partial \theta_n}\cdots \frac{\partial}{\partial \theta_1}f(\theta) = f_n^{12\ldots n}= \tilde{f}_n.\] This defines also the integral with respect to all \(n\) variables, \[\begin{split} & \int d\theta_n\cdots d\theta_1 f(\theta) = f_n^{12\ldots} = \tilde{f}_n \\ & = \int d^n \theta f(\theta) = \int D\theta f(\theta). \end{split}\]

Linear change of Grassmann variables

Let us consider a linear change of the Grassmann variables in the form (summation over \(k\) is implied) \[\theta_j = J_{jk}\theta^{\prime}_{k},\] where \(J_{jk}\) is a matrix of commuting variables. We can write \[f(\theta) = f_0 + \ldots + \frac{1}{n!}\left(J_{i_1 j_1} \theta^{\prime}_{j_1} \right) \cdots \left(J_{i_n j_n}\theta^{\prime}_{jn} \right) \, \varepsilon_{i_1\cdots i_n}\tilde{f}_n.\] Now one can use the identity \[\varepsilon_{i_1\ldots i_n} J_{i_1 j_1} \cdots J_{i_n j_n}= \det(J) \, \varepsilon_{j_1\ldots j_n}.\] This can actually be seen as the definition of the determinant. One can therefore write \[f(\theta) = f_0 + \ldots + \frac{1}{n!} \theta^{\prime}_{j_1}\cdots\theta^{\prime}_{j_n}\varepsilon_{j_1 \ldots j_n} \det(J) \tilde{f}_n.\] The integral with respect to \(\theta^{\prime}\) is \[\int d^n \theta^{\prime} f(\theta) = \det(J) \tilde{f}_{n}.\] In summary, one has \[\int d^n \theta f(\theta) = \frac{1}{\det(J)} \int d^n \theta^{\prime} f(\theta).\]

Linear change of ordinary variables

One should compare this to the corresponding relation for conventional integrals with \(x_j = J_{jk} x^{\prime}_{k}\). In that case one has \[\int d^n x f(x) = \det(J) \int d^n x^{\prime} f(x^\prime).\] Note that the determinant appears in the denominator for Grassmann variables while it appears in the numerator for conventional integrals.

Gaussian integrals of Grassmann variables

Consider a Gaussian integral of two Grassmann variables \[\int d\theta d\xi \, e^{-\theta \xi b} = \int d\theta d\xi \, (1-\theta\xi b) = \int d\theta d\xi \,(1+\xi\theta b) = b.\] For a Gaussian integral over conventional complex variables one has instead \[\int d(\text{Re}\, x)\; d(\text{Im} \, x) \, e^{-x^* x b} = \frac{\pi}{b}.\] Again, integrals over Grassmann and ordinary variables behave in some sense “inverse”.

Higher dimensional Gaussian integrals

For higher dimensional Gaussian integrals over Grassmann numbers we write \[\int d^n\theta d^n \xi e^{-\theta_j a_{jk}\xi_k} = \int d\theta_n d\xi_n \cdots d\theta_1 d\xi_1 e^{-\theta_j a_{jk} \xi_k}.\] One can now employ two unitary matrices with unit determinat to perform a change of variables \[\theta_j = \theta^{\prime}_{l} U_{l j},\quad\quad\quad \xi_k = V_{km}\xi^{\prime}_{m},\] such that \[U_{l j} a_{j k} V_{km} = \tilde{a}_l \delta_{l m},\] is diagonal. This is always possible. The Gaussian integral becomes \[d^n \theta d^n \xi \, e^{-\theta_{j} a_{j k} \xi_{k}} = \det(U)^{-1} \det(V)^{-1} \int d^n \theta^{'} d^n \xi^{'} e^{-\theta^{'}_{l} \xi^{'}_{l} \tilde{a}_{l}} = \prod^n_{l=1} \tilde{a}_l = \det(a_{j k}).\] Again this is in contrast to a similar integral over commuting variables where the determinant would appear in the denominator.

Gaussian integrals with sources

Finally let us consider a Gaussian integral with source forms, \[\int d^n \bar{\psi} d^n \psi \; \exp \left[-\bar{\psi}M \psi + \bar{\eta} \psi + \bar{\psi} \eta \right]= Z(\bar\eta, \eta).\] We integrate here over independent Grassmann variables \(\psi = (\psi_1, \ldots, \psi_n)\) and \(\bar{\psi} = (\bar{\psi}_1, \ldots, \bar{\psi}_n)\) and we use the abbreviation \[\bar{\psi} M \psi = \bar{\psi}_j M_{jk} \psi_k.\] The source forms are also Grassmann variables \(\eta = (\eta_1, \ldots, \eta_n)\) and \(\bar{\eta} = (\bar{\eta}_1, \ldots , \bar{\eta}_n)\) with \[\bar{\eta} \psi = \bar{\eta}_j \psi_j, \quad\quad\quad \bar{\psi}\eta = \bar{\psi}_j \eta_j .\] As usual, we can write \[Z(\bar \eta, \eta) = \int d^n \bar{\psi} d^n \psi \; \exp \left[-(\bar\psi - \eta M^{-1}) M (\psi - M^{-1} \eta) +\bar{\eta}M^{-1} \eta \right].\] A shift of integration variables does not change the result and thus we find \[Z(\bar{\eta}, \eta) = \det(M) \exp\left[\bar{\eta}M^{-1} \eta\right].\] In this sense, Gaussian integrals over Grassmann variables can be manipulated similarly as Gaussian integrals over commuting variables. Note again that \(\det(M)\) appears in the numerator while it would appear in the denominator of bosonic variables.

Functional integral over Grassmann fields

We can now take the limit \(n \to \infty\) and write \[\int d^n \bar{\psi} d^n \psi \to \int D\bar{\psi} D\psi, \quad\quad\quad Z(\bar{\eta},\eta) \to Z[\bar{\eta}, \eta],\] with \[Z[\bar{\eta}, \eta] = \int D\bar{\psi}D\psi \; \exp [-\bar{\psi} M \psi + \bar{\eta}\psi +\bar{\psi}\eta] = \det(M) \exp\left[\bar{\eta} M^{-1} \eta \right].\] In this way we obtain a formalism that can be used for fermionic or Grassmann fields.

Action for free non-relativistic scalars

We can now write down an action for non-relativistic fermions with spin \(1/2\). It looks similar to what we have conjectured before, \[S_2 = \int dt d^3 x \left\{-\bar{\psi}\left[\left(-i\partial_t - \tfrac{\boldsymbol{\nabla}^2}{2m}+ V_0\right) \mathbb{1} +\mu_B \boldsymbol{\sigma} \cdot \mathbf{B}\right]\psi \right\},\] but the two-component fields \(\psi = (\psi_1, \psi_2)\) and \(\bar{\psi} = (\bar{\psi}_1, \bar{\psi}_2)\) are in fact \(\textit{Grassmann fields}\). Such fields anti-commute, for example \(\psi_1(x) \psi_2 (y) = -\psi_2(y) \psi_1 (x)\). One should see the field at different space-time positions \(x\) to be independent Grassmann numbers. Also, \(\psi_1\) and \(\bar{\psi}_1\) are independent as Grassmann fields. In particular \(\psi_1(x)^2 = 0\) but \(\bar{\psi}_1(x) \psi_1(x) \neq 0\).

Categories:

Updated: