# A (hopefully) gentle guide to the computer implementation of molecular integrals over Gaussian basis functions.

**Note: You can find the following integral routines implemented in a working Hartree-Fock code here.**

*Special thank you to Joy Alamgir for pointing out several corrections to this post (and PDF)!*

In quantum chemistry, we are often interested in evaluating integrals over Gaussian basis functions. Here I am going to take much of the theory for granted and talk about how one might actually implement the integrals for quantum chemistry by using recursion relationships with Hermite Gaussians. Our goal is to have clean, readable code. I’m writing this for the people are comfortable with the mathematics behind the Gaussian integrals, but want to see a readable computer implementation. I’ll talk a bit about some computational considerations at the end, but my goal is to convert equations to code. In fact, I’ve tried to structure the equations and the code in such a way that the two look very similar.

For a very good overview of integral evaluation, please see:

**Helgaker, Trygve, and Peter R. Taylor. “Gaussian basis sets and
molecular integrals.” Modern Electronic Structure (1995).**

I will try and follow the notation used in the above reference.

# Mathematical preliminaries

Let’s start with some of the basics. First, we have our 3D Gaussian functions

\[G_{ijk}(\mathbf{r}, \alpha, \mathbf{A}) = x_A^i y_A^j z_A^k \hbox{exp}(-\alpha r_A^2)\]with orbital exponent \(\alpha\), electronic coordinates \(\mathbf{r}\), origin \(\mathbf{A}\), and

\[\mathbf{r}_A = \mathbf{r} - \mathbf{A}\]also, \(i,j,k\) are the angular quantum numbers (e.g. \(i=0\) is \(s\)-type, \(i=1\) is \(p\) type, etc.) Cartesian Gaussians are separable in 3D along \(x,y,z\) so that

\[G_{ijk}(\mathbf{r}, \alpha, \mathbf{A}) = G_i(x,\alpha,A_x)G_j(y,\alpha,A_y)G_k(z,\alpha,A_z)\]with the 1D Gaussian

\[G_i(x,\alpha,A_x) = (x - A_x)^i \hbox{exp}(-\alpha(x-A_x)^2).\]So far, so good. Let’s consider the overlap integral of two 1D Gaussians, \(a\) and \(b\)

\[\begin{align} S_{ab} &= \int G_i(x,\alpha,A_x) G_j(x,\beta,B_x) dx \\ &= \int K_{AB} x^i_A x^j_B \hbox{exp}(-p x_P^2) dx\end{align}\]where we used the Gaussian product theorem so that

\[\begin{align} K_{AB} &= \hbox{exp}(-q Q_x^2),\\ Q_x &= A_x - B_x, \\ q &= \frac{\alpha\beta}{\alpha + \beta}, \\ p &= \alpha + \beta,\quad \hbox{and}, \\ P_x &= \frac{1}{p}\left(\alpha A_x + \beta B_x\right).\end{align}\]When using Hermite Gaussians, we can express \(S_{ab}\) as

\[\begin{align} S_{ab} &= \int \sum\limits_{t=0}^{i+j} E_{t}^{ij} \Lambda_t dx \\ &= \sum\limits_{t=0}^{i+j} E_{t}^{ij} \int \Lambda_t dx \\ &= \sum\limits_{t=0}^{i+j} E_{t}^{ij} \delta_{t0} \sqrt{\frac{\pi}{p}} \\ &= E_{0}^{ij} \sqrt{\frac{\pi}{p}} \end{align}\]where \(E_{t}^{ij}\)
are expansion coefficients (to be determined recursively) and
\(\Lambda_t\) is the Hermite Gaussian overlap of two Gaussians \(a\) and
\(b\). It has a simple expression that kills the sum *via* the Kronecker
delta \(\delta_{t0}\). It can be shown that the expansion coefficients can
be defined using the following recursive definitions

The first equation gives us a way to reduce the index \(j\) and the second gives us a way to reduce index \(i\) so that we can get to the third equation, which is our base case. The last equation tells us what to do if we go out of index bounds.

# Overlap integrals

The first thing we need to do is implement a function `E`

which returns
our expansion coefficients \(E_t^{ij}\). Aside from angular momentum \(i\)
and \(j\) from the Gaussian functions, we also need the distance between
Gaussians \(Q_x\) and the orbital exponent coefficients \(\alpha\) and
\(\beta\) as inputs.

This is simple enough! So for a 1D overlap between two Gaussians we
would just need to evaluate \(E_0^{ij}\) and multiply it by
\(\sqrt{\frac{\pi}{p}}\). Overlap integrals in 3D are just a product of
the \(x,y,z\) 1D overlaps. We could imagine a 3D `overlap`

function like
so

Note that we are using the `NumPy`

package in order to take advantage of
the definitions of \(\pi\) and the fractional power to the \(3/2\). The
above two functions `overlap`

and `E`

are enough to get us the overlap
between two Gaussian functions (primitives), but most basis functions
are contracted, meaning they are the sum of multiple Gaussian
primitives. It is not too difficult to account for this, and we can
finally wrap up our evaluation of overlap integrals with a function
`S(a,b)`

which returns the overlap integral between two *contracted*
Gaussian functions.

Basically, this is just a sum over primitive overlaps, weighted by
normalization and coefficient. A word is in order for the arguments,
however. In order to keep the number of arguments we have to pass into
our functions, we have created `BasisFunction`

objects that contain all
the relevant data for the basis function, including exponents,
normalization, etc. A `BasisFunction`

class looks like

So, for example if we had a STO-3G Hydrogen 1s at origin (1.0, 2.0, 3.0), we could create a basis function object for it like so

Where we used the EMSL STO-3G definition

So doing `S(a,a) = 1.0`

, since the overlap of a basis function with
itself (appropriately normalized) is one.

# Kinetic energy integrals

Having finished the overlap integrals, we move on to the kinetic integrals. The kinetic energy integrals can be written in terms of overlap integrals

\[T_{ab} = -\frac{1}{2}\left[D_{ij}^2 S_{kl} S_{mn} + S_{ij}D_{kl}^2 S_{mn} + S_{ij} S_{kl} D_{mn}^2\right]\]where

\[D_{ij}^2 = j(j-1)S_{i,j-2} - 2\beta(2j +1)S_{ij} + 4\beta^2 S_{i,j+2}\]For a 3D primitive, we can form a `kinetic`

function analogous to
`overlap`

,

and for contracted Gaussians we make our final function `T(a,b)`

# Nuclear attraction integrals

The last one-body integral I want to consider here is the nuclear attraction integrals. These differ from the overlap and kinetic energy integrals in that the nuclear attraction operator \(1/r_C\) is Coulombic, meaning we cannot easily factor the integral into Cartesian components \(x,y,z\).

To evaluate these integrals, we need to set up an auxiliary Hermite Coulomb integral \(R^n_{tuv}(p,\mathbf{P},\mathbf{C})\) that handles the Coulomb interaction between a Gaussian charge distribution centered at \(\mathbf{P}\) and a nuclei centered at \(\mathbf{C}\). The Hermite Coulomb integral, like its counterpart \(E_t^{ij}\), is defined recursively:

\[\begin{align} R^{n}_{t+1,u,v} &= t R^{n+1}_{t-1,u,v} + X_{PC}R^{n+1}_{t,u,v} \\ R^{n}_{t,u+1,v} &= u R^{n+1}_{t,u-1,v} + Y_{PC}R^{n+1}_{t,u,v} \\ R^{n}_{t,u,v+1} &= v R^{n+1}_{t,u,v-1} + Z_{PC}R^{n+1}_{t,u,v} \\ R^{n}_{0,0,0} &= (-2p)^n F_n (p R_{PC}^2) \end{align}\]where \(F_n(T)\) is the Boys function

\[F_n(T) = \int_0^1 \hbox{exp}(-Tx^2)x^{2n}dx\]which is a special case of the Kummer confluent hypergeometric function, \(_1F_1(a,b,x)\)

\[F_n(T) = \frac{_1F_1(n+\frac{1}{2}, n+\frac{3}{2}, -T)}{2n+1}\]which is convenient for us, since `SciPy`

has an implementation of \(_1F_1\) as
a part of `scipy.special`

. So for `R`

we can code up the recursion like
so

and we can define our `boys(n,T)`

function as

There are other definitions of the Boys function of course, in case you
do not want to use the `SciPy`

built-in. Note that `R`

requires
knowledge of the composite center \(\mathbf{P}\) from two Gaussians
centered at \(\mathbf{A}\) and \(\mathbf{B}\). We can determine \(\mathbf{P}\)
using the Gaussian product center rule

which is very simply coded up as

Now that we have a the Coulomb auxiliary Hermite integrals
\(R^{n}_{tuv}\), we can form the nuclear attraction integrals with respect
to a given nucleus centered at \(\mathbf{C}\), \(V_{ab}(C)\), *via* the
expression

And, just like all the other routines, we can wrap it up to treat contracted Gaussians like so:

**Important:** Note that this is the nuclear repulsion integral
contribution from an atom centered at \(\mathbf{C}\). To get the full
nuclear attraction contribution, you must sum over all the nuclei, as
well as scale each term by the appropriate nuclear charge!

# Two electron repulsion integrals

We are done with the necessary one-body integrals (for a basic Hartree-Fock energy code, at least) and are ready to move on to the two-body terms: the electron-electron repulsion integrals. Thankfully, much of the work has been done for us on account of the nuclear attraction one-body integrals.

In terms of Hermite integrals, to evaluate the two electron repulsion terms, we must evaluate the summation

\[g_{abcd} = \frac{2\pi^{5/2}}{pq\sqrt{p+q}} \sum\limits_{t,u,v}^{\substack{i+j+1,\\k+l+1,\\m+n+1}} E_t^{ij} E_u^{kl} E_v^{mn} \sum\limits_{\tau,\nu,\phi}^{\substack{i'+j'+1,\\k'+l'+1,\\m'+n'+1}} E_{\tau}^{i'j'} E_{\nu}^{k'l'} E_{\phi}^{m'n'} (-1)^{\tau+\nu+\phi} R^0_{t+\tau,u + \nu, v+\phi}(p,q,\mathbf{P},\mathbf{Q})\]which looks terrible and it is. However, recalling that \(p = \alpha + \beta\) letting \(q = \gamma + \delta\) (that is, the Gaussian exponents on \(a\) and \(b\), and \(c\) and \(d\)), we can write the equation in a similar form to the nuclear attraction integrals

And, for completeness’ sake, we wrap the above to handle contracted Gaussians

And there you have it! All the integrals necessary for a Hartree-Fock SCF code.

# Computational efficiency considerations

Our goal here has been to eliminate some of the confusion when it comes
to connecting mathematics to actual computer code. So the code that we
have shown is hopefully clear and looks nearly identical to the
mathematical equations they are supposed to represent. This is one
reason we chose `Python`

as the code of choice to implement the
integrals. It emphasizes readability.

If you use the code as is, you’ll find that you can only really handle small systems. To that end, I’ll give a few ideas on how to improve the integral code to actually be usable.

First, I would recommend not using pure `Python`

. The problem is that we
have some pretty deep loops in the code, and nested `for`

loops will
kill your speed if you insist on sticking with `Python`

. Now, you can
code in another language, but I would suggest rewriting some of the
lower level routines with `Cython`

(http://www.cython.org). `Cython`

statically compiles your code to eliminate many of the `Python`

calls
that slow your loops down. In my experience, you will get several orders
of magnitude speed up. This brings me to my second point. One of the
problems with `Python`

is that you have the global interpreter lock
(GIL) which basically means you cannot run things in parallel. Many of
the integral evaluation routines would do well if you could split the
work up over multiple CPUs. If you rewrite some of the low level
routines such that they do not make `Python`

calls anymore, you can turn
off the GIL and wrap the code in `OpenMP`

directives, or even use
`Cython`

’s `cython.parallel`

module. This will take some thought, but
can definitely be done. Furthermore, removing the explicit recursion in the
`E`

and `R`

functions and making them iterative would go a long way to speed
up the code.

A couple other thoughts: be sure to exploit the permutational symmetry of the integrals. The two electron repulsion integrals, for example, can be done in \(1/8\) of the time just by exploiting these symmetries, which are unrelated to point group. Also, you can exploit many of the integral screening routines, since many of the two electron integrals are effectively zero. There are a lot of tricks out there in the literature, go check them out!