next up previous
Next: K-L algorithm Up: Graph Theory Based Algorithms Previous: Greedy algorithm

Spectral bisection

The recursive spectral bisection (RSB) algorithm [68,75,87] is based on the following consideration. Let each vertex of the graph be assigned a value of either 1 or $-1$, and let the value assigned to vertex $i$ be denoted by $x_i$. This creates a bisection of the graph in which the vertices having value 1 form one subdomain and those with value $-1$ form the other. The edge-cut for the bisection can be expressed as

\vert E_c\vert={1\over4}\sum_{i\leftrightarrow j,\ i,j\in V}(x_i-x_j)^2,
\end{displaymath} (2)

where $i\leftrightarrow j$ means that there is an edge connecting the vertices $i$ and $j$. In order to keep the load balanced, each of the two subdomains is required to have the same number of vertices. Thus the sum of all the values associated with the vertices is zero, that is

\sum_{i=1}^n x_i=0.
\end{displaymath} (3)

The graph bisection problem is equivalent to minimizing (2) subject to (3) with $x_i$ taking the value of either 1 or $-1$. This is an integer programming problem which is difficult to solve when the number of vertices $n$ is large. Ignoring the integer constraints, the quadratic (2) can be minimized subject to the linear constraint (3) only. This gives a continuous minimization problem. But without an extra constraint the solution is simply the vector of zeros. Remembering that when all the variables take the value 1 or $-1$, the sum of the squares should be $n$, the number of vertices. This gives the extra constraint

\sum_{i=1}^n x_i^2=n.
\end{displaymath} (4)

The quadratic (2) can now be minimized subject to (3) and (4).

The quadratic (2) can be written as

\vert E_c\vert={1\over4}x^TLx,
\end{displaymath} (5)

where $x=(x_i)$ is the vector composed of all the values to be assigned to vertices and $L$ is an $n\times n$ matrix known as the Laplacian matrix of the graph. It has the simple form
$\displaystyle (L)_{ij}=\left\{
-1,&\qquad{\rm if}\ \ i\ne j {...
deg(i),&\qquad{\rm if}\ \ i=j,\\
0,&\qquad{\rm otherwise}.
\end{array}\right.$     (6)

Here deg$(i)$ is the degree of the vertex $i$, defined in Section 2 as the number of edges connected with the vertex $i$. The matrix $L$ satisfies $Le=0$ with $e$ the vector of all ones. Applying the necessary condition for the constrained minimization problem gives
Lx=\mu e+\lambda x,
\end{displaymath} (7)

with $\mu$ and $\lambda$ two Lagrange multipliers to be decided. Multiplying both sides of the equation by $e^T$ gives $\mu=0$. Thus $x$ has to be an eigenvector of $L$ and the edge-cut is $\vert E_c\vert={1\over4}n\lambda$. In order to minimize the edge-cut, $x$ needs to be the eigenvector of the smallest possible eigenvalue of $L$ that satisfies the two constraints (3) and (4).

The matrix $L$ is positive semi-definite with smallest eigenvalue $\lambda_1=0$ and corresponding eigenvector $e$. Clearly $e$ is not a solution to the problem since it does not satisfy the load balancing constraint (3). If the graph is connected, then it can be shown that the next smallest eigenvalue $\lambda_2$ is positive [63]. The eigenvector associated with this eigenvalue satisfies (3) because it is orthogonal to the first eigenvector $e$. It also satisfies (4) by proper scaling, and thus is the solution of the constrained minimization problem. This eigenvector (also called the Fiedler vector) gives a value to each vertex of the graph, and the graph can be bisected by separating those with smaller values from those with larger values. The procedure can then be repeated on each of the subdomains. This gives the RSB algorithm. In practice the RSB algorithm almost always gives connected subdomains if the original graph is connected, however there can be exceptions, as Figure 5 shows.

Figure 6 gives a simple graph and the eigenvector associated with the second smallest eigenvalue of the Laplacian. Here the values of the eigenvector are marked beside each vertex in brackets. Sorting the vertices based on the values of the eigenvector bisects the graph into two triangles, with the edge-cut of one.

Figure 5: A graph that can not be bisected into 2 connected subgraphs with an equal number of vertices
\epsfxsize =0.50\vsize
\epsfysize =0.30\vsize
\hfil\epsffile {}\hfil

Figure 6: A simple graph and its Fiedler vector
\epsfxsize =0.60\vsize
\epsfysize =0.60\vsize
\hfil\epsffile {}\hfil

To find the eigenvector corresponding to the second smallest eigenvalue, the Lanczos algorithm can be employed. This is an iterative algorithm which requires $O(n)$ operations per iteration. The algorithm normally converges in $m$ iterations, with $m$ typically less than a few hundred. Thus the algorithm has a complexity of $O(mn)$. However the Lanczos algorithm requires the storage of $m$ Lanczos vectors of length $n$, unless the entire Lanczos recurrence is run through a second time in order to accumulate the eigenvector. An alternative algorithm [56] for finding the eigenvector is to solve the minimization problem (2) using the conjugate gradient algorithm, which requires the storage of only four vectors. This algorithm was only tested on regular grids.

Figure 7 shows the result of applying the RSB algorithm to the dual graph of the mesh with 788 elements, the partition has 108 shared edges, significantly fewer than the edge-cut of 142 generated by the RCB and RGB algorithms, although the RSB algorithm is far more expensive. In this case all of the subdomains are connected.

Figure 7: Partitioning a mesh into 16 subdomains using RSB
\epsfxsize =120pt
\epsfysize =120pt
\hfil\epsffile {}\hfil

Two recursive spectral bisections do not necessarily generate an optimal quadri-section [76]. Direct quadri-section, even octa-section, using two or three eigenvectors of the Laplacian matrix has been explored [34]. Another interesting way of utilizing multiple eigenvectors is to treat them as spectral coordinates of the vertices. Using these coordinates in the context of inertia bisection can generate partitions of better quality than RSB [77].

next up previous
Next: K-L algorithm Up: Graph Theory Based Algorithms Previous: Greedy algorithm