where Smin (and Smax) are generated by setting all their off-diagonal elements to Sij min = min {Sij} (and Sij max = max {Sij}) for
all i π j. The bounding probabilities are thereby expressed
in terms of simple and regular correlation matrices.
We now use majorization theory to evaluate the probability content of a standard multi-normal vector with a simple
correlation structure over the non-equicoordinate set. For a
more efficient numerical evaluation, we can resort to expressions in terms of the equicoordinate probability. This is attractive because it is easier to pre-characterize the probability of a
multidimensional vector over a multidimensional semi-infinite cube rather than for all possible shapes of the multidimensional rectangular cuboid. We enable this transformation
by bounding the cumulative probabilities of a normal random vector using the theory of stochastic majorization.
10
We use the property that for some distributions, including
Gaussian, stochastic inequalities can be established based
on partial ordering of their distribution parameter vectors
using ordinary (deterministic) majorization.
We first introduce the notions of strong and weak
majorization for real vectors. The components of a real vector a = (a1, …, aN)′ can be arranged in increasing magnitude,
a[
1] ³. … ³ a[N]. A majorization relationship is defined between
two real vectors, a and b. We say that a majorizes b, in sym-
bols a b, if
i =
Σ
N
1
ai = i = Σ
N
1
bi and i = Σ
r
1
a[i ] ≥ i = Σ
r
1
b[i] for r = 1,…, N− 1. If only
the second condition is satisfied, then only weak majoriza-
tion holds. We say that a weakly majorizes b, in symbols a
b, if
i =
Σ
r
1
a[i ] ≥ i = Σ
r
1
b[i] for r = 1,…, N. An example of strong majoriza-
tion is ( 3, 2, 1) ( 2, 2, 2), and an example of weak majorization
is ( 3, 2, 1) ( 1, 1, 1).
We now relate ordering of parameter vectors to ordering of probabilities. Note that the class of random vectors
that can be ordered via ordering of their parameter vectors
includes Gaussian random vectors. Specifically:
This is a well-structured object whose probability content
can be computed numerically. The numerical evaluation is
done at pre-characterization time by Monte-Carlo integration of the cumulative distribution function of a multivariate normal vector. The result of pre-characterization is a set
of probability lookup tables for a range of vector dimension-alities N, coordinate values t, and correlation coefficients.
We tested the algorithm on multiple benchmark circuits and compared our results to the exact cdf computed
via Monte-Carlo runs of the deterministic STA algorithm.
Samples were taken from the parameter distributions, and
both inter-chip and intra-chip components were present.
Figure 4 shows the cumulative probabilities from
the Monte-Carlo simulation and the derived bounds
for one combinational benchmark circuit (c7552) containing 3874 nodes. The bounds are tight: across the
figure 4. The derived bounds for cdf of delay of a benchmark circuit
c7552 containing 3874 nodes.
1.0
0.9
0.8
Cumulative Probability
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
Upper Bound
Monte-Carlo
Lower Bound
2800
Delay (ps)
4400 4200 4000 3800 3600 3400 3200 3000
Theorem 3. If
then
and
where
Theorem 4. If
then
Thus we have bounded the original probability by cumulative probabilities expressed in terms of the standard multivariate normal vector with an equicoordinate vector and a
correlation matrix of identical off-diagonal elements:
2800
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
Benchmark: c7552
4000 3800 3200 3000
figure 5. change in the cdf for a benchmark circuit depending on the
level of edge correlations.
which establishes an upper bound on the circuit delay probability. We use weak majorization for the lower bound.
Specifically:
tmin = min (t1,…,tN) and
Cumulative Probability
Low Correlation
Moderate Correlation
High Correlation
Delay (ps)
3600 3400
4200