xref: /reactos/dll/opengl/glu32/src/libtess/alg-outline (revision c2c66aff)
1/*
2*/
3
4This is only a very brief overview.  There is quite a bit of
5additional documentation in the source code itself.
6
7
8Goals of robust tesselation
9---------------------------
10
11The tesselation algorithm is fundamentally a 2D algorithm.  We
12initially project all data into a plane; our goal is to robustly
13tesselate the projected data.  The same topological tesselation is
14then applied to the input data.
15
16Topologically, the output should always be a tesselation.  If the
17input is even slightly non-planar, then some triangles will
18necessarily be back-facing when viewed from some angles, but the goal
19is to minimize this effect.
20
21The algorithm needs some capability of cleaning up the input data as
22well as the numerical errors in its own calculations.  One way to do
23this is to specify a tolerance as defined above, and clean up the
24input and output during the line sweep process.  At the very least,
25the algorithm must handle coincident vertices, vertices incident to an
26edge, and coincident edges.
27
28
29Phases of the algorithm
30-----------------------
31
321. Find the polygon normal N.
332. Project the vertex data onto a plane.  It does not need to be
34   perpendicular to the normal, eg. we can project onto the plane
35   perpendicular to the coordinate axis whose dot product with N
36   is largest.
373. Using a line-sweep algorithm, partition the plane into x-monotone
38   regions.  Any vertical line intersects an x-monotone region in
39   at most one interval.
404. Triangulate the x-monotone regions.
415. Group the triangles into strips and fans.
42
43
44Finding the normal vector
45-------------------------
46
47A common way to find a polygon normal is to compute the signed area
48when the polygon is projected along the three coordinate axes.  We
49can't do this, since contours can have zero area without being
50degenerate (eg. a bowtie).
51
52We fit a plane to the vertex data, ignoring how they are connected
53into contours.  Ideally this would be a least-squares fit; however for
54our purpose the accuracy of the normal is not important.  Instead we
55find three vertices which are widely separated, and compute the normal
56to the triangle they form.  The vertices are chosen so that the
57triangle has an area at least 1/sqrt(3) times the largest area of any
58triangle formed using the input vertices.
59
60The contours do affect the orientation of the normal; after computing
61the normal, we check that the sum of the signed contour areas is
62non-negative, and reverse the normal if necessary.
63
64
65Projecting the vertices
66-----------------------
67
68We project the vertices onto a plane perpendicular to one of the three
69coordinate axes.  This helps numerical accuracy by removing a
70transformation step between the original input data and the data
71processed by the algorithm.  The projection also compresses the input
72data; the 2D distance between vertices after projection may be smaller
73than the original 2D distance.  However by choosing the coordinate
74axis whose dot product with the normal is greatest, the compression
75factor is at most 1/sqrt(3).
76
77Even though the *accuracy* of the normal is not that important (since
78we are projecting perpendicular to a coordinate axis anyway), the
79*robustness* of the computation is important.  For example, if there
80are many vertices which lie almost along a line, and one vertex V
81which is well-separated from the line, then our normal computation
82should involve V otherwise the results will be garbage.
83
84The advantage of projecting perpendicular to the polygon normal is
85that computed intersection points will be as close as possible to
86their ideal locations.  To get this behavior, define TRUE_PROJECT.
87
88
89The Line Sweep
90--------------
91
92There are three data structures: the mesh, the event queue, and the
93edge dictionary.
94
95The mesh is a "quad-edge" data structure which records the topology of
96the current decomposition; for details see the include file "mesh.h".
97
98The event queue simply holds all vertices (both original and computed
99ones), organized so that we can quickly extract the vertex with the
100minimum x-coord (and among those, the one with the minimum y-coord).
101
102The edge dictionary describes the current intersection of the sweep
103line with the regions of the polygon.  This is just an ordering of the
104edges which intersect the sweep line, sorted by their current order of
105intersection.  For each pair of edges, we store some information about
106the monotone region between them -- these are call "active regions"
107(since they are crossed by the current sweep line).
108
109The basic algorithm is to sweep from left to right, processing each
110vertex.  The processed portion of the mesh (left of the sweep line) is
111a planar decomposition.  As we cross each vertex, we update the mesh
112and the edge dictionary, then we check any newly adjacent pairs of
113edges to see if they intersect.
114
115A vertex can have any number of edges.  Vertices with many edges can
116be created as vertices are merged and intersection points are
117computed.  For unprocessed vertices (right of the sweep line), these
118edges are in no particular order around the vertex; for processed
119vertices, the topological ordering should match the geometric ordering.
120
121The vertex processing happens in two phases: first we process are the
122left-going edges (all these edges are currently in the edge
123dictionary).  This involves:
124
125 - deleting the left-going edges from the dictionary;
126 - relinking the mesh if necessary, so that the order of these edges around
127   the event vertex matches the order in the dictionary;
128 - marking any terminated regions (regions which lie between two left-going
129   edges) as either "inside" or "outside" according to their winding number.
130
131When there are no left-going edges, and the event vertex is in an
132"interior" region, we need to add an edge (to split the region into
133monotone pieces).  To do this we simply join the event vertex to the
134rightmost left endpoint of the upper or lower edge of the containing
135region.
136
137Then we process the right-going edges.  This involves:
138
139 - inserting the edges in the edge dictionary;
140 - computing the winding number of any newly created active regions.
141   We can compute this incrementally using the winding of each edge
142   that we cross as we walk through the dictionary.
143 - relinking the mesh if necessary, so that the order of these edges around
144   the event vertex matches the order in the dictionary;
145 - checking any newly adjacent edges for intersection and/or merging.
146
147If there are no right-going edges, again we need to add one to split
148the containing region into monotone pieces.  In our case it is most
149convenient to add an edge to the leftmost right endpoint of either
150containing edge; however we may need to change this later (see the
151code for details).
152
153
154Invariants
155----------
156
157These are the most important invariants maintained during the sweep.
158We define a function VertLeq(v1,v2) which defines the order in which
159vertices cross the sweep line, and a function EdgeLeq(e1,e2; loc)
160which says whether e1 is below e2 at the sweep event location "loc".
161This function is defined only at sweep event locations which lie
162between the rightmost left endpoint of {e1,e2}, and the leftmost right
163endpoint of {e1,e2}.
164
165Invariants for the Edge Dictionary.
166
167 - Each pair of adjacent edges e2=Succ(e1) satisfies EdgeLeq(e1,e2)
168   at any valid location of the sweep event.
169 - If EdgeLeq(e2,e1) as well (at any valid sweep event), then e1 and e2
170   share a common endpoint.
171 - For each e in the dictionary, e->Dst has been processed but not e->Org.
172 - Each edge e satisfies VertLeq(e->Dst,event) && VertLeq(event,e->Org)
173   where "event" is the current sweep line event.
174 - No edge e has zero length.
175 - No two edges have identical left and right endpoints.
176
177Invariants for the Mesh (the processed portion).
178
179 - The portion of the mesh left of the sweep line is a planar graph,
180   ie. there is *some* way to embed it in the plane.
181 - No processed edge has zero length.
182 - No two processed vertices have identical coordinates.
183 - Each "inside" region is monotone, ie. can be broken into two chains
184   of monotonically increasing vertices according to VertLeq(v1,v2)
185   - a non-invariant: these chains may intersect (slightly) due to
186     numerical errors, but this does not affect the algorithm's operation.
187
188Invariants for the Sweep.
189
190 - If a vertex has any left-going edges, then these must be in the edge
191   dictionary at the time the vertex is processed.
192 - If an edge is marked "fixUpperEdge" (it is a temporary edge introduced
193   by ConnectRightVertex), then it is the only right-going edge from
194   its associated vertex.  (This says that these edges exist only
195   when it is necessary.)
196
197
198Robustness
199----------
200
201The key to the robustness of the algorithm is maintaining the
202invariants above, especially the correct ordering of the edge
203dictionary.  We achieve this by:
204
205  1. Writing the numerical computations for maximum precision rather
206     than maximum speed.
207
208  2. Making no assumptions at all about the results of the edge
209     intersection calculations -- for sufficiently degenerate inputs,
210     the computed location is not much better than a random number.
211
212  3. When numerical errors violate the invariants, restore them
213     by making *topological* changes when necessary (ie. relinking
214     the mesh structure).
215
216
217Triangulation and Grouping
218--------------------------
219
220We finish the line sweep before doing any triangulation.  This is
221because even after a monotone region is complete, there can be further
222changes to its vertex data because of further vertex merging.
223
224After triangulating all monotone regions, we want to group the
225triangles into fans and strips.  We do this using a greedy approach.
226The triangulation itself is not optimized to reduce the number of
227primitives; we just try to get a reasonable decomposition of the
228computed triangulation.
229