Real-Time Cloud Rendering.pdf
(
4116 KB
)
Pobierz
www.GetPedia.com
* The Ebook starts from the next page : Enjoy !
EUROGRAPHICS 2001 / A. Chalmers and T.-M. Rhyne
Volume 20 (2001), Number 3
(Guest Editors)
Real-Time Cloud Rendering
Mark J. Harris and Anselmo Lastra
Department of Computer Science, University of North Carolina, Chapel Hill, North Carolina, USA
{harrism, lastra}@cs.unc.edu
Abstract
This paper presents a method for realistic real-time rendering of clouds suitable for flight simulation and games. It
provides a cloud shading algorithm that approximates multiple forward scattering in a preprocess, and first order
anisotropic scattering at runtime. Impostors are used to accelerate cloud rendering by exploiting frame-to-frame
coherence in an interactive flight simulation. Impostors are shown to be particularly well suited to clouds, even in
circumstances under which they cannot be applied to the rendering of polygonal geometry. The method allows
hundreds of clouds and hundreds of thousands of particles to be rendered at high frame rates, and improves
interaction with clouds by reducing artifacts introduced by direct particle rendering techniques.
1. Introduction
“Beautiful day, isn’t it?”
“Yep. Not a cloud in the sky!
”
Uncommon occurrences such as cloudless skies often elicit
such common figures of speech. Clouds are such an integral
feature of our skies that their absence from any synthetic
outdoor scene can detract from its realism. Unfortunately,
interactive applications such as flight simulators often suffer
from cloudless skies. Designers of these applications have
relied upon similar techniques to those used by renaissance
painters in ceiling frescos: distant and high-flying clouds are
represented by paintings on an always-distant sky dome. In
addition, clouds in flight simulators and games have been
hinted at with planar textures – both static and animated – or
with semi-transparent textured objects and fogging effects.
There are many desirable effects associated with clouds
that are not achievable with such techniques. In an
interactive flight simulation, we would like to fly in and
around realistic, volumetric clouds, and to see other flying
objects pass within and behind them. Current real-time
techniques have not provided users with such experiences.
This paper describes a system for real-time cloud rendering
that is appropriate for games and flight simulators.
Figure 1:
Realistic clouds in the game “Ozzy’s Black Skies”.
do not address issues of dynamic cloud simulation. This
choice enables us to generate clouds ahead of time, and to
assume that cloud particles are static relative to each other.
This assumption speeds the rendering of the clouds because
we need only shade them once per scene in a preprocess.
The rest of this section presents previous work. Section 2
gives a derivation and description of our shading algorithm.
Section 3 discusses dynamically generated impostors and
shows how we use them to accelerate cloud rendering. We
also discuss how we have dealt with issues in interacting
with clouds. Section 4 discusses our results and presents
performance measurements. We conclude and discuss ideas
for future research in section 5.
In this paper we focus on high-speed, high-quality
rendering of constant-shape clouds for games and flight
simulators. These systems are already computationally and
graphically loaded, so cloud rendering must be very fast. For
this reason, we render realistically shaded static clouds, and
© The Eurographics Association and Blackwell Publishers 2001. Published by Blackwell
Publishers, 108 Cowley Road, Oxford OX4 1JF, UK and 350 Main Street, Malden, MA
02148, USA.
Harris and Lastra / Real-Time Cloud Rendering
“solid spaces”, including offline computation of realistic
images of smoke, steam, and clouds [Ebert1990, Ebert1997].
Stam simulated light transport in gases using diffusion
processes [Stam1995]. Nishita
et al.
introduced
approximations and rendering techniques for global
illumination of clouds accounting for multiple anisotropic
scattering and skylight [Nishita1996].
Our rendering approach draws most directly from recent
work by Dobashi
et al.
, which presents both an efficient
simulation method for clouds and a hardware-accelerated
rendering technique [Dobashi2000]. The shading method
presented by Dobashi
et al.
implements an isotropic single
scattering approximation. We extend this method with an
approximation to multiple forward scattering and anisotropic
first order scattering. The animated cloud scenes of Dobashi
et al.
required 20-30 seconds rendering time per frame. Our
system renders static cloudy scenes at tens to hundreds of
frames per second, depending on scene complexity.
Figure 2:
A view of an interactive flight through clouds.
1.1 Previous Work
We segment previous work related to cloud rendering into
two areas: cloud modeling and cloud rendering. Cloud
modeling deals with the data used to represent clouds in the
computer, and how the data are generated and organized.
We build our clouds with particle systems. Reeves
introduced particle systems as an approach to modeling
clouds and other such “fuzzy” phenomena in [Reeves1983].
Voxels are another common representation for clouds.
Voxel models provide a uniform sampling of the volume,
and can be rendered with both forward and backward
methods. Procedural solid noise techniques are also
important to cloud modeling as a way to generate random but
continuous density data to fill cloud volumes [Lewis1989,
Perlin1985, Ebert1998]. [Kajiya1984] introduced the
simulation of cloud formation via approximations of real
physical processes, and [Dobashi2000] used a physical
approximation simulated with cellular automata. Another
important representation for clouds has been
metaballs
,or
“blobs” [Stam1991, Stam1995, Dobashi2000].
Much previous work has been done in non-interactive
rendering techniques for clouds. Rendering clouds is
difficult because realistic shading requires the integration of
the effects of optical properties along paths through the cloud
volume, while incorporating the complex scattering within
the medium. Previous work has attempted to approximate
the physical characteristics of clouds at various levels of
accuracy and complexity, and then to use these approximate
models to render images of clouds. Blinn introduced the use
of density models for image synthesis in [Blinn1982], where
he presented a low albedo, single scattering approximation
for illumination in a uniform medium. Kajiya and Von
Herzen extended this work with methods to ray trace volume
data exhibiting both single and multiple scattering
[Kajiya1984]. Gardner represented clouds with textured
ellipsoids, and recently [Elinas2001] extended Gardner’s
method to render clouds made of multiple ellipsoids in real
time. Max provided an excellent survey in which he
summarized the spectrum of optical models used in volume
rendering and derived their integral equations from physical
models [Max1995]. Ebert has done much work in modeling
2. Shading and Rendering
Particle systems are a simple and efficient method for
representing and rendering clouds. Our cloud model assumes
that a particle represents a roughly spherical volume in
which a Gaussian distribution governs the density falloff
from the center of the particle. Each particle is made up of a
center, radius, density, and color. We get good
approximations of real clouds by filling space with particles
of varying size and density. Clouds in our system can be
built by filling a volume with particles, or by using an
editing application that allows a user to place particles and
build clouds interactively. The randomized method is a good
way to get a quick field of clouds, but we intend our clouds
for interactive games with levels designed and built by
artists. Providing an artist with an editor allows the artist to
produce beautiful clouds tailored to the needs of the game.
We render particles using splatting [Westover1991], by
drawing screen-oriented polygons texture-mapped with a
Gaussian density function. Although we chose a particle
system representation for our clouds, it is important to note
that both our shading algorithm and our fast rendering
system are independent of the cloud representation, and can
be
used
with
any
model
composed
of
discrete
density
samples in space.
2.1 Light Scattering Illumination
Scattering illumination models simulate the emission and
absorption of light by a medium as well as scattering through
the medium.
Single scattering
models simulate scattering
through the medium in a single direction. This direction is
usually the direction leading to the point of view.
Multiple
scattering
models are more physically accurate, but must
account for scattering in all directions (or a sampling of all
directions), and therefore are much more complicated and
expensive to evaluate. The rendering algorithm presented by
Dobashi
et al
. computes an approximation of illumination of
clouds with single scattering.
This approximation has been
© The Eurographics Association and Blackwell Publishers 2001.
Harris and Lastra / Real-Time Cloud Rendering
used previously to render clouds and other participating
media [Blinn1982, Kajiya1984].
In a multiple scattering simulation that samples
N
directions on the sphere, each additional order of scattering
that is simulated multiplies the number of simulated paths by
N
. Fortunately, as demonstrated by [Nishita1996], the
contribution of most of these paths is insignificant. Nishita
et al.
found that scattering illumination is dominated by the
first and second orders, and therefore they only simulated up
to the 4
th
order. They reduced the directions sampled in their
evaluation of scattering to sub-spaces of high contribution,
which are composed mostly of directions near the direction
of forward scattering and those directed at the viewer. We
simplify further, and approximate multiple scattering only in
the light direction – or
multiple forward scattering –
and
anisotropic single scattering in the eye direction.
Our cloud rendering method is a two-pass algorithm
similar to the one presented in [Dobashi2000]: we
precompute cloud shading in the first pass, and use this
shading to render the clouds in the second pass. The
algorithm of Dobashi
et al.
,however,usesonlyanisotropic
first order scattering approximation. If realistic values are
used for the extinction and albedo of clouds shaded with only
a first order scattering approximation, the clouds appear very
dark [Max1995]. This is because much of the illumination in
a cloud is a result of light scattered forward along the light
direction. Figures 8 and 9 show the difference in appearance
between
a
(
x
)
⋅τ
(
x
)
⋅
p
(
ω
,
ω
′), where
a
(
x
) is the albedo of the medium at
x
,and
p
(
ω
,
ω
′) is the phase function.
A full multiple scattering algorithm must compute this
quantity for a sampling of all light flow directions.
We
simplify
our
approximation
to
compute
only
multiple
forward scattering in the light direction, so
′ =-
l
.
To do so, we approximate the integration of (2) over only a
small solid angle
ω
=
l
,and
ω
around the forward direction. Because
this solid angle is small, we also assume
r
and
I
to be
constant
γ
over
γ.
Thus,
(2)
reduces
to
g
(
x
,
l
)=
r
(x,
l
,-
l
)
.
Wesplitthelightpathfrom0to
D
P
into discrete segments
s
j
,for
j
from 1 to
N
,where
N
is the number of cloud particles
along the light direction from 0 to
D
P
. By approximating the
integrals with Riemann Sums, we have
⋅
I
(
x
,-
l
)
⋅ γ
/4
π
N
N
N
∑
∏
−
τ
∏
−
τ
I
=
I
⋅
e
+
g
e
.
j
(3)
k
P
0
k
j
=
1
j
=
1
k
=
j
+
1
I
0
is the intensity of light incident on the edge of the cloud.
In discrete form g(
x
,
l
) becomes
g
k
=
a
k
.We
assume that albedo and optical depth are represented at
discrete samples (particles) along the path of light. In order
to easily transform (3) into an algorithm that can be
implemented in graphics hardware, we cast it as a recurrence
relation:
⋅⋅⋅τ
⋅⋅⋅
p
(
l
,
-l
)
⋅⋅⋅
I
k
⋅⋅⋅γ
/4
π
k
clouds
shaded
with
and
without
our
multiple
g
+
T
⋅
I
,
2
≤
k
≤
N
forward scattering approximation.
k
−
1
k
−
1
k
−
1
I
=
.
(4)
k
I
,
k
=
1
0
2.1.1 Multiple Forward Scattering
The first pass of our shading algorithm computes the amount
of
incident
light at each position
P
from direction
l
.Th s
light,
I
(
P
,
l
),
e
−
be the transparency of particle
p
k
,
then
(4) expands to (3). This representation can be more
intuitively understood. It simply says that starting outside
the cloud, as we trace along the light direction the light
incident on any particle
p
k
is equal to the intensity of light
scattered to
p
k
from
p
k-
1
plus the intensity transmitted through
p
k
-1
(as determined by its transparency,
T
k-
1
). Notice that if
g
k
is expanded in (4) then
I
k
-1
is a factor in both terms. Section
2.2 explains how we combine frame buffer read back with
hardware blending to evaluate this recurrence.
If we let
T
k
=
k
is composed of all direct light from direction
l
that
is not
absorbed
by
intervening
particles,
plus
light
scattered to
P
from other particles.
The multiple scattering
model is written
D
D
∫
D
∫
−
τ
(
t
)
dt
−
τ
(
t
)
dt
P
∫
I
(
P
,
ω
)
=
I
(
ω
)
⋅
e
+
g
(
s
,
ω
)
e
ds
,
0
s
(1)
0
0
2.1.2 Eye Scattering
In addition to approximating multiple forward scattering, we
also implement single scattering toward the viewer as in
[Dobashi2000]. The recurrence for this is subtly different:
where
I
0
(ω) is the intensity of light in direction ω outside the
cloud,
(
t
) is the extinction coefficient (in units of 1 / length)
of the cloud at depth
t
,
D
P
is the depth of
P
in the cloud
along the light direction, and
τ
∫
g
(
x
,
ω
)
=
r
(
x
,
ω
,
ω
'
)
I
(
x
,
ω
'
)
d
ω
'
(2)
E
=
S
+
T
⋅
E
,
1
≤
k
≤
N
.
(5)
k
k
k
k
−
1
4
π
This says that the light,
E
k
, exiting any particle
p
k
is equal
to the light incident on it that it does not absorb,
T
k
·E
k
-1
,plus
the light that it scatters,
S
k
. In the first pass, we were
computing the light
I
k
incident on each particle from the light
source. Now, we are interested in the portion of this light
that is scattered toward the viewer.
represents the light from all directions
ω
′
scattered
into
direction
′
) is the bi-directional
scattering distribution function (BSDF), and determines the
percentage of light incident on
x
from direction ω
ω
at the point
x
.Here
r
(
x
,
ω
,
ω
′
that is
scattered
in
direction
ω
.
It
expands
to
r
(
x
,
ω
,
ω
′)=
When
S
k
is replaced by
a
k
⋅⋅⋅τ
⋅⋅⋅
p
(ω,-
l
)⋅⋅⋅
I
k
/4π,whereω is the view direction and
T
k
is
k
© The Eurographics Association and Blackwell Publishers 2001.
Harris and Lastra / Real-Time Cloud Rendering
as above, this recurrence approximates single scattering
toward the viewer. It is important to mention that (5)
computes light emitted from particles using results (
I
k
)
computed in (4). Since illumination is multiplied by the
phase function in both recurrences, one might think that the
phase function is multiplied twice for the same light. This is
not the case, since in (4),
I
k
-1
is multiplied by the phase
function to determine the amount of light
P
k
-1
scatters to
P
k
in the light direction, and in (5)
I
k
is multiplied by the phase
function to determine the amount of light that
P
k
scatters in
the view direction. Even if the viewpoint is directly opposite
the light source, since the light
incident
on
P
k
is stored and
used in the scattering computation, the phase function is
never taken into account twice at the same particle.
particle
p
k
-1
beforehand.
To do this, we employ pixel read
back.
To compute (4) and (5), we use the procedure described
by the pseudocode in Figure 3. The pseudocode shows that
we use a nearly identical algorithm for preprocess and
runtime. The differences are as follows. In the illumination
pass, the frame buffer is cleared to white and particles are
sorted with respect to the light. As a particle is blended into
the frame buffer, the transparency of the particle modulates
the color and adds an amount proportional to the forward
scattering.
Immediately
before
it
is
rendered, the
light
incident on
p
k
from a small solid angle
is found by reading
back the color of a small square of pixels centered on the
projection of
p
k
. The number of pixels needed to sample
γ
is
computed using the distance of
p
k
from the projection plane
of the camera. The average of these pixels is multiplied by
r
(x,
l
,-
l
) as in section 2.1.1.
γ
2.1.3 Phase Function
I
k
is computed by multiplying
The phase function
p
(ω,ω’) mentioned above is very
important to cloud shading. Clouds exhibit anisotropic
scattering of light (including the strong forward scattering
that we assume in our multiple forward scattering
approximation). The phase function determines the
distribution of scattering for a given incident light direction.
Phase functions are discussed in detail in [Nishita1996],
[Max1995], and [Blinn1982], among others. The images
showninthispaperweregeneratedusingasimpleRayleigh
scattering phase function,
p
(
source_blend_factor = 1;
dest_blend_factor = 1 – src_alpha;
texture_mode = modulate;
l
= direction from light;
if (preprocess) then
ω
=-
l
;
view cloud from light source;
clear frame buffer to white;
particles.Sort(<, dist. to light );
else
view cloud from eye position;
particles.Sort(>,dist. from eye);
endif
) = 3/4(1 +cos
2
is
the angle between the incident and scattered directions.
Rayleigh scattering favors scattering in the forward and
backward directions, but occurs in nature only for aerosols of
very small particles. We chose it for its simplicity, and good
results, and plan to try more physically based functions.
Figures 10 and 11 demonstrate the differences between
clouds shaded with and without anisotropic scattering.
Anisotropic scattering gives the clouds their characteristic
“silver lining” when viewed looking into the sun.
θ
θ
), where
θ
γ
= solid angle of pixels to read
.
foreach particle
p
k
[
p
k
has extinction
τ
k
, albedo a
k
,
radius r
k
, color, and alpha
]
if (preprocess) then
Compute # pixels n
p
needed to
cover solid angle
γ
;
Read n
p
pixels in square around
projected center of p
k
;
i
k
=
Average intensity of n
p
pix;
i
k
*= light_color;
p
k
.color =
a
k
*
τ
k
*
i
k
*
γ
/4
π
;
p
k
.alpha = 1 - exp(-
τ
k
);
else
ω
=
p
k
.position – view_position;
endif
c
=
p
k
.color * phase(
2.2 Rendering Algorithm
Armed with recurrences (4) and (5) and a standard graphics
API such as OpenGL or Direct3D, computation of cloud
illumination is straightforward. Our algorithm is similar to
the one presented by [Dobashi2000] and has two phases: a
shading phase that runs once per scene and a rendering phase
that runs in real time. The key to the implementation is the
use of hardware blending and pixel read back.
Blending operates by computing a weighted average of
the frame buffer contents (the
destination
) and an incoming
fragment (the
source
), and storing the result back in the
frame buffer. This weighted average can be written
ω
,
l
);
render
p
k
with color c, side 2*
r
k
;
endfor
c
=
f
⋅
c
+
f
⋅
c
(6)
result
src
src
dest
dest
[Note: Sort(<, distance from
x
)
means
sort in ascending order by distance
from x, and
>
means sort in
descending order
.
Pixel values are
assumed to lie in [0,1] here.
]
Figure 3:
Pseudocode for cloud shading and rendering.
If we let
c
result
=
I
k
,
f
src
=1,
c
src
=
g
k
-1
,
f
dest
=
T
k
-1
,and
c
dest
=
I
k
–1
, then we see that (4) and (6) are equivalent if the
contents of the frame buffer before blending represent
I
0
.
This is not quite enough, though, since as we saw before,
I
k
-1
is a factor of both terms in (4). To solve the recurrence for a
particle
p
k
, we must know how much light is incident on
© The Eurographics Association and Blackwell Publishers 2001.
Plik z chomika:
atari666
Inne pliki z tego folderu:
Analog_Interfacing_to_Embedded_Microprocessors.pdf
(2819 KB)
CMP[1].Books.Embedded.Systems.Dictionary.eBook-LiB.chm
(5011 KB)
Building Embedded Linux Systems.chm
(1032 KB)
CMP Books - Practical Statecharts in C&C++ Quantum Programming for Embedded Systems.pdf
(2391 KB)
Building Embedded Linux Systems.pdf
(1415 KB)
Inne foldery tego chomika:
audio
automatyka
budownictwo
chemia
czasopisma
Zgłoś jeśli
naruszono regulamin