Geography Reference
In-Depth Information
Raster Driven Digital Cloud Model
There are many more raster datasets publicly available on the Internet. One of them
is the NOAA satellite archive. Images are available to be downloaded in the
different wavelengths and time intervals. The Infra-Red (IR) images can be used
in the same way as the normalized heights and positions of the trees, created in the
previous part. In this case, the only difference is, that the distributed objects are not
trees but cloud particles. This raster dataset does not provide extended information
about the elevation of a cloud layer, density or cloud types. However, it is sufficient
enough to display the weather conditions that closely represent the real state in the
area of interest.
Firstly, a 3D projected texture map is created. Next, the same texture is used as a
master object, which controls the distribution of basic 3D primitives (boxes)—
proxy shapes, based on the raster value. This extruded structure works as a volume
for the cloud particles. Authors use the built in module inside of the 3D Studio Max
2012 called Particle Flow but any other alternative can be used in the similar way.
In ParticleFlow module, the particle is positioned into the chosen object and its
normal vector is turned directly into the camera of the spectator. This allows correct
Sun ray redirection. The object shape is set to the simple 1 polygon plane with
procedural material. The material is composed from the black-white mask and
gradient built smoke structure, which gives the good impression of the
photorealistic look of the clouds. The test object used to generate the particles is
made out of the deformed sphere. This system can be used and implemented along
the created vegetation system, 3D terrain and partial outputs of other branches
(architecture, engineering). The final composition which includes the cloud model
is shown in the following chapter.
Final Composition and Outputs
The proposed solution describes how to create photorealistic 3D polygonal vege-
tation model based on the raw LIDAR point clouds. The same procedure is
demonstrated on creation of digital clouds model. Both of these processes utilize
raster encoded values to control the object properties (namely height) and its
placement in the given area of interest. The next picture (see Fig. 2 ) shows the
resulting photorealistic outputs. All outputs are completely Computer Generated
Imageries (CGI) taken from the random cameras placed in the 3D model. The setup
of the model is focused into the parametrical and procedural way, which means, that
the model can be immediately changed by a couple of values. All the objects have
the coordinate system WGS 1984 preserved and their properties are very close to
the real-world values that were acquired during the scan period (8 bit limits). The
leading rasters can be recreated anytime in the future, which automatically affects
the object redistribution.
Search WWH ::




Custom Search