Graphics Reference
In-Depth Information
Objects of this class have a method called from_pydata() that enables you to pass Python-formatted
data to the object. This is how you get the vertex, edge, and face information to your mesh datablock
object:
mesh.from_pydata(verts, edges, faces)
5. Now, create a Python object to represent the Blender 3D mesh object. Again, use a new() method.
In this case, the arguments represent the name of the new 3D object and its associated datablock. This is
done like this:
obj = bpy.data.objects.new("Single Vertex", mesh)
6. The3Dobjectisreadytobeplacedintothescene(calledlinking),butyouhaven'tyetdonethat.Before
you do that, set the object's location to be the same as the 3D cursor's location, like this:
obj.location = bpy.context.scene.cursor_location
7. Next, link the object into the scene, like this:
bpy.context.scene.objects.link(obj)
8. Finally, make the new 3D object the active object in the scene, like this:
bpy.context.scene.objects.active = obj
9. When you've done this, click the Run Script button in the text editor header. A new 3D object will
appear in the 3D viewport at the location of the 3D cursor. You'll see only the object's center, because
there's only a single vertex. However, if you tab into Edit mode, you'll be able to extrude new vertices.
10. If you have any problems, check the error output in the System Console (review Chapter 12, “The
Blender-Python Interpreter,” to see how to view the System Console for your operating system). Check
for any typos or inconsistent spacing and indentation along the left edge of the script.
State-of-the-Art Virtual Reality at MetaVR
Modeling and animating with Blender isn't all just fun and games. At MetaVR, one of the industry leaders in vir-
tual reality software, integration with Blender has been a central part of the development of their Virtual Reality
Scene Generator (VRSG) software, which has been used extensively for real-time, immersive military training by
the armed forces of several countries. Blender developer Campbell Barton was contracted to author MetaVR's
Blender plug-in, which enables users to create new content for use in the VRSG environment, including fully
rigged, animated characters.
This work resulted in the ability to create scenes involving tens of thousands of buildings and trees and in the cre-
ation of a character library of more than 100 human characters that could be exported simultaneously to the
VRSG format, with automatic level-of-detail generation and infrared texture map creation. Numerous current
features of Blender were the direct result of MetaVR's investment in this project, including BVH export function-
ality, the array modifier, UV projection, custom transform axes, knife snapping, improved quad-to-triangle con-
versions, and numeric input for transforms. Under-the-hood developments from this project include ID proper-
ties, which form the basis for the data interchange system of Blender 2.50 and future versions.
The MetaVR Blender plug-in is a quintessential example of how Python scripting can extend the possibilities of
Blender by enabling users to export content to, and import content from, other formats. Blender already includes
a wealth of built-in Python-based import and export scripts for most standard 3D formats. With the custom-made
MetaVR plug-in, Blender was able to offer functionality that would otherwise have been limited to special VR
content-creation software ranging in the tens of thousands of dollars per seat.
You can learn more about MetaVR's use of Blender in the VRSG environment at their website:
www.metavr.com/products/vrsg/vrsg-characteranimation.html
Search WWH ::




Custom Search