Graphics Reference
In-Depth Information
3. For convenience, we will create a couple of constructors. The first of these takes
position and color as the input and uses the normalized vertex position as the
normal vector—this works well for geometric meshes that are built around the
origin ( Vector3.Zero ). The second one takes only a position and defaults
the color using Color.White .
public Vertex(Vector3 position, Color color)
: this(position, Vector3.Normalize(position), color)
{ }
public Vertex(Vector3 position)
: this(position, Color.White)
{ }
...
We can now create our updated renderers.
4. The quad will represent a flat platform with the normal vectors pointing straight
up the y axis. Open the QuadRenderer.cs file and make the following changes:
Replace the vertices in CreateDeviceDependentResources
and don't forget to set the size of the vertex binding to the size of
our Vertex structure.
var color = Color.LightGray;
quadVertices = ToDispose(Buffer.Create(device, BindFlags.
VertexBuffer, new Vertex[] {
/* Position: float x 3, Normal: Vector3, Color */
new Vertex(-0.5f, 0f, -0.5f, Vector3.UnitY, Color),
new Vertex(-0.5f, 0f, 0.5f, Vector3.UnitY, color),
new Vertex(0.5f, 0f, 0.5f, Vector3.UnitY, color),
new Vertex(0.5f, 0f, -0.5f, Vector3.UnitY, color),
}));
quadBinding = new VertexBufferBinding(quadVertices, Utilities.
SizeOf< Vertex >(), 0);
The creation of the index buffer and the rendering commands do not change.
At this point, you should be able to compile and run the project ( F5 ) to see the
updated light gray quad.
Although the shader used at this point does not support
our Normal input semantic, the SV_Position and
COLOR input semantics will be matched. The IA will simply
ignore the Normal component of our input layout.
 
Search WWH ::




Custom Search