65.9K
CodeProject 正在变化。 阅读更多。
Home

使用 C# 和 Igneel.Graphics 在 .NET 中渲染 3D 图形

starIconstarIconstarIconstarIconstarIcon

5.00/5 (13投票s)

2016 年 10 月 28 日

CPOL

8分钟阅读

viewsIcon

67706

downloadIcon

35

本文介绍了如何使用 Igneel.Graphics API 在 .NET 中使用 C# 渲染 3D 图形。

引言

Igneel.Graphics 是一个用于在 .NET 中渲染 3D 图形的 API。它提供了与 C# 代码中的图形硬件进行交互的抽象。该 API 结合了 C# 的表现力和 C++ 的强大功能而开发。此外,它结合了 OpenGL 和 Direct3D 规范中的概念以及 C# 接口和动态映射到着色器统一变量等独特功能。虽然 Igneel.Graphics 与 Direct3D10 共享通用定义,但它不仅仅是简单的封装,更像是一个您可以在托管代码中使用的平台或中间件,可以实现 Direct3D11、OpenGL 或 OpenGL ES。此外,Igneel.Graphics 中的着色器管理更倾向于 OpenGL 规范而非 Direct3D。

在 Igneel.Graphics 中,图形管道的每个可编程阶段都由一个 IShaderStage<TShader> 接口表示。此接口可用于创建着色器、设置纹理、缓冲区或采样器状态等资源。

本文将介绍一个托管在 Windows Forms 环境中的示例应用程序,以展示如何使用 API 组件渲染几何图形、应用纹理、加载和构建着色器代码,并将应用程序值提供给着色器的统一变量。

背景

Igneel.Graphics 是作为 Igneel Engine 的底层图形 API 开发的。它是 .NET 上的一个抽象,支持 Igneel Engine 的高级渲染系统。该 API 设计用于支持高达 SM5.0 的多种着色器模型。因此,Igneel.Graphics 的设计允许该 API 在不同的原生平台上实现。

当前的 Igneel.Graphics 实现使用着色器模型 4.0 (SM4.0)。此着色器模型最初由 Direct3D10 和 OpenGL 2.0 支持。该模型彻底重新定义了之前的着色器模型架构,允许自定义图形处理的新阶段。后来的着色器模型 5.0 添加了诸如 Hull Shader 和 Domain Shader 等阶段,它们与不可自定义的 Tessellation 阶段交互。Vertex Shader 阶段是唯一的必需阶段,其他阶段是可选的。有关更多信息,请参阅 DirectX SDK 或 OpenGL 文档。

使用代码

首先,我们创建一个 Windows Form 应用程序。在 Form 构造函数中,我们获取 GraphicDevice 的引用,然后就可以开始加载我们的着色器并创建用于保存模型几何体的 GraphicBuffer。在创建 GraphicDevice 之后,我们还可以加载模型纹理,这些纹理由 Texture2D 表示。

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Runtime.InteropServices;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;
using Igneel;
using Igneel.Graphics;
using Igneel.Windows;
using Igneel.Windows.Forms;

namespace BasicEarth
{
    public partial class Form1 : Form
    {
        /// <summary>
        /// The Sphere vertex definition. 
        /// Attributes are used for defined the 
        /// vertex shader input layout
        /// </summary>
        [StructLayout(LayoutKind.Sequential)]
        public struct SphereVertex
        {
            [VertexElement(IASemantic.Position)]
            public Vector3 Position;

            [VertexElement(IASemantic.Normal)]
            public Vector3 Normal;

            [VertexElement(IASemantic.Tangent)]
            public Vector3 Tangent;

            [VertexElement(IASemantic.TextureCoordinate, 0)]
            public Vector2 TexCoord;
         
            public SphereVertex(Vector3 position = default(Vector3), 
                Vector3 normal = default(Vector3), 
                Vector3 tangent = default(Vector3), 
                Vector2 texCoord = default(Vector2))
            {
                Position = position;
                Normal = normal;
                Tangent = tangent;
                TexCoord = texCoord;               
            }            
        }

        /// <summary>
        /// Data Structure that defines a Directional Light. 
        /// The structure use 16 bytes padding for efficiently transfer the data to GPU memory
        /// </summary>
        [StructLayout(LayoutKind.Sequential)]
        public struct DirectionalLight
        {
            /// <summary>
            /// Light's direction
            /// </summary>
            public Vector3 Direction;
            private float pad0;
            
            /// <summary>
            /// Light's color
            /// </summary>
            public Color3 Color;
            private float pad1;
        }

        /// <summary>
        /// Contract to access shader uniform variables and textures.      
        /// </summary>
        public interface ProgramMapping
        {
            float Time { get; set; }

            Matrix World { get; set; }

            Matrix View { get; set; }

            Matrix Projection { get; set; }

            DirectionalLight DirectionalLight { get; set; }

            Sampler<Texture2D> DiffuseTexture { get; set; }

            Sampler<Texture2D> NightTexture { get; set; }

            Sampler<Texture2D> NormalMapTexture { get; set; }

            Sampler<Texture2D> ReflectionMask { get; set; }

            Vector3 CameraPosition { get; set; }

            float ReflectionRatio { get; set; }

            float SpecularRatio { get; set; }

            float SpecularStyleLerp { get; set; }

            int SpecularPower { get; set; }
        }


        //The Graphic Device
        private GraphicDevice device;

        //Buffer for storing the mesh vertexes in GPU memory
        private GraphicBuffer vertexBuffer;

        //Buffer for storing the triangles indices in GPU memory
		private GraphicBuffer indexBuffer;

        //The shader program mapping
        ProgramMapping input;

        //The shader program
        ShaderProgram shaderProgram;                

        //Transformation matrices
        Matrix world;
        Matrix view;
        Matrix projection;

        //Texture sampling settings
        SamplerState diffuseSampler;
       
        //Textures
        Texture2D diffuseTexture;
        Texture2D nightTexture;
        Texture2D normalMapTexture;
        Texture2D reflectionMask;

        //Camera position
        private Vector3 cameraPosition = new Vector3(0, 10, -15);


        public Form1()
        {
            SetStyle(ControlStyles.Opaque, true);

            InitializeComponent();

            Init();            

            Application.Idle += (sender, args) =>
            {
                NativeMessage message;
                while (!Native.PeekMessage(out message, IntPtr.Zero, 0, 0, 0))
                {
                    RenderFrame();
                }
            };

          
        }           

        protected override void OnResize(EventArgs e)
        {
            base.OnResize(e);

            if (device != null)
            {
                //resize the device back buffer after the form's size changed
                device.ResizeBackBuffer(Width, Height);

                //set the new render target viewport
                device.ViewPort = new ViewPort(0, 0, Width, Height);

                //create the projection matrix with the new aspect ratio
                projection = Matrix.PerspectiveFovLh((float)Width / (float)Height, Igneel.Numerics.PIover6, 1, 1000);
            }
        }


        private void Init()
        {

            //Setup shader model version and default compiling options, 
            //also set the relative directory where the shaders are located
            ShaderRepository.SetupD3D10_SM40("Shaders");

            //Create an instance of the GraphicDeviceFactory.
            //The GraphicDeviceFactory abstract class is used to creates GraphicDevices without worrying about the native implementation.
            //This sample use a Direc3D10 native implementation, therefore an instance of a GraphicManager10 is created
            GraphicDeviceFactory devFactory = new IgneelD3D10.GraphicManager10();

            //A GraphicDevice is created using a WindowContext containing rendering and display settings.           
            device = devFactory.CreateDevice(new WindowContext(Handle)
            {
                BackBufferWidth = Width,
                BackBufferHeight = Height,
                BackBufferFormat = Format.R8G8B8A8_UNORM,
                DepthStencilFormat = Format.D24_UNORM_S8_UINT,
                FullScreen = false,
                Sampling = new Multisampling(1, 0),
                Presentation = PresentionInterval.Default                 
            });

            //Create a ShaderProgram using the input layout definition provided by the SphereVertex struct
            //and the code for the vertex and pixel shaders located in the VertexShaderVS and PixelShaderPS files.
            //As a convention the last 2 characters in the filename specify the type of shader to load.
            shaderProgram = device.CreateProgram<SphereVertex>("VertexShaderVS", "PixelShaderPS");

            //Get a typed mapping using the ProgramMapping interface for the ShaderProgram uniform variables and textures
            input = shaderProgram.Map<ProgramMapping>();

            //The application blending state allowing transparency blend
            device.Blend = device.CreateBlendState(new BlendDesc(
                blendEnable: true, 
                srcBlend: Blend.SourceAlpha, 
                destBlend: Blend.InverseSourceAlpha));

            //The application depth testing state
            device.DepthTest = device.CreateDepthStencilState(new DepthStencilStateDesc(
                depthFunc: Comparison.Less));

            //The application rasterizer state
            device.Rasterizer = device.CreateRasterizerState(new RasterizerDesc(
                cull: CullMode.Back,
                fill: FillMode.Solid));

            //Default texture sampling settings
            diffuseSampler = device.CreateSamplerState(new SamplerDesc(
                addressU: TextureAddressMode.Wrap,
                addressV: TextureAddressMode.Wrap,
                filter: Filter.MinPointMagMipLinear));

            //Load the textures
            diffuseTexture = device.CreateTexture2DFromFile("Textures/Earth_Diffuse.dds");
            nightTexture = device.CreateTexture2DFromFile("Textures/Earth_Night.dds");
            normalMapTexture = device.CreateTexture2DFromFile("Textures/Earth_NormalMap.dds");
            reflectionMask = device.CreateTexture2DFromFile("Textures/Earth_ReflectionMask.dds");

            //Create transformation matrices
            world = Matrix.Identity;
            view = Matrix.LookAt(cameraPosition, new Vector3(0, 0, 1), Vector3.UnitY);
            projection = Matrix.PerspectiveFovLh((float)Width / (float)Height, Igneel.Numerics.PIover6, 1, 1000);

            CreateSphere();
        }

		.............
}

列表 1:初始化。

在上面的代码中,我们挂接了 Application.Idle 事件,因此可以在没有待处理消息时渲染每一帧。SetStyle(ControlStyles.Opaque, true) 行用于避免窗口尝试重绘背景时出现的闪烁。此外,还定义了一些结构,例如用于照亮场景的像素着色器中使用的几何体顶点定义 SphereVertex 和光照定义 DirectionalLight。还定义了一个接口 ProgramMapping,该接口用于创建应用程序代码与着色器统一变量之间的映射。着色器的统一变量是可以从应用程序代码接收值的变量,通常定义在常量缓冲区中。

Init 方法中,ShaderRepository.SetupD3D10_SM40("Shaders") 行告知 API 着色器文件的位置以及用于着色器模型 4.0 的默认着色器编译设置。接下来创建了一个 GraphicDeviceFactory。在这种情况下,使用了 Direct3D10 的实现,因此需要 GraphicManager10 实例。这是代码中唯一附加到 API 特定原生实现的组件。然后,使用工厂,我们可以创建一个 graphic device,传入一个 WindowsContext 作为参数,该参数包含诸如窗口宽度和高度以及后备缓冲区格式和多重采样等呈现设置。

创建设备后,我们只需一行代码即可加载和编译着色器。

shaderProgram = device.CreateProgram<SphereVertex>("VertexShaderVS", "PixelShaderPS");

上一行代码创建了一个着色器程序,其中包含一个具有 SphereVertex 结构指定的输入布局的 vertex shader 和一个 pixel shader。为了简化着色器对象的创建代码,API 使用一些约定,根据着色器文件名来识别要创建的着色器类型。为了实现这一点,它使用文件名后缀,如 VS(Vertex Shader)、PS(Pixel Shader)、GS(Geometry Shader)、HS(Hull Shader)、DS(Domain Shader)和 CS(Compute Shader)。

Igneel.Graphics 的另一项独特功能称为着色器接口映射,如下面的语句所示。

input = shaderProgram.Map<ProgramMapping>();

然后,在检索到接口实例后,我们可以使用它来设置着色器统一变量,如变换矩阵、光照数据和纹理,只需设置 C# 属性即可,并提供额外的 Intellisense 支持。

纹理在计算机图形应用程序中非常重要。因此,图形设备支持多种方法,可以从文件、流加载纹理,或仅保留 GPU 内存以供以后填充。您还可以加载不同类型的纹理,如 Texture1D、Texture2D 或 Texture3D。立方体纹理被视为 Texture2D 的数组。支持的文件格式有 .DDS、.JPG、.PNG、.TGA 和 .BMP。

在计算机图形学中,矩阵用于将向量从一个空间转换到另一个空间。因此,在渲染过程中,我们需要矩阵在 vertex shader 中使用,将顶点从本地网格空间转换为投影空间,也称为同位设备坐标。然后 GPU 将负责通过除以 z 和视口变换将这些投影坐标转换为屏幕坐标。

world = Matrix.Identity;
view = Matrix.LookAt(cameraPosition, new Vector3(0, 0, 1), Vector3.UnitY);
projection = Matrix.PerspectiveFovLh((float)Width / (float)Height, Igneel.Numerics.PIover6, 1, 1000);

列表 4 创建变换

此处显示了生成球体网格的代码。

        private void CreateSphere()
        {
            var stacks = 128;
            var slices = 128;
            var radius = 10;

            var vertices = new SphereVertex[(stacks - 1) * (slices + 1) + 2];
            var indices = new ushort[(stacks - 2) * slices * 6 + slices * 6];

            float phiStep = Numerics.PI / stacks;
            float thetaStep = Numerics.TwoPI / slices;

            // do not count the poles as rings
            int numRings = stacks - 1;

            // Compute vertices for each stack ring.
            int k = 0;
            var v = new SphereVertex();

            for (int i = 1; i <= numRings; ++i)
            {
                float phi = i * phiStep;

                // vertices of ring
                for (int j = 0; j <= slices; ++j)
                {
                    float theta = j * thetaStep;

                    // spherical to cartesian
                    v.Position = Vector3.SphericalToCartesian(phi, theta, radius);
                    v.Normal = Vector3.Normalize(v.Position);
                    v.TexCoord = new Vector2(theta / (-2.0f * (float)Math.PI), phi / (float)Math.PI);

                    // partial derivative of P with respect to theta
                    v.Tangent = new Vector3(-radius * (float)Math.Sin(phi) * (float)Math.Sin(theta), 0, radius * (float)Math.Sin(phi) * (float)Math.Cos(theta));

                    vertices[k++] = v;
                }
            }
            // poles: note that there will be texture coordinate distortion
            vertices[vertices.Length - 2] = new SphereVertex(new Vector3(0.0f, -radius, 0.0f), new Vector3(0.0f, -1.0f, 0.0f), Vector3.Zero, new Vector2(0.0f, 1.0f));
            vertices[vertices.Length - 1] = new SphereVertex(new Vector3(0.0f, radius, 0.0f), new Vector3(0.0f, 1.0f, 0.0f), Vector3.Zero, new Vector2(0.0f, 0.0f));

            int northPoleIndex = vertices.Length - 1;
            int southPoleIndex = vertices.Length - 2;

            int numRingVertices = slices + 1;

            // Compute indices for inner stacks (not connected to poles).
            k = 0;
            for (int i = 0; i < stacks - 2; ++i)
            {
                for (int j = 0; j < slices; ++j)
                {
                    indices[k++] = (ushort)((i + 1) * numRingVertices + j);                    
                    indices[k++] = (ushort)(i * numRingVertices + j + 1);
                    indices[k++] = (ushort)(i * numRingVertices + j);

                    indices[k++] = (ushort)((i + 1) * numRingVertices + j + 1);                    
                    indices[k++] = (ushort)(i * numRingVertices + j + 1);
                    indices[k++] = (ushort)((i + 1) * numRingVertices + j);
                }
            }

            // Compute indices for top stack.  The top stack was written 
            // first to the vertex buffer.
            for (int i = 0; i < slices; ++i)
            {
                indices[k++] = (ushort)i;
                indices[k++] = (ushort)(i + 1);                
                indices[k++] = (ushort)northPoleIndex;
            }

            // Compute indices for bottom stack.  The bottom stack was written
            // last to the vertex buffer, so we need to offset to the index
            // of first vertex in the last ring.
            int baseIndex = (numRings - 1) * numRingVertices;
            for (int i = 0; i < slices; ++i)
            {
                indices[k++] = (ushort)(baseIndex + i + 1);
                indices[k++] = (ushort)(baseIndex + i);
                indices[k++] = (ushort)southPoleIndex;                
            }

            vertexBuffer = device.CreateVertexBuffer(data: vertices);
            indexBuffer = device.CreateIndexBuffer(data:indices);
                                          
        }         

列表 5 创建顶点和索引缓冲区

CreateSphere 方法中,最后两行语句创建了用于在 GPU 内存中存储顶点的顶点缓冲区,以及包含定义网格三角形的索引的索引缓冲区。

 vertexBuffer = device.CreateVertexBuffer(data: vertices);
 indexBuffer = device.CreateIndexBuffer(data:indices);

这将在图形设备上保留内存,用于存储包含顶点数据和索引的数组。内存以默认设置保留,该方法还允许传入多个参数来控制资源内存行为或 CPU 访问类型,如读取或写入。

渲染场景的代码位于 RenderFrame 方法中。

        private void RenderFrame()
        {
            //Set the render target and the depth stencil buffers
            //for rendering to the display just set the device default 
            //BackBuffer and BackDepthBuffer
            device.SetRenderTarget(device.BackBuffer, device.BackDepthBuffer);

            //Set the ViewPort to used by the device during the viewport tranformation
            device.ViewPort = new ViewPort(0, 0, Width, Height);

            //Clear the render target and depth stencil buffers
            device.Clear(ClearFlags.Target | ClearFlags.ZBuffer, new Color4(0, 0, 0, 0), 1, 0);

            //Set the primitive type
            device.PrimitiveTopology = IAPrimitive.TriangleList;

            //Bind the vertex buffer to slot 0 at offset 0
            device.SetVertexBuffer(0, vertexBuffer, 0);

            //Set the index buffer
            device.SetIndexBuffer(indexBuffer);

          
            //Send the transformation matrices to the vertex shader
            input.World = Matrix.RotationY(-(float)Environment.TickCount / 5000.0f);
            input.View = view;
            input.Projection = projection;          

            //Send the light info and other values to the pixel shader
            input.CameraPosition = cameraPosition;
            input.ReflectionRatio = 0.05f;
            input.SpecularRatio = 0.15f;
            input.SpecularStyleLerp = 0.15f;
            input.SpecularPower = 8;
            input.DirectionalLight = new DirectionalLight
            {
                Color = Color3.White,
                Direction = new Euler(45, 0, 0).ToDirection()
            };

            //Bind a texture with a sampler state. As a convetion the SamplerState
            //in the shader must have the same name as the texture with 's' as prefix
            //for example in the shader the sampler state is declared
            //SamplerState sDiffuseTexture;
            input.DiffuseTexture = diffuseTexture.ToSampler(diffuseSampler);

            //Bind textures with default sampler state (linear filtering and wrap TextureAddressMode).
            //these statements have the same behavior that calling nightTexture.ToSampler()
            input.NightTexture = nightTexture;
            input.NormalMapTexture = normalMapTexture;
            input.ReflectionMask = reflectionMask;
            

            //Set the shader program
            device.Program = shaderProgram;

            //Draw the geometry using the indices count, the start index an the vertex base offset 
            device.DrawIndexed((int)indexBuffer.SizeInBytes / indexBuffer.Stride, 0, 0);

            //Present the render target buffer to the display.
            device.Present();
        }

列表 6 渲染帧

设置渲染和深度-模板缓冲区后,设置 ViewPort 并清除渲染缓冲区。将原始类型指定为三角形列表,并将存储顶点和索引的 GraphicBuffer 绑定到管道。然后使用着色器接口映射发送着色器变量值,如矩阵和光照信息。着色器接口映射还可用于绑定纹理和采样器状态,如下面的语句所示。

input.DiffuseTexture = diffuseTexture.ToSampler(diffuseSampler);
.....
input.NightTexture = nightTexture;

纹理和采样器状态也可以通过 GPU 寄存器使用 IShaderStage 接口设置,如下面的语句所示,其中纹理和采样器状态绑定到纹理寄存器 0 和采样器寄存器 0。

device.GetShaderStage<PixelShader>().SetResource(0, diffuseTexture);
device.GetShaderStage<PixelShader>().SetSampler(0, diffuseSampler);

可以通过调用 device.GetShaderStage<TShader>() 来获取 IShaderStage<TShader>,其中 TShader 是继承自 Shader 的类型,如 VertexShaderPixelShaderGeometryShaderHullShaderDomainShaderComputeShader。如果特定的 GraphicDevice 实现不支持 [Shader Type] 的着色器阶段,则在调用 device.GetShaderStage<[Shader Type]>() 时必须返回 null。

另一方面,可以使用称为动态着色器映射的另一项独特功能,而不是接口着色器映射,如下面的代码行所示。

 

shaderProgram.Input.World = Matrix.RotationY(-(float)Environment.TickCount/5000.0f);

shaderProgram.Input 的类型是动态的,因此无需声明接口即可映射着色器常量。但是,动态着色器映射的缺点是它只能映射基本类型,如向量、矩阵或纹理。它不能映射用户定义的类型,如 DirectionalLight 结构。此外,它也不能绑定 SamplerState,因此必须使用 IShaderStage 来绑定 SamplerState

 

Vertex Shader

struct VSInput
{
    float4 Position : POSITION; 
    float3 Normal : NORMAL;
    float3 Tangent : TANGENT;
    float2 TexCoords : TEXCOORD0;
};

struct VSOutput
{
    float4 PositionVS : SV_POSITION;
    float2 TexCoords : TEXCOORD0; 
    float3 Normal : TEXCOORD1;
    float3 Tangent : TEXCOORD2;
    float3 Binormal : TEXCOORD3;
    float3 Position : TEXCOORD4;
};

cbuffer camera
{
	float4x4 View;	
	float4x4 Projection;
};

cbuffer perObject
{	
	float4x4 World;
};


VSOutput main( VSInput input)
{
	 VSOutput output;
   
    // Transform to clip space by multiplying by the basic transform matrices.
    // An additional rotation is performed to illustrate vertex animation.    
    float4 worldPosition = mul(input.Position, World);
    output.PositionVS = mul(worldPosition, mul(View, Projection));
    
    // Move the incoming normal and tangent into world space and compute the binormal.
    // These three axes will be used by the pixel shader to move the normal map from 
    // tangent space to world space. 
    output.Normal = mul(input.Normal, World);
    output.Tangent = mul(input.Tangent, World);
    output.Binormal = cross(output.Normal, output.Tangent);
    output.Position = worldPosition.xyz;
     
    // Pass texture coordinates on to the pixel shader
    output.TexCoords = input.TexCoords;
    return output;    
}

Pixel Shader

struct Light
{
	float3 Direction;
	float3 Color;
};

struct VSOutput
{
    float4 PositionVS : SV_POSITION;
    float2 TexCoords : TEXCOORD0; 
    float3 Normal : TEXCOORD1;
    float3 Tangent : TEXCOORD2;
    float3 Binormal : TEXCOORD3;
    float3 Position : TEXCOORD4;
};

cbuffer cbParams
{
	float ReflectionRatio;
	float SpecularRatio;
	float SpecularStyleLerp;
	int SpecularPower;
};

cbuffer cbLight
{
	Light DirectionalLight;
	float4x4 View;
	float3 CameraPosition;
};

Texture2D DiffuseTexture;
Texture2D NightTexture;
Texture2D NormalMapTexture;
Texture2D ReflectionMask;

SamplerState sDiffuseTexture;

float4 main(VSOutput input) : SV_TARGET
{	
    float3 EyeVector = normalize(input.Position - CameraPosition );

    
    // Look up the normal from the NormalMap texture, and unbias the result
    float3 Normal = NormalMapTexture.Sample(sDiffuseTexture, input.TexCoords).rgb;
    Normal = (Normal * 2) - 1;
    
    // Move the normal from tangent space to world space
    float3x3 tangentFrame = {input.Tangent, input.Binormal, input.Normal};
    Normal = normalize(mul(Normal, tangentFrame));
    
    // Start with N dot L lighting
    float light = saturate( dot( Normal, -DirectionalLight.Direction ) );
    float3 color = DirectionalLight.Color * light;
    
    // Modulate against the diffuse texture color
    float4 diffuse = DiffuseTexture.Sample(sDiffuseTexture, input.TexCoords);
    color *= diffuse.rgb;
    
    // Add ground lights if the area is not in sunlight
    float sunlitRatio = saturate(2*light);
    float4 nightColor =NightTexture.Sample(sDiffuseTexture, input.TexCoords);
    color = lerp( nightColor.xyz, color, float3( sunlitRatio, sunlitRatio, sunlitRatio) );
       
    
    // Add a specular highlight
	float reflectionMask = ReflectionMask.Sample(sDiffuseTexture, input.TexCoords);
    float3 vHalf = normalize( -EyeVector + -DirectionalLight.Direction );
    float PhongSpecular = saturate(dot(vHalf, Normal));
	

    color += DirectionalLight.Color * ( pow(PhongSpecular, SpecularPower) * SpecularRatio * reflectionMask);  
    
	 // Add atmosphere
    float atmosphereRatio = 1 - saturate( dot(-EyeVector, input.Normal) );
    color += 0.30f * float3(.3, .5, 1) * pow(atmosphereRatio, 2);

    // Set alpha to 1.0 and return
    return float4(color, 1.0);	
}

在两个着色器中都声明了不同的常量缓冲区,接口和动态映射机制将负责高效地管理这些常量缓冲区,因此只有访问过的变量的缓冲区才会在每个渲染帧中打开一次并关闭。

应用程序屏幕

关注点

在 Igneel.Graphics 中,创建设备和着色器、缓冲区、纹理和管道状态等资源非常简单,这一点很有趣。着色器的特殊管理以及接口映射和动态映射等功能也很有趣。作为说明,您可以编写自己的基本渲染代码,而无需了解 API 的原生实现,并测试给定 Shader 类型是否实现了 IShaderStage。此外,我在开发此 API 的过程中非常享受,并且学到了很多关于如何编写高性能代码以及如何在 .NET/MSIL 中将托管代码与原生非托管代码集成。

此外,为了运行示例,您必须首先安装 SDK 附带的 DirectX 可再发行组件,您可以在 https://www.microsoft.com/en-us/download/details.aspx?id=6812 下载。请注意,安装 SDK 后,您必须找到 SDK 安装文件夹(默认为 Program Files),然后在 [SDK 文件夹]/Redist/DXSETUP.exe 中运行可再发行组件安装程序。

此外,Igneel Engine 现在可以在 Github 上获取,欢迎贡献。

关于作者

我的名字是 Ansel Castro Cabrera,我拥有哈瓦那大学计算机科学学士学位,在那里我专攻计算机图形学、编译和 .NET 开发。我还研究了计算机科学的其他领域,如机器学习、计算机视觉、Web 和 Android 编程。此外,我还开发了用于图像模式识别的神经网络和卷积神经网络模型,并且使用 OpenCV 进行特征跟踪和提取。另一方面,我还在 Web 开发中使用了 Django、PHP、ASP.NET WebForms、ASP.NET MVC 和 Javascript。

© . All rights reserved.