Running on Empty

The few things I know, I like to share.

Model-View-ViewModel (MVVM) – Part 1

I have been working with MVVM for about a year now as part of work related projects. For the larger portion of my career I have focused on Model View Controller or Model View Presenter schemas for user interface. However, after learning as much as possible about MVVM from experts around the web, I have adopted it as my primary method UI implementation.

So what is MVVM? For the most part it is MVC on steroids, it is a way to split presentation design from codified logic specific to the application. In WPF, it allows easy access to core features such as databinding. It allows the developer to write UI seperately from business logic, which seperates the visible behavior from the logical behavior of the application. A nice side effect is unit testing of the application becomes much easier, since most if not all application logic is handled as methods. It is easier to describe in parts:

ViewModel – is the entity(ies) that the application is trying to manage and consume

  • This is your entity class, everything here should be Properties.

View-Model – is an intermediary layer where most of the logic live for specific business applications.

  • This layer is not aware of UI controls.
  • Here you expose Public properties that the UI layer will be bound.
  • Use public methods invoked by events/commands
  • Update the View through databinding
  • Update the Model through bound properties

View  – is how the View-Model is displayed, the UI.

  • Contains the XAML in a WPF application
  • Command bindings – routed commands
  • Property bindings
  • Code behind should be as clean as possible
  • Should include basic wiring to instantiate the view
  • General Rule of Thumb – if you have x:Name in your XAML, you are doing it wrong.

In the next article I will demonstrate the view model base class.

July 22, 2011 Posted by | C#, MVVM, TDD, WPF, XAML | Leave a comment

Still Alive

I am still around, been working on some personal projects. I have neglected to look at this blog for a long time, I had no idea that anybody would actually enjoy reading. I am writing some new articles demonstrating designer support for a WPF like interface in XNA. However, for now I have a few Model-View-View Model articles for a series to put together. I appreciate the comments and feedback.

July 22, 2011 Posted by | Uncategorized | Leave a comment

WPF ProgressBar and Long Running Processes

Introduction

Often during long running processes it is useful to indicate in the UI that the process is running.  Rather than simply locking the UI from the user it is preferable to display a progressbar.  In my current implementation I have a server side process that I know takes a maximum of 30 seconds to complete.  So when I call this process from the UI, I display a progressbar that is timed to reach it’s maximum at 30 seconds.

The ProgressBar Window

<Windown ...>
 

      <ProgressBar x:Name="MessageProgessBar"
                   Maximum="100" Minimum="0"
                   Height="25" Width="275">
        <ProgressBar.Triggers>
          <EventTrigger RoutedEvent="ProgressBar.Loaded">
            <BeginStoryboard>
              <Storyboard>
                <DoubleAnimation Storyboard.TargetName="MessageProgessBar"
                                 Storyboard.TargetProperty="Value"
                                 From="0" To="100" Duration="0:0:30"/>
              </Storyboard>
            </BeginStoryboard>
          </EventTrigger>
        </ProgressBar.Triggers>
      </ProgressBar>
    </StackPanel>
  </Border>
</Window>

Calling Long Running Processes

//Open the Progressbar Window.
ThreadStart ts = delegate
{
 // DO LONG RUNNING PROCESS HERE!
 Dispatcher.BeginInvoke(DispatcherPriority.Normal, (EventHandler)
        delegate
        {
         // Do Cleanup Process HERE!               
        }, null, null);
};
ts.BeginInvoke(null, null);

Thank you Linda for your comments and reminder of subjects I have yet to post.

Please feel free to leave comments or suggestions.

January 23, 2009 Posted by | C#, WPF, XAML | 4 Comments

XNA Framework GameEngine Development. (Part 20, Plug-in Manager)

Introduction

Welcome to Part 20 of the XNA Framework GameEngine Development series.  This is the first article of my new game engine series Roe3. 

This article will focus on creating a plug-in architecture for game components and managers.

Creating the plug-in socket

While writing this series I found a few instances when I wanted to be able to turn off a component or manager in order to test or even to create a game that didn’t use all of bells and whistles that the engine had to offer.  I normally would go into the base engine code and remove the components rebuild and move on.  Of course this works, but it is not exactly a very elegant solution.

So I created a configuration management system that would allow me to instantiate managers and components using an xml file that looks something like the following.

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <components>
    <component value="RoeEngine3.GameComponents.Cursor"/>
    <component value="RoeEngine3.GameComponents.FpsCounter"/>
    <component value="RoeEngine3.GameComponents.DebugOverlay"/>
  </components>
  <managers>
    <manager value="RoeEngine3.Managers.AudioManager"/>
    <manager value="RoeEngine3.Managers.ScreenManager"/>
  </managers>
</configuration>

Much nicer!  In this example there are 3 components and 2 managers that will be created as the engine starts up… without the need to recompile the engine.  Don’t want to show a cursor or fpscounter?  Simply remove the component from the config.

So all we need to do is create an xml reader for this file, something like the following.

namespace RoeEngine3.Managers
{
    [XmlType("configuration")]
    public sealed class ConfigurationManager
    {
        public static ConfigurationManager Instance { get; private set; }

        static ConfigurationManager()
        {
            if(Instance == null)
            {
                string file = @"Managers.config";

                if(!File.Exists(file))
                {
                    throw new FileNotFoundException();
                }

                using(StreamReader reader = new StreamReader(file))
                {
                    XmlSerializer serializer = new XmlSerializer(typeof(ConfigurationManager));
                    Instance = (ConfigurationManager)serializer.Deserialize(reader);
                }
            }
        }

        [XmlArray("managers"), XmlArrayItem("manager")]
        public Managers Managers { get; set; }

        [XmlArray("components"), XmlArrayItem("component")]
        public Components Components { get; set; }
    }

    public sealed class Components : KeyedCollection<string, Component>
    {
        protected override string GetKeyForItem(Component item)
        {
            return item.Value;
        }
    }

    public sealed class Managers : KeyedCollection<string, Manager>
    {
        protected override string GetKeyForItem(Manager item)
        {
            return item.Value;
        }
    }

    [XmlType("component")]
    public sealed class Component
    {
        [XmlAttribute("value")]
        public string Value { get; set; }
    }

    [XmlType("manager")]
    public sealed class Manager
    {
        [XmlAttribute("value")]
        public string Value { get; set; }
    }
}

Now we have a list of Managers and Components that we want to activate in the engine.  So in our base engine class we need to write some prep code for our managers and components.

        private void PrepareComponents()
        {
            foreach (Component component in ConfigurationManager.Instance.Components)
            {
                GameComponent baseComponent = ComponentActivator.CreateInstance(Type.GetType(component.Value)) as GameComponent;
                EngineManager.Game.Components.Add(baseComponent);
            }
        }
       
        private void PrepareManagers()
        {    
            foreach (Manager manager in ConfigurationManager.Instance.Managers)
            {
                GameComponent baseManager = ManagerActivator.CreateInstance(Type.GetType(manager.Value)) as GameComponent;              
                Components.Add(baseManager);
            }
        }
        protected override void Initialize()
        {
            PrepareManagers();
            PrepareComponents();

     ... // OTHER INITIALIZE STUFF GOES HERE
 }

Turn on the Power

All that is left for us to do is activate the managers and components.  Just need to create some reflection magic to invoke them.

    public sealed class ManagerActivator
    {
        public static object CreateInstance(Type type)
        {
            if (type == null)
            {
                throw new ArgumentNullException("Manager Type");
            }

            ConstructorInfo info = type.GetConstructor(new Type[] { typeof(Game) });

            if (info == null)
            {
                return null;
            }

            object instance;

            try
            {
                instance = info.Invoke(new object[] { EngineManager.Game });
            }
            catch (Exception ex)
            {
                if (ex is SystemException)
                {
                    throw;
                }
                instance = null;
            }
            return instance;
        }
    }

    public sealed class ComponentActivator
    {
        public static object CreateInstance(Type type)
        {
            return ManagerActivator.CreateInstance(type);
        }
    }

A few side notes here, if you notice the GetConstructor object passes in a Type array of type Game.  This is because all GameComponents have a constructor parameter of the Game.  Your components will obviously need to subscribe to the correct GetConstructor types.

Conclusion

In this article I outlined a simple method of creating a pluggable interface for components and managers.  This interface will allow us to turn on and off components and managers as we need them without writing additional code.

December 17, 2008 Posted by | C#, GameComponent, XBOX360, XNA | 22 Comments

Happy Anniversary (Year 1)

Today is the one year anniversary of this blog.  I really appreciate all the support and kind comments readers have given.  I certainly hope this blog has been helpful.  I do plan to continue regular blog entries. 

As some of you may know, I have taken a long break from posting, mainly to redesign my XNA Game Engine.  I am eager to share with you some of the new areas that many of the new articles will focus on.

  • “Plug-in” architecture for managers and components (Redesign of EngineManager)
  • WPF “like” Window Design (Overhaul of original ScreenManager)
  • Action mapping (Redesigned Input Manager)
  • Sound Manager

After I get the groundwork for the new engine in place, I will then continue some more advanced sections of engine design.  As always, I appreciate input from readers.  Ideas and suggestions are always welcome.

December 11, 2008 Posted by | Uncategorized | Leave a comment

Using FlowDocuments XAML to print XPS Documents. (Part 6)

Indroduction

This one is for you Linda and Cow-Killer.

Welcome to Part 6 of the Using FlowDocuments XAML to print XPS Documents series.  Up until now it has been impossible to use this series to create an XPS document file or display the XAML text in a document viewer.  It would be nice to be able to create a XAML template and easily insert text into the template using databinding.  Here is my solution, hope to hear your feedback.

Creating a new method to Load Xaml Strings

This method simply takes a raw XAML string and loads it into a FlowDocument, you may have seen this before in previous samples.

        public static IDocumentPaginatorSource RenderFlowDocumentTemplate(string templatePath, 

Dictionary<string, string> parameters)
        {
            string rawXamlText = "";
            using (StreamReader streamReader = File.OpenText(templatePath))
            {
                rawXamlText = streamReader.ReadToEnd();
            }

            rawXamlText = MergeParameters(parameters, rawXamlText);

            FlowDocument document = XamlReader.Load(new XmlTextReader(new StringReader(rawXamlText))) as 

FlowDocument;
           
            return document;
        }

The new parts of this method are the addition of a Dictionary of string key/value pairs and the MergeParameters method.

Merging Parameters

Since XAML is nothing more than a jazzed up XML file, we can use simple string replacements to replace items in our template with strings from our objects.

        private static string MergeParameters(Dictionary<string, string> parameters, string template)
        {
            foreach (KeyValuePair<string, string> kvp in parameters)
            {
                template = template.Replace(string.Format("[{0}]", kvp.Key), kvp.Value);
            }
            return template;
        }

Now we have a nearly completed solution to XAML template rendering, all that is left to do is create a template.

The new XAML Template

<FlowDocument xmlns=http://schemas.microsoft.com/winfx/2006/xaml/presentation
     xmlns:x=http://schemas.microsoft.com/winfx/2006/xaml>
    <Paragraph FontFamily="Arial" Margin="20">
        <TextBlock Text="&#91;TestString&#93;" FontSize="12"/>
    </Paragraph>
</FlowDocument>

Notice the inclusion of [TestString], this is essentially the “databind” we will use in the string Key/Value pair dictionary.

Using the new Key/Value pair databinding

            Dictionary<string, string> strings = new Dictionary<string,string>();
            strings.Add("TestString", "Tester");
            IDocumentPaginatorSource flowDocument =
                

XamlTemplatePrinter.RenderFlowDocumentTemplate(Path.Combine(Environment.CurrentDirectory, 

"TestTemplate.xaml"), strings);
            flowDocument.DocumentPaginator.PageSize = new Size(96 * 8.5, 96 * 11);
            PrintDialog flowPrintDialog = XamlTemplatePrinter.GetPrintDialog();
            if (flowPrintDialog == null)
                return;
            PrintQueue flowPrintQueue = flowPrintDialog.PrintQueue;
            XamlTemplatePrinter.PrintFlowDocument(flowPrintQueue, 

flowDocument.DocumentPaginator);

Conclusion

Thank you all for your comments and suggestions.  Your questions and comments are fuel for the continued success of this blog.  Please feel free to leave a comment or contact me if you have suggestions or questions.

May 28, 2008 Posted by | C#, WPF, XAML, XPS | 15 Comments

Using FlowDocuments XAML to print XPS Documents. (Part 5)

Introduction

Welcome to Part 5 of the Using FlowDocument XAML to print XPS Documents series.  This article has been a long time coming.  In this article I will focus on creating dynamic XAML content.

Creating a method to Load Xaml Strings

This method simply takes a raw XAML string and loads it into a FlowDocument.

        public static IDocumentPaginatorSource RenderFlowDocumentString(string rawXamlString, object dataContextObject)
        {
            FlowDocument document = XamlReader.Load(new XmlTextReader(new StringReader(rawXamlString))) as FlowDocument;
            if (dataContextObject != null)
            {
                document.DataContext = dataContextObject;
            }
            return document;
        }

Now all we need to do is create some XAML strings.

Using the new Xaml string loader plus some.

        public void PrintGenericList(List<Awards> awardList)
        {
            PrintDialog flowPrintDialog = XamlTemplatePrinter.GetPrintDialog();
            if (flowPrintDialog == null)
                return;


                StringBuilder xamlString = new StringBuilder();
                
                xamlString.Append("<FlowDocument xmlns=\"<a href="http://schemas.microsoft.com/winfx/2006/xaml/presentation\">http://schemas.microsoft.com/winfx/2006/xaml/presentation\</a>" \nxmlns:x=\"<a href="http://schemas.microsoft.com/winfx/2006/xaml\">http://schemas.microsoft.com/winfx/2006/xaml\</a>">");
                
                foreach (Awards award in awardList)
                {
                    xamlString.Append("<Paragraph FontFamily=\"Arial\" Margin=\"20\">");
                    xamlString.AppendFormat("<TextBlock Text=\"{0}  {1}\" Margin=\"170,0,0,0\" />", award.SerialNumber, award.ControlNumber);
                    xamlString.Append("<TextBlock Text=\"\" Margin=\"170,0,0,0\" />");
                    xamlString.Append("</Paragraph>");
                }

                xamlString.Append("</FlowDocument>");

                IDocumentPaginatorSource flowDocument =
                       XamlTemplatePrinter.RenderFlowDocumentString(xamlString.ToString(), null);

                flowDocument.DocumentPaginator.PageSize = new Size(96 * 5, 96 * 2);

                PrintQueue flowPrintQueue = flowPrintDialog.PrintQueue;
                XamlTemplatePrinter.PrintFlowDocument(flowPrintQueue, flowDocument.DocumentPaginator);            
        }

Conclusion

In this article I demonstrated building dynamic XAML to print or same as an XPS document.  I am personally using this in an application to format labels to be sent to a printer.

April 30, 2008 Posted by | C#, WPF, XAML, XPS | 4 Comments

XNA Framework GameEngine Development. (Part 19, Hardware Instancing PC Only)

Introduction

Welcome to Part19 of the XNA FrameWork GameEngine Development series.  In this article I will discuss the magic that is Hardware instancing.  Recent comments have expressed some concerns about poor performance in the engine.  One way to improve performance in the engine is to support instancing.

part19.jpg

Hardware instancing

This is by far the easiest method of instancing, I will discuss shader and vertex fetch instancing in later articles.  Essentially hardware instancing tells the graphics card that you want to draw an object multiple times in the same draw call.  Obviously, this will improve performance since one of the slowest calls you will ever make to a graphics device is draw.  We should always look to draw the most geometry in the fewest draw calls possible.

hardwareinstancing.png

Image courtesy MS XNA Team Site

Since shader model 3.0 we have at our disposal the concept of hardware instancing.  This method of instancing allows us to set up a vertex and index buffer to be drawn multiple times using an array of matrices.  This means pretty much any piece of complex geometry that uses the same effect settings can and should be drawn at the same time.

Starting the Magic

We need a way to identify a normal Model from an instanced model.  I do this by adding a few properties to the RoeModel class.  The ModelParts list will contain information about what else… model parts, and the Instanced proeprty will define if the model was created for instancing.  Finally, using the constructor parameter instanced will allow us to load the content and set up the model parts for instance rendering.

using System;
using System.Collections.Generic;
using System.Text;
using RoeEngine2.Interfaces;
using Microsoft.Xna.Framework.Graphics;
using RoeEngine2.Managers;

namespace RoeEngine2.Models
{
    public class RoeModel : IRoeModel
    {
        public List<RoeModelPart> ModelParts = new List<RoeModelPart>();

        private bool _instanced;
        public bool Instanced
        {
            get { return _instanced; }
        }
 
        private string _fileName;
        /// <summary>
        /// The file name of the asset.
        /// </summary>
        public string FileName
        {
            get { return _fileName; }
            set { _fileName = value; }
        }

        private Model _baseModel;
        public Model BaseModel
        {
            get { return _baseModel; }
        }

        private bool _readyToRender = false;
        ///<summary>
        ///Is the texture ready to be rendered.
        ///</summary>
        public bool ReadyToRender
        {
            get { return _readyToRender; }
        }

        public RoeModel(string fileName, bool instanced)
        {
            _instanced = instanced;
            _fileName = fileName;
        }

        /// <summary>
        /// Construct a new RoeModel.
        /// </summary>
        /// <param name="fileName">The asset file name.</param>
        public RoeModel(string fileName)
        {
            _fileName = fileName;
        }

        public void LoadContent()
        {
            if (!String.IsNullOrEmpty(_fileName))
            {
                _baseModel = EngineManager.ContentManager.Load<Model>(_fileName);
                if (_instanced)
                {
                    foreach (ModelMesh mesh in _baseModel.Meshes)
                    {
                        foreach (ModelMeshPart part in mesh.MeshParts)
                        {
                            ModelParts.Add(new RoeModelPart(part, mesh.VertexBuffer, mesh.IndexBuffer));
                        }
                    }
                }
                _readyToRender = true;
            }
        }
    }
}

Most of the work will happen in the model parts clss, but here you can see how a model is loaded as normal, then if it is instanced we do additional work to load the specific parts.  Please note, we do not have to use any custom processors to create the model parts.  For me this is a huge improvement over the example given on the team development site.

Backstage pass

This is were all the work takes place, here in the RoeModelPart class.  Each mesh part will be set up for prime GPU processing here, plus we will extend the vertex declaration to include the instancing magic.

using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.Xna.Framework.Graphics;
using RoeEngine2.Managers;

namespace RoeEngine2.Models
{
    public class RoeModelPart
    {
        private int _primitiveCount;
        public int PrimitiveCount
        {
            get { return _primitiveCount; }
        }

        private int _vertexCount;
        public int VertexCount
        {
            get { return _vertexCount; }
        }

        private int _vertexStride;
        public int VertexStride
        {
            get { return _vertexStride; }
        }

        private VertexDeclaration _vertexDeclaration;
        public VertexDeclaration VertexDeclartion
        {
            get { return _vertexDeclaration; }
        }

        private VertexBuffer _vertexBuffer;
        public VertexBuffer VertexBuffer
        {
            get { return _vertexBuffer; }
        }

        private IndexBuffer _indexBuffer;
        public IndexBuffer IndexBuffer
        {
            get { return _indexBuffer; }
        }

        VertexElement[] originalVertexDeclaration;
 
        internal RoeModelPart(ModelMeshPart part, VertexBuffer vertexBuffer, IndexBuffer indexBuffer)
        {
            _primitiveCount = part.PrimitiveCount;
            _vertexCount = part.NumVertices;
            _vertexStride = part.VertexStride;
            _vertexDeclaration = part.VertexDeclaration;

            _vertexBuffer = vertexBuffer;
            _indexBuffer = indexBuffer;

            originalVertexDeclaration = part.VertexDeclaration.GetVertexElements();

            InitializeHardwareInstancing();
        }

        private void InitializeHardwareInstancing()
        {
            // When using hardware instancing, the instance transform matrix is
            // specified using a second vertex stream that provides 4x4 matrices
            // in texture coordinate channels 1 to 4. We must modify our vertex
            // declaration to include these channels.
            VertexElement[] extraElements = new VertexElement[4];

            short offset = 0;
            byte usageIndex = 1;
            short stream = 1;

            const int sizeOfVector4 = sizeof(float) * 4;

            for (int i = 0; i < extraElements.Length; i++)
            {
                extraElements&#91;i&#93; = new VertexElement(stream, offset,
                                                VertexElementFormat.Vector4,
                                                VertexElementMethod.Default,
                                                VertexElementUsage.TextureCoordinate,
                                                usageIndex);

                offset += sizeOfVector4;
                usageIndex++;
            }

            ExtendVertexDeclaration(extraElements);
        }

        private void ExtendVertexDeclaration(VertexElement&#91;&#93; extraElements)
        {
            // Get rid of the existing vertex declaration.
            _vertexDeclaration.Dispose();

            // Append the new elements to the original format.
            int length = originalVertexDeclaration.Length + extraElements.Length;

            VertexElement&#91;&#93; elements = new VertexElement&#91;length&#93;;

            originalVertexDeclaration.CopyTo(elements, 0);

            extraElements.CopyTo(elements, originalVertexDeclaration.Length);

            // Create a new vertex declaration.
            _vertexDeclaration = new VertexDeclaration(EngineManager.Device, elements);
        }
    }
}&#91;/sourcecode&#93;

<strong>The Beautiful Assistant</strong>

Now that we have a mesh broken up into its basic parts for fast GPU rendering, we must find a way to store matrices in a simple and easy to manage location.  I accomplish this using the SceneGraphManager.  This class will now support a dictionary keyed by string and support a list of matrices.  In addition, we also need a way to load up the dictionary and finally draw the instances.

        private static Dictionary<string, List<Matrix>> instanceMatrices;

        private static void DrawInstances(GameTime gameTime)
        {
            instancedshaderEffect effect = ShaderManager.GetShader("instance") as instancedshaderEffect;
            if (effect.ReadyToRender)
            {
                effect.View = CameraManager.ActiveCamera.View;
                effect.Projection = CameraManager.ActiveCamera.Projection;

                foreach (string key in instanceMatrices.Keys)
                {
                    RoeModel model = ModelManager.GetModel(key) as RoeModel;
                    if (model.ReadyToRender)
                    {
                        Model meshModel = model.BaseModel;
                        
                        foreach (RoeModelPart part in model.ModelParts)
                        {
                            EngineManager.Device.VertexDeclaration = part.VertexDeclartion;
                            EngineManager.Device.Vertices[0].SetSource(part.VertexBuffer, 0, part.VertexStride);
                            EngineManager.Device.Indices = part.IndexBuffer;
                            effect.BaseEffect.Begin();
                            foreach (EffectPass pass in effect.BaseEffect.CurrentTechnique.Passes)
                            {
                                pass.Begin();
                                DrawHardwareInstancing(instanceMatrices[key].ToArray(), part.VertexCount, part.PrimitiveCount);
                                pass.End();
                            }
                            effect.BaseEffect.End();
                        }
                    }
                }
            }
        }

        private static void DrawHardwareInstancing(Matrix[] matrix, int vertexCount, int primitiveCount)
        {
            const int sizeofMatrix = sizeof(float) * 16;
            int instanceDataSize = sizeofMatrix * matrix.Length;

            DynamicVertexBuffer instanceDataStream = new DynamicVertexBuffer(EngineManager.Device,
                                                                             instanceDataSize,
                                                                             BufferUsage.WriteOnly);

            instanceDataStream.SetData(matrix, 0, matrix.Length, SetDataOptions.Discard);

            VertexStreamCollection vertices = EngineManager.Device.Vertices;

            vertices[0].SetFrequencyOfIndexData(matrix.Length);

            vertices[1].SetSource(instanceDataStream, 0, sizeofMatrix);
            vertices[1].SetFrequencyOfInstanceData(1);

            EngineManager.Device.DrawIndexedPrimitives(PrimitiveType.TriangleList,
                                                       0, 0, vertexCount, 0, primitiveCount);

            // Reset the instancing streams.
            vertices[0].SetSource(null, 0, 0);
            vertices[1].SetSource(null, 0, 0);
        }

MisDirection

Now that we have a simple way to store instance objects and render them, we need to load up the dictionary.  This is done in the SceneObjectNode class.  As usual, we do not want to render objects that are culled.  In addition, if we are using instancing we need to bypass all of the occlusion work.  This is a tradeoff that we have to make, either we use occlusion or we use instancing.

        public override void DrawCulling(GameTime gameTime)
        {
            if (SceneObject is IRoeCullable)
            {
                ((IRoeCullable)SceneObject).Culled = false;
                if (CameraManager.ActiveCamera.Frustum.Contains(((IRoeCullable)SceneObject).GetBoundingBoxTransformed()) == ContainmentType.Disjoint)
                {
                    ((IRoeCullable)SceneObject).Culled = true;
                }
                else if (ModelManager.GetModel(SceneObject.ModelName).Instanced)
                {
                    SceneGraphManager.AddInstance(SceneObject.ModelName, SceneObject.World);
                }
                else
                {
                    SceneObject.DrawCulling(gameTime);
                }
            }
        }

        public override void Draw(GameTime gameTime)
        {
            if (SceneObject.ModelName != null && ModelManager.GetModel(SceneObject.ModelName).Instanced)
            {
                return;
            }
            else if (SceneObject is IRoeCullable && ((IRoeCullable)SceneObject).Culled)
            {
                SceneGraphManager.Culled++;
            }
            else if (SceneObject is IRoeOcclusion && ((IRoeOcclusion)SceneObject).Occluded)
            {
                SceneGraphManager.Occluded++;
            }
            else
            {
                SceneObject.Draw(gameTime);
            }
        }

The Grand Finale

Finally, our dictionary is loaded and we are ready to draw, time for some shader work.

#define MAX_SHADER_MATRICES 60

// Array of instance transforms used by the VFetch and ShaderInstancing techniques.
float4x4 instanceTransforms[MAX_SHADER_MATRICES];

// Camera settings.
float4x4 view;
float4x4 projection;

// This sample uses a simple Lambert lighting model.
float3 lightDirection = normalize(float3(-1, -1, -1));
float3 diffuseLight = 1.25;
float3 ambientLight = 0.25;

struct VS_INPUT
{
 float4 Position : POSITION0;
 float3 Normal : NORMAL;
 float2 TexCoord : TEXCOORD0;
};

struct VS_OUTPUT
{
 float4 Position     : POSITION;
 float4 Color  : COLOR0;
 float2 TexCoord : TEXCOORD0;
};

VS_OUTPUT VertexShaderCommon(VS_INPUT input, float4x4 instanceTransform)
{
    VS_OUTPUT output;

    // Apply the world and camera matrices to compute the output position.
    float4 worldPosition = mul(input.Position, instanceTransform);
    float4 viewPosition = mul(worldPosition, view);
    output.Position = mul(viewPosition, projection);

    // Compute lighting, using a simple Lambert model.
    float3 worldNormal = mul(input.Normal, instanceTransform);
    
    float diffuseAmount = max(-dot(worldNormal, lightDirection), 0);
    
    float3 lightingResult = saturate(diffuseAmount * diffuseLight + ambientLight);
    
    output.Color = float4(lightingResult, 1);

    // Copy across the input texture coordinate.
    output.TexCoord = input.TexCoord;

    return output;
};

// On Windows shader 3.0 cards, we can use hardware instancing, reading
// the per-instance world transform directly from a secondary vertex stream.
VS_OUTPUT HardwareInstancingVertexShader(VS_INPUT input,
                                         float4x4 instanceTransform : TEXCOORD1)
{
    return VertexShaderCommon(input, transpose(instanceTransform));
}

// All the different instancing techniques share this same pixel shader.
float4 PixelShaderFunction(VS_OUTPUT input) : COLOR0
{
    return input.Color;
}

// Windows instancing technique for shader 3.0 cards.
technique HardwareInstancing
{
    pass Pass1
    {
        VertexShader = compile vs_3_0 HardwareInstancingVertexShader();
        PixelShader = compile ps_3_0 PixelShaderFunction();
    }
}

 

March 17, 2008 Posted by | C#, XNA | 26 Comments

XNA Framework GameEngine Development. (Part 18, Brute Force Terrain with Physics)

Introduction

Welcome to Part18 of the XNA Framework GameEngine Development series.  In this article I will introduce terrain rendering concepts.  All good outdoor games need terrain as a primary building block, so it is essential to render terrain as realistically and as performant as possible.  There are dozens of algorithmic approaches to terrain rendering, I could literally spend an entire series discussing terrain.  Here I will discuss some of my favorite terrain rendering methods and maybe a few of your own if you leave suggestions.

part18.jpg

Release

Here is the sourcecode.

Content Pipeline

The XNA content pipeline allows us to create custom processors for data.  I use the content pipeline to convert a 2D heightmap into a 3D model.  This is a well known technique for creating simple terrain, but I will describe it here briefly.

  • Create a greyscale image with black being low laying areas and white being high altitude areas.
  • Assume the world is a square lattice with vertices at precise intervals in the lattice.
  • The height at each point in the lattice is set as the pixel in the greyscale image.
  • Connect the vertices as a triangle strip.

Create a mesh from any greyscale image that is a power of 2 + 1 and voila you have a simple brute force heightmap.

BruteForceTerrainProcessor

Create a new Content Pipeline project.  I will be expanding on this project later, add the following content processor code.

using System;
using System.Collections.Generic;
using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Content.Pipeline;
using Microsoft.Xna.Framework.Content.Pipeline.Graphics;
using Microsoft.Xna.Framework.Content.Pipeline.Processors;

using TInput = Microsoft.Xna.Framework.Content.Pipeline.Graphics.Texture2DContent;
using TOutput = Microsoft.Xna.Framework.Content.Pipeline.Processors.ModelContent;

namespace RoeEngine2ContentPipeline
{
    /// <summary>
    /// This class will be instantiated by the XNA Framework Content Pipeline
    /// to apply custom processing to content data, converting an object of
    /// type TInput to TOutput. The input and output types may be the same if
    /// the processor wishes to alter data without changing its type.
    ///
    /// This should be part of a Content Pipeline Extension Library project.
    ///
    /// TODO: change the ContentProcessor attribute to specify the correct
    /// display name for this processor.
    /// </summary>
    [ContentProcessor(DisplayName = "BruteForceTerrainProcessor")]
    public class BruteForceTerrainProcessor : ContentProcessor<TInput, TOutput>
    {
        public override TOutput Process(TInput input, ContentProcessorContext context)
        {
            MeshBuilder builder = MeshBuilder.StartMesh("Terrain");

            input.ConvertBitmapType(typeof(PixelBitmapContent<float>));

            PixelBitmapContent<float> heightfield;
            heightfield = (PixelBitmapContent<float>)input.Mipmaps[0];

            for (int y = 0; y < heightfield.Height; y++)
            {
                for (int x = 0; x < heightfield.Width; x++)
                {
                    Vector3 position;

                    position.X = (x - heightfield.Width / 2);
                    position.Z = (y - heightfield.Height / 2);
                    position.Y = (heightfield.GetPixel(x, y) - 1);

                    builder.CreatePosition(position);
                }
            }

            int texCoordId = builder.CreateVertexChannel<Vector2>(
                            VertexChannelNames.TextureCoordinate(0));

            for (int y = 0; y < heightfield.Height - 1; y++)
            {
                for (int x = 0; x < heightfield.Width - 1; x++)
                {
                    AddVertex(builder, texCoordId, heightfield.Width, x, y);
                    AddVertex(builder, texCoordId, heightfield.Width, x + 1, y);
                    AddVertex(builder, texCoordId, heightfield.Width, x + 1, y + 1);

                    AddVertex(builder, texCoordId, heightfield.Width, x, y);
                    AddVertex(builder, texCoordId, heightfield.Width, x + 1, y + 1);
                    AddVertex(builder, texCoordId, heightfield.Width, x, y + 1);
                }
            }

            MeshContent terrain = builder.FinishMesh();

            ModelContent model = context.Convert<MeshContent, ModelContent>(terrain, "ModelProcessor");

            model.Tag = new HeightMapContent(heightfield);

            return model;
        }

        /// <summary>
        /// Helper for adding a new triangle vertex to a MeshBuilder,
        /// along with an associated texture coordinate value.
        /// </summary>
        static void AddVertex(MeshBuilder builder, int texCoordId, int w, int x, int y)
        {
            builder.SetVertexChannelData(texCoordId, new Vector2(x, y) / w);

            builder.AddTriangleVertex(x + y * w);
        }
    }
}

Now we need a way to write and read data that we want to store in the model.Tag field.
using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.Xna.Framework.Content.Pipeline.Graphics;
using Microsoft.Xna.Framework.Content.Pipeline.Serialization.Compiler;
using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Content;

namespace RoeEngine2ContentPipeline
{
    public class HeightMapContent
    {
        private float[,] _height;

        public float[,] Height
        {
            get { return _height; }
            set { _height = value; }
        }

        public HeightMapContent(PixelBitmapContent bitmap)
        {
            _height = new float[bitmap.Width, bitmap.Height];
            for (int y = 0; y < bitmap.Height; y++)             {                 for (int x = 0; x < bitmap.Width; x++)                 {                     // the pixels will vary from 0 (black) to 1 (white).                     // by subtracting 1, our heights vary from -1 to 0, which we then                     // multiply by the "bumpiness" to get our final height.                     _height[x, y] = (bitmap.GetPixel(x, y) - 1);                 }             }         }      }     ///

    /// A TypeWriter for HeightMapInfo, which tells the content pipeline how to save the
    /// data in HeightMapInfo. This class should match HeightMapInfoReader: whatever the
    /// writer writes, the reader should read.
    ///

    [ContentTypeWriter]
    public class HeightMapInfoWriter : ContentTypeWriter
    {
        protected override void Write(ContentWriter output, HeightMapContent value)
        {
            output.Write(value.Height.GetLength(0));
            output.Write(value.Height.GetLength(1));
            foreach (float height in value.Height)
            {
                output.Write(height);
            }
        }

        public override string GetRuntimeReader(TargetPlatform targetPlatform)
        {
            return typeof(HeightMapReader).AssemblyQualifiedName;
        }

        public override string GetRuntimeType(TargetPlatform targetPlatform)
        {
            return typeof(HeightMap).AssemblyQualifiedName;
        }
    }

    public class HeightMap
    {
        public float[,] Heights;

        public HeightMap(float[,] heights)
        {
            Heights = heights;
        }
    }

    public class HeightMapReader : ContentTypeReader
    {
        protected override HeightMap Read(ContentReader input, HeightMap existingInstance)
        {
            int width = input.ReadInt32();
            int height = input.ReadInt32();
            float[,] heights = new float[width, height];

            for (int x = 0; x < width; x++)             {                 for (int z = 0; z < height; z++)                 {                     heights[x, z] = input.ReadSingle();                 }             }             return new HeightMap(heights);         }     } }[/sourcecode] I chose to create the reader and writer in the same namespace, this makes getting the runtime reader and type much easier. Creating Terrain SceneObjects

Now that we have a content processor, we can now import the model and render it in our game.  Include a reference to the content pipeline in both the content reference and in the reference section of the main engine project.

Create a BruteForceTerrain SceneObject

using System;
using System.Collections.Generic;
using System.Text;
using RoeEngine2.SceneObject.BaseObjects;
using RoeEngine2.Interfaces;
using JigLibX.Physics;
using JigLibX.Collision;
using Microsoft.Xna.Framework;
using RoeEngine2.Models;
using RoeEngine2.Managers;
using JigLibX.Utils;
using Microsoft.Xna.Framework.Graphics;
using RoeEngine2ContentPipeline;

namespace RoeEngine2.SceneObject.StandardObjects
{
    public class BruteForceTerrain: OccluderSceneObject, IRoePhysics
    {
        private Body _body;
        public Body Body
        {
            get { return _body; }
        }

        private CollisionSkin _collisionSkin;
        public CollisionSkin CollisionSkin
        {
            get { return _collisionSkin; }
        } 

        public BruteForceTerrain()
        {
            _body = new Body();
            _collisionSkin = new CollisionSkin(null);

            RoeModel terrainModel = new RoeModel(“Content/Models/TerrainRoE”);
            ModelManager.AddModel(terrainModel, “terrainModel”);

            this.ModelName = “terrainModel”;
            this.OcclusionModelName = “terrainModel”;
        }

        public BruteForceTerrain(Vector3 newPosition)
        {
            Position = newPosition;

            _body = new Body();
            _collisionSkin = new CollisionSkin(null);

            RoeModel boxmodel = new RoeModel(“Content/Models/Box”);

            ModelManager.AddModel(boxmodel, “boxmodel”);

            this.ModelName = “boxmodel”;
            this.OcclusionModelName = “boxmodel”;
        }

        public Vector3 SetMass(float mass)
        {
            return Vector3.Zero;
        }

        public override void Update(GameTime gameTime)
        {
            base.Update(gameTime);

            IRoeModel model = ModelManager.GetModel(ModelName);
            if (model != null && model.ReadyToRender && !ReadyToRender)
            {
                HeightMap heightMap = model.BaseModel.Tag as HeightMap;

                Array2D field = new Array2D(heightMap.Heights.GetUpperBound(0), heightMap.Heights.GetUpperBound(1));

                for (int x = 0; x < heightMap.Heights.GetUpperBound(0); x++)                 {                     for (int z = 0; z < heightMap.Heights.GetUpperBound(1); z++)                     {                         field.SetAt(x, z, heightMap.Heights[x, z] * Scale.Y + Position.Y );                     }                 }                 _body.MoveTo(Position, Matrix.Identity);                 _collisionSkin.AddPrimitive(new JigLibX.Geometry.Heightmap(field, Position.X, Position.Z, Scale.X, Scale.Z),                                             (int)MaterialTable.MaterialID.UserDefined,                                             new MaterialProperties(0.7f, 0.7f, 0.6f));                 PhysicsSystem.CurrentPhysicsSystem.CollisionSystem.AddCollisionSkin(_collisionSkin);                 ReadyToRender = true;             }         }         public override void UnloadContent()         {             PhysicsManager.RemoveObject(this);         }         public override string ToString()         {             return "Brute Force Terrain";         }     } }[/sourcecode] Simple, now we can use this new class to render the terrain object. [sourcecode language='csharp']            BruteForceTerrain terrain = new BruteForceTerrain();             terrain.Position = new Vector3(0f, -15f, 0f);             terrain.Scale = new Vector3(10, 10, 10);             SceneGraphManager.AddObject(terrain);[/sourcecode] Conclusion

In this article I introduced the content pipeline and simple brute force terrain.  This article will be the kickoff for a terrain rendering sub series.  I look forward to your comments and suggestions.

March 12, 2008 Posted by | C#, XBOX360, XNA | 9 Comments

XNA Framework GameEngine Development. (Part 17, Hoffman Atmosphere)

Introduction

Welcome to Part17 of the XNA Framework GameEngine Development series.  In this article I will be introducing non-renderable scene objects to the engine.   Non-renderable scene objects are a unique subset of the SceneObject class that contain no geometry.  This is not necessarily interesting, accept that they are loaded into the Scene Management system as a normal renderable scene object and do take their turn with the renderer.  Mainly, I use this type of scene object to set up special shader commands, enter the Hoffman scattering algorithm.

part17.jpg

Release

Here is the code, I really want to hear your ideas to make the engine better, please leave a comment.

Hoffman Light Scattering

Hoffman/Preetham light scattering is an algorithmic approach to model how light will deflect off gas particles in the atmosphere.  The main goal of the algorithm is to define two angles of approach for light particles entering the eye.  In the Hoffman simulation these angles are determined analytically based on some pretty intense research that is beyond the scope of this series.  What this article will focus on is making it look good and render fast in a gameing situation.

Non-Geometric SceneObjects

This is a new concept in the engine, essentially a Non-Geometric SceneObject is exactly that, a scene object that does not have geometry to render.  This type of scene object will give us access to use all of the events that a normal scene object would have, but without actually sending polygons to the graphics card.  As a side effect, which I will show in an upcoming article, this class of sceneobject will allow us to share shader information between geometric sceneobjects.
Now that we have this new concept available to use we can create Atmosphere objects.  An atmoshpere object will be used to set up outdoor lighting and fogging, almost every object in an outdoor environment can use the atmosphere to render major light reflections.

Enter the HoffmanAtmosphere, this Non-Geometric Sceneobject will set up all of the maths necessary to compute the two attenuation angles required in the Hoffman light scattering algorithm, as well as store some important variables.

using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.Xna.Framework;
using RoeEngine2.Shaders;
using RoeEngine2.Managers;

#if !XBOX360
using System.ComponentModel;
using RoeEngine2.Helpers;
#endif

namespace RoeEngine2.SceneObject.NonGeometricObjects
{
    public class HoffmanAtmosphere : RoeSceneObject
    {
        private float _sunDirection = 0;
        [PropertyAttribute(typeof(float), 0, 180, 1)]
        public float SunDirection
        {
            get { return _sunDirection; }
            set
            {
                _sunDirection = value;
                if (_sunDirection > 180.0f)
                    _sunDirection = 0;
                else if (_sunDirection < 0.0f)
                    _sunDirection = 0;
                Position = new Vector3(0.0f, (float)Math.Sin(MathHelper.ToRadians(value)), (float)Math.Cos(MathHelper.ToRadians(value)));
            }
        }

        private float _sunIntensity = 1.0f;
        &#91;PropertyAttribute(typeof(float), 0, 10, 1)&#93;
        public float SunIntensity
        {
            get { return _sunIntensity; }
            set { _sunIntensity = value; }
        }

        private float _turbitity = 1.0f;
        &#91;PropertyAttribute(typeof(float), 1, 10, 1)&#93;
        public float Turbitity
        {
            get { return _turbitity; }
            set { _turbitity = value; }
        }

        private Vector3 _hGg = new Vector3(0.9f, 0.9f, 0.9f);
        public Vector3 HGg
        {
            get { return _hGg; }
            set { _hGg = value; }
        }

        private float _inscatteringMultiplier = 1.0f;
        &#91;PropertyAttribute(typeof(float), 0, 10, 1)&#93;
        public float InscatteringMultiplier
        {
            get { return _inscatteringMultiplier; }
            set { _inscatteringMultiplier = value; }
        }

        private float _betaRayMultiplier = 8.0f;
        &#91;PropertyAttribute(typeof(float), 0, 20, 1)&#93;
        public float BetaRayMultiplier
        {
            get { return _betaRayMultiplier; }
            set { _betaRayMultiplier = value; }
        }

        private float _betaMieMultiplier = 0.00005f;
        &#91;PropertyAttribute(typeof(float), 0, 10, 100000)&#93;
        public float BetaMieMultiplier
        {
            get { return _betaMieMultiplier; }
            set { _betaMieMultiplier = value; }
        }

        private Vector3 _betaRPlusBetaM;
        private Vector3 _betaDashR;
        private Vector3 _betaDashM;
        private Vector3 _oneOverBetaRPlusBetaM;
        private Vector4 _multipliers;
        private Vector4 _sunColorAndIntensity;

        private Vector3 _betaRay;
        private Vector3 _betaDashRay;
        private Vector3 _betaMie;
        private Vector3 _betaDashMie;

        public override string ToString()
        {
            return "Atmosphere";
        }

        public HoffmanAtmosphere()
        {
            hoffmanshaderEffect effect = new hoffmanshaderEffect();

            ShaderManager.AddShader(effect, "hoffman");
            Material.Shader = "hoffman";

            ReadyToRender = true;

            SunDirection = 0;

            const float n = 1.0003f;
            const float N = 2.545e25f;
            const float pn = 0.035f;

            float&#91;&#93; lambda = new float&#91;3&#93;;
            float&#91;&#93; lambda2 = new float&#91;3&#93;;
            float&#91;&#93; lambda4 = new float&#91;3&#93;;

            lambda&#91;0&#93; = 1.0f / 650e-9f;   // red
            lambda&#91;1&#93; = 1.0f / 570e-9f;   // green
            lambda&#91;2&#93; = 1.0f / 475e-9f;   // blue

            for (int i = 0; i < 3; ++i)
            {
                lambda2&#91;i&#93; = lambda&#91;i&#93; * lambda&#91;i&#93;;
                lambda4&#91;i&#93; = lambda2&#91;i&#93; * lambda2&#91;i&#93;;
            }

            Vector3 vLambda2 = new Vector3(lambda2&#91;0&#93;, lambda2&#91;1&#93;, lambda2&#91;2&#93;);
            Vector3 vLambda4 = new Vector3(lambda4&#91;0&#93;, lambda4&#91;1&#93;, lambda4&#91;2&#93;);

            // Rayleigh scattering constants

            const float temp = (float)(Math.PI * Math.PI * (n * n - 1.0f) * (n * n - 1.0f) * (6.0f + 3.0f * pn) / (6.0f - 7.0f * pn) / N);
            const float beta = (float)(8.0f * temp * Math.PI / 3.0f);

            _betaRay = beta * vLambda4;

            const float betaDash = temp / 2.0f;

            _betaDashRay = betaDash * vLambda4;

            // Mie scattering constants

            const float T = 2.0f;
            const float c = (6.544f * T - 6.51f) * 1e-17f;
            const float temp2 = (float)(0.434f * c * (2.0f * Math.PI) * (2.0f * Math.PI) * 0.5f);

            _betaDashMie = temp2 * vLambda2;

            float&#91;&#93; K = new float&#91;3&#93; { 0.685f, 0.679f, 0.670f };
            const float temp3 = (float)(0.434f * c * Math.PI * (2.0f * Math.PI) * (2.0f * Math.PI));

            Vector3 vBetaMieTemp = new Vector3(K&#91;0&#93; * lambda2&#91;0&#93;, K&#91;1&#93; * lambda2&#91;1&#93;, K&#91;2&#93; * lambda2&#91;2&#93;);

            _betaMie = temp3 * vBetaMieTemp;
        }

        public override void Draw(GameTime gameTime)
        {
            if (!ReadyToRender)
            {
                ReadyToRender = ShaderManager.GetShader("hoffman").ReadyToRender;
            }
            else
            {
                Vector3 vZenith = new Vector3(0.0f, 1.0f, 0.0f);

                float thetaS = Vector3.Dot(Position, vZenith);
                thetaS = (float)(Math.Acos(thetaS));

                ComputeAttenuation(thetaS);
                SetMaterialProperties();
            }
        }

        private void ComputeAttenuation(float thetaS)
        {
            float beta = 0.04608365822050f * _turbitity - 0.04586025928522f;
            float tauR, tauA;
            float&#91;&#93; fTau = new float&#91;3&#93;;
            float m = (float)(1.0f / (Math.Cos(thetaS) + 0.15f * Math.Pow(93.885f - thetaS / Math.PI * 180.0f, -1.253f)));  // Relative Optical Mass
            float&#91;&#93; lambda = new float&#91;3&#93; { 0.65f, 0.57f, 0.475f };

            for (int i = 0; i < 3; ++i)
            {
                // Rayleigh Scattering
                // lambda in um.
                tauR = (float)(Math.Exp(-m * 0.008735f * Math.Pow(lambda&#91;i&#93;, -4.08f)));

                // Aerosal (water + dust) attenuation
                // beta - amount of aerosols present
                // alpha - ratio of small to large particle sizes. (0:4,usually 1.3)
                const float alpha = 1.3f;
                tauA = (float)(Math.Exp(-m * beta * Math.Pow(lambda&#91;i&#93;, -alpha)));  // lambda should be in um

                fTau&#91;i&#93; = tauR * tauA;
            }

            _sunColorAndIntensity = new Vector4(fTau&#91;0&#93;, fTau&#91;1&#93;, fTau&#91;2&#93;, _sunIntensity * 100.0f);

        }

        private void SetMaterialProperties()
        {
            float reflectance = 0.1f;

            Vector3 vecBetaR = _betaRay * _betaRayMultiplier;
            _betaDashR = _betaDashRay * _betaRayMultiplier;
            Vector3 vecBetaM = _betaMie * _betaMieMultiplier;
            _betaDashM = _betaDashMie * _betaMieMultiplier;
            _betaRPlusBetaM = vecBetaR + vecBetaM;
            _oneOverBetaRPlusBetaM = new Vector3(1.0f / _betaRPlusBetaM.X, 1.0f / _betaRPlusBetaM.Y, 1.0f / _betaRPlusBetaM.Z);
            Vector3 vecG = new Vector3(1.0f - _hGg.X * _hGg.X, 1.0f + _hGg.X * _hGg.X, 2.0f * _hGg.X);
            _multipliers = new Vector4(_inscatteringMultiplier, 0.138f * reflectance, 0.113f * reflectance, 0.08f * reflectance);

            hoffmanshaderEffect hoffmanShader = ShaderManager.GetShader(Material.Shader) as hoffmanshaderEffect;

            hoffmanShader.SunDirection = Position;
            hoffmanShader.BetaRPlusBetaM = _betaRPlusBetaM;
            hoffmanShader.HGg = HGg;
            hoffmanShader.BetaDashR = _betaDashR;
            hoffmanShader.BetaDashM = _betaDashM;
            hoffmanShader.OneOverBetaRPlusBetaM = _oneOverBetaRPlusBetaM;
            hoffmanShader.Multipliers = _multipliers;
            hoffmanShader.SunColorAndIntensity = _sunColorAndIntensity;
        }
    }
}&#91;/sourcecode&#93;<strong>Introduction to HLSL - YEAH RIGHT!</strong>

I feel pretty bad about using this shader as the first example of High Level Shader Language (HLSL), I will most likely write a sub-series of articles to help introduce shaders when I am finished with the major pieces of the engine.  This shader has been discussed at length on the GameDev.net site so I will not bore you with the details here, ignore the terrain stuff for now, I will be demonstrating that soon.

float4x4 worldViewProject;
float4x4 worldView;

float3 sunDirection;
float3 betaRPlusBetaM;
float3 hGg;
float3 betaDashR;
float3 betaDashM;
float3 oneOverBetaRPlusBetaM;
float4 multipliers;
float4 sunColorAndIntensity;

float3 groundCursorPosition;
bool showGroundCursor;

texture terrainAlpha, terrainBreakup, terrainOne, terrainTwo, terrainThree, terrainFour, groundCursor;

sampler samplergroundCursor = sampler_state
{
   Texture = <groundCursor>;
   ADDRESSU = CLAMP;
   ADDRESSV = CLAMP;
   MAGFILTER = LINEAR;
   MINFILTER = LINEAR;
   MIPFILTER = LINEAR;
};

sampler samplerAlpha = sampler_state
{
 texture = <terrainAlpha>;
 MINFILTER = ANISOTROPIC;
 MAGFILTER = ANISOTROPIC;
 MIPFILTER = ANISOTROPIC;
};

sampler samplerOne = sampler_state
{
 texture = <terrainOne>;
 MINFILTER = ANISOTROPIC;
 MAGFILTER = ANISOTROPIC;
 MIPFILTER = ANISOTROPIC;
 ADDRESSU = WRAP;
 ADDRESSV = WRAP;
};

sampler samplerTwo = sampler_state
{
 texture = <terrainTwo>;
 MINFILTER = ANISOTROPIC;
 MAGFILTER = ANISOTROPIC;
 MIPFILTER = ANISOTROPIC;
 ADDRESSU = WRAP;
 ADDRESSV = WRAP;
};

sampler samplerThree = sampler_state
{
 texture = <terrainThree>;
 MINFILTER = ANISOTROPIC;
 MAGFILTER = ANISOTROPIC;
 MIPFILTER = ANISOTROPIC;
 ADDRESSU = WRAP;
 ADDRESSV = WRAP;
};

sampler samplerFour = sampler_state
{
 texture = <terrainFour>;
 MINFILTER = ANISOTROPIC;
 MAGFILTER = ANISOTROPIC;
 MIPFILTER = ANISOTROPIC;
 ADDRESSU = WRAP;
 ADDRESSV = WRAP;
};

float4 constants = { 0.25f, 1.4426950f, 0.5f, 0.0f };

struct VS_INPUT
{
 float4 Position : POSITION0;
 float3 Normal : NORMAL;
 float2 TexCoord : TEXCOORD0;
};

struct VS_OUTPUT
{
 float4 Position     : POSITION;
 float2 TerrainCoord : TEXCOORD0;
 float3 Normal  : TEXCOORD1;
 float3 Lin          : COLOR0;
 float3 Fex          : COLOR1;
};

VS_OUTPUT HoffmanShader(VS_INPUT Input)
{ 
 float4 worldPos = mul(Input.Position, worldView);
 float3 viewDir = normalize(worldPos.xyz);
 float distance = length(worldPos.xyz);
 
 float3 sunDir= normalize(mul(float4(sunDirection, 0.0), worldView ).xyz);
 
 float theta = dot(sunDir, viewDir);
 
 // 
 // Phase1 and Phase2
 //

 float phase1 = 1.0 + theta * theta;
 float phase2 = pow( rsqrt( hGg.y - hGg.z * theta ), 3 ) * hGg.x;

 //
 // Extinction term
 //

     float3 extinction      = exp( -betaRPlusBetaM * distance * constants.x );
     float3 totalExtinction = extinction * multipliers.yzw;
    
 //
 // Inscattering term
 //

 float3 betaRay = betaDashR * phase1;
 float3 betaMie = betaDashM * phase2;

 float3 inscatter = (betaRay + betaMie) * oneOverBetaRPlusBetaM * (1.0 - extinction);

 //
 // Apply inscattering contribution factors
 //

 inscatter *= multipliers.x;
 //
 // Scale with sun color & intensity
 //

 inscatter       *= sunColorAndIntensity.xyz * sunColorAndIntensity.w;
 totalExtinction *= sunColorAndIntensity.xyz * sunColorAndIntensity.w;

 VS_OUTPUT Output;
 Output.Position = mul(Input.Position, worldViewProject);
  Output.TerrainCoord = Input.TexCoord.xy;
  Output.Normal = Input.Normal;
 Output.Lin = inscatter;
 Output.Fex = totalExtinction;

 return Output;
};

struct PS_INPUT
{
 float4 Position : POSITION;
 float2 TerrainCoord : TEXCOORD0;
 float3 Normal : TEXCOORD1;
 float3 Lin : COLOR0;
 float3 Fex : COLOR1;
};

float4 SkyShader(PS_INPUT Input) : COLOR0
{
 return float4(Input.Lin, 1.0f);
};

float4 TerrainShader(PS_INPUT Input) : COLOR0
{
 Input.Normal = normalize(Input.Normal);

 vector alphaSamp = tex2D(samplerAlpha, Input.TerrainCoord);
 vector oneSamp = tex2D(samplerOne, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 vector twoSamp = tex2D(samplerTwo, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 vector threeSamp = tex2D(samplerThree, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 vector fourSamp = tex2D(samplerFour, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 
 float4 tester1 = 1.0 - alphaSamp.a;
 float4 tester2 = 1.0 - alphaSamp.b;
 float4 tester3 = 1.0 - alphaSamp.g;
 float4 tester4 = 1.0 - alphaSamp.r;
 
 float4 tester = lerp(threeSamp, oneSamp, saturate(dot(float3(0, 1, 0), Input.Normal) * 2));
 
 vector l = alphaSamp.a * oneSamp + tester1 * tester;
 vector m = alphaSamp.b * twoSamp + tester2 * l;
 vector o = alphaSamp.g * threeSamp + tester3 * m;
 vector p = alphaSamp.r * fourSamp + tester4 * o;
  
 float4 albedo = saturate((dot(normalize(sunDirection), Input.Normal) + .9f)) * p;
 
 albedo *= float4(Input.Fex, 1.0f);
 albedo += float4(Input.Lin, 1.0f);
 
 return albedo;
};

float4 TerrainShaderWithCursor(PS_INPUT Input) : COLOR0
{
 Input.Normal = normalize(Input.Normal);

 vector alphaSamp = tex2D(samplerAlpha, Input.TerrainCoord);
 vector oneSamp = tex2D(samplerOne, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 vector twoSamp = tex2D(samplerTwo, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 vector threeSamp = tex2D(samplerThree, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 vector fourSamp = tex2D(samplerFour, float2(Input.TerrainCoord.x * 2048, Input.TerrainCoord.y * 2048));
 
 float4 tester1 = 1.0 - alphaSamp.a;
 float4 tester2 = 1.0 - alphaSamp.b;
 float4 tester3 = 1.0 - alphaSamp.g;
 float4 tester4 = 1.0 - alphaSamp.r;
 
 float4 tester0 = lerp(threeSamp, oneSamp, saturate(dot(float3(0, 1, 0), Input.Normal) * 2));
 
 vector l = alphaSamp.a * oneSamp + tester1 * tester0;
 vector m = alphaSamp.b * twoSamp + tester2 * l;
 vector o = alphaSamp.g * threeSamp + tester3 * m;
 vector p = alphaSamp.r * fourSamp + tester4 * o;
  
 float4 albedo = saturate((dot(normalize(sunDirection), Input.Normal) + .9f)) * p;
 
 albedo *= float4(Input.Fex, 1.0f);
 albedo += float4(Input.Lin, 1.0f);
 
 if(showGroundCursor)
 {
  float cursorScale = 40.0f;
  albedo += tex2D(samplergroundCursor,
   (Input.TerrainCoord * (cursorScale)) -
   (groundCursorPosition.xz * (cursorScale)) + 0.5f);
 }
 
 return albedo;
};

float4 Wireframe( PS_INPUT Input ) : COLOR0
{
    return float4( 1, 1, 1, 1 );
};

technique Sky
{
 pass P0
 {
  CullMode = CCW;
  FillMode = Solid;
  VertexShader = compile vs_3_0 HoffmanShader();
  PixelShader = compile ps_3_0 SkyShader(); 
 }
};

technique SkyWireframe
{
 pass P0
 {
  CullMode = CCW;
  FillMode = Solid;
  VertexShader = compile vs_3_0 HoffmanShader();
  PixelShader = compile ps_3_0 SkyShader(); 
 }
 pass P1
 {
  CullMode = CCW;
  FillMode = Wireframe;
  VertexShader = compile vs_3_0 HoffmanShader();
  PixelShader = compile ps_1_1 Wireframe();
 }
};

technique Terrain
{
 pass P0
 {
  CullMode = CCW;
  FillMode = Solid;
  VertexShader = compile vs_3_0 HoffmanShader();
  PixelShader = compile ps_3_0 TerrainShader();  
 }
};

technique TerrainWithCursor
{
 pass P0
 {
  CullMode = CCW;
  FillMode = Solid;
  VertexShader = compile vs_3_0 HoffmanShader();
  PixelShader = compile ps_3_0 TerrainShaderWithCursor();  
 }
};

technique TerrainWireframe
{
 pass P0
 {
  CullMode = CCW;
  FillMode = Solid;
  VertexShader = compile vs_3_0 HoffmanShader();
  PixelShader = compile ps_3_0 TerrainShader(); 
  
 }
  pass P1
 {
  CullMode = CCW;
  FillMode = Wireframe;
  VertexShader = compile vs_3_0 HoffmanShader();
  PixelShader = compile ps_1_1 Wireframe();
 }
};

Now that we have a working shader, it is time to create our first shader proxy.  Remember way back to Part4 of this series, we created an effect code generator.  Pretty much that is an executable that will create a shader file for us when we drag a .fx file onto the executable.  You will get something like the following.

using System;
using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Content;
using Microsoft.Xna.Framework.Graphics;
using RoeEngine2.Interfaces;

namespace RoeEngine2.Shaders
{
 public class hoffmanshaderEffect : IRoeShader
 {
  public enum Techniques
  {
   Sky,
   SkyWireframe,
   Terrain,
   TerrainWithCursor,
   TerrainWireframe,
  }

  private Effect _baseEffect;
  ///<summary>
  ///Gets the underlying Effect.
  ///</summary>
  public Effect BaseEffect
  {
   get { return _baseEffect; }
  }

  private bool _readyToRender = false;
  ///<summary>
  ///Is the shader ready to be rendered.
  ///</summary>
  public bool ReadyToRender
  {
   get { return _readyToRender; }
  }

  #region Effect Parameters

  private EffectParameter _worldViewProjectParam;
  public Matrix WorldViewProject
  {
   get
   {
    if (_worldViewProjectParam == null)
     throw new Exception("Cannot get value of WorldViewProject; WorldViewProject EffectParameter is null.");
    return _worldViewProjectParam.GetValueMatrix();
   }
   set
   {
    if (_worldViewProjectParam == null)
     throw new Exception("Cannot set value of WorldViewProject; WorldViewProject EffectParameter is null.");
    _worldViewProjectParam.SetValue(value);
   }
  }

  private EffectParameter _worldViewParam;
  public Matrix WorldView
  {
   get
   {
    if (_worldViewParam == null)
     throw new Exception("Cannot get value of WorldView; WorldView EffectParameter is null.");
    return _worldViewParam.GetValueMatrix();
   }
   set
   {
    if (_worldViewParam == null)
     throw new Exception("Cannot set value of WorldView; WorldView EffectParameter is null.");
    _worldViewParam.SetValue(value);
   }
  }

  private EffectParameter _sunDirectionParam;
  public Vector3 SunDirection
  {
   get
   {
    if (_sunDirectionParam == null)
     throw new Exception("Cannot get value of SunDirection; SunDirection EffectParameter is null.");
    return _sunDirectionParam.GetValueVector3();
   }
   set
   {
    if (_sunDirectionParam == null)
     throw new Exception("Cannot set value of SunDirection; SunDirection EffectParameter is null.");
    _sunDirectionParam.SetValue(value);
   }
  }

  private EffectParameter _betaRPlusBetaMParam;
  public Vector3 BetaRPlusBetaM
  {
   get
   {
    if (_betaRPlusBetaMParam == null)
     throw new Exception("Cannot get value of BetaRPlusBetaM; BetaRPlusBetaM EffectParameter is null.");
    return _betaRPlusBetaMParam.GetValueVector3();
   }
   set
   {
    if (_betaRPlusBetaMParam == null)
     throw new Exception("Cannot set value of BetaRPlusBetaM; BetaRPlusBetaM EffectParameter is null.");
    _betaRPlusBetaMParam.SetValue(value);
   }
  }

  private EffectParameter _hGgParam;
  public Vector3 HGg
  {
   get
   {
    if (_hGgParam == null)
     throw new Exception("Cannot get value of HGg; HGg EffectParameter is null.");
    return _hGgParam.GetValueVector3();
   }
   set
   {
    if (_hGgParam == null)
     throw new Exception("Cannot set value of HGg; HGg EffectParameter is null.");
    _hGgParam.SetValue(value);
   }
  }

  private EffectParameter _betaDashRParam;
  public Vector3 BetaDashR
  {
   get
   {
    if (_betaDashRParam == null)
     throw new Exception("Cannot get value of BetaDashR; BetaDashR EffectParameter is null.");
    return _betaDashRParam.GetValueVector3();
   }
   set
   {
    if (_betaDashRParam == null)
     throw new Exception("Cannot set value of BetaDashR; BetaDashR EffectParameter is null.");
    _betaDashRParam.SetValue(value);
   }
  }

  private EffectParameter _betaDashMParam;
  public Vector3 BetaDashM
  {
   get
   {
    if (_betaDashMParam == null)
     throw new Exception("Cannot get value of BetaDashM; BetaDashM EffectParameter is null.");
    return _betaDashMParam.GetValueVector3();
   }
   set
   {
    if (_betaDashMParam == null)
     throw new Exception("Cannot set value of BetaDashM; BetaDashM EffectParameter is null.");
    _betaDashMParam.SetValue(value);
   }
  }

  private EffectParameter _oneOverBetaRPlusBetaMParam;
  public Vector3 OneOverBetaRPlusBetaM
  {
   get
   {
    if (_oneOverBetaRPlusBetaMParam == null)
     throw new Exception("Cannot get value of OneOverBetaRPlusBetaM; OneOverBetaRPlusBetaM EffectParameter is null.");
    return _oneOverBetaRPlusBetaMParam.GetValueVector3();
   }
   set
   {
    if (_oneOverBetaRPlusBetaMParam == null)
     throw new Exception("Cannot set value of OneOverBetaRPlusBetaM; OneOverBetaRPlusBetaM EffectParameter is null.");
    _oneOverBetaRPlusBetaMParam.SetValue(value);
   }
  }

  private EffectParameter _multipliersParam;
  public Vector4 Multipliers
  {
   get
   {
    if (_multipliersParam == null)
     throw new Exception("Cannot get value of Multipliers; Multipliers EffectParameter is null.");
    return _multipliersParam.GetValueVector4();
   }
   set
   {
    if (_multipliersParam == null)
     throw new Exception("Cannot set value of Multipliers; Multipliers EffectParameter is null.");
    _multipliersParam.SetValue(value);
   }
  }

  private EffectParameter _sunColorAndIntensityParam;
  public Vector4 SunColorAndIntensity
  {
   get
   {
    if (_sunColorAndIntensityParam == null)
     throw new Exception("Cannot get value of SunColorAndIntensity; SunColorAndIntensity EffectParameter is null.");
    return _sunColorAndIntensityParam.GetValueVector4();
   }
   set
   {
    if (_sunColorAndIntensityParam == null)
     throw new Exception("Cannot set value of SunColorAndIntensity; SunColorAndIntensity EffectParameter is null.");
    _sunColorAndIntensityParam.SetValue(value);
   }
  }

  private EffectParameter _groundCursorPositionParam;
  public Vector3 GroundCursorPosition
  {
   get
   {
    if (_groundCursorPositionParam == null)
     throw new Exception("Cannot get value of GroundCursorPosition; GroundCursorPosition EffectParameter is null.");
    return _groundCursorPositionParam.GetValueVector3();
   }
   set
   {
    if (_groundCursorPositionParam == null)
     throw new Exception("Cannot set value of GroundCursorPosition; GroundCursorPosition EffectParameter is null.");
    _groundCursorPositionParam.SetValue(value);
   }
  }

  private EffectParameter _showGroundCursorParam;
  public bool ShowGroundCursor
  {
   get
   {
    if (_showGroundCursorParam == null)
     throw new Exception("Cannot get value of ShowGroundCursor; ShowGroundCursor EffectParameter is null.");
    return _showGroundCursorParam.GetValueBoolean();
   }
   set
   {
    if (_showGroundCursorParam == null)
     throw new Exception("Cannot set value of ShowGroundCursor; ShowGroundCursor EffectParameter is null.");
    _showGroundCursorParam.SetValue(value);
   }
  }

  private EffectParameter _terrainAlphaParam;
  public Texture2D TerrainAlpha
  {
   get
   {
    if (_terrainAlphaParam == null)
     throw new Exception("Cannot get value of TerrainAlpha; TerrainAlpha EffectParameter is null.");
    return _terrainAlphaParam.GetValueTexture2D();
   }
   set
   {
    if (_terrainAlphaParam == null)
     throw new Exception("Cannot set value of TerrainAlpha; TerrainAlpha EffectParameter is null.");
    _terrainAlphaParam.SetValue(value);
   }
  }

  private EffectParameter _terrainBreakupParam;
  public Texture2D TerrainBreakup
  {
   get
   {
    if (_terrainBreakupParam == null)
     throw new Exception("Cannot get value of TerrainBreakup; TerrainBreakup EffectParameter is null.");
    return _terrainBreakupParam.GetValueTexture2D();
   }
   set
   {
    if (_terrainBreakupParam == null)
     throw new Exception("Cannot set value of TerrainBreakup; TerrainBreakup EffectParameter is null.");
    _terrainBreakupParam.SetValue(value);
   }
  }

  private EffectParameter _terrainOneParam;
  public Texture2D TerrainOne
  {
   get
   {
    if (_terrainOneParam == null)
     throw new Exception("Cannot get value of TerrainOne; TerrainOne EffectParameter is null.");
    return _terrainOneParam.GetValueTexture2D();
   }
   set
   {
    if (_terrainOneParam == null)
     throw new Exception("Cannot set value of TerrainOne; TerrainOne EffectParameter is null.");
    _terrainOneParam.SetValue(value);
   }
  }

  private EffectParameter _terrainTwoParam;
  public Texture2D TerrainTwo
  {
   get
   {
    if (_terrainTwoParam == null)
     throw new Exception("Cannot get value of TerrainTwo; TerrainTwo EffectParameter is null.");
    return _terrainTwoParam.GetValueTexture2D();
   }
   set
   {
    if (_terrainTwoParam == null)
     throw new Exception("Cannot set value of TerrainTwo; TerrainTwo EffectParameter is null.");
    _terrainTwoParam.SetValue(value);
   }
  }

  private EffectParameter _terrainThreeParam;
  public Texture2D TerrainThree
  {
   get
   {
    if (_terrainThreeParam == null)
     throw new Exception("Cannot get value of TerrainThree; TerrainThree EffectParameter is null.");
    return _terrainThreeParam.GetValueTexture2D();
   }
   set
   {
    if (_terrainThreeParam == null)
     throw new Exception("Cannot set value of TerrainThree; TerrainThree EffectParameter is null.");
    _terrainThreeParam.SetValue(value);
   }
  }

  private EffectParameter _terrainFourParam;
  public Texture2D TerrainFour
  {
   get
   {
    if (_terrainFourParam == null)
     throw new Exception("Cannot get value of TerrainFour; TerrainFour EffectParameter is null.");
    return _terrainFourParam.GetValueTexture2D();
   }
   set
   {
    if (_terrainFourParam == null)
     throw new Exception("Cannot set value of TerrainFour; TerrainFour EffectParameter is null.");
    _terrainFourParam.SetValue(value);
   }
  }

  private EffectParameter _groundCursorParam;
  public Texture2D GroundCursor
  {
   get
   {
    if (_groundCursorParam == null)
     throw new Exception("Cannot get value of GroundCursor; GroundCursor EffectParameter is null.");
    return _groundCursorParam.GetValueTexture2D();
   }
   set
   {
    if (_groundCursorParam == null)
     throw new Exception("Cannot set value of GroundCursor; GroundCursor EffectParameter is null.");
    _groundCursorParam.SetValue(value);
   }
  }

  private EffectParameter _constantsParam;
  public Vector4 Constants
  {
   get
   {
    if (_constantsParam == null)
     throw new Exception("Cannot get value of Constants; Constants EffectParameter is null.");
    return _constantsParam.GetValueVector4();
   }
   set
   {
    if (_constantsParam == null)
     throw new Exception("Cannot set value of Constants; Constants EffectParameter is null.");
    _constantsParam.SetValue(value);
   }
  }

  #endregion

  #region Effect Techniques

  private EffectTechnique _SkyTechnique;

  private EffectTechnique _SkyWireframeTechnique;

  private EffectTechnique _TerrainTechnique;

  private EffectTechnique _TerrainWithCursorTechnique;

  private EffectTechnique _TerrainWireframeTechnique;

  #endregion

  #region Initialize Methods

  ///<summary>
  ///Initializes the Effect from byte code for the given GraphicsDevice.
  ///</summary
  ///<param name="graphicsDevice">The GraphicsDevice for which the effect is being created.</param>
  public void Initialize(GraphicsDevice graphicsDevice)
  {
   Initialize(graphicsDevice, CompilerOptions.None, null);
  }

  ///<summary>
  ///Initializes the Effect from byte code for the given GraphicsDevice and CompilerOptions.
  ///</summary
  ///<param name="graphicsDevice">The GraphicsDevice for which the effect is being created.</param>
  ///<param name="compilerOptions">The CompilerOptions to use when creating the effect.</param>
  public void Initialize(GraphicsDevice graphicsDevice, CompilerOptions compilerOptions)
  {
   Initialize(graphicsDevice, compilerOptions, null);
  }

  ///<summary>
  ///Initializes the Effect from byte code for the given GraphicsDevice, CompilerOptions, and EffectPool.
  ///</summary
  ///<param name="graphicsDevice">The GraphicsDevice for which the effect is being created.</param>
  ///<param name="compilerOptions">The CompilerOptions to use when creating the effect.</param>
  ///<param name="effectPools">The EffectPool to use with the effect.</param>
  public void Initialize(GraphicsDevice graphicsDevice, CompilerOptions compilerOptions, EffectPool effectPool)
  {
   _baseEffect = new Effect(graphicsDevice, byteCode, compilerOptions, effectPool);
   _readyToRender = true;

   _worldViewProjectParam = _baseEffect.Parameters["worldViewProject"];
   _worldViewParam = _baseEffect.Parameters["worldView"];
   _sunDirectionParam = _baseEffect.Parameters["sunDirection"];
   _betaRPlusBetaMParam = _baseEffect.Parameters["betaRPlusBetaM"];
   _hGgParam = _baseEffect.Parameters["hGg"];
   _betaDashRParam = _baseEffect.Parameters["betaDashR"];
   _betaDashMParam = _baseEffect.Parameters["betaDashM"];
   _oneOverBetaRPlusBetaMParam = _baseEffect.Parameters["oneOverBetaRPlusBetaM"];
   _multipliersParam = _baseEffect.Parameters["multipliers"];
   _sunColorAndIntensityParam = _baseEffect.Parameters["sunColorAndIntensity"];
   _groundCursorPositionParam = _baseEffect.Parameters["groundCursorPosition"];
   _showGroundCursorParam = _baseEffect.Parameters["showGroundCursor"];
   _terrainAlphaParam = _baseEffect.Parameters["terrainAlpha"];
   _terrainBreakupParam = _baseEffect.Parameters["terrainBreakup"];
   _terrainOneParam = _baseEffect.Parameters["terrainOne"];
   _terrainTwoParam = _baseEffect.Parameters["terrainTwo"];
   _terrainThreeParam = _baseEffect.Parameters["terrainThree"];
   _terrainFourParam = _baseEffect.Parameters["terrainFour"];
   _groundCursorParam = _baseEffect.Parameters["groundCursor"];
   _constantsParam = _baseEffect.Parameters["constants"];

   _SkyTechnique = _baseEffect.Techniques["Sky"];
   _SkyWireframeTechnique = _baseEffect.Techniques["SkyWireframe"];
   _TerrainTechnique = _baseEffect.Techniques["Terrain"];
   _TerrainWithCursorTechnique = _baseEffect.Techniques["TerrainWithCursor"];
   _TerrainWireframeTechnique = _baseEffect.Techniques["TerrainWireframe"];
  }

  #endregion

  ///<summary>
  ///Sets the current technique for the effect.
  ///</summary>
  ///<param name="technique">The technique to use for the current technique.</param>
  public void SetCurrentTechnique(hoffmanshaderEffect.Techniques technique)
  {
   switch (technique)
   {
    case hoffmanshaderEffect.Techniques.Sky:
     _baseEffect.CurrentTechnique = _SkyTechnique;
     break;

    case hoffmanshaderEffect.Techniques.SkyWireframe:
     _baseEffect.CurrentTechnique = _SkyWireframeTechnique;
     break;

    case hoffmanshaderEffect.Techniques.Terrain:
     _baseEffect.CurrentTechnique = _TerrainTechnique;
     break;

    case hoffmanshaderEffect.Techniques.TerrainWithCursor:
     _baseEffect.CurrentTechnique = _TerrainWithCursorTechnique;
     break;

    case hoffmanshaderEffect.Techniques.TerrainWireframe:
     _baseEffect.CurrentTechnique = _TerrainWireframeTechnique;
     break;

   }
  }
//omitted compiled byte code, takes up too much space!
 }
}

Conclusion

In this article I demonstrated the Hoffman light scattering algorithm.  In the released code, by pressing the up/down arrow, you can adjust the time of day and watch the sun go through a 12 hour cycle.

Please leave comments or suggestions, I would very much enjoy hearing ideas on how to improve on the shader here.

March 11, 2008 Posted by | C#, XBOX360, XNA | 11 Comments