Particle System Development on the DirectX 9 Platform. Part II

  • Tutorial
This post is the second and last part of the article on the development of a particle system on DirectX 9. If you have not read the first part, I recommend that you familiarize yourself with it .

This part of the article will discuss: working with sprites, vertex and pixel shaders, effects, post-effects. In particular, for the implementation of the post-effect - the acceptance of the render into the texture.


0. Basic information



Sprites

Sprites are a texture that moves around the screen and depicts an object or part of an object. Since particles in our system are just dots, then applying various textures to them, you can visualize any object (for example, clouds). Since sprite is a simple texture, you need to have a basic understanding of them.

The texture instead of pixels, as we are used to, has texels. Direct3D uses a coordinate system for textures formed by the horizontal U axis and vertical V axis.


Vertex Shaders

Vertex shaders is a program that is created in a special HLSL language (or assembler) and deals with vertex conversion and lighting. In the vertex shader, we can take the position of the vertex and move it to a completely different place. In the article, vertex shaders will also be used to generate texture coordinates.

Pixel Shaders

Similar to vertex shaders, only instead they are engaged in rasterization of the image. In such a shader, data about the texture, color and many others are transmitted, and on the basis of this, the shader must return the color of the pixel. We will use them for texturing.

Effects and Post Effects

Effects will include pixel and / or vertex shaders, and one or more render passes. Using them, you can implement, for example, blur or glow effects.
Post effects differ from the usual ones in that they are applied to an already raspered scene.

1. Texture the particles


Before overlaying the texture on the particles, you need to change the type that we used to represent the vertices in the buffer to the following:
struct VertexData
{
	float x,y,z;
	float u,v; // Храним коориднаты текстуры
};

The values ​​of u and v must be initialized to zero at creation.

It is also necessary to change the flags when creating the buffer, and the description of the buffer:
device->CreateVertexBuffer(count*sizeof(VertexData), D3DUSAGE_WRITEONLY,
		D3DFVF_XYZ | D3DFVF_TEX0, D3DPOOL_DEFAULT, &pVertexObject, NULL);
// ...

D3DVERTEXELEMENT9 decl[] =
	{
		{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
		{ 0, 12, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
		D3DDECL_END()
	};

Add the flag D3DFVF_TEX0, indicating that we will store the coordinates of the texture. We also add a line to the description of the vertices.

And now it remains to load the texture and change the render states:

float pointSize = 5; // Размер частиц в единицах пространства вида
	device->SetRenderState(D3DRS_POINTSIZE_MAX, *((DWORD*)&pointSize));
	device->SetRenderState(D3DRS_POINTSIZE, *((DWORD*)&pointSize));
	device->SetRenderState(D3DRS_LIGHTING,FALSE);
	device->SetRenderState(D3DRS_POINTSPRITEENABLE, TRUE ); //Включаем рисование спрайтов поверх точек
	device->SetTextureStageState(0, D3DTSS_ALPHAARG1, D3DTA_TEXTURE);
	device->SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_SELECTARG1);
	device->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE);
	device->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
	device->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
	device->SetRenderState(D3DRS_ZENABLE, FALSE);


I will not describe all the states, information about them can be found on MSDN. I can only say that we need some of them for effects.

IDirect3DTexture9 *particleTexture = NULL,
D3DXCreateTextureFromFile(device, L"particle.png", &particleTexture); //Создаем текстуру
device->SetTexture(0, particleTexture); //Устанавливаем текстуру

We load the texture and install from the file, which will represent the particle.

That's it, now when you start the application, you will see textured particles instead of simple dots, but we will go further and add a simple effect to the resulting image.

Visualization Result:


2. Effects



For developing effects, there is a great program from NVIDIA called Fx Composer . Supported debugging of shaders, shaders of the 4th version, DIrect3D (9, 10) and OpenGL. I highly recommend it, but in this article this development environment will not be considered.

To begin, consider the basic structure of effects:
Hidden text
float4x4 WorldViewProj; // Входной параметр. Матрица 4x4

//Входной параметр текстура
texture Base  <
	string UIName =  "Base Texture";
	string ResourceType = "2D";
>;

//Сэмплер, используется для выборки текселей
sampler2D BaseTexture = sampler_state {
	Texture = <Base>;
	AddressU = Wrap;
	AddressV = Wrap;
};

//Структура, описывающая входные параметры для вершинного шейдера
struct VS_INPUT 
{
	float4 Position : POSITION0;
	float2 Tex      : TEXCOORD0;

};

//Структура для выходных параметров
struct VS_OUTPUT 
{
	float4 Position : POSITION0;
	float2 Tex      : TEXCOORD0;

};

// Вершинный шейдер
VS_OUTPUT mainVS(VS_INPUT Input)
{
	VS_OUTPUT Output;

	Output.Position = mul( Input.Position, WorldViewProj );
	Output.Tex = Input.Tex;

	return( Output );
}

// Пиксельный шейдер
float4 mainPS(float2 tex: TEXCOORD0) : COLOR
{
	return tex2D(BaseTexture, tex);
}

// Описание "Техники"
technique technique0 
{		
	//Описание прохода визуализации
	pass p0 
	{ 
		CullMode = None; // Устанавливаем состояние рендера
		// Выолняем
		VertexShader = compile vs_2_0 mainVS(); // вершинный шейдер
		PixelShader = compile ps_2_0 mainPS(); //  пиксельный шейдер
	}
}



As you can see from the code, each of the shaders takes and returns a value. The vertex shader must return the coordinates of the vertex, and the pixel color of the processed pixel.
The effect is divided into several techniques . Each of the techniques can represent its own way of applying effects, or even a different effect.
Each technique has one or more visualization passes.

It is time to write your simple effect, which, for example, will color the particles red:
Hidden text
float4x4 WorldViewProj; // Входной параметр. Матрица 4x4

//Входной параметр текстура (спрайт)
texture Base  <
	string UIName =  "Base Texture";
	string ResourceType = "2D";
>;

//Сэмплер, используется для выборки текселей
sampler2D BaseTexture = sampler_state {
	Texture = <Base>;
	AddressU = Wrap;
	AddressV = Wrap;
};

//Структура, описывающая входные параметры для вершинного шейдера
struct VS_INPUT 
{
	float4 Position : POSITION0;
	float2 Tex      : TEXCOORD0;

};

//Структура для выходных параметров
struct VS_OUTPUT 
{
	float4 Position : POSITION0;
	float2 Tex      : TEXCOORD0;

};

// Вершинный шейдер
VS_OUTPUT mainVS(VS_INPUT Input)
{
	VS_OUTPUT Output;

	Output.Position = mul( Input.Position, WorldViewProj ); // Преобразуем координаты вершин в пространство вида
	Output.Tex = Input.Tex; // Координаты текстуры мы не будем модифицировать

	return( Output );
}

// Пиксельный шейдер
float4 mainPS(float2 tex: TEXCOORD0) : COLOR
{
	return tex2D(BaseTexture, tex) * float4(1.0, 0, 0, 1.0); // Смешиваем цвет текстуры с красным
}

// Описание "Техники"
technique technique0 
{		
	//Описание прохода визуализации
	pass p0 
	{ 
		CullMode = None; // Устанавливаем состояние рендера
		// Выолняем
		VertexShader = compile vs_2_0 mainVS(); // вершинный шейдер
		PixelShader = compile ps_2_0 mainPS(); //  пиксельный шейдер
	}
}



The code for this effect differs little from the basic structure that we previously examined. We only added blending with red using the Multiply Blend method. Here's what we got:
Not bad, but you can change the blending mode to another and make the blending not with one color, but with the whole texture.
In order for us to get the right mix of particle visualization and texture, we need to use a technique called Render Target (visualization target). The essence of the technique is simple, we render our scene into a texture, and then we apply effects to the already rasterized image.

Here is the full effect code that implements this:
Hidden text
float4x4 WorldViewProj;

texture Base  <
	string UIName =  "Base Texture";
	string ResourceType = "2D";
>;

sampler2D BaseTexture = sampler_state {
	Texture = <Base>;
	AddressU = Wrap;
	AddressV = Wrap;
};

texture Overlay  <
	string UIName =  "Overlay Texture";
	string ResourceType = "2D";
>;

sampler2D OverlayTexture = sampler_state {
	Texture = <Overlay>;
	AddressU = Wrap;
	AddressV = Wrap;
};

// Текстура, которая будет использоваться для рендера
texture PreRender : RENDERCOLORTARGET
	<
	string Format = "X8R8G8B8" ;
	>;

// И сэмплер для неё
sampler2D PreRenderSampler = sampler_state {
	Texture = <PreRender>;
};

struct VS_INPUT 
{
	float4 Position : POSITION0;
	float2 Tex      : TEXCOORD0;

};

struct VS_OUTPUT 
{
	float4 Position : POSITION0;
	float2 Tex      : TEXCOORD0;

};

VS_OUTPUT cap_mainVS(VS_INPUT Input)
{
	VS_OUTPUT Output;

	Output.Position = mul( Input.Position, WorldViewProj );
	Output.Tex = Input.Tex;

	return( Output );
}

float4 cap_mainPS(float2 tex: TEXCOORD0) : COLOR
{
	return tex2D(BaseTexture, tex);
}

///////////////////////////////////////////////////////

struct Overlay_VS_INPUT 
{
	float4 Position : POSITION0;
	float2 Texture1 : TEXCOORD0;

};

struct Overlay_VS_OUTPUT 
{
	float4 Position : POSITION0;
	float2 Texture1 : TEXCOORD0;
	float2 Texture2 : TEXCOORD1;

};

vector blend(vector bottom, vector top)
{
	//Linear light
	float r = (top.r < 0.5)? (bottom.r + 2*top.r - 1) : (bottom.r + top.r);
	float g = (top.g < 0.5)? (bottom.g + 2*top.g - 1) : (bottom.g + top.g);
	float b = (top.b < 0.5)? (bottom.b + 2*top.b - 1) : (bottom.b + top.b);

	return  vector(r,g,b,bottom.a);
}

Overlay_VS_OUTPUT over_mainVS(Overlay_VS_INPUT Input)
{
	Overlay_VS_OUTPUT Output;

	Output.Position = mul( Input.Position, WorldViewProj );
	Output.Texture1 = Input.Texture1;
	Output.Texture2 = Output.Position.xy*float2(0.5,0.5) + float2(0.5,0.5); // преобразуем координаты вершины, в координаты текстуры

	return( Output );
}

float4 over_mainPS(float2 tex :TEXCOORD0, float2 pos :TEXCOORD1) : COLOR 
{
	return blend(tex2D(OverlayTexture, pos), tex2D(PreRenderSampler, tex));
}


technique technique0 
{		
	pass p0 
	{
		CullMode = None;
		VertexShader = compile vs_2_0 cap_mainVS();
		PixelShader = compile ps_2_0 cap_mainPS();
	}

	pass p1 
	{
		CullMode = None;
		VertexShader = compile vs_2_0 over_mainVS();
		PixelShader = compile ps_2_0 over_mainPS();
	}
}


As you noticed, another stage of visualization has appeared. At the first stage, we visualize the particles as they are. Moreover, we will have to render visualization in the texture. And already in the second pass of visualization, we impose another texture on the image using Linear Light blending.

Using effects in the program


We created the effects, it's time to change the code by adding the use of effects.
We need to create and compile an effect code, load an additional texture, and also create a texture into which we will render.
Hidden text
ID3DXBuffer* errorBuffer = 0;
D3DXCreateEffectFromFile( // Создаем и компилируем эффект
	device, 
	L"effect.fx", 
	NULL,
	NULL,
	D3DXSHADER_USE_LEGACY_D3DX9_31_DLL, //Используем компилятор для DirectX 9
	NULL,
	&effect, 
	&errorBuffer );

if( errorBuffer ) //Выводим ошибки, если они есть
{
	MessageBoxA(hMainWnd, (char*)errorBuffer->GetBufferPointer(), 0, 0);
	errorBuffer->Release();
	terminate();
}

// Создаем матрицу, которую передадим в качестве WorldViewProj
// Она необходима для работы вершинного шейдера
D3DXMATRIX W, V, P, Result; 
D3DXMatrixIdentity(&Result);
device->GetTransform(D3DTS_WORLD, &W);
device->GetTransform(D3DTS_VIEW, &V);
device->GetTransform(D3DTS_PROJECTION, &P);
D3DXMatrixMultiply(&Result, &W, &V);
D3DXMatrixMultiply(&Result, &Result, &P);

effect->SetMatrix(effect->GetParameterByName(0, "WorldViewProj"), &Result);

// Выбираем самую первую технику
effect->SetTechnique( effect->GetTechnique(0) );

IDirect3DTexture9 *renderTexture = NULL,
	*overlayTexture = NULL;

// Поверхности будут использованы для установки цели визуализации
IDirect3DSurface9* orig =NULL
	, *renderTarget = NULL;

D3DXCreateTextureFromFile(device, L"overlay.png", &overlayTexture);

// Создаем текстуру, в которую будет выполняться визуализация
D3DXCreateTexture(device, Width, Height, 0, D3DUSAGE_RENDERTARGET, D3DFMT_X8B8G8R8, D3DPOOL_DEFAULT, &renderTexture);
// Сохраняем поверхность, для рендера в текстуру
renderTexture->GetSurfaceLevel(0, &renderTarget); 
// Сохраняем оригинальную поверхность
device->GetRenderTarget(0, &orig);

// Устанавлим текстуры эффекта
auto hr = effect->SetTexture( effect->GetParameterByName(NULL, "Overlay"), overlayTexture);
hr |= effect->SetTexture( effect->GetParameterByName(NULL, "Base"), particleTexture);
hr |= effect->SetTexture( effect->GetParameterByName(NULL, "PreRender"), renderTexture);

if(hr != 0)
{
	MessageBox(hMainWnd, L"Unable to set effect textures.", L"", MB_ICONHAND);
}



As we see, the effect must be compiled before use, the technique must be selected, and all the data used by it must be installed.
To visualize the texture, we need to create the texture itself, the size of the original scene, and the surface for it. The surface will be used for rendering.

Now it remains only to draw the textures using the effect. It is done like this:

Hidden text
UINT passes = 0; // Здесь будет хранится количество этапов визуализации
effect->Begin(&passes, 0);
for(UINT i=0; i<passes; ++i)
{
	effect->BeginPass(i);

	if(i == 0)
	{
		// Очищаем экранный буфер
		device->Clear( 0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(0,0,0), 1.0f, 0 );
		// Устанавливаем текстуру, а точнее её поверхность, в качестве цели визуализации
		device->SetRenderTarget(0, renderTarget);
		// Очищаем текстуру, в которую будет произведен рендер
		device->Clear(0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(0,0,0), 1.0f, 0);
		// Рисуем частицы
		DrawParticles();
	}
	else if(i == 1)
	{
		// Востанавливаем оригинальную поверхность
		device->SetRenderTarget(0, orig);
		// Рисуем прямоугольник, с наложенной на него текстурой (RenderTexture)
		DrawRect();
	}

	effect->EndPass();
}
effect->End();

// Выводим частицы на экран
device->Present(NULL, NULL, NULL, NULL);



In the code we used DrawRect () , this function draws a rectangle on which the RenderTexture texture is superimposed . This is a feature of the technique, after rendering into the texture, we need to somehow display it on the screen for further processing. The rectangle helps us with this, and we draw it so that it occupies the entire screen space. I will not cite the initialization code for the vertices and the visualization of the rectangle, so as not to inflate the article even more. I can only say that all the necessary actions are similar to those that we carried out during the initialization of particles. If you have any difficulties, then you can see how this function is implemented in the example code.

The effects are used like this: first we call the method Begin () , getting the number of render passes in the effect. Then, before each pass, call BeginPass (i) , and after EndPass () . And finally, after rendering is complete, we call the End () method .

Here's what we got:


This article ends here, thank you all for your attention. I will be glad to answer your questions in the comments.
The full source code of the project is available on GitHub . Attention, to run the compiled example, you need to install VisualC ++ Redistributable 2012
UPD
For those who think that D3D9 is hopelessly outdated, or those who just want all the calculations to be done on the GPU - there is another example, only on D3D10. As usual, an example and a compiled demo are available on GitHub . GPU calculations are included :)