Forums Gallery LOGIN |REGISTER

Community Archive is read-only - Here you can view content recorded up until February 2017. To join the latest discussions visit our new forums


New Viewport + RENDER TO TEXTURE

Please use this forum for questions on C++, Lua, XML, and any other languages used in Crysis modding.


 

User avatar danilochavez
Intern Dev
Intern Dev
 
Posts: 130
Member since: 17.11.2010, 10:35
Location: Guatemala
Likes: 0

Postby danilochavez » 14.12.2010, 06:57

thats great Blue, can you share your source code with me?...I need to draw a new viewport in a texture, C++ is not my principal language, but I am learning, can you post the entire sourcode of that great thing?...because I try to do a realistic zoom effect, I tried with water shader but is very poor, so the solution is drawing a viewport in a texture like that in the James Ryan video or your code, so for making a realisitc zoom for weapons I need a viewport in a texture for the pixel resolution that is needed...the water shader mantains the same resolution that lookin so bad...so is needed a new viewport with stages for approximation...can you share you source code with us?...thanks...
E=mc^2
This is more general E^2=(pc)^2+(mc^2)^2
and Me=h*fcompton/c^2 (Me is electron's mass)
User avatar Blue
Junior Dev
Junior Dev
 
Posts: 328
Member since: 18.02.2009, 10:55
Likes: 22

Postby Blue » 14.12.2010, 17:27

the water shader mantains the same resolution that lookin so bad


Huh, what? So you mean that in this shader you can make some zoom effect or you can draw a second viewport with it? Im not so familiar with shaders - perhaps ill give hlsl in the future a try. Would be perfect if someone of our shader specialists could make some tutorial for this. ^^

To the point with the entire source code:

The code was only for trying to accomplish more viewports, so ive changed it a lot. Last point was to create some virtual material for objects. So i asked here in the forum if someone knows how i can allocate a texture per code to a material, but sadly, no one has answerd. ;( As i said before, ive tryed the hole stuff two weeks, so i mesed up the code and thats the reson i havent posted it here.

If you have problems to make a second view port even with the code snippets ive posted before, i think you wont be happy with my code, because he is like a ground zero in the programming world. :happy:

Perhaps i can clean it up and make a runable version - perhaps also as a plugin for the FPS, but i wont promis it atm, because im workin on a small fun project. What i can promis is that ill take a look in it, perhaps it isnt much of a effort to clean it up - than ill do it in the upcoming days.

Main problem afterall is that the texture for the material is extern on your harddrive. You must use a predefined material with this texture - so it isnt much flexible. Now you know why ive given up the whole stuff. If i had known how to make this all virtual in the code - even if im not enthused with this solution, i had made a FlowGraphPlugin.


Greetings, and sry for my bad englisch,

Blue
User avatar jveer
Just getting started
Just getting started
 
Posts: 13
Member since: 25.10.2010, 22:18
Likes: 0

Postby jveer » 14.12.2010, 22:34

Does anyone know how to read the current BACKBUFFER?

ReadFrameBufferFast() gives you the FRONTBUFFER.

With ReadFrameBuffer() you should be able to get the current Backbuffer, but:

1) I always get an error when I try to readout (maybe I can fix this)
2) This function is very slow (too slow)
User avatar Blue
Junior Dev
Junior Dev
 
Posts: 328
Member since: 18.02.2009, 10:55
Likes: 22

Postby Blue » 14.12.2010, 23:00

Jup, the readframebuffer function is much to slow because it processed the data, thats the reason ive used readframebufferfast. ;) I hadnt any problems with it, so why do you wanna read the backbuffer, is ther something wrong with the frontbuffer (it wont displayed - so there isnt much of a problem (except it could be faster to read the backbuffer because it dosnt need to flip the buffers... ^^)9.

What kind of error do you get? (And how do you know that is to slow - also its true - if you get an error when you read the framebuffer? ^^)

To read the framebuffer with readframebuffer - you must give him a correct unsigned char array (with the same size of your image (example: 128x128 img size == array[(128*128)*4)] - argb pic format)).
You must give him the hight and width (in my example 128).
You must give him the enum:backbuffer flag.

Sry, i havent the sdk atm open, so i must say it out of my mind. :unsure:


Blue

PS: Send us some screenshots of your work pls if you finnished. :happy:
EDIT: ouw my english... :whistle:
Edit2: Jear nevermind - think no answer means thats it works now... :rolleyes:
Last edited by Blue on 15.12.2010, 01:38, edited 2 times in total.
User avatar jveer
Just getting started
Just getting started
 
Posts: 13
Member since: 25.10.2010, 22:18
Likes: 0

Postby jveer » 15.12.2010, 18:38

Yeah we finally found out how it works (more or less ;) ) proper (still some smaller problems).

We didnt really do it that way but used rendertargets (-> CreateRenderTarget() and SetRenderTarget()) to render each viewport.

The size howevery doesn't need to be a power of 2 at all (but it throws a warning in the console, which you can disable).

Thanks for your help, especially Blue ;)
Last edited by jveer on 15.12.2010, 18:38, edited 1 time in total.
User avatar Blue
Junior Dev
Junior Dev
 
Posts: 328
Member since: 18.02.2009, 10:55
Likes: 22

Postby Blue » 15.12.2010, 23:49

We didnt really do it that way but used rendertargets (-> CreateRenderTarget() and SetRenderTarget()) to render each viewport.


?( Realy? Ive never found out for what they can be used - if i tryed them (if i remember right) my screen was always frozen. Perhaps i shall take a look again to those functions. Are they faster than reading framebuffer? Or are they for something else - as i said, i never found out for what they are good. ^^


Thanks for your help, especially Blue


:xmas: It was me a plesure :xmas:
User avatar z_kaiser
Just getting started
Just getting started
 
Posts: 52
Member since: 02.04.2009, 12:17
Likes: 0

Postby z_kaiser » 16.12.2010, 11:01

Hi all,

I have been following this thread and I can render into a diffuse texture. Using ReadFrameBuffer the frame drop is unacceptable, but with ReadFrameBufferFast the drop is little and the level can be played at least. The problem is that this data (buffer with the information) applied to the diffuse texture it appears too bright and still dont know why, maybe the shader... In your video, it seems you dont have this problem. How do you apply the data read from ReadFrameBufferFast?? Do you use a special shader??

P.S: As blue says my screen gets frozen too with CreateRenderTarget and SetRenderTarget functions
Last edited by z_kaiser on 16.12.2010, 11:48, edited 2 times in total.
User avatar Blue
Junior Dev
Junior Dev
 
Posts: 328
Member since: 18.02.2009, 10:55
Likes: 22

Postby Blue » 16.12.2010, 11:53

Hm, hi z_kaiser, dosnt know what i do different - perhaps you can tell me first what

render into a diffuse texture


means for you. As i said i also has had dumped it out to a texture, but it can be that i understand something else under these words (my english isnt so good, sry).


Blue
User avatar z_kaiser
Just getting started
Just getting started
 
Posts: 52
Member since: 02.04.2009, 12:17
Likes: 0

Postby z_kaiser » 16.12.2010, 12:02

Hi,

To be more specific, I suppose you know the r_ShowRenderTarget cVar. What I get is the same that you get with r_ShowRenderTarget 2, and what I want is the result of r_ShowRenderTarget 3 :) (More realistic)
User avatar Blue
Junior Dev
Junior Dev
 
Posts: 328
Member since: 18.02.2009, 10:55
Likes: 22

Postby Blue » 16.12.2010, 12:30

Originally posted by z_kaiser
Hi,

To be more specific, I suppose you know the r_ShowRenderTarget cVar. What I get is the same that you get with r_ShowRenderTarget 2, and what I want is the result of r_ShowRenderTarget 3 :) (More realistic)


Nope, sry you got me wrong. Ive postet the important parts of my code already here in the forum, so i think there we do the same. So perhaps there is a difference in what i mean under: render to a texture - and what you understand in "render into a diffuse texture".

So i asked what you mean - or better what you do "to render into a diffuse texture".

Ahm, and i have now looked in the code - tryed a little with those rendertargets and came to the same as i rememberd - when i set them, my display is frozen - even if i destroy they after setting it.

Its kinda funny - you create it and got a texture id. So if you make a pointer to it and save it to as jpg, it has the right size, but its totaly black. Hm, perhaps ill take a look today again in it. ^^ Think there must be some good reason that jveer has used them - perhaps they are realy faster to geting the pic from the buffer.


Blue
User avatar z_kaiser
Just getting started
Just getting started
 
Posts: 52
Member since: 02.04.2009, 12:17
Likes: 0

Postby z_kaiser » 16.12.2010, 12:35

Ahm,

Basically I get the texture ID of the material and then update it with the data from ReadFrameBufferFast, so I suppose its the same as yours
User avatar Blue
Junior Dev
Junior Dev
 
Posts: 328
Member since: 18.02.2009, 10:55
Likes: 22

Postby Blue » 16.12.2010, 12:59

K got it, k a little difference we have - i recreate the dds dummy files in the material first, before i update them (part of the first try and error stepds - isnt nexesary).

So, hm you read the buffer with read buffer fast? I think you flip the lines... ah!

re it appears too bright


I think i know now whats wrong - you dosnt correct the file data! If you read the buffer (if im not wrong (could also be!)) - you got the data in format: A B G R -> if you dosnt dump them direct to the dds files and go the way over forcerelease and realod - you must prepare this data before you can load it direct to the video buffer. As you know, the right way for the video buffer must be: A R G B. So after fliping the lines - flip also the 8 bit from red and blue(^^).

unsigned int who holds the screen data == register(32bit): unsigned char RED[bit:0-7] flip unsigned char BLUE[bit:16-23]. :cheesy:


greetings, Blue
User avatar z_kaiser
Just getting started
Just getting started
 
Posts: 52
Member since: 02.04.2009, 12:17
Likes: 0

Postby z_kaiser » 16.12.2010, 13:29

To prepare data to the video memory I did this tests and I could check that the four bytes correspond to A R G B, so the order is the correct. Have you take a look to r_ShowRenderTarget cVar? and you could see my problem.
User avatar danilochavez
Intern Dev
Intern Dev
 
Posts: 130
Member since: 17.11.2010, 10:35
Location: Guatemala
Likes: 0

Postby danilochavez » 16.12.2010, 14:45

congratulations jveer...but will be good for us if you can show us the exact fragment of code or the entire code, because we want to render viewports too...we can learn from you...
and Blue, concerning to shaders i spoke about, I tweaked the liquid shader in the shader script and I extended the IndexOfRefraction slider to have negative values in the range, that material shader expands the images acording to the equations like bulges, imitating real glases, the problem is the pixel resolution is the same because the shader operates in the pixels, that implies that espanding some parts of the image the quality is bad...so for doing a real glass effect I need to render a new viewport in a material for giving the better resolution like in the stages in the scopes...so is like projecting a camera near of the object with optimal resolution...the tweak in the shader give interesting effects like in "the house of mirrors" with deformations, but is not good for the effect I want to do...so I need a working code for making a new viewport, becuase I am not so good with C++, I am learning that lenguage...
copy the next code in a file liquid.cfx and go to materials, in the first option use (shader) click in liquid and then go to shader parameters next to the place of the bitmaps section and use the refractionindex in static or in dinamic for with flowgraph or something so...and see what happens...

/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////


#include "Common.cfi"
#include "ShadeLib.cfi"
#include "ModificatorVT.cfi"
#include "ModificatorTC.cfi"


// Shader global descriptions
float Script : STANDARDSGLOBAL
<
string Script =
"Public;"
"NoPreview;"
"ForceZpass;"
//"ForceDrawLast;"
"ForceWaterPass;"
//"Decal;"
"ShaderDrawType = General;"
"ShaderType = FX;"
>;

/// Un-Tweakables //////////////////////
float4x4 mComposite : PI_Composite; // View*Projection
float4 ScrSize : PB_ScreenSize;
float4 CameraFront : PB_CameraFront;

ENVIRONMENTMAP
ENVIRONMENTCUBEMAP

/*
sampler2D envMapSamplerRefr
<
string Script =
"RenderOrder=PreDraw;"
"RenderCamera=Current;"
"RenderType=CurScene;"
"RenderTarget_IDPool = _RT2D_SCREEN_ID;"
"RenderTarget_Width=$ScreenSize;"
"RenderTarget_Height=$ScreenSize;"
"RenderTarget_UpdateType=Allways;"
"RenderDepthStencilTarget=DepthBuffer;";
>
{
Texture = $RT_2D;
MinFilter = LINEAR;
MagFilter = LINEAR;
MipFilter = NONE;
AddressU = Clamp;
AddressV = Clamp;
};
*/

sampler2D envMapSamplerRefr
{
Texture = $SceneTarget;
MinFilter = POINT;
MagFilter = POINT;
MipFilter = POINT;
AddressU = Clamp;
AddressV = Clamp;
};

sampler2D fringeMapSampler = sampler_state
{
Texture = textures/defaults/fringe_map.dds;
MinFilter = LINEAR;
MagFilter = LINEAR;
MipFilter = LINEAR;
AddressU = Wrap;
AddressV = Clamp;
};

sampler2D screenNoiseSampler = sampler_state
{
Texture = textures/defaults/perlinNoiseNormal.dds;
MinFilter = LINEAR;
MagFilter = LINEAR;
MipFilter = POINT;
AddressU = Wrap;
AddressV = Wrap;
};

sampler2D sceneDepthSampler = sampler_state
{
Texture = $ZTarget;
MinFilter = POINT;
MagFilter = POINT;
MipFilter = POINT;
AddressU = Clamp;
AddressV = Clamp;
};

// Tweakables /////////////////

float IndexOfRefraction
<
psregister = PS_REG_PM_3.x;
string UIName = "Index of refraction";

string UIWidget = "slider";
float UIMin = -50.0;
float UIMax = 50.0;
float UIStep = 0.01;
> = 0.85;

float FogDensity
<
psregister = PS_REG_PM_3.y;
string UIName = "Fog density";

string UIWidget = "slider";
float UIMin = 0.0;
float UIMax = 10.0;
float UIStep = 0.01;
> = 3.0;

float HDRDynamic
<
psregister = PS_REG_PM_3.z;
string UIName = "HDRDynamic";

string UIWidget = "slider";
float UIMin = 0.0;
float UIMax = 32.0;
float UIStep = 0.01;
> = 3.0;

float FlowTilling
<
vsregister = VS_REG_PM_4.y;
string UIName = "Flow tilling";

string UIWidget = "slider";
float UIMin = 0.0;
float UIMax = 10.0;
float UIStep = 0.01;
> = 1.0;

float FlowSpeed
<
vsregister = VS_REG_PM_4.z;
string UIName = "Flow speed";

string UIWidget = "slider";
float UIMin = 0.0;
float UIMax = 10.0;
float UIStep = 0.01;
> = 1.0;

float4 LiquidColor
<
psregister = PS_REG_PM_4;
string UIName = "Liquid color";
string UIWidget = "color";
> = {0.45, 0.62, 0.54, 1.0};

////////////////////////////////////////////////////////
// GENERAL PASS
////////////////////////////////////////////////////////

///////////////// vertex input/output //////////////////
struct vtxOUT
{
OUT_P

float4 baseTC : TEXCOORDN;
float4 baseTC2 : TEXCOORDN;

float4 vTangent : TEXCOORDN;
float4 vBinormal : TEXCOORDN;
float4 vNormal : TEXCOORDN;
float4 vView : TEXCOORDN;
float4 vPos : TEXCOORDN;
float4 screenPos : TEXCOORDN;

};

///////////////// vertex shader //////////////////
vtxOUT LiquidVS(app2vertGeneral IN)
{
vtxOUT OUT = (vtxOUT)0;

streamPos vertPassPos = (streamPos)0;
streamPos_FromGeneral(IN, vertPassPos);

OUT.baseTC = IN.baseTC.xyyx * FlowTilling;
OUT.baseTC2 = OUT.baseTC;

float fAnimSpeed = g_VS_AnimGenParams.y * 0.1 * FlowSpeed;

// pre-multiply/compute constants per-vertex (cheaper...) . Store in wz format, to save 1 alu in pixel shader
OUT.baseTC.xy = OUT.baseTC.xy * 0.4 + float2(0, fAnimSpeed);
OUT.baseTC.wz = OUT.baseTC.wz * 0.42 - float2(0, fAnimSpeed);

OUT.baseTC2.xy = OUT.baseTC.yx * 0.2 + float2(0, fAnimSpeed);
OUT.baseTC2.wz = OUT.baseTC.zw * 0.22 - float2(0, fAnimSpeed);

OUT.HPosition = Pos_VS_General(g_VS_ViewProjZeroMatr, vertPassPos);

float3 worldTangentS = normalize( mul((const float3x3)vertPassPos.InstMatrix, vertPassPos.ObjToTangentSpace[0]) );
float3 worldTangentT = normalize( mul((const float3x3)vertPassPos.InstMatrix, vertPassPos.ObjToTangentSpace[1]) );
float3 worldTangentN = normalize(cross(worldTangentS, worldTangentT)) * IN.Tangent.w;

OUT.vTangent = float4(worldTangentS, IN.Tangent.w);
OUT.vBinormal.xyz = worldTangentT;
OUT.vNormal.xyz = worldTangentN;

OUT.vView.xyz = vertPassPos.WorldPos.xyz;

vertPassPos.WorldPos.xyz += g_VS_WorldViewPos.xyz;
OUT.vPos = vertPassPos.WorldPos.xyzz;

// Output the screen-space texture coordinates
OUT.screenPos = HPosToScreenTC(OUT.HPosition);

#if %_RT_FOG
float4 fogColor = GetVolumetricFogColor(vertPassPos.WorldPos.xyz);
OUT.vView.w = fogColor.w;
#endif

return OUT;
}

///////////////// pixel shader //////////////////

pixout LiquidPS(vtxOUT IN)
{
pixout OUT;

// Debug output
#if %_RT_DEBUG0 || %_RT_DEBUG1 || %_RT_DEBUG2 || %_RT_DEBUG3
DebugOutput(OUT.Color, float4(IN.baseTC, 0, 1));
return OUT;
#endif

float3x3 mTangentToWS = float3x3(IN.vTangent.xyz, IN.vBinormal.xyz, IN.vNormal.xyz);

half3 eyeVec = normalize(-IN.vView.xyz); // 3 alu

// Make procedural bump
half4 bump01 = tex2D(screenNoiseSampler, IN.baseTC.xy);
half4 bump02 = tex2D(screenNoiseSampler, IN.baseTC.wz);
half4 bump03 = tex2D(screenNoiseSampler, IN.baseTC.xy);
half4 bump04 = tex2D(screenNoiseSampler, IN.baseTC.wz);
half3 vMergedBump = bump01.xyz + bump02.xyz + bump03.xyz + bump04.xyz; // 3 alu
vMergedBump = vMergedBump * 2.0 - 4.0; // 1 alu


// Put in world space
half3 vNormal = normalize(mul(vMergedBump, mTangentToWS).xyz); // 6 alu

// Compute refraction vector
float3 vRefr = refract(-eyeVec, vNormal, IndexOfRefraction);

// Project refraction into screen space
float4 vRefrPos = mul(mComposite, float4(IN.vPos.xyz + vRefr*0.5 , 1)); // 4 alu
float4 vRefrVS = vRefrPos;

float4 vRefractTC = vRefrVS * 0.5;
vRefrVS.xy = vRefractTC.ww + vRefractTC.xy * float2( 1 , -1 );
vRefrVS.xy += ScrSize.zw * vRefrVS.w;

// Use refraction vector as texture lookup
float sceneDepth = tex2Dproj( sceneDepthSampler, vRefrVS ).x * PS_NearFarClipDist.y;
float fRefractionMask = IN.screenPos.w < sceneDepth;

vRefrVS = fRefractionMask? vRefrVS : IN.screenPos;

sceneDepth = tex2Dproj( sceneDepthSampler, vRefrVS ).x * PS_NearFarClipDist.y;
half4 refrColor = tex2Dproj(envMapSamplerRefr, vRefrVS );

// Aproximate fog volume
float fFrontDepth = IN.screenPos.w;
half3 finalColor = lerp(refrColor, (1 + HDRDynamic) * LiquidColor,
saturate(1 - exp( - FogDensity * saturate(sceneDepth - fFrontDepth) ) ));

HDROutput(OUT, half4(finalColor.xyz, 1), 1);
return OUT;
}

//////////////////////////////// technique ////////////////

technique General
<
string Script =
"TechniqueZ=ZPass;"
>
{
pass p0
{
VertexShader = compile vs_Auto LiquidVS() GeneralVS;
PixelShader = compile ps_Auto LiquidPS() GeneralPS;

ZEnable = true;
ZWriteEnable = true;
CullMode = Back;
}
}
//////////////////////////////// Common techniques ////////////////

///////////////// vertex input/output //////////////////
struct vert2fragZ
{
float4 HPosition : POSITION;
float4 ZInfo : TEXCOORD0_centroid;
};

///////////////// vertex shaders //////////////////
vert2fragZ ZPassVS(app2vertZGeneral IN)
{
vert2fragZ OUT;
#ifndef OPENGL
OUT = (vert2fragZ)0;
#endif

streamPos vertPassPos = (streamPos)0;
streamPos_FromZ(IN, vertPassPos);

OUT.HPosition = Pos_VS_General(g_VS_ViewProjZeroMatr, vertPassPos);

#if %_RT_FSAA
OUT.ZInfo.xyz = mul(vertPassPos.InstMatrix, vertPassPos.Position).xyz + g_VS_WorldViewPos.xyz;
#endif

OUT.ZInfo.w = OUT.HPosition.w * g_VS_NearFarClipDist.w;

return OUT;
}

///////////////// pixel shaders //////////////////
pixout ZPassPS(vert2fragZ IN)
{
pixout OUT = (pixout)0;

float fZ = IN.ZInfo.w;

OUT.Color = EncodeSceneDepthNoAlpha(fZ, 1, IN.ZInfo.xyz);

return OUT;
}

//////////////////////////////// technique ////////////////

technique ZPass
{
pass p0
{
VertexShader = compile vs_Auto ZPassVS() ZVS;
PixelShader = compile ps_Auto ZPassPS() ZPS;

ZEnable = true;
ZWriteEnable = false;
CullMode = Front;
}
}


/////////////////////// eof ///
E=mc^2
This is more general E^2=(pc)^2+(mc^2)^2
and Me=h*fcompton/c^2 (Me is electron's mass)
User avatar Blue
Junior Dev
Junior Dev
 
Posts: 328
Member since: 18.02.2009, 10:55
Likes: 22

Postby Blue » 16.12.2010, 14:51

Hm, at first, i havent a rendertarget with the number 2 - can be that its variable.

Second, if you use readframebufferfast - and use the data as you get it, they are normaly fliped horizontal and the blue and red value are transposed. If that is not the case - pls tell me your trick, because i must rework the data before i can display them directly to the screen. :-(

Perhaps you can load up a picture so that i can see what your problem is?

EDIT: Hui nice man - sadly that i dosnt know much about hlsl - a tut about something like this for crysis would be very nice. Ill try the shader out, sounds interesting. So - hm, i looked in my solution after so many from you have asked about stuff in it.

So one thing, i use e predefined dds file - wo i can be included in some material. In this texture i dump a cam view and update it, so it comes to the "cam video texture" (fake) effect.

damn ive forgotten what i would write... Oo
Last edited by Blue on 16.12.2010, 15:02, edited 2 times in total.