怎样在max中redis 查看maxmemorynormaltietu

3ds max中烘培normal和ao的详细教程
作者:佚名
字体:[ ] 来源:互联网 时间:06-28 19:54:17
3ds max怎么用?3ds max烘培normal和ao,主要是真对一些刚从事游戏制作人员,对max烘培技巧的理解。使他们能快速学会这个技能,需要的朋友可以参考下
3ds max烘培normal和ao,主要是真对一些刚从事游戏制作人员,对max烘培技巧的理解。使他们能快速学会这个技能。
1、准备好一个底模和一个高模,选中底模匹配好高模。命令
2、选择底模给一个Projection
3、当你PICK拾取高模后在Reference Geometry里会现实你做选中的高模切记不能选错
4、点击cage会弹出下面的命令大amount的小箭头网上托纸左边的包裹框包裹住蓝色的高模。
5、然后安数字键0,如图红框里面的照着勾好,然后点击Add,选择我们要烘培的normal就点击normalsmap如果烘培ao就选择lightingmap。
6、如图上的红字描述设置好这些你后我们就可以点击Render,然后就等着渲染结束,在你所选的存储位置找到渲染好的图,就ok了
注意事项:
1、不要选错高模底模。
2、注意图片的大小自己要设置。
大家感兴趣的内容
12345678910
最近更新的内容君,已阅读到文档的结尾了呢~~
扫扫二维码,随身浏览文档
手机或平板扫扫即可继续访问
MAX烘培Normal map Ambient Occlusion流程
举报该文档为侵权文档。
举报该文档含有违规或不良信息。
反馈该文档无法正常浏览。
举报该文档为重复文档。
推荐理由:
将文档分享至:
分享完整地址
文档地址:
粘贴到BBS或博客
flash地址:
支持嵌入FLASH地址的网站使用
html代码:
&embed src='/DocinViewer-4.swf' width='100%' height='600' type=application/x-shockwave-flash ALLOWFULLSCREEN='true' ALLOWSCRIPTACCESS='always'&&/embed&
450px*300px480px*400px650px*490px
支持嵌入HTML代码的网站使用
您的内容已经提交成功
您所提交的内容需要审核后才能发布,请您等待!
3秒自动关闭窗口2009人阅读
DXUT(13)
3D下的各种shader(7)
相信大家都应该玩过CS或者CF吧,游戏里面有个喷图功能,就是按一个T键就能在墙上或者地板上喷出自己预先设定的图案.
而刚好这就是我们这个Shader所需实现的内容.由于没有潜伏者的贴图,我只有从这个图里用PS扣一个Alpha通道出来,我后面会讲具体操作
下面就起跟我一起进入亮瞎氪金狗眼的3D世界吧
国际惯例上图先:
怎么样?过瘾吧,这样的话,到处乱喷,估计保卫者都快疯掉了,呵呵!!!!!!!!!
好了,进入正题,讲下原理:
投影贴图是从灯光处看到世界坐标系的原点从而建立VIEW矩阵,物体在灯光中视椎体的位置仍然由物体的MODEL矩阵来控制,然后通过PROJECTION矩阵转成标准设备坐标,由于标准设备坐标的范围为(-1,1),但是纹理坐标的范围为(0,1),所以我们需要将其移动到(0,1)
投影贴图的矩阵变换
In the rendering pipeline, geometry outside the frustum is clipped. However, when we generate projective texture coordinates by projecting the geometry from the
point of view of the light projector, no clipping is done — we simply project vertices. Consequently, geometry outside the projector's frustum receives projective&texture coordinates outside the [0, 1] range.
这里有两种方法来处理位于视椎体外面的点,一种方法:就是为灯光设置一个锥形照射范围,这样超过这个范围的点将不会映射,第二种方法,我们检测纹理坐标,如何超出了(0,1)的范围,我们就不要映射了.
这里我们采用了第二种方法:
float2 uv = texCoordProj.xy / texCoordProj.w;
if( uv.x & 0.0f || uv.x & 1.0f || uv.y & 0.0f || uv.y & 1.0f &)
return diffuseL
另外,我们会发现,视椎体的前后都会投影,
The second issue is that back-projection artifacts can appear when the q coordinate is negative.
Back-projection refers to the texture being projected on surfaces that are behind the light source (or
projector).
There are several ways to avoid artifacts when q is negative:
1.Use culling to draw only geometry that is in front of the light source.
2.Use clip planes to remove geometry that is behind the light source.
3.Fold the back-projection factor into a 3D attenuation texture.
4.Use a fragment program to check when q is negative.
The first two solutions are
the fragment program solution is simple and efficient but
requires an advanced fragment profile. Just check the value of the q coordinate. If q is negative, you can
ignore the projective texture computation and output black.
这里我们采用第4种方法
if( texCoordProj.w & 0 )
return diffuseL
好了,这里讲下PS相关的内容,为了将潜伏者从原图中扣出来,去掉背景的影响,我们需要新建一个Alpha通道,然后在Alpha通道里用多边形套索工具,自己慢慢把它扣出来,最后保存为32位的tga格式,将alpha通道保存进去
然后在片断着色器中,我们采用混合的方式,将其显示出来
return textureColor * textureColor.w + (1.0 - textureColor.w) * diffuseL
最后再来一个暴雪的LOGO吧
是不是有点像放电影的赶脚,呵呵.
好了,贴下源代码
/*------------------------------------------------------------
3D_Shader_ProjectivetTexturing.cpp -- achieve projective texturing
(c) Seamanj.
------------------------------------------------------------*/
#include &DXUT.h&
#include &resource.h&
// phase1 : add camera
// phase2 : add wall
// phase3 : add light and shader
// phase4 : add animation
// phase5 : add objects in space
// phase6 : add projective texture
#define phase1 1
#define phase2 1
#define phase3 1
#define phase4 1
#define phase5 1
#define phase6 1
#if phase1
#include &DXUTcamera.h&
CModelViewerCamera g_C
#if phase2
// Vertex Buffer
LPDIRECT3DVERTEXBUFFER9 g_pVB = NULL;
// Index Buffer
LPDIRECT3DINDEXBUFFER9 g_pIB = NULL;
#if phase3
#include &SDKmisc.h&//加载文件时会用到
ID3DXEffect*
g_pEffect = NULL;
// D3DX effect interface
D3DXHANDLE
g_hTech = 0;
D3DXHANDLE
g_hWorldViewProj = NULL;
D3DXHANDLE
g_hWorld = NULL;
D3DXHANDLE
g_hWorldInv = NULL;
D3DXHANDLE
g_hLightPosition = NULL;
ID3DXMesh* g_pLightSphereMesh = 0;
static float g_fLightAngle = -0.4;
/* Angle light rotates around scene. */
static float g_fLightHeight = 2.0f;
/* Vertical height of light. */
#if phase4
static bool g_bAnimation =
short g_sDirection = 1;
#if phase5
ID3DXMesh* g_pSphereMesh = 0;
ID3DXMesh* g_pTeapotMesh = 0;
#if phase6
IDirect3DTexture9*
g_pTex = NULL;
D3DXHANDLE
g_hTex = NULL;
D3DXHANDLE
g_hTextureMatrix = NULL;
//--------------------------------------------------------------------------------------
// Rejects any D3D9 devices that aren't acceptable to the app by returning false
//--------------------------------------------------------------------------------------
bool CALLBACK IsD3D9DeviceAcceptable( D3DCAPS9* pCaps, D3DFORMAT AdapterFormat, D3DFORMAT BackBufferFormat,
bool bWindowed, void* pUserContext )
// Typically want to skip back buffer formats that don't support alpha blending
IDirect3D9* pD3D = DXUTGetD3D9Object();
if( FAILED( pD3D-&CheckDeviceFormat( pCaps-&AdapterOrdinal, pCaps-&DeviceType,
AdapterFormat, D3DUSAGE_QUERY_POSTPIXELSHADER_BLENDING,
D3DRTYPE_TEXTURE, BackBufferFormat ) ) )
//--------------------------------------------------------------------------------------
// Before a device is created, modify the device settings as needed
//--------------------------------------------------------------------------------------
bool CALLBACK ModifyDeviceSettings( DXUTDeviceSettings* pDeviceSettings, void* pUserContext )
#if phase1
pDeviceSettings-&d3d9.pp.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE;
//--------------------------------------------------------------------------------------
// Create any D3D9 resources that will live through a device reset (D3DPOOL_MANAGED)
// and aren't tied to the back buffer size
//--------------------------------------------------------------------------------------
HRESULT CALLBACK OnD3D9CreateDevice( IDirect3DDevice9* pd3dDevice, const D3DSURFACE_DESC* pBackBufferSurfaceDesc,
void* pUserContext )
#if phase1
// Setup the camera's view parameters
D3DXVECTOR3 vecEye( 0.0f, 0.0f, -5.0f );
D3DXVECTOR3 vecAt ( 0.0f, 0.0f, -0.0f );
g_Camera.SetViewParams( &vecEye, &vecAt );
FLOAT fObjectRadius=1;
//摄像机缩放的3个参数
g_Camera.SetRadius( fObjectRadius * 3.0f, fObjectRadius * 0.5f, fObjectRadius * 20.0f );
g_Camera.SetEnablePositionMovement( true );
#if phase3
// Create vertex shader
WCHAR str[MAX_PATH];
// Read the D3DX effect file
V_RETURN( DXUTFindDXSDKMediaFileCch( str, MAX_PATH, L&ProjectiveTexturing.fx& ) );
// Create the effect
LPD3DXBUFFER pErrorB
V_RETURN( D3DXCreateEffectFromFile(
pd3dDevice,
// associated device
// effect filename
// no preprocessor definitions
// no ID3DXInclude interface
D3DXSHADER_DEBUG, // compile flags
// don't share parameters
&g_pEffect,
// return effect
&pErrorBuff
// return error messages
//pErrorBuff
// Get handle
g_hTech = g_pEffect-&GetTechniqueByName(&MyTechnique&);
g_hWorldViewProj = g_pEffect-&GetParameterByName(0, &g_mWorldViewProj&);
g_hLightPosition = g_pEffect-&GetParameterByName(0, &g_lightPosition&);
g_hWorld = g_pEffect-&GetParameterByName(0, &g_mWorld&);
g_hWorldInv = g_pEffect-&GetParameterByName(0, &g_mWorldInv&);
D3DXCreateSphere(pd3dDevice, 0.15, 30, 30, &g_pLightSphereMesh, 0);
#if phase5
D3DXCreateSphere(pd3dDevice, 2.0f, 40, 40, &g_pSphereMesh, 0);
D3DXCreateTeapot(pd3dDevice, &g_pTeapotMesh, 0);
#if phase6
g_hTextureMatrix = g_pEffect-&GetParameterByName(0, &g_mTextureMatrix&);
g_hTex = g_pEffect-&GetParameterByName(0, &g_tex&);
D3DXCreateTextureFromFile(pd3dDevice, L&cf.tga&, &g_pTex);
return S_OK;
#if phase2
struct MyVertexFormat
FLOAT nx, ny,
#define FVF_VERTEX (D3DFVF_XYZ | D3DFVF_NORMAL )
static HRESULT initVertexIndexBuffer(IDirect3DDevice9* pd3dDevice)
static const MyVertexFormat Vertices[] =
12, -2, -12,
0, 1, 0 },
-12, -2, -12, 0, 1, 0 },
{ -12, -2,
12 , 0, 1, 0 },
12, 0, 1, 0 },
{ -12, -2, -12,0, 0, 1 },
12, -2, -12,0, 0, 1 },
12, 10, -12,0, 0, 1 },
{ -12, 10, -12, 0, 0, 1},
12 ,0, 0, -1 },
{ -12, -2,
12 ,0, 0, -1 },
{ -12, 10,
12 ,0, 0, -1},
12,0, 0, -1 },
{ -12, 10, -12,0, -1, 0 },
12, 10, -12,0, -1, 0 },
12,0, -1, 0 },
{ -12, 10,
12,0, -1, 0},
{ -12, -2,
12,1, 0, 0 },
{ -12, -2, -12,1, 0, 0 },
{ -12, 10, -12,1, 0, 0 },
12,1, 0, 0 },
{ 12, -2, -12,-1, 0, 0 },
12,-1, 0, 0 },
12,-1, 0, 0},
{12, 10, -12,-1, 0, 0 }
if (FAILED(pd3dDevice-&CreateVertexBuffer(sizeof(Vertices),
0, FVF_VERTEX,
D3DPOOL_DEFAULT,
&g_pVB, NULL))) {
return E_FAIL;
if (FAILED(g_pVB-&Lock(0, 0, /* map entire buffer */
&pVertices, 0))) {
return E_FAIL;
memcpy(pVertices, Vertices, sizeof(Vertices));
g_pVB-&Unlock();
// Create and initialize index buffer
static const WORD Indices[] =
if (FAILED(pd3dDevice-&CreateIndexBuffer(sizeof(Indices),
D3DUSAGE_WRITEONLY,
D3DFMT_INDEX16,
D3DPOOL_DEFAULT,
&g_pIB, NULL))) {
return E_FAIL;
if (FAILED(g_pIB-&Lock(0, 0, /* map entire buffer */
&pIndices, 0))) {
return E_FAIL;
memcpy(pIndices, Indices, sizeof(Indices));
g_pIB-&Unlock();
return S_OK;
//--------------------------------------------------------------------------------------
// Create any D3D9 resources that won't live through a device reset (D3DPOOL_DEFAULT)
// or that are tied to the back buffer size
//--------------------------------------------------------------------------------------
HRESULT CALLBACK OnD3D9ResetDevice( IDirect3DDevice9* pd3dDevice, const D3DSURFACE_DESC* pBackBufferSurfaceDesc,
void* pUserContext )
#if phase3
if( g_pEffect )
V_RETURN( g_pEffect-&OnResetDevice() );
#if phase2
pd3dDevice-&SetRenderState( D3DRS_CULLMODE, D3DCULL_NONE );
//Setup the camera's projection parameters
float fAspectRatio = pBackBufferSurfaceDesc-&Width / ( FLOAT )pBackBufferSurfaceDesc-&H
g_Camera.SetProjParams( D3DX_PI / 2, fAspectRatio, 0.1f, 5000.0f );
g_Camera.SetWindow( pBackBufferSurfaceDesc-&Width, pBackBufferSurfaceDesc-&Height );
g_Camera.SetButtonMasks( MOUSE_LEFT_BUTTON, MOUSE_WHEEL, MOUSE_RIGHT_BUTTON );
#if !phase1
return S_OK;
return initVertexIndexBuffer(pd3dDevice);
#if phase4
static const double my2Pi = 2.0 * 3.;
//--------------------------------------------------------------------------------------
// Handle updates to the scene.
This is called regardless of which D3D API is used
//--------------------------------------------------------------------------------------
void CALLBACK OnFrameMove( double fTime, float fElapsedTime, void* pUserContext )
#if phase1
g_Camera.FrameMove( fElapsedTime );
#if phase4
if( g_bAnimation )
if( g_fLightHeight & 9.0f)
g_sDirection = -1;
else if( g_fLightHeight & -1.5f )
g_sDirection = 1;
g_fLightHeight += 0.0002 * g_sD
g_fLightAngle += 0.0002;
if( g_fLightAngle & my2Pi)
g_fLightAngle -= my2Pi;
#if phase6
static void buildTextureMatrix(D3DXMATRIXA16& textureMatrix,
const D3DXMATRIXA16& modelMatrix,
const D3DXMATRIXA16& viewMatrix
)//需要modelMatrix是因为需要了解物体在灯光的视椎体里面的位置
D3DXMATRIXA16 projM
D3DXMATRIXA16 RangeMappingMatrix(
0.5, 0, 0, 0,
0.5, 0, 0,
0, 0,0.5, 0,
D3DXMatrixPerspectiveFovLH(&projMatrix, 50.0f * 3.14f / 180.0f , 1.72f, 0.25, 20.0);
textureMatrix = modelMatrix * viewMatrix * projMatrix * RangeMappingM
static void buildLightViewMatrix(D3DXMATRIXA16& lightViewMatrix,const D3DXVECTOR3 *pLightPosition )
D3DXMatrixLookAtLH(&lightViewMatrix, pLightPosition,
&D3DXVECTOR3(0, 0, 0), &D3DXVECTOR3(0, -1, 0));//看世界坐标系中的原点
//--------------------------------------------------------------------------------------
// Render the scene using the D3D9 device
//--------------------------------------------------------------------------------------
void CALLBACK OnD3D9FrameRender( IDirect3DDevice9* pd3dDevice, double fTime, float fElapsedTime, void* pUserContext )
#if phase3
const float lightPosition[4] = { 6*sin(g_fLightAngle),
g_fLightHeight,
6*cos(g_fLightAngle), 1 };
// Clear the render target and the zbuffer
V( pd3dDevice-&Clear( 0, NULL, D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER, D3DCOLOR_ARGB( 0, 45, 50, 170 ), 1.0f, 0 ) );
// Render the scene
if( SUCCEEDED( pd3dDevice-&BeginScene() ) )
#if phase3
UINT iPass, cP
D3DXMATRIXA16 mWorldViewProjection,mWorld,mWorldI
#if phase6
D3DXMATRIXA16 mLightViewMatrix, mTextureM
V(g_pEffect-&SetTechnique(g_hTech));
V( g_pEffect-&Begin( &cPasses, 0 ) );
for( iPass = 0; iPass & cP iPass++ )
V( g_pEffect-&BeginPass( iPass ) );
//set light position
V( g_pEffect-&SetFloatArray( g_hLightPosition, lightPosition, 4) );
#if phase6
//set texture
V( g_pEffect-&SetTexture( g_hTex, g_pTex) );
#if phase2//build wall
#if phase5
// Set world matrix
D3DXMatrixIdentity(&mWorld);
V( g_pEffect-&SetMatrix( g_hWorld, &mWorld) );
// Set world inverse matrix
D3DXMatrixInverse(&mWorldInv, NULL, &mWorld);
V( g_pEffect-&SetMatrix( g_hWorldInv, &mWorldInv) );
// set worldviewproj matrix
mWorldViewProjection = mWorld * *g_Camera.GetViewMatrix() * *g_Camera.GetProjMatrix();
V( g_pEffect-&SetMatrix( g_hWorldViewProj, &mWorldViewProjection) );
#if phase6
// set light view matrix
buildLightViewMatrix(mLightViewMatrix, &D3DXVECTOR3(lightPosition[0], lightPosition[1], lightPosition[2]) );
// set texture matrix
buildTextureMatrix(mTextureMatrix, mWorld, mLightViewMatrix);
V( g_pEffect-&SetMatrix( g_hTextureMatrix, &mTextureMatrix) );
pd3dDevice-&SetRenderState(D3DRS_CULLMODE, D3DCULL_CCW);
V( g_pEffect-&CommitChanges() );
pd3dDevice-&SetStreamSource(0, g_pVB, 0, sizeof(MyVertexFormat));
pd3dDevice-&SetIndices(g_pIB);//sets the current index buffer.
pd3dDevice-&SetFVF(FVF_VERTEX);//Sets the current vertex stream declaration.
pd3dDevice-&DrawIndexedPrimitive(D3DPT_TRIANGLELIST, 0, 0, 24, 0, 12);
#if phase5
// Set world matrix
D3DXMatrixTranslation(&mWorld, 2, 0, 0);
V( g_pEffect-&SetMatrix( g_hWorld, &mWorld) );
// Set world inverse matrix
D3DXMatrixInverse(&mWorldInv, NULL, &mWorld);
V( g_pEffect-&SetMatrix( g_hWorldInv, &mWorldInv) );
// set worldviewproj matrix
mWorldViewProjection = mWorld * *g_Camera.GetViewMatrix() * *g_Camera.GetProjMatrix();
V( g_pEffect-&SetMatrix( g_hWorldViewProj, &mWorldViewProjection) );
#if phase6
// set light view matrix
buildLightViewMatrix(mLightViewMatrix, &D3DXVECTOR3(lightPosition[0], lightPosition[1], lightPosition[2]) );
// set texture matrix
buildTextureMatrix(mTextureMatrix, mWorld, mLightViewMatrix);
V( g_pEffect-&SetMatrix( g_hTextureMatrix, &mTextureMatrix) );
V( g_pEffect-&CommitChanges() );
g_pSphereMesh-&DrawSubset(0);
// set teapot world matrix
// Set world matrix
D3DXMatrixTranslation(&mWorld, -2, 1, 0);
V( g_pEffect-&SetMatrix( g_hWorld, &mWorld) );
// Set world inverse matrix
D3DXMatrixInverse(&mWorldInv, NULL, &mWorld);
V( g_pEffect-&SetMatrix( g_hWorldInv, &mWorldInv) );
// set worldviewproj matrix
mWorldViewProjection = mWorld * *g_Camera.GetViewMatrix() * *g_Camera.GetProjMatrix();
V( g_pEffect-&SetMatrix( g_hWorldViewProj, &mWorldViewProjection) );
#if phase6
// set light view matrix
buildLightViewMatrix(mLightViewMatrix, &D3DXVECTOR3(lightPosition[0], lightPosition[1], lightPosition[2]) );
// set texture matrix
buildTextureMatrix(mTextureMatrix, mWorld, mLightViewMatrix);
V( g_pEffect-&SetMatrix( g_hTextureMatrix, &mTextureMatrix) );
V( g_pEffect-&CommitChanges() );
g_pTeapotMesh-&DrawSubset(0);
V( g_pEffect-&EndPass() );
V( g_pEffect-&End() );
#if phase3
pd3dDevice-&SetRenderState(D3DRS_LIGHTING, false);
// Set world matrix
D3DXMATRIX M;
D3DXMatrixIdentity( &M ); // M = identity matrix
D3DXMatrixTranslation(&M, lightPosition[0], lightPosition[1], lightPosition[2]);
pd3dDevice-&SetTransform(D3DTS_WORLD, &M) ;
//这里三角形更像是世界坐标中静止的物体(比如墙)因为按W它会相对与摄像机会动,不像茶壶总在摄像机前面,相对于摄像机静止
// Set view matrix
D3DXMATRIX view
= *g_Camera.GetViewMatrix() ;
pd3dDevice-&SetTransform(D3DTS_VIEW, &view) ;
// Set projection matrix
D3DXMATRIX proj
= *g_Camera.GetProjMatrix() ;
pd3dDevice-&SetTransform(D3DTS_PROJECTION, &proj) ;
g_pLightSphereMesh-&DrawSubset(0);
V( pd3dDevice-&EndScene() );
//--------------------------------------------------------------------------------------
// Handle messages to the application
//--------------------------------------------------------------------------------------
LRESULT CALLBACK MsgProc( HWND hWnd, UINT uMsg, WPARAM wParam, LPARAM lParam,
bool* pbNoFurtherProcessing, void* pUserContext )
#if phase1
g_Camera.HandleMessages( hWnd, uMsg, wParam, lParam );
//--------------------------------------------------------------------------------------
// Release D3D9 resources created in the OnD3D9ResetDevice callback
//--------------------------------------------------------------------------------------
void CALLBACK OnD3D9LostDevice( void* pUserContext )
#if phase1
SAFE_RELEASE(g_pVB);
SAFE_RELEASE(g_pIB);
#if phase3
if( g_pEffect )
g_pEffect-&OnLostDevice();
//--------------------------------------------------------------------------------------
// Release D3D9 resources created in the OnD3D9CreateDevice callback
//--------------------------------------------------------------------------------------
void CALLBACK OnD3D9DestroyDevice( void* pUserContext )
#if phase3
SAFE_RELEASE(g_pEffect);
SAFE_RELEASE(g_pLightSphereMesh);
#if phase5
SAFE_RELEASE(g_pSphereMesh);
SAFE_RELEASE(g_pTeapotMesh);
#if phase6
SAFE_RELEASE(g_pTex);
#if phase4
void CALLBACK OnKeyboardProc(UINT character, bool is_key_down, bool is_alt_down, void* user_context)
if(is_key_down)
switch(character)
case VK_SPACE:
g_bAnimation = !g_bA
//--------------------------------------------------------------------------------------
// Initialize everything and go into a render loop
//--------------------------------------------------------------------------------------
INT WINAPI wWinMain( HINSTANCE, HINSTANCE, LPWSTR, int )
// Enable run-time memory check for debug builds.
#if defined(DEBUG) | defined(_DEBUG)
_CrtSetDbgFlag( _CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF );
// Set the callback functions
DXUTSetCallbackD3D9DeviceAcceptable( IsD3D9DeviceAcceptable );
DXUTSetCallbackD3D9DeviceCreated( OnD3D9CreateDevice );
DXUTSetCallbackD3D9DeviceReset( OnD3D9ResetDevice );
DXUTSetCallbackD3D9FrameRender( OnD3D9FrameRender );
DXUTSetCallbackD3D9DeviceLost( OnD3D9LostDevice );
DXUTSetCallbackD3D9DeviceDestroyed( OnD3D9DestroyDevice );
DXUTSetCallbackDeviceChanging( ModifyDeviceSettings );
DXUTSetCallbackMsgProc( MsgProc );
DXUTSetCallbackFrameMove( OnFrameMove );
#if phase4
DXUTSetCallbackKeyboard( OnKeyboardProc );
// TODO: Perform any application-level initialization here
// Initialize DXUT and create the desired Win32 window and Direct3D device for the application
DXUTInit( true, true ); // Parse the command line and show msgboxes
DXUTSetHotkeyHandling( true, true, true );
// handle the default hotkeys
DXUTSetCursorSettings( true, true ); // Show the cursor and clip it when in full screen
DXUTCreateWindow( L&3D_Shader_ProjectivetTexturing& );
DXUTCreateDevice( true, 640, 480 );
// Start the render loop
DXUTMainLoop();
// TODO: Perform any application-level cleanup here
return DXUTGetExitCode();
/*--------------------------------------------------------------------------
ProjectiveTexturing.fx -- Projective texturing shader
(c) Seamanj.
--------------------------------------------------------------------------*/
//--------------------------------------------------------------------------------------
// Global variables
//--------------------------------------------------------------------------------------
float4x4 g_mWorldViewP
float4x4 g_mW
float4x4 g_mWorldI
float4x4 g_mTextureM
float3 g_lightP
texture g_
//-----------------------------------------------------------------------------
// Sampler
//-----------------------------------------------------------------------------
sampler2D g_sam =
sampler_state
Texture = &g_tex&;
MinFilter = L
MagFilter = L
MipFilter = L
//--------------------------------------------------------------------------------------
// Vertex shader output structure
//--------------------------------------------------------------------------------------
struct VS_Output {
float4 position : POSITION;
float4 diffuseLighting : COLOR;
float4 texCoordProj : TEXCOORD0;
//--------------------------------------------------------------------------------------
// Vertex shader
//--------------------------------------------------------------------------------------
VS_Output MyVertexEntry(float4 position : POSITION,float3 normal : NORMAL)
VS_Output OUT;
OUT.position = mul ( position, g_mWorldViewProj);
OUT.texCoordProj = mul(position, g_mTextureMatrix);
// Compute diffuse lighting
normal = mul(normal, transpose((float3x3)g_mWorldInv));
float3 N = normalize(normal);
position = mul( position, g_mWorld );//在世界坐标系中计算光照
float3 L = normalize(g_lightPosition - position.xyz);
OUT.diffuseLighting = float4(1,1,1,1) * max(dot(N, L), 0);
return OUT;
//--------------------------------------------------------------------------------------
// Pixel shader
//--------------------------------------------------------------------------------------
float4 MyPixelEntry(float4 diffuseLighting : COLOR, float4 texCoordProj : TEXCOORD0) : COLOR
float2 uv = texCoordProj.xy / texCoordProj.w;
//如果纹理坐标超出了(0,1)范围以及投影椎体的后面都不要映射
if( uv.x & 0.0f || uv.x & 1.0f || uv.y & 0.0f || uv.y & 1.0f || texCoordProj.w & 0 )
return diffuseL
uv.x = 1 - uv.x;//由于建立灯光的view矩阵时采用的是左手坐标系,所以转成纹理坐标需要左右相反
float4 textureColor = tex2D(g_sam, uv);
return textureColor * textureColor.w + (1.0 - textureColor.w) * diffuseL
//--------------------------------------------------------------------------------------
// Renders scene to render target
//--------------------------------------------------------------------------------------
technique MyTechnique
VertexShader = compile vs_2_0 MyVertexEntry();
PixelShader = compile ps_2_0 MyPixelEntry();
&&相关文章推荐
* 以上用户言论只代表其个人观点,不代表CSDN网站的观点或立场
访问:365656次
积分:9263
积分:9263
排名:第1979名
原创:563篇
转载:89篇
评论:94条
(2)(1)(2)(3)(27)(13)(16)(16)(6)(9)(11)(25)(29)(28)(5)(69)(50)(29)(39)(82)(10)(2)(5)(2)(2)(3)(6)(22)(7)(7)(34)(21)(3)(3)(5)(2)(4)(8)(14)(6)(16)(4)(3)(1)

我要回帖

更多关于 3dsmax normal map 的文章

 

随机推荐