Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
508 views
in Technique[技术] by (71.8m points)

ios - Writing texture data onto depth buffer

I'm trying to implement the technique described at : Compositing Images with Depth.

The idea is to use an existing texture (loaded from an image) as a depth mask, to basically fake 3D.

The problem I face is that glDrawPixels is not available in OpenglES. Is there a way to accomplish the same thing on the iPhone?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The depth buffer is more obscured than you think in OpenGL ES; not only is glDrawPixels absent but gl_FragDepth has been removed from GLSL. So you can't write a custom fragment shader to spool values to the depth buffer as you might push colours.

The most obvious solution is to pack your depth information into a texture and to use a custom fragment shader that does a depth comparison between the fragment it generates and one looked up from a texture you supply. Only if the generated fragment is closer is it allowed to proceed. The normal depth buffer will catch other cases of occlusion and — in principle — you could use a framebuffer object to create the depth texture in the first place, giving you a complete on-GPU round trip, though it isn't directly relevant to your problem.

Disadvantages are that drawing will cost you an extra texture unit and textures use integer components.

EDIT: for the purposes of keeping the example simple, suppose you were packing all of your depth information into the red channel of a texture. That'd give you a really low precision depth buffer, but just to keep things clear, you could write a quick fragment shader like:

void main()
{
    // write a value to the depth map
    gl_FragColor = vec4(gl_FragCoord.w, 0.0, 0.0, 1.0);
}

To store depth in the red channel. So you've partially recreated the old depth texture extension — you'll have an image that has a brighter red in pixels that are closer, a darker red in pixels that are further away. I think that in your question, you'd actually load this image from disk.

To then use the texture in a future fragment shader, you'd do something like:

uniform sampler2D depthMap;

void main()
{
    // read a value from the depth map
    lowp vec3 colourFromDepthMap = texture2D(depthMap, gl_FragCoord.xy);

    // discard the current fragment if it is less close than the stored value
    if(colourFromDepthMap.r > gl_FragCoord.w) discard;

    ... set gl_FragColor appropriately otherwise ...
}

EDIT2: you can see a much smarter mapping from depth to an RGBA value here. To tie in directly to that document, OES_depth_texture definitely isn't supported on the iPad or on the third generation iPhone. I've not run a complete test elsewhere.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...