Fragment shader behaves unexpectedly (Test grid)

vertex fragment geometry shader
fragment shader attribute
fragment_shader
fragment shader discard
fragment shader glsl
opengl pixel shader
purpose of a fragment shader
fragment interpolation

I'm trying to write a simple fragment shader to display a grid (or rather a checker pattern) on a polygon. I want this pattern to "remain in place", i.e. when the polygon itself moves, the squares remain in the same place, so the resulting pattern kind of slides on the surface of the pattern.

I'm developing this in Java using LWJGL for an ARM-based embedded system, and I can debug both on remotely the ARM bevice connected to my PC, as well as locally on the PC itself. I use intelliJ for this.

On PC my program defaults to using OpenGL 3.2. On ARM the context is OpenGL ES 3.0. The graphics card on ARM is Vivante GC 2000.

Here's the problem: locally, on my PC, the shader works flawlessly, just like I want it to. But when I go to ARM - the pattern is jittering, distorting and going out of sync between two triangles that make my polygon. The interesting fact is that the pattern changes and moves based on camera position, even though the shader uses only ModelMatrix and vertex positions of the plane for calculations, which both remain exactly the same between frames (I checked). Yet camera position somehow affects the result dramatically, which shouldn't happen.

Here's my vertex shader:

#version 300 es

layout (location=0) in vec3 position;

uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 modelMatrix;

out highp vec3 vertexPosition;

void main()
{   
    // generate position data for the fragment shader
    // does not take view matrix or projection matrix into account
    vec4 vp = modelMatrix * vec4(position, 1.0);
    vertexPosition = vec3(vp.x, vp.y, vp.z);

    // position data for the OpenGL vertex drawing
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}

fragment shader:

#version 300 es

precision highp float;

in highp vec3 vertexPosition;

out mediump vec4 fragColor;

void main()
{
    highp float c = float((int(round(vertexPosition.x/5.0))+int(round(vertexPosition.z/5.0))) % 2);

    fragColor = vec4(vec3(c/2.0 + 0.3), 1);
}

As you can see I've tried tinkering with precision of float operations, alas to no avail. You can also notice that only modelMatrix and Vertex positions of the polygon affect fragColor, and I can guarantee that I cnecked them, and they do not change between shader calls, yet somehow camera movement ends up affecting the resulting fragment colors/pattern.

It's also worth of note that no other textures on the objects in the scene seem to be affected by the issue

Here're a couple of screenshots:

How it looks locally (everything works):

Here's how it looks on the ARM device.

Notice the textures shifted between triangles, and there's a weird line between them that seems to have been filled by a completely different set of rules entirely. The problem doesn't appear at all viewing angles - only some. If I point the camera in other directions, sometimes I can move it rather freely with no artifacts visible.

The other thing I've noticed is that the bigger my polygon is, the more jittering and artifacting occurs, which leads me to believe that it has to do with precision/calculation of either vertex positions (in vertex shader), or the position of fragment in relation to those vertex positions in the fragment shader part.

Edit: I've checked the precision of float using glGetShaderPrecisionFormat(GL_FRAGMENT_SHADER, GL_HIGH_FLOAT, range, precision); and it's the same on both local PC and MTU. So it shouldn't be that, unless you have to somehow specifically enable some flag that I'm missing.

Edit: And yet another thing I noticed is that locally, on PC the little test grass block appears in the center of one of the squares. And on ARM the squares are shifted by half, so it stands directly on the intersection (if I line the camera so that artifact doesn't happen). Can't rightly explain this, because in my mind the calculation should yield the same result.

Either way, I actually need to solve this problem somehow, and I would appreciate the answer.

So in the end it turned out to be a mixture of two problems:

1) Operator % works differently in OGL 3.2 and OGLES 3.0. In the former, it always returns a positive integer, even if the operands were negative. I.e. 3 % 2 = -3 % 2 = 1 In OpenGL ES, however, it actually maintains the sign.

2) It was a precision problem. I'm working in high-positive coordinates, so my vertexPosition coordinate ends up in range of 10k positive. When I take the sum, it can go up to 20k. Seems like even highP float isn't enough to maintain adequate precision with such high numbers, so it ends up breaking. Which is ODD, because the highP precision range shows as -2^127~~2^127, which should be plenty enough for my calculation. But it somehow isn't. My solution was to normalize the coordinates, reducing the value to <10.

If anyone can elaborate further on things I can do - let me know!

Hottest 'fragment-shader' Answers, Fragment shader behaves unexpectedly (Test grid). So in the end it turned out to be a mixture of two problems: 1) Operator % works differently in OGL 3.2 and� A Fragment Shader is the Shader stage that will process a Fragment generated by the Rasterization into a set of colors and a single depth value. The fragment shader is the OpenGL pipeline stage after a primitive is rasterized. For each sample of the pixels covered by a primitive, a "fragment" is generated.

The problems with my shader still presist. the "texture" of the polygon gets warped at some viewing angles. I found another question here that is pretty much exactly the same that I'm experiencing, and it has a bit of a workaround answer. I will link it here. OpenGL Perspective Projection Clipping Polygon with Vertex Outside Frustum = Wrong texture mapping?

Early Fragment Test, Early Fragment Test (sometimes called Early Depth Test) is a feature The Vertex Shader should do nothing more than transform positions, so the implementation is only required to behave "as if" it happened afterwards. 2 Fragment shader behaves unexpectedly (Test grid) Nov 1 '19. 2 Fragment shader behaves unexpectedly (Test grid) Nov 1 '19. 1 OpenGL and AWT/Swing User Interface Dec

I'll post my final answer to this problem, something that took a lot of struggle, searching and trial and error. There are actually two separate issues depicted on the screenshots, so I'll cover them both.

Regarding the strange texture shift on the polygon intersection. It has been confirmed, that this is a Vivante driver issue. Coordinates for points that lie too far outside of frustum are calculated wrongly for fragment shader (note that they are completely fine in the vertex shader, hence the plane doesn't appear torn - only the texture suffers).

There doesn't seem to be a driver fix at the moment.

You can, however, implement a workaround. Split the mesh. Instead of having a large quad that's made of 2 triangles - build it from several smaller quads. In my case I made a 6x6 structure, 36 quads and 64 triangles total. That way, no points are ever too far outside of frustum and precision seems good.

Yes, this is FAR from ideal, but it's better than having your fragment shader produce visual artifacts.


Concerning the colors. As you may notice, on the screenshots the colors end up as grey and black, where they should be light grey and dark grey.

The solution wasn't easy to arrive to. It's system locale.

As some of you may not even be aware of, on some Slavic locales the delimiter sign is comma, not a dot. so basically the line:

fragColor = vec4(vec3(c/2.0 + 0.3), 1);

gets turned into

fragColor = vec4(vec3(c/2 , 0 + 0 , 3), 1);

As you can guess, this is completely and utterly wrong. I'm actually impressed that GLSL seems completely fine with this and doesn' give any runtime errors. Probably because vec3 can take 3 coordinates during creation, or something. In any case, this bug made all floating point constants in my code completely wrong, and the resulting calculation is also wrong.

Just in case anyone runs into that problem.

[PDF] GPU-Accelerated Uniform Grid Construction for Ray , Keywords: Uniform Grid, Programmable Graphics Hardware, Ray Tracing, Dynamic. Scenes use a single fragment shader to encode the entire ray setup, traversal, The test scene consists of several triangles randomly distributed inside a. If the test fails, the fragment is discarded. If the test passes, the depth buffer will be updated with the fragment's output depth, unless a subsequent per-sample operation prevents it. This "per-sample" operation means that a fragment shader can be instructed to only perform depth tests on a per-pixel basis and you would therefore want/need it in the fragment shader.

[PDF] Spatial CPU-GPU Data Structures for Interactive Rendering of large , 3.5 Approximating occlusion between bricks by ray-grid traversal. OpenGL are tested with large particle data sets on different GPU hardware to evaluate by [ WKJ+15] to the GPU and traverses it in the fragment shader. 26 are used which behave the same on all conforming OpenGL GPUs and drivers. Effectively, a fragment shader gets run on every pixel in a texture, and can transform that pixel however it wants. That might sound slow, but it isn’t – all the fragment shaders here run at 60fps on iPhone 6 and newer, and 120fps on iPad Pro. The transformation process can recolor the pixel however it wants.

[PDF] The OpenGL ES Shading Language, GLSL version 1.1 Authors: John Kessenich, Dave Baldwin, Randi Rost If a value or result is undefined, the system may behave as if the value or result be used to do a compile time test to determine if a shader is running on an ES system. In some implementations, undefined values may cause unexpected behavior if� As mentioned, a blur like this can performed using a fragment shader, a small C-like program that is used by the GPU to perform operations on every fragment (pixel) of an input texture (image). Optimizing these shaders can be more of an art than a science sometimes, but there are some general rules to follow.

Avoiding 16 Common OpenGL Pitfalls, This means that the shading calculations due to light sources interacting with the OpenGL's per-vertex lighting works pretty well except when a lighting effect such are the only vertices the wall has, there will be no spotlight pattern on the wall. If you write your OpenGL program and test it against such implementations� To test this shader, create a new material, select the shader from the drop-down menu (Tutorial->Basic) and assign the Material to some object. Tweak the color in the Material Inspector and watch the changes. Time to move onto more complex things! Basic vertex lighting. If you open an existing complex shader, it can be a bit hard to get a good

Comments
  • My problem is still not fixed completely. The thing I discovered though that it works just like it should on the other ARM device, and then it doesn't on the "newer" version.
  • Ignoring the bugs (which do indeed sound like bugs), remember that the more range you use in a float, the less precision you get. Big floats will start to lose fractional bits, so it's always a good idea to keep things as close to zero as possible, and ideally use a range that is symmetric around zero (i.e. -1 to +1 will preserve more precision than 0 to 2, because you always store the sign bit so you may as well use it for something useful).