Some drivers support it despite that, while others fail with an error along the lines of "sampler arrays indexed with non-constant expressions are forbidden in GLSL 1.30 and later". The setting the array size to 1 part of the problem is due to indexing sampler arrays with a varying being illegal under the OpenGL spec. This behavior is very weird (and makes INVALIDATE_BUFFER_BIT useless). The working theory is that invalidating the buffer while it is being drawn causing the driver to garbage collect it (despite it being used). Just an update in case anyone stumbles onto this. I would greatly appreciate it if someone could shed some insight on this. Additionally, I do not see why the invalidate buffer and flush explicit bits would cause problems either, considering the stated driver support. The driver specifies that it supports both the GL_ARB_gpu_shader5 extension (which I've tried to request) and I'm using GLSL 440 therefore uniform indexing of opaque types should work. I got the scene to render properly by using only " GL_MAP_WRITE_BIT" and by limiting my activated textures (and uniform array size) to 1. I'd like to stress that I have checked the parameters that go into the functions stated below and have tested them on a variety of machines and GPUs including the Mesa3D llvmpipe, and therefore do not believe my ranges are wrong, but rather that I'm making a wrong assumption somewhere. Random flickering, and sometimes the last texture being drawn is not displayed.
When performing this I get the result in the attached video. In the fragment shader, I index a uniform array of samplers to determine which texture the vertex uses.I bind and activate a series of textures, up to the number specified by MAX_TEXTURE_IMAGE_UNITS which on this driver is 18.One of the attributes of the vertex data is an int specifying the texture index within the shader.I then flush the range of data which was mapped using glFlushMappedBufferRange.I upload vertex data to a VertexBuffer using glMapBufferRange with GL_MAP_WRITE_BIT | GL_MAP_INVALIDATE_BUFFER_BIT | GL_MAP_FLUSH_EXPLICIT_BIT.I've also tested using a core, core and forward compat, compatibility and default OpenGL profile. The GPU in this machine is the ATI Radeon HD 5770 Specs | TechPowerUp GPU Database
After messing around with my code I managed to get it working but I have no idea why the changes I made caused it to work, and how to detect this other than hardcoding the renderer's name, so I decided to ask here.
Hey guys, I'm working on some OpenGL code and I noticed some peculiarities on one of my machines which don't happen on any of the other ones.