From David Fries, "Fix remote X11 crash querying GL_NUM_EXTENSIONS

In osg::isGLExtensionOrVersionSupported in src/osg/GLExtensions.cpp when
using indirect X11 rendering,
glGetIntegerv( GL_NUM_EXTENSIONS, &numExt );
is leaving numExt uninitilized causing the following glGetStringi to
return NULL when the extension number isn't present.  Passing NULL to
std::string() then crashes.  This is with the following nVidia driver.
OpenGL version string: 3.3.0 NVIDIA 256.35

I went ahead and initialized some of the other variables before
glGetInitegerv in other files as well.  I don't know for sure
which ones can fail, so I don't know which are strictly required.
"
This commit is contained in:
Robert Osfield
2010-11-03 09:28:28 +00:00
parent 079b1c293e
commit 2d28026654
9 changed files with 11 additions and 6 deletions

View File

@@ -2258,6 +2258,7 @@ Texture::Extensions::Extensions(unsigned int contextID)
OSG_INFO<<"Disabling _isNonPowerOfTwoTextureMipMappedSupported for GeForce FX hardware."<<std::endl;
}
_maxTextureSize=0;
glGetIntegerv(GL_MAX_TEXTURE_SIZE,&_maxTextureSize);
char *ptr;
@@ -2274,6 +2275,7 @@ Texture::Extensions::Extensions(unsigned int contextID)
if( _isMultiTexturingSupported )
{
_numTextureUnits = 0;
#if defined(OSG_GLES2_AVAILABLE) || defined(OSG_GL3_AVAILABLE)
glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS,&_numTextureUnits);
#else