From David Fries, "Fix remote X11 crash querying GL_NUM_EXTENSIONS
In osg::isGLExtensionOrVersionSupported in src/osg/GLExtensions.cpp when using indirect X11 rendering, glGetIntegerv( GL_NUM_EXTENSIONS, &numExt ); is leaving numExt uninitilized causing the following glGetStringi to return NULL when the extension number isn't present. Passing NULL to std::string() then crashes. This is with the following nVidia driver. OpenGL version string: 3.3.0 NVIDIA 256.35 I went ahead and initialized some of the other variables before glGetInitegerv in other files as well. I don't know for sure which ones can fail, so I don't know which are strictly required. "
This commit is contained in:
@@ -2258,6 +2258,7 @@ Texture::Extensions::Extensions(unsigned int contextID)
|
||||
OSG_INFO<<"Disabling _isNonPowerOfTwoTextureMipMappedSupported for GeForce FX hardware."<<std::endl;
|
||||
}
|
||||
|
||||
_maxTextureSize=0;
|
||||
glGetIntegerv(GL_MAX_TEXTURE_SIZE,&_maxTextureSize);
|
||||
|
||||
char *ptr;
|
||||
@@ -2274,6 +2275,7 @@ Texture::Extensions::Extensions(unsigned int contextID)
|
||||
|
||||
if( _isMultiTexturingSupported )
|
||||
{
|
||||
_numTextureUnits = 0;
|
||||
#if defined(OSG_GLES2_AVAILABLE) || defined(OSG_GL3_AVAILABLE)
|
||||
glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS,&_numTextureUnits);
|
||||
#else
|
||||
|
||||
Reference in New Issue
Block a user