In osg::isGLExtensionOrVersionSupported in src/osg/GLExtensions.cpp when
using indirect X11 rendering,
glGetIntegerv( GL_NUM_EXTENSIONS, &numExt );
is leaving numExt uninitilized causing the following glGetStringi to
return NULL when the extension number isn't present. Passing NULL to
std::string() then crashes. This is with the following nVidia driver.
OpenGL version string: 3.3.0 NVIDIA 256.35
I went ahead and initialized some of the other variables before
glGetInitegerv in other files as well. I don't know for sure
which ones can fail, so I don't know which are strictly required.
"
local function pointer to avoid compiler warnings related to case void*.
Moved various OSG classes across to using setGLExtensions instead of getGLExtensions,
and changed them to use typedef declarations in the headers rather than casts in
the .cpp.
Updated wrappers
Note, this required adding a unsigned int context ID to the osg::isGLUExtensionSupported(,)
and osg::isGLExtensionSupported(,) functions. This may require reimplementation
of end user code to accomodate the new calling convention.
part of prep for supporting both Matrixf (float) and Matrixd (double).
Added osg::Matrixf::glLoadMatrix() and osg::Matrixf::glMultiMatrix() methods
and changed corresponding usage of glLoad/MultMatrixf() calls across to use these
methods. Again prep for support Matrixd.
Fixes for VisualStudio 6.0 compile.