In osg::isGLExtensionOrVersionSupported in src/osg/GLExtensions.cpp when
using indirect X11 rendering,
glGetIntegerv( GL_NUM_EXTENSIONS, &numExt );
is leaving numExt uninitilized causing the following glGetStringi to
return NULL when the extension number isn't present. Passing NULL to
std::string() then crashes. This is with the following nVidia driver.
OpenGL version string: 3.3.0 NVIDIA 256.35
I went ahead and initialized some of the other variables before
glGetInitegerv in other files as well. I don't know for sure
which ones can fail, so I don't know which are strictly required.
"
creation of main shader to ShaderComposer and
collection of ShaderComponent to osg::State.
Also added very basic shader set up in osgshadecomposition example.
Added support for automatic aliasing of vertex, normal, color etc. arrays to Vertex Attribute equivelants.
Added new osg::GLBeginEndAdapter class for runtime conversion from glBegin/glEnd codes to vertex arrray equivelants.
Added automatic shader source conversion from gl_ to osg_ builtins.
Introduced a new FindFileCallback to Registry to compliement the existing ReadFileCallback and WriteFileCallback.
Added support for assign Find, Read and WriteFileCallbacks to osdDB::Options to enable plugins/applications to override the callbacks just for that
read/write call and any nested file operations
"updates for GLSL core integration:
Code compiles and runs on win32.
Basic functionality of Program and Shader in place.
Program derived from StateAttribute.
Uniform value propagation is not yet functional (in development)
Includes some patches by Nathan Cournia.
includes example testcase to demo use of new classes."