From Johannes Bauerle, "I experienced problems using binary shaders in osgt files (=the osg-serializer plugins). At runtime std::bad_alloc errors were thrown when using binary shaders.

The reason is that the .osgt text files do not provide size information about the contained binary shader hence leading to a bad allocation when reading the shader data, probably size 0 ? The reader method in the responsible serializer class (serializers/osg/BinaryShader) is correct and does not need to be changed as it queries the size as expected. The writer method supports two paths(binary output .osgb and text output .osgt/.osgx). Only the text path is affected as the binary path writes the size.

I extended the writer in the text path by the size information. The results before and after the fix are shown below:

Erroneous code for binary shader in osgt file before fix:

Data {
    0a
    0d
    0
    ...
}

Corrected code for binary shader in osgt file after fix:

Data 524 {
   0a
   0d
   0
   ...
}

After my fix the the thrown error disappeared."
This commit is contained in:
Robert Osfield
2011-02-14 13:54:15 +00:00
parent 54afbb9ff8
commit b5a14f8d68

View File

@@ -39,6 +39,7 @@ static bool writeData( osgDB::OutputStream& os, const osg::ShaderBinary& sb )
else
{
const unsigned char* data = sb.getData();
os << (unsigned int)sb.getSize();
os << osgDB::BEGIN_BRACKET << std::endl;
for ( unsigned int i=0; i<sb.getSize(); ++i )
{