From b5a14f8d6831fc906394a41aa9e05fc2c343ab46 Mon Sep 17 00:00:00 2001 From: Robert Osfield Date: Mon, 14 Feb 2011 13:54:15 +0000 Subject: [PATCH] From Johannes Bauerle, "I experienced problems using binary shaders in osgt files (=the osg-serializer plugins). At runtime std::bad_alloc errors were thrown when using binary shaders. The reason is that the .osgt text files do not provide size information about the contained binary shader hence leading to a bad allocation when reading the shader data, probably size 0 ? The reader method in the responsible serializer class (serializers/osg/BinaryShader) is correct and does not need to be changed as it queries the size as expected. The writer method supports two paths(binary output .osgb and text output .osgt/.osgx). Only the text path is affected as the binary path writes the size. I extended the writer in the text path by the size information. The results before and after the fix are shown below: Erroneous code for binary shader in osgt file before fix: Data { 0a 0d 0 ... } Corrected code for binary shader in osgt file after fix: Data 524 { 0a 0d 0 ... } After my fix the the thrown error disappeared." --- src/osgWrappers/serializers/osg/ShaderBinary.cpp | 1 + 1 file changed, 1 insertion(+) diff --git a/src/osgWrappers/serializers/osg/ShaderBinary.cpp b/src/osgWrappers/serializers/osg/ShaderBinary.cpp index d00adee0f..f23a18536 100644 --- a/src/osgWrappers/serializers/osg/ShaderBinary.cpp +++ b/src/osgWrappers/serializers/osg/ShaderBinary.cpp @@ -39,6 +39,7 @@ static bool writeData( osgDB::OutputStream& os, const osg::ShaderBinary& sb ) else { const unsigned char* data = sb.getData(); + os << (unsigned int)sb.getSize(); os << osgDB::BEGIN_BRACKET << std::endl; for ( unsigned int i=0; i