I've found that *v8::String::Utf8Value(args[0]->ToString()) returns the proper string on node 0.8.2 32-bit and does not return the proper string on node 0.8.8 64-bit.
does anybody understand why?
My node.js addon looks like this:
#define BUILDING_NODE_EXTENSION
#include <node.h>
#define MAX_OUTPUT_BUF 80
extern "C" char *do_sqlsig(char *in);
using namespace v8;
Handle<Value> Sqlsig(const Arguments& args) {
HandleScope scope;
char *c_arg, *ret;
if (args.Length() < 1) {
ThrowException(Exception::TypeError(String::New("Wrong number of arguments")));
return scope.Close(Undefined());
}
c_arg = *v8::String::Utf8Value(args[0]->ToString());
ret = c_arg; //do_sqlsig(c_arg);
return scope.Close(String::New(ret));
}
void Init(Handle<Object> exports) {
exports->Set(String::NewSymbol("sqlsig"),
FunctionTemplate::New(Sqlsig)->GetFunction());
}
NODE_MODULE(sqlsig, Init)
As you can see I'm writing a wrapper for the C function, do_sqlsig. I know C very well and know very little about C++
The string that the pointer returned from *v8::String::Utf8Value(args[0]->ToString());
is point at is destroyed at the end of this line (when Utf8Value
is destroyed). You create and destroy a Utf8Value
object in one line. It is undefined behavior when dereferencing a dangling pointer, and why you are seeing different results on different versions.
Break it up into two lines and the string will be valid as long as your Utf8Value
object is in scope.
v8::String::Utf8Value str(args[0]->ToString());
c_arg = *str;