** This is still unsolved **
I'm trying to call an ObjC/C++ function code from C#. I've done my best to follow different example code, the latest being mostly from:
http://msdn.microsoft.com/en-us/library/ms146631(v=VS.80).aspx
This is for an iPhone/MonoTouch environment, so I'm not sure I've done everything I should. The bytes appear to be ok in the ObjC/C++ function, but the byte array I get back into C# ends up containing 0 0 0 0 0 0 etc.
** Update **
Corrected for loop initializer, and now its giving a EXC_BAD_ACCESS signal on the *returnbytes[i] = bytes[i]; line.
C# code:
[DllImport ("__Internal")]
private static extern int _getjpeg(string url,ref IntPtr thebytes);
void somefunction(string image_id) {
int maxsize = 50000;
byte[] thebytes = new byte[maxsize];
IntPtr byteptr = Marshal.AllocHGlobal(maxsize);
int imagesize = _getjpeg(image_id,ref byteptr);
Debug.Log("Getting _picturesize()... "+ image_id);
int picsize = _picturesize();
Marshal.Copy(byteptr,thebytes,0,picsize);
var texture = new Texture2D(1,1);
string bytedebug = "";
for (int i=5000 ; i < 5020 ; i++)
bytedebug+=thebytes[i] + " ";
Debug.Log("Bytes length is "+imagesize);
Debug.Log("Bytes content is "+bytedebug);
}
C++/ObjC code:
int _getjpeg(const char* url,unsigned char** returnbytes) {
ALAsset* asset = [_pictures objectForKey:[NSString stringWithUTF8String:url]];
if(asset != NULL)
NSLog(#"_getjpeg() found URL: %#",[NSString stringWithUTF8String: url]);
else {
NSLog(#"_getjpeg() could not find URL: %#",[NSString stringWithUTF8String: url]);
return NULL;
}
UIImage *image = [UIImage imageWithCGImage: [asset thumbnail]];
NSData* pictureData = UIImageJPEGRepresentation (image, 1.0);
picturesize = (int)[pictureData length];
unsigned char* bytes = (unsigned char*)[pictureData bytes];
// This test does not give EXC_BAD_ACCESS
*returnbytes[5] = (unsigned int)3;
// updated below initializer in below for loop according to Eikos suggestion
for(int i=0 ; i < picturesize ; i++) {
// below lines gives EXC_BAD_ACCESS
*returnbytes[i] = bytes[i];
}
NSString* debugstr = [NSString string];
for(int i=5000; i < 5020 ; i++) {
unsigned char byteint = bytes[i];
debugstr = [debugstr stringByAppendingString:[NSString stringWithFormat:#"%i ",byteint]];
}
NSLog(#"bytes %s",[debugstr UTF8String]);
return picturesize;
}
Thanks
Keep in mind that the JPGRepresentation is probably not exactly the same as you put into it, so the length may differ.
In
for(int i;i < picturesize;i++) {
// *** Not sure I'm doing this correctly ***
*returnbytes[i] = bytes[i];
}
you forget to initialize i, so it might start with a random value which is bigger than picturesize, so the loop won't run at all.
You want unsigned char*, not **. You are passing a pointer in that is already allocated. A ** is for when you are passing in a pointer to variable that is itself a pointer to data: i.e. when the callee will allocate the memory and the caller wants to know about it.
Just pass in unsigned char* and then use
returnbytes[i] = bytes[i];
Alternatively, allocate in the calee and use an out, not a ref.
Related
I have some constraints where the addon is built with nan.h and v8 (not the new node-addon-api).
The end function is a part of a library. It accepts std::vector<char> that represents the bytes of an image.
I tried creating an image buffer from Node.js:
const img = fs.readFileSync('./myImage.png');
myAddonFunction(Buffer.from(img));
I am not really sure how to continue from here. I tried creating a new vector with a buffer, like so:
std::vector<char> buffer(data);
But it seems like I need to give it a size, which I am unsure how to get. Regardless, even when I use the initial buffer size (from Node.js), the image fails to go through.
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
[1] 16021 abort (core dumped)
However, when I read the image directly from C++, it all works fine:
std::ifstream ifs ("./myImage.png", std::ios::binary|std::ios::ate);
std::ifstream::pos_type pos = ifs.tellg();
std::vector<char> buffer(pos);
ifs.seekg(0, std::ios::beg);
ifs.read(&buffer[0], pos);
// further below, I pass "buffer" to the function and it works just fine.
But of course, I need the image to come from Node.js. Maybe Buffer is not what I am looking for?
Here is an example based on N-API; I would also encourage you to take a look similar implementation based on node-addon-api (it is an easy to use C++ wrapper on top of N-API)
https://github.com/nodejs/node-addon-examples/tree/master/array_buffer_to_native/node-addon-api
#include <assert.h>
#include "addon_api.h"
#include "stdio.h"
napi_value CArrayBuffSum(napi_env env, napi_callback_info info)
{
napi_status status;
const size_t MaxArgExpected = 1;
napi_value args[MaxArgExpected];
size_t argc = sizeof(args) / sizeof(napi_value);
status = napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);
assert(status == napi_ok);
if (argc < 1)
napi_throw_error(env, "EINVAL", "Too few arguments");
napi_value buff = args[0];
napi_valuetype valuetype;
status = napi_typeof(env, buff, &valuetype);
assert(status == napi_ok);
if (valuetype == napi_object)
{
bool isArrayBuff = 0;
status = napi_is_arraybuffer(env, buff, &isArrayBuff);
assert(status == napi_ok);
if (isArrayBuff != true)
napi_throw_error(env, "EINVAL", "Expected an ArrayBuffer");
}
int32_t *buff_data = NULL;
size_t byte_length = 0;
int32_t sum = 0;
napi_get_arraybuffer_info(env, buff, (void **)&buff_data, &byte_length);
assert(status == napi_ok);
printf("\nC: Int32Array size = %d, (ie: bytes=%d)",
(int)(byte_length / sizeof(int32_t)), (int)byte_length);
for (int i = 0; i < byte_length / sizeof(int32_t); ++i)
{
sum += *(buff_data + i);
printf("\nC: Int32ArrayBuff[%d] = %d", i, *(buff_data + i));
}
napi_value rcValue;
napi_create_int32(env, sum, &rcValue);
return (rcValue);
}
The JavaScript code to call the addon
'use strict'
const myaddon = require('bindings')('mync1');
function test1() {
const array = new Int32Array(10);
for (let i = 0; i < 10; ++i)
array[i] = i * 5;
const sum = myaddon.ArrayBuffSum(array.buffer);
console.log();
console.log(`js: Sum of the array = ${sum}`);
}
test1();
The Output of the code execution:
C: Int32Array size = 10, (ie: bytes=40)
C: Int32ArrayBuff[0] = 0
C: Int32ArrayBuff[1] = 5
C: Int32ArrayBuff[2] = 10
C: Int32ArrayBuff[3] = 15
C: Int32ArrayBuff[4] = 20
C: Int32ArrayBuff[5] = 25
C: Int32ArrayBuff[6] = 30
C: Int32ArrayBuff[7] = 35
C: Int32ArrayBuff[8] = 40
C: Int32ArrayBuff[9] = 45
js: Sum of the array = 225
I am trying to initialize the Vulkan API.
The problem I am having is that I get an access violation error after I call vkCreateInstance and I think the problem comes from the extension and layer lists.
I am using a char buff[20][256] to transfer them from strings to the structure for the API call, and the layer and extension names I see in the debugger(3 extensions and 15 layers) are all a lot shorter than 256 characters and are all null terminated.
There is no buffer overflow with the extension or layer names, yet it crashes.
The layer and extension lists of strings I received trough using vkEnumerateInstanceExtensionProperties and vkEnumerateInstanceLayerProperties beforehand and are all valid null-terminated strings like "VK_KHR_surface", etc.
Is it possible that even tho it says I support some extensions, that I don't really support them and the API is crashing when it's trying to initialize an extension I don't support?
void InitializeInstance(void** instance, const vector<string>& layers, const vector<string>& extensions)
{
VkApplicationInfo applicationInfo;
VkInstanceCreateInfo instanceInfo;
VkInstance* instanceOut = (VkInstance*)instance;
applicationInfo.sType = VK_STRUCTURE_TYPE_APPLICATION_INFO;
applicationInfo.pNext = nullptr;
applicationInfo.pApplicationName = "MyApp";
applicationInfo.pEngineName = "MyEngine";
applicationInfo.engineVersion = 1;
applicationInfo.apiVersion = VK_API_VERSION_1_0;
instanceInfo.sType = VK_STRUCTURE_TYPE_INSTANCE_CREATE_INFO;
instanceInfo.pNext = null;
instanceInfo.flags = 0;
instanceInfo.pApplicationInfo = &applicationInfo;
char buffLayer[20][256];
char buffExt[20][256];
if(!layers.empty())
{
instanceInfo.enabledLayerCount = layers.size();
for(int i = 0; i < layers.size(); i++)
{
strcpy(buffLayer[i], layers[i].c_str());
}
instanceInfo.ppEnabledLayerNames = (char**)buffLayer;
}
else
{
instanceInfo.enabledLayerCount = 0;
instanceInfo.ppEnabledLayerNames = nullptr;
}
if(!extensions.empty())
{
instanceInfo.enabledExtensionCount = extensions.size();
for(int i = 0; i < extensions.size(); i++)
{
strcpy(buffExt[i], extensions[i].c_str());
}
instanceInfo.ppEnabledExtensionNames = (char**)buffExt;
}
else
{
instanceInfo.enabledExtensionCount = 0;
instanceInfo.ppEnabledExtensionNames = nullptr;
}
vkCreateInstance(&instanceInfo, nullptr, instanceOut);
}
When I have only 0 extensions AND 0 layers, it creates successfully. If any of them is not 0, it crashes.
char buffLayer[20][256];
instanceInfo.ppEnabledLayerNames = (char**)buffLayer;
ppEnabledLayerNames is supposed to be an array of pointers to character arrays. But you're passing it a 2D array of characters, which is effectively just an array of 20*256 characters.
If you're on a machine with 32-bit pointers, the driver is going to take the first four bytes in buffLayer and treat them as a pointer to a character array. But you've just stored the first four characters of a layer name there, and 'VK_K' is probably not going to be a valid pointer value :). So the loader will crash when trying to dereference that invalid pointer.
Probably the simplest change would be to add:
char* layerNames[20];
for (int i = 0; i < 20; i++)
layerNames[i] = &buffLayer[i][0];
and pass layerNames as ppEnabledLayerNames.
const int bookBoatNum = 10;
Wt::WPushButton *buttonBookBoat[bookBoatNum];
Wt::WDialog *dialogBookBoat[bookBoatNum];
for (int i = 1; i < bookBoatNum; i++){
dialogBookBoat[i] = new Wt::WDialog("Book Boat");
buttonBookBoat[i] = new Wt::WPushButton();
buttonBookBoat[i]->clicked().connect(std::bind([&dialogBookBoat,i]() {
dialogBookBoat[i]->show();
}));
}
The program compiles and runs. When I click on a WPushButton object, it crashes because of the third last line because of a memory error. This code works perfectly if buttonBookBoat and dialogBookBoat are single objects, rather than an array of objects. show() is a method that displays the dialog object.
Any help is appreciated, this error has been driving me crazy and my life is on the line with this code (not really).
const int bookBoatNum = 10;
Wt::WPushButton *buttonBookBoat[bookBoatNum];
Wt::WDialog *dialogBookBoat[bookBoatNum];
for (int i = 1; i < bookBoatNum; i++){
dialogBookBoat[i] = new Wt::WDialog("Book Boat");
Wt::WDialog * tempDialog=new Wt::WPushButton();
buttonBookBoat[i] = tempDialog;
buttonBookBoat[i]->clicked().connect(std::bind([tempDialog]() {
tempDialog->show();
}));
}
I think dialogBookBoat is an array so it will not exist after the function call.
They seem to all get autoreleased the moment I create them =s
void SceneView::createAnimation(KillerRabbit* killerRabbit, std::string animation) {
CCArray* animFrames = CCArray::createWithCapacity(15);
int first = std::stoi(killerRabbit->spriteSheetMap[animation]["FIRST"]);
int last = std::stoi(killerRabbit->spriteSheetMap[animation]["LAST"]);
char str[100] = {0};
for (int i = first; i <= last; i++) {
// Obtain frames by alias name
sprintf(str, (killerRabbit->spriteSheetMap[animation]["KEY"]+"[%d].png").c_str(), i);
CCSpriteFrame* frame = sharedSpriteFrameCache->spriteFrameByName(str);
animFrames->addObject(frame);
}
spriteAnimationsMap[killerRabbit->spriteName][animation] = CCAnimation::createWithSpriteFrames(animFrames, 0.1f);
// 14 frames * 1sec = 14 seconds
rabbitSprites[killerRabbit->spriteName][animation]->
runAction(CCRepeatForever::create(CCAnimate::create(spriteAnimationsMap[killerRabbit->spriteName][animation])));
}
If I omit this part of the code:
rabbitSprites[killerRabbit->spriteName][animation]->
runAction(CCRepeatForever::create(CCAnimate::create(spriteAnimationsMap[killerRabbit->spriteName][animation])));
And try to access the object in:
spriteAnimationsMap[killerRabbit->spriteName][animation]
In a later part of the code with another method, the object inside that map would have been autoreleased, how can I retain it so I can use the different animations stored in it at a later time?
Oh, silly me, I had to do this:
spriteAnimationsMap[killerRabbit->spriteName][animation]->retain();
I have a sample project here on github where I created a c++ wrapper class for an external C++ library that I want to use in Objective-C.
I don't understand why my returned pointers are sometimes correct and sometimes wrong. Here's sample output:
Test Data = 43343008
In Compress 43343008
Returned Value = 43343008
Casted Value = 43343008
Test Data = 2239023
In Compress 2239023
Returned Value = 2239023
Casted Value = 2239023
Test Data = 29459973
In Compress 29459973
Returned Value = 29459973
Casted Value = l.remote
Test Data = 64019670
In Compress 64019670
Returned Value =
Casted Value = stem.syslog.master
In the above output you can see that the 1st and 2nd click of the button outputs the results I was expecting. In each of the other clicks either the returned value or casted value are invalid. I'm assuming this is because my pointer is pointing to an address I wasn't expecting. when running the app multiple times, any button click could be right or wrong.
I also tried with a single thread but experienced similar results.
The complete code is on github but here are the important bits.
ViewController.m
#import "ViewController.h"
extern const char * CompressCodeData(const char * strToCompress);
#implementation ViewController
...
// IBAction on the button
- (IBAction)testNow:(id)sender
{
[self performSelectorInBackground:#selector(analyze) withObject:nil];
}
- (void)analyze
{
#synchronized(self) {
const char *testData = [[NSString stringWithFormat:#"%d",
(int)(arc4random() % 100000000)] UTF8String];
NSLog(#"Test Data = %s", testData);
const char *compressed = CompressCodeData(testData);
NSLog(#"Returned Value = %s", compressed);
NSString *casted = [NSString stringWithCString:compressed
encoding:NSASCIIStringEncoding];
NSLog(#"Casted Value = %#\n\n", casted);
}
}
#end
SampleWrapper.cpp
#include <iostream>
#include <string.h>
#include <CoreFoundation/CoreFoundation.h>
using namespace std;
extern "C"
{
extern void NSLog(CFStringRef format, ...);
/**
* This function simply wraps a library function so that
* it can be used in objective-c.
*/
const char * CompressCodeData(const char * strToCompress)
{
const string s(strToCompress);
// Omitted call to static method in c++ library
// to simplify this test case.
//const char *result = SomeStaticLibraryFunction(s);
const char *result = s.c_str();
NSLog(CFSTR("In Compress %s"), result);
return result;
}
}
You are returning a pointer to at object that has been deallocated.
const string s(strToCompress);
…
const char *result = s.c_str();
NSLog(CFSTR("In Compress %s"), result);
return result;
s does not exist after CompressCodeData() function is over, so the pointer to it's internal memory is invalid.
You could allocate a chunk of memory to hold the response, but it would be up to the caller to release it.
char *compressed = CompressCodeData(testData);
NSLog(#"Returned Value = %s", compressed);
NSString *casted = [NSString stringWithCString:compressed
encoding:NSASCIIStringEncoding];
free(compressed);
NSLog(#"Casted Value = %#\n\n", casted);
…
const char * CompressCodeData(const char * strToCompress)
…
char *result = strdup(s.c_str());
Another solution is to pass in the memory to store the data into.
char compressed[2048]; // Or whatever!
CompressCodeData(testData, compressed, sizeof(compressed));
NSLog(#"Returned Value = %s", compressed);
NSString *casted = [NSString stringWithCString:compressed
encoding:NSASCIIStringEncoding];
NSLog(#"Casted Value = %#\n\n", casted);
…
void CompressCodeData(const char * strToCompress, char *result, size_t size)
…
s.copy(result, size - 1);
result[s.length() < size ? s.length() : size-1] = '\0';