I'm supplying a c++ .dll to a user who is writing an installer via an NSIS script. Using System.dll, the user can call my .dll as such:
System::Call 'my.dll::GetJson(v) t .r0'
DetailPrint $0
The return value of GetJson() gets stored in $0. This is all working correctly, though GetJson() may return a json blob whose length is > 8192, in which case the value stored in $0 gets truncated.
I looked at trying to increase NSIS_MAX_STRLEN by building NSIS myself using scons, as mentioned here: https://nsis.sourceforge.io/Special_Builds
scons NSIS_MAX_STRLEN=16384 PREFIX=C:\somewhere install-compiler install-stubs
However, after doing this, the NSIS-compiled .exes crashed upon running. It seems like 8192 may be some kind of memory limitation.
Is there any way around this for me? For example, would it be possible to call
System::Call 'mydll::GetJson(v) t .r0'
But instead of the return value being stored in $0, have it be split into chunks? Perhaps it's possible to write the contents of GetJson() to a file first, and then NSIS can read that and split it?
Any help is appreciated. Thank you.
If the user needs to edit a very long string you basically have two options:
Use the system plug-in to fill a text field on a nsDialogs custom page. You can't use the registers to store the string, you need to use ...func()p.r0 to get the raw address of the string from your plug-in and use Sendmesage to fill the text field. To save you need to allocate memory, get the text with SendMessage and write it to a file and finally free the memory.
The other option is to create the custom page with your custom plug-in.
Related
I have generated a GeoTiff dataset in-memory using GDALTranslate() with a /vsimem/ filepath. I need access to the buffer for the actual GeoTiff file to put it in a stream for an external API. My understanding is that this should be possible with VSIGetMemFileBuffer(), however I can't seem to get this to return anything other than nullptr.
My code is essentially as follows:
//^^ GDALDataset* srcDataset created somewhere up here ^^
//psOptions struct has "-b 4" and "-of GTiff" settings.
const char* filep = "/vsimem/foo.tif";
GDALDataset* gtiffData = GDALTranslate(filep, srcDataset, psOptions, nullptr);
vsi_l_offset size = 0;
GByte* buf = VSIGetMemFileBuffer(filep, &size, true); //<-- returns nullptr
gtiffData seems to be a real dataset on inspection, it has all the appropriate properties (number of bands, raster size, etc). When I provide a real filesystem location to GDALTranslate() rather than the /vsimem/ path and load it up in QGIS it renders correctly too.
Looking a the source for VSIGetMemFileBuffer(), this should really only be returning nullptr if the file can't be found. This suggests i'm using it incorrectly. Does anyone know what the correct usage is?
Bonus points: Is there a better way to do this (stream the file out)?
Thanks!
I don't know anything about the C++ API. But in Python, the snippet below is what I sometimes use to get the contents of an in-mem file. In my case mainly VRT's but it shouldn't be any different for other formats.
But as said, I don't know if the VSI-api translate 1-on-1 to C++.
from osgeo import gdal
filep = "/vsimem/foo.tif"
# get the file size
stat = gdal.VSIStatL(filep, gdal.VSI_STAT_SIZE_FLAG)
# open file
vsifile = gdal.VSIFOpenL(filep, 'r')
# read entire contents
vsimem_content = gdal.VSIFReadL(1, stat.size, vsifile)
In the case of a VRT the content would be text, shown with something like print(vsimem_content.decode()). For a tiff it would of course be binary data.
I came back to this after putting in a workaround, and upon swapping things back over it seems to work fine. #mmomtchev suggested looking at the CPL_DEBUG output, which showed nothing unusual (and was silent during the actual VSIGetMemFileBuffer call).
In particular, for other reasons I had to put a GDALWarp call in between calling GDALTranslate and accessing the buffer, and it seems that this is what makes the difference. My guess is that GDALWarp is calling VSIFOpenL internally - although I can't find this in the source - and this does some kind of initialisation for VSIGetMemFileBuffer. Something to try for anyone else who encounters this.
I have a code in this format:
srcSAXController control(input_filename.c_str());
std::string output_filename = input_filename;
output_filename = "c-" + output_filename.erase(input_filename.rfind(XML_STR));
std:: ofstream myfile(output_filename.c_str());
coverage_handler handler(i == MAIN_POS ? true : false, output_filename);
control.parse(&handler);
myfile.write((char *)&control, sizeof(control));
myfile.close();
I want the content of object 'control' to be written into my file. How to fix the code above, so that content of the control object is written to the file.
In general you need much more than just writing the bytes of the object to be able to save and reload it.
The problem is named "serialization" and depending on a lot of factors there are several strategies.
For example it's important to know if you need to save and reload the object on the same system or if you may need to reload it on a different system; it's also fundamental to know if the object contains links to other objects, if the link graph is a simple tree or if there are possibly loops, if you need to support versioning etc. etc.
Writing the bytes to disk like the code is doing is not going to work even for something as simple as an object containing an std::string.
I would like to know how to use memory buffers as the io streams to a system command using Qt.
Normally you would do something pseudocode like:
Exec Command(" command < inputfile > outputfile");
but I would like to do the entire operation in memory.
I would prefer Something psudocode like:
ByteArray input;
ByteArray output;
Exec Command("command name", &input, &output);
A specific reference, example or link to the answer would be awesome. I just need a starting spot, I think.
Thanks in advance.
One way to do that would be to create memory-mapped input and output files, and specify full path to input and output files in regular shell command - this way it will effectively be in memory.
You can create/access those programmatically, take a look at
Streaming from memory mapped files in Qt
I found out you can use a "shared memory" space and use it like a file reference. Once you have the file reference, you can use redirection with it. This is one solution anyway.
This is my first post here, so please bear with me.
I have searched high and low on the internet for an answer, but I've not been able to resolve my issue, so I have decided to write a post here.
I am trying to write(append) to a JSON array on file using C++ and JZON, at intervals of 1 write each second. The JSON file is initially written by a “Prepare” function. Another function is then called each second to a add an array to the JSON file and append an new object to the array every second.
I have tried many things, most of which resulted in all sorts of issues. My latest attempt gave me the best results and this is the code that I have included below. However, the approach I took is very inefficient as I am writing an entire array every second. This is having a massive hit on CPU utilisation as the array grows, but not so much on memory as I had first anticipated.
What I really would like to be able to do is to append to an existing array contained in a JSON file on disk, line by line, rather than having to clear the entire array from the JSON object and rewriting the entire file, each and every second.
I am hoping that some of the geniuses on this website will be able to point me in the right direction.
Thank you very much in advance.
Here is my code:
//Create some object somewhere at the top of the cpp file
Jzon::Object jsonFlight;
Jzon::Array jsonFlightPath;
Jzon::Object jsonCoordinates;
int PrepareFlight(const char* jsonfilename) {
//...SOME PREPARE FUNCTION STUFF GOES HERE...
//Add the Flight Information to the jsonFlight root JSON Object
jsonFlight.Add("Flight Number", flightnum);
jsonFlight.Add("Origin", originicao);
jsonFlight.Add("Destination", desticao);
jsonFlight.Add("Pilot in Command", pic);
//Write the jsonFlight object to a .json file on disk. Filename is passed in as a param of the function.
Jzon::FileWriter::WriteFile(jsonfilename, jsonFlight, Jzon::NoFormat);
return 0;
}
int UpdateJSON_FlightPath(ACFT_PARAM* pS, const char* jsonfilename) {
//Add the current returned coordinates to the jsonCoordinates jzon object
jsonCoordinates.Add("altitude", pS-> altitude);
jsonCoordinates.Add("latitude", pS-> latitude);
jsonCoordinates.Add("longitude", pS-> longitude);
//Add the Coordinates to the FlightPath then clear the coordinates.
jsonFlightPath.Add(jsonCoordinates);
jsonCoordinates.Clear();
//Now add the entire flightpath array to the jsonFlight object.
jsonFlight.Add("Flightpath", jsonFlightPath);
//write the jsonFlight object to a JSON file on disk.
Jzon::FileWriter::WriteFile(jsonfilename, jsonFlight, Jzon::NoFormat);
//Remove the entire jsonFlighPath array from the jsonFlight object to avoid duplicaiton next time the function executes.
jsonFlight.Remove("Flightpath");
return 0;
}
For sure you can do "flat file" storage yourself.. but this is a symptom of needing a database. Something very light like SQLite, or mid-weight & open-source like MySQL, FireBird, or PostgreSQL.
But as to your question:
1) Leave the closing ] bracket off, and just keep the file open & appending -- but if you don't close the file correctly, it will be damaged & need repair to be readable.
2) Your current option -- writing a complete file each time -- isn't safe from data loss either, as the moment you "open to overwrite" you lose all data previously stored in the file. The workaround here, is to rename the old file as a backup before you start writing.
You should also make backup copies of your file, with the first option. (Say at daily intervals). Otherwise data loss is likely to occur eventually -- on Ctrl-C, power loss, program error or system crash.
Of course if you use any of SQLlite, MySQL, Firebird or PostgreSQL all the data-integrity problems will be handled for you.
I created a .properties file that contains a few simple key = value pairs.
I tried it out from a sample c++ console application, using imported java classes, and I was able to access it, no problem.
Now, I am trying to use it in the same way, from a C++ dll, which is being called by another (unmanaged) c++ project.
For some reason, the file is not being accessed.
Maybe my file location is wrong. Where should I be storing it?
What else might be the issue?
TIA
As you are mentioning "DLL" i guess, that you are using MS Windows. Finding a file there from a DLL, and independently from the logged on user is a restricted item. The best way is to store the file in a path assembled from the environment variable ALLUSERSPROFILE. This is the only location that is equal to all users and where all users usually have write access. Your applications data should reside in a private subdirectory named like < MyCompany > or < MyApplicationsName >. Type
echo %ALLUSERSPROFILE%
on a windows command line prompt to find out the actual location on a machine.
Store your data in i.e.:
%ALLUSERSPROFILE%\MyApp\
Your dll can then query the location of ALLUSERSPROFILE using getenv:
char *allUsersData = getenv("ALLUSERSPROFILE");