I am trying to include huge string in my c++ programs, Its size is 20598617 characters , I am using #define to achieve it. I have a header file which contains this statement
#define "<huge string containing 20598617 characterd>"
When I try to compile the program I get error as fatal error C1060: compiler is out of heap space
I tried following command line options with no success
/Zm200
/Zm1000
/Zm2000
How can I make successful compilation of this program?
Platform: Windows 7
You can't, not reliably. Even if it will compile, it's liable to break the runtime library, or the OS assumptions, and so forth.
If you tell us why you're trying to do it, we can offer lots of alternatives. Deciding how to handle arbitrarily large data is a major part of programming.
Edited to add:
Rather than guess, I looked into MSDN:
Prior to adjacent strings being
concatenated, a string cannot be
longer than 16380 single-byte
characters.
A Unicode string of about one half
this length would also generate this
error.
The page concludes:
You may want to store exceptionally
large string literals (32K or more) in
a custom resource or an external file.
What do other compilers say?
Further edited to add:
I created a file like this:
char s[] = {'x','x','x','x'};
I kept doubling the occurrences of 'x', testing each one as an #include file.
An 8388608 byte string succeeded; 16777216 bytes failed, with the "out of heap space" error.
I suspect you are running into a design limit on the size of a character string.
Most people really think that a million characters is long enough :-}
To avoid such design limits, I'd try not to put the whole thing into a single literal string. On the suspicion that #define macro bodies likewise have similar limits, I't try not to put the entire thing in a single #define, either.
Most C compilers will accept pretty big lists of individual characters as initializers. If you write
char c[]={ c1, c2, ... c20598617 };
with the c_i being your individual characters, you may succeed. I've seen GCC2 applications where there were 2 million elements like this (apparantly they were loading some type of ROM image). You might even be able to group the c_i into blocks of K characters for K=100, 1000, 10000 as suits your tastes, and that might actually help the compiler.
You might also consider running your string through a compression algorithm,
putting the compressed result into your C++ file by any of the above methods,
and decompressing after the program was loaded.
I suspect you can get a decompression algorithm into a few thousand bytes.
Store the string to a file and just open and read it...
Its much cleaner/organized that way [i'm assuming that right now you have a file named blargh.h which contains that one #Define...]
Um, store the string in a separate resource of some sort and load it in? Seriously, in embedded land, you would have this as a separate resource and not hold it in RAM. On windows, I believe you can use .dlls or other external resources to handle this for you. Compilers aren't designed to hold this size of resources for you and they will fail.
Increase the compiler heap space.
If your string comes from a large text or binary file, you may have luck with either the xxd -i command (to get everything in an array, per Ira Baxter's answer) or a variant of the bin2obj command (to get everything into a .o file you can link into the program).
Note that the string may not be null terminated in this case.
See answers to the earlier question, "How can I get the contents of a file at build time into my C++ string?"
(Also, as an aside: note the existence of the .xbm format.)
This is a very old question, but since there's no definitive answer yet: C++11's raw string literals seem to do the job.
This compiles nicely on GCC 4.8:
#include <string>
std::string data = R"(
... <1.4 MB of base85-encoded string> ...
)";
As said in other posts in this thread, this is definitely not the preferred way of handling large amounts of data.
Related
In my code the following line gives me data that performs the task its meant for:
const char *key = "\xf1`\xf8\a\\\x9cT\x82z\x18\x5\xb9\xbc\x80\xca\x15";
The problem is that it gets converted at compile time according to rules that I don't fully understand. How does "\x" work in a String?
What I'd like to do is to get the same result but from a string exactly like that fed in at run time. I have tried a lot of things and looked for answers but none that match closely enough for me to be able to apply.
I understand that \x denotes a hex number. But I don't know in which form that gets 'baked out' by the compiler (gcc).
What does that ` translate into?
Does the "\a" do something similar to "\x"?
This is indeed provided by the compiler, but this part is not member of the standard library. That means that you are left with 3 ways:
dynamically write a C++ source file containing the string, and writing it on its standard output. Compile it and (providing popen is available) execute it from your main program and read its input. Pretty ugly isn't it...
use the source of an existing compiler, or directly its internal libraries. Clang is probably a good starting point because it has been designed to be modular. But it could require a good amount of work to find where that damned specific point is coded and how to use that...
just mimic what the compiler does, and write your own parser by hand. It is not that hard, and will learn you why tests are useful...
If it was not clear until here, I strongly urge you to use the third way ;-)
If you want to translate "escape" codes in strings that you get as input at run-time then you need to do it yourself, explicitly.
One way is to read the input into one string. Then copy the characters from that source string into a new destination string, one by one. If you see a backslash then you discard it, fetch the next character, and if it's an x you can use e.g. std::stoi to convert the next few characters into its corresponding integer value, and append that number to the destination string (either adding it with std::to_string, or using output string streams and the normal "output" operator <<).
I am working on a very basic REPL for Arduino. To get parameters, I need to split a String into parts, separating using spaces. I do not know how I would store the result. For example, pinmode 1 input would result in a list: "pinmode", 1, "input". The 1 would have to be an int. I have looked at other Stack Overflow answers, but they require a char input.
Don't use String. That's the reason all the other answers use char, commonly called C strings (lower case "s"). String uses "dynamic memory" (i.e., the heap), which is bad on this small microcontroller. It also add 1.6k to your program size.
The simplest thing to do is save each received character into a char array, until you get the newline character, '\n'. Be sure to add the NUL character at the end, and be sure your array is sized appropriately.
Then process the array using the C string library: strcmp, strtoul, strtok, isdigit, etc. Learning about these routines will really pay off, as it keeps your program small and fast. C strings are easy to print out, as well.
Again, stay away from String. It is tempting to beginners, because it is easy to understand. However, it has many subtle, complicated and unpredictable ways to make your embedded program fail. This is not a PC with lots of RAM and a swap file on a hard drive.
I have a dll (ansi c) that has some string litarals defined.
__declspec(dllexport) char* GetSomeString()
{
return "This is a test string from TestLib.dll";
}
When compiled this string is still visible in "notepad" for example. I'm fairly new to C, so I was wondering, is there a way to safely store string literals?
Should I do it with a resx file (for example), that has some encrypted values, or what would be the best way?
Thanks
EDIT 1:
The scenario is basically the following in pseudo code:
if(hostname)
return hostname
else
return "Literal String"';
It's this "literal string" that I would like to see "secured" in some way..
Don't put your secrets on anyone else's computer if you want them to stay secret.
See my related answer, The #1 Law of Software Licensing
And Eric Lippert's similar answer
First of all, since your executable1 needs to decode that literal in memory, any attacker determined enough will be able to do the same; often it's just as easy as freezing the process after startup (or after it needed to use the string we want), creating a memory dump and use utilities like string over it. There are methods to mitigate the issue (e.g. zeroing the memory used by a sensitive string immediately after using it), but since your code is on a machine where the potential attacker has all the privileges, you can only put roadblocks: in the end your executable is completely in the attacker's hands.
That being said, if your concern is just "not leaving important strings en plein air" you may just run an executable packer/encrypter over your whole dll. This is as easy as adding a post-build step in your solution, the packer will compress/encrypt the whole executable image and build an executable that when launched will decrypt and run it in memory.
This method has the great advantage of not requiring any change to your code: you just run upx over the compiled dll and you get your compressed dll, no XORs or weird literals spread across your code are needed.
Of course, this is quite weak security (basically it will just protect from snooping around in the executable with notepad or a hex editor), but again, storing critical "secrets" in an executable that is going to be distributed is a bad idea in first place.
In the whole answer I "executable" is to be intended in the wide meaning - i.e. also dlls are included.
You probably want to store hardcoded passwords in the library, right? You can XOR the string with some value, and store it, then read it and XOR again. It's the simplest way, but it doesn't protect your string from any kind of disassembling/reverse engineering.
I have code which runs lots of loops updating a single string.
Finally I want that string to be stored in a file.
Currently I am printing that string to the console.
I can use a ofstream and write that to a file instead of console.
Instead using a string to be updated, use directly the file stream
Use string stream instead and finally copy that string stream to file
stream and write to a file.
After update of the string is complete I should write a file stream
at once.
The std::string::max_size in my compiler is : 4294967257
And the maximum size of the string that I could generate is approximately half of the max_size of the compiler.
Note: I am using Solaris Unix.
What is the most performant way to write this string to a file?
There's only one way to know the answer. You have to profile it for your case. You can easily do this by measuring the timings how long it takes to generate the file.
Consider all the scenarios and Benchmark the timings.
NOTE : The fastest would be the one closest to the memory.
Try to reuse the same std::string object as much as possible, using reserve and clear. The string will cache its memory allocation. Strings inside {} will make a new allocation each time you enter the block.
Be careful of hidden temporary string objects, for example a + b when a is std::string will create a temporary std::string object with new allocation. Prefer += to concatenate strings.
Use C code to perform conversions. Create a local char buffer and use sprintf etc. They are faster than stringstreams but it's easier to make a mistake so be careful.
Use "\n" instead of std::endl when writing to a file, as the latter causes a flush.
Avoid stringstreams like the plague. They are slow, at least Visual Studio implementation. I've tried them for hardcore text processing and I know.
Of course this assumes performance IS an issue for you. If you are doing light work, stringstreams could be the easiest solution. I prefer them when I am not doing hardcore work as there is much less chance of a bug.
I've some values I want to find in a large (> 500 MB) text file using C++ or C. I know that a possible matching value can only exist at the very beginning of each line and its length is exactly ten characters. Okay, I can read the whole file line by line searching the value with substr() or use regexp but that is a little bit ugly and very slow. I consider to use a embedded database (e.g. Berkeley DB) but the file I want to search in is very dynamic and I see a problem to bring it into the database every time. Due to a limit of memory it is not possible to load the whole file at once into memory. Many thanks in advance.
This doesn't seem well suited to C/C++. Since the problem is defined with the need to parse whole lines of text, and perform pattern matching on the first 10-chars, something interpreted, such as python or perl would seem to be simpler.
How about:
import os
pattern ='0123456789' # <-- replace with pattern
with open('myfile.txt') as f:
for line in f:
if line.startswith(pattern):
print "Eureka!'
I don't see how you're going to do this faster than using the stdio library, reading each line in turn into a buffer, and using strchr, strcmp, strncmp or some such. Given the description of your problem, that's already fairly optimal. There's no magic that will avoid the need to go through the file line by line looking for your pattern.
That said, regular expressions are almost certainly not needed here if you're dealing with a fixed pattern of exactly ten characters at the start of a line -- that would be needlessly slow and I wouldn't use the regex library.
If you really, really need to beat the last few microseconds out of this, and the pattern is literally constant and at the start of a line, you might be able to do a memchr on read-in buffers looking for "\npattern" or some such (that is, including the newline character in your search) but you make it sound like the pattern is not precisely constant. Assuming it is not precisely constant, the most obvious method (see first paragraph) is the the most obvious thing to do.
If you have a large number of values that you are looking for then you want to use Aho-Corasick. This algorithm allows you to create a single finite state machine that can search for all occurrences of any string in a set simultaneously. This means that you can search through your file a single time and find all matches of every value you are looking for. The wikipedia link above has a link to a C implementation of Aho-Corasick. If you want to look at a Go implementation that I've written you can look here.
If you are looking for a single or a very small number of values then you'd be better off using Boyer-Moore. Although in this case you might want to just use grep, which will probably be just as fast as anything you write for this application.
How about using memory mapped files before search?
http://beej.us/guide/bgipc/output/html/multipage/mmap.html
One way may be loading and searching for say first 64 MB in memory, unload this then load the next 64 MB and so on (in multiples of 4 KB so that you are not overlooking any text which might be split at the block boundary)
Also view Boyer Moore String Search
http://en.wikipedia.org/wiki/Boyer%E2%80%93Moore_string_search_algorithm
Yes this can be done fast. Been there. Done that. It is easy to introduce bugs, however.
The trick is in managing end of buffer, since you will read a buffer full of data, search that buffer, and then go on to the next. Since the pattern could span the boundary between two buffers, you wind up writing most of your code to cover that case.
At any rate, outside of the boundary case, you have a loop that looks like the following:
unsigned short *p = buffer;
while( (p < EOB) && ( patterns[*p] ) ) ++p;
This assumes that EOB has been appropriately initialized, and that patterns[] is an array of 65536 values which are 0 if you can't be at the start of your pattern and 1 if you can.
Depending on your CR/LF and byte order conventions, patterns to set to 1 might include \nx or \rx where x is the first character in your 10 character pattern. Or x\n or x\r for the other byte order. And if you don't know the byte order or convention you can include all four.
Once you have a candidate location (EOL followed by the first byte) you do the work of checking the remaining 9 bytes. Building the patterns array is done offline, ahead of time. Two byte patterns fit in a small enough array that you don't have too much memory thrashing when doing the indexing, but you get to zip through the data twice as fast as if you did single byte.
There is one crazy optimization you can add into this, and that is to write a sentinel at the end of buffer, and put it in your patterns array. But that sentinel must be something that couldn't appear in the file otherwise. It gets the loop down to one test, one lookup and one increment, though.