I have a situation where I have a std::string, and I only need characters x to x + y, and I think it would speed it up quite a bit if I instead could somehow do (char*)&string[x], but the problem is all my functions expect a NULL terminated string.
What can I do?
Thanks
Nothing nice can be done. The only trick I can think of is temporarily setting s[x+y+1] to 0, pass &s[x], then restore the character. But you should resort to this ONLY if you are sure this will reasonably boost the performance and that boost is necessary
nothing (if the string you need is in the middle). the speed difference will be utterly trivial unless its being done A LOT (several millions)
Use:
string.c_str() + x;
This assumes your function takes a const char *
If you need actual 0-termination, you'll have to copy.
You have no choice here. You can't create a null-terminated substring without copying or modifying the original string.
You say you "think it would speed it up". Have you measured?
You could overwrite &string[x+y+1] with a NUL, and pass &string[x] to your functions. If you're going to need the whole string again afterward, you can save it for the duration, and restore it when needed.
Related
I was wondering if in code such as this:
std::cout << "Hello!" << '\n';
Would it be more efficient to have the \n character in the same string literal, as in this case, "Hello!"? Such as the code would then look like this:
std::cout << "Hello!\n";
I am aware that things such as these are so miniscule in difference whether one is more efficient than the other (especially in this case), that it just doesn't matter, but it has just been boggling my mind.
My reasoning:
If I am not mistaken having the \n character in the same string literal would be more efficient, since when you have the extra set of the insertion operator (operator<<) you have to call that function once again and whatever is in the implementation of that function, which in this case, in the Standard Library, will happen just for one single character (\n). In comparison to only having to do it once, that is, if I were to append that character to the end of the string literal and only have to use one call to operator<<.
I guess this is a very simple question but I am not 100% sure about the answer and I love to spend time knowing the details of a language and how little things work better than others.
Update:
I am working to find the answer myself as for questions such as this, it is better for one to try and find the answer for themselves, I haven't worked with any Assembly Code in the past, so this will be my firs time trying to.
Yes; but all of the ostreams are so inefficient in practice that if you care much about efficiency you shouldn't be using them.
Write code to be clear and maintainable. Making code faster takes work, and the simpler and clearer your code is the easier to make it faster.
Identify what your bottleneck is, then work on optimizing that by actually working out what is taking time. (This only fails when global behaviour caused global slowdowns, like fragmentation or messing with caches because).
I'm currently building/concatenating pretty big strings in a program I'm developing. To give some context, the full string has the size of a terminal. This happens quite a lot since I'm developing a terminal application. I found the datastructure Buffer, which seems to be the most performant way to concatenate strings with the standard library.
Is this the right choice if I also frequently need to update some part of a buffer at a specific position? Let's say character 20 to 50 ?
Is there a better way in this case?
#coredump and #kne have given good answers. I might just add that in today's world a byte is a poor representation of a character. So you might consider using an array or a bigarray.
AFAICS, there is no way to alter the contents of a Buffer.t except adding to the end. Maybe you should take a look at the module Bytes. A Bytes.t is mutable everywhere, only the length cannot change. But it seems the length you need is fixed anyway: the size of the terminal (and if the terminal window is resized you can replace the Bytes.t by a new one).
I have code which runs lots of loops updating a single string.
Finally I want that string to be stored in a file.
Currently I am printing that string to the console.
I can use a ofstream and write that to a file instead of console.
Instead using a string to be updated, use directly the file stream
Use string stream instead and finally copy that string stream to file
stream and write to a file.
After update of the string is complete I should write a file stream
at once.
The std::string::max_size in my compiler is : 4294967257
And the maximum size of the string that I could generate is approximately half of the max_size of the compiler.
Note: I am using Solaris Unix.
What is the most performant way to write this string to a file?
There's only one way to know the answer. You have to profile it for your case. You can easily do this by measuring the timings how long it takes to generate the file.
Consider all the scenarios and Benchmark the timings.
NOTE : The fastest would be the one closest to the memory.
Try to reuse the same std::string object as much as possible, using reserve and clear. The string will cache its memory allocation. Strings inside {} will make a new allocation each time you enter the block.
Be careful of hidden temporary string objects, for example a + b when a is std::string will create a temporary std::string object with new allocation. Prefer += to concatenate strings.
Use C code to perform conversions. Create a local char buffer and use sprintf etc. They are faster than stringstreams but it's easier to make a mistake so be careful.
Use "\n" instead of std::endl when writing to a file, as the latter causes a flush.
Avoid stringstreams like the plague. They are slow, at least Visual Studio implementation. I've tried them for hardcore text processing and I know.
Of course this assumes performance IS an issue for you. If you are doing light work, stringstreams could be the easiest solution. I prefer them when I am not doing hardcore work as there is much less chance of a bug.
I am trying to convert a std::string Buffer - containing data from a bitmap file - to std::wstring.
I am using MultiByteToWideChar, but that does not work, because the function stops after it encounters the first '\0'-character. Seems like it interprets it as the end of the string.
When i dont pass -1 as the length-parameter, but the real length of the data in the std::string-Buffer, it messes the Unicode-String up with characters that definetly not appeared at that position in the original string...
Do I have to write my own conversion function?
Or maybe shall i keep the data as a casual char-array, because the special-symbols will be converted incorrectly?
With regards
There are many, many things that will fail with this approach. Among other things, extra bytes may be added to your data without your realizing it.
It's odd that your only option takes a std::wstring(). If this is a home-grown library, you should take the trouble to write a new function. If it's not, make sure there's nothing more suitable before writing your own.
I've been working with this for about 2 days now. I'm stuck, with a rather simple annoyance, but I'm not capable of solving it.
My programs basicly recieves a TCP connection from a PHP script. And the message which is send is stored in char buffer[1024];.
Okay this buffer variable contains an unique key, which is being compared to a char key[1024] = "supersecretkey123";
The problem itself is that these two does not equal - no matter what I do.
I've been printing the buffer and key variable out just above eachother and by the look they are 100% identical. However my equalisation test still fails.
if(key == buffer) { // do some thing here etc }
So then I started searching the internet for some information on what could be wrong. I later realized that it might be some escape characters annoying me. But I'm not capable of printing them, removing them or even making sure they are there. So that's why I'm stuck - out of ideas on how to make these equal when the buffer variable matches the key variable.
Well the key does not chance, unless the declaration of the key is modified manually. The program itself is recieving the information and sending back information "correctly".
Thanks.
If you're using null terminated strings use proper api - strcmp and its variants.
Additionally size in declaration char key[1024] = "supersecretkey123"; is not needed - either compiler will reduced it or stack/heap memory will be wasted.
If you are using C++ use std::string instead of char []. You cannot compare two char [] in way you try to do this (they are pointers to memory), but it's possible with std::string.
If it's somehow mandatory to use char[] in your case, use strcmp.
Try with if(!strncmp(key,buffer,1024)). See this reference on strncmp.