"Proper C++ solution"-- is c-style logic 'bad' when using C++? - c++

So recently I got into a debate about how to solve a problem, the problem specifically was: How do I find all the pallindromes between 1 and 1 million. I said, "Use atoi to make a string, use a for loop to reverse the string, the use strcmp to compare the string(s) in question.
A few minutes later someone asked "Why would you use a C-style solution in C++." I found myself confused of a simple, more "C++" way of solving this with code as direct and easy to understand. Anyone care to illuminate me on this one?
edit: itoa not atoi

Quite simply, C++ streams are guaranteed to be memory safe and exception safe, failure is distinct from any return value, and C++ strings are memory-safe and exception-safe. C-strings and atoi are hideously unsafe in pretty much every way known to man. Code written in that way is much more error-prone.

Example C++ solution:
#include <algorithm>
#include <iterator>
#include <iostream>
#include <boost/iterator/counting_iterator.hpp>
#include <boost/lexical_cast.hpp>
namespace {
bool is_palindrome(unsigned int i) {
const std::string& s = boost::lexical_cast<std::string>(i);
return std::equal(s.begin(), s.end(), s.rbegin());
}
const unsigned int stop = 1000000;
}
int main() {
std::remove_copy_if(boost::counting_iterator<unsigned int>(1),
boost::counting_iterator<unsigned int>(stop),
std::ostream_iterator<unsigned int>(std::cout, "\n"),
std::not1(std::ptr_fun(is_palindrome)));
}
(I used std::remove_copy_if to make up for the lack of std::copy_if which is in C++0x)
For completeness sake I implemented a version that generates the palindromes rather than testing candidates against a predicate:
#include <boost/lexical_cast.hpp>
#include <boost/iterator/counting_iterator.hpp>
#include <boost/iterator/transform_iterator.hpp>
#include <algorithm>
#include <iterator>
#include <iostream>
#include <cassert>
namespace {
template <int ver>
unsigned int make_palindrome(unsigned int i) {
std::string s = boost::lexical_cast<std::string>(i);
assert(s.size());
s.reserve(s.size()*2);
std::reverse_copy(s.begin(), s.end()-ver, std::back_inserter(s));
return boost::lexical_cast<unsigned int>(s);
}
}
int main() {
typedef boost::counting_iterator<unsigned int> counter;
std::merge(boost::make_transform_iterator(counter(1), make_palindrome<1>),
boost::make_transform_iterator(counter(999), make_palindrome<1>),
boost::make_transform_iterator(counter(1), make_palindrome<0>),
boost::make_transform_iterator(counter(999), make_palindrome<0>),
std::ostream_iterator<unsigned int>(std::cout, "\n"));
}
(I could have used boost.range I think instead of std::merge for this)
The discussion point from this I guess then is "is this a better* way to write it?". The thing I like about writing problems like the palindrome in this style is you get the "if it compiles it's probably correct" heuristic on your side. Even if there is a bug it'll still get handled sensibly at run time (e.g. an exception from lexical_cast).
It's a markedly different way of thinking from C programming (but strangely similar to Haskell in some ways). It brings benefits in the form of lots of extra safety, but the compiler error messages can be terrible and shifting the way you think about problems is hard.
At the end of it all though what matters is "is it less work for less bugs?". I can't answer that without some metrics to help.
* For some definition of better.

The C++ equivalent to your solution would be to:
Use stringstream to turn the number into a std::string.
Use std::reverse_copy to reverse the string.
Use == to compare the strings.
1 is better than using itoa (which you probably meant) because you don't have to allocate the memory for the created string yourself and there's no chance of a buffer overrun.
2 is better because again you don't have to worry about allocating memory for the reversed string and you don't duplicate existing functionality.
3 is better because string1 == string2 reads better than using strcmp.

atoi makes it impossible to detect input errors. While a stringstream can do the same job and can errors can be easily detected.

I think a more appropriate question would be why shouldn't you use C-based solutions in C++? If it is simpler and more readable than a "pure" C++ solution, I know I would opt for the C-style solution. Given the solution you came up with was clever and simple, a corresponding C++ solution may be overkill and given the simplicity of the problem I'm not sure what was being suggested that C++ can bring to the solution.

Related

How do I print to stdout or a file using C++20 format library

As is well noted, the one thing that didn't make it into C++20's format library was a function that prints to standard out, or to a generic file stream. We've been promised a std::print() to fill this need in C++23, but that doesn't solve the problem in the interim.
What options exist to solve this?
Answering my own question, the solution is actually very simple. std::format_to can write to an output iterator, so all that's needed is the construction of a suitable iterator.
#include <iostream>
#include <iterator>
#include <format>
int main()
{
std::ostream_iterator<char> out(std::cout); // Create an output iterator that writes to std::cout
// Replace std::cout with an ostream to write to a file
std::format_to(out, "Hello {}!\n", "world");
return 0;
}
Note that this will, of course, become completely unnecessary if we get the promised std::print() in C++23. But it is a valid interim solution.

equivalent of atoi

Is there a function that could replace atoi in c++.
I made some research and didn't find anything to replace it, the only solutions would be using cstdlib or implementing it myself
If you don't want to use Boost, C++11 added std::stoi for strings. Similar methods exist for all types.
std::string s = "123"
int num = std::stoi(s);
Unlike atoi, if no conversion can be made, an invalid_argument exception is thrown. Also, if the value is out of range for an int, an out_of_range exception is thrown.
boost::lexical_cast is your friend
#include <string>
#include <boost/lexical_cast.hpp>
int main()
{
std::string s = "123";
try
{
int i = boost::lexical_cast<int>(s); //i == 123
}
catch(const boost::bad_lexical_cast&)
{
//incorrect format
}
}
You can use the Boost function boost::lexical_cast<> as follows:
char* numericString = "911";
int num = boost::lexical_cast<int>( numericString );
More information can be found here (latest Boost version 1.47). Remember to handle exceptions appropriately.
Without boost:
stringstream ss(my_string_with_a_number); int my_res; ss >> my_res;
About as annoying as the boost version but without the added dependency. Could possibly waste more ram.
You don't say why atoi is unsuitable so I am going to guess it has something to do with performance. Anyway, clarification would be helpful.
Using Boost Spirit.Qi is about an order of magnitude faster than atoi, at least in tests done by Alex Ott.
I don't have a reference but the last time I tested it, Boost lexical_cast was about an order of magnitude slower than atoi. I think the reason is that it constructs a stringstream, which is quite expensive.
Update: Some more recent tests

is there a way to use cin.getline() without having to define a char array size before hand?

Basically my task is having to sort a bunch of strings of variable length ignoring case. I understand there is a function strcasecmp() that compares cstrings, but doesn't work on strings. Right now I'm using getline() for strings so I can just read in the strings one line at a time. I add these to a vector of strings, then convert to cstrings for each call of strcasecmp(). Instead of having to convert each string to a cstring before comparing with strcasecmp(), I was wondering if there was a way I could use cin.getline() for cstrings without having a predefined char array size. Or, would the best solution be to just read in string, convert to cstring, store in vector, then sort?
I assume by "convert to cstring" you mean using the c_str() member of string. If that is the case, in most implementation that isn't really a conversion, it's just an accessor. The difference is only important if you are worried about performance (which it sounds like you are). Internally std::strings are (pretty much always, but technically do not have to be) represented as a "cstring". The class takes care of managing it's size for you, but it's just a dynamically allocated cstring underneath.
So, to directly answer: You have to specify the size of the array when using cin.getline. If you don't want to specify a size, then use getline and std::string. There's nothing wrong with that approach.
C++ is pretty efficient on its own. Unless you have a truly proven need to do otherwise, let it do its thing.
#include <algorithm>
#include <iostream>
#include <iterator>
#include <string>
#include <vector>
#include <cstring>
using namespace std;
bool cmp(string a, string b)
{
return(strcasecmp(a.c_str(), b.c_str()) < 0);
}
int main(int argc, char *argv[])
{
vector<string> strArr;
//too lazy to test with getline(cin, str);
strArr.push_back("aaaaa");
strArr.push_back("AAAAA");
strArr.push_back("ababab");
strArr.push_back("bababa");
strArr.push_back("abcabc");
strArr.push_back("cbacba");
strArr.push_back("AbCdEf");
strArr.push_back("aBcDeF");
strArr.push_back(" whatever");
sort(strArr.begin(), strArr.end(), cmp);
copy(strArr.begin(), strArr.end(), ostream_iterator<string>(cout, " \n"));
return(0);
}

Mixing c++ standard strings and windows API

Many windows APIs take a pointer to a buffer and a size element but the result needs to go into a c++ string. (I'm using windows unicode here so they are wstrings)
Here is an example :-
#include <iostream>
#include <string>
#include <vector>
#include <windows.h>
using namespace std;
// This is the method I'm interested in improving ...
wstring getComputerName()
{
vector<wchar_t> buffer;
buffer.resize(MAX_COMPUTERNAME_LENGTH+1);
DWORD size = MAX_COMPUTERNAME_LENGTH;
GetComputerNameW(&buffer[0], &size);
return wstring(&buffer[0], size);
}
int main()
{
wcout << getComputerName() << "\n";
}
My question really is, is this the best way to write the getComputerName function so that it fits into C++ better, or is there a better way? I don't see any way to use a string directly without going via a vector unless I missed something? It works fine, but somehow seems a little ugly. The question isn't about that particular API, it's just a convenient example.
In this case, I don't see what std::vector brings to the party. MAX_COMPUTERNAME_LENGTH is not likely to be very large, so I would simply use a C-style array as the temporary buffer.
See this answer to another question. It provides the source to a StringBuffer class which handles this situation very cleanly.
I would say, since you are already at task of abstracting Windows API behind a more generic C++ interface, do away with vector altogether, and don't bother about wstring constructor:
wstring getComputerName()
{
wchar_t name[MAX_COMPUTERNAME_LENGTH + 1];
DWORD size = MAX_COMPUTERNAME_LENGTH;
GetComputerNameW(name, &size);
return name;
}
This function will return a valid wstring object.
I'd use the vector. In response to you saying you picked a bad example, pretend for a moment that we don't have a reasonable constant upper bound on the string length. Then it's not quite as easy:
#include <string>
#include <vector>
#include <windows.h>
using std::wstring;
using std::vector;
wstring getComputerName()
{
DWORD size = 1; // or a bigger number if you like
vector<wchar_t> buffer(size);
while ((GetComputerNameW(&buffer[0], &size) == 0))
{
if (GetLastError() != ERROR_BUFFER_OVERFLOW) aargh(); // handle error
buffer.resize(++size);
};
return wstring(&buffer[0], size);
}
In practice, you can probably get away with writing into a string, but I'm not entirely sure. You certainly need additional guarantees made by your implementation of std::wstring, beyond what's in the standard, but I expect MSVC's strings are probably OK.
I think that if wstring::reference is wchar_t& then you're sorted. 21.3.4 defines that non-const operator[] returns a reference, and that it returns data()[pos]. So if reference is just a plain wchar_t& then there's no scope for exciting copy-on-write behaviour through the reference, and the string must in fact be modifiable through the pointer &buffer[0]. I think. The basic problem here is that the standard allowed implementations more flexibility than turned out to be needed.
That's a lot of effort and commenting though, just to avoid copying a string, so I've never felt the need to avoid an intermediate array/vector.

C++: what is the optimal way to convert a double to a string?

What is the most optimal way to achieve the same as this?
void foo(double floatValue, char* stringResult)
{
sprintf(stringResult, "%f", floatValue);
}
I'm sure someone will say boost::lexical_cast, so go for that if you're using boost, but it's basically the same as this anyway:
#include <sstream>
#include <string>
std::string doubleToString(double d)
{
std::ostringstream ss;
ss << d;
return ss.str();
}
Note that you could easily make this into a template that works on anything that can be stream-inserted (not just doubles).
http://www.cplusplus.com/reference/iostream/stringstream/
double d=123.456;
stringstream s;
s << d; // insert d into s
Boost::lexical_cast<>
On dinkumware STL, the stringstream is filled out by the C library snprintf.
Thus using snprintf formatting directly will be comparable with the STL formatting part.
But someone once told me that the whole is greater than or equal to the sum of its known parts.
As it will be platform dependent as to whether stringstream will do an allocation (and I am quite sure that DINKUMWARE DOES NOT YET include a small buffer in stringstream for conversions of single items like yours) it is truely doubtful that ANYTHING that requires an allocation (ESPECIALLY if MULTITHREADED) can compete with snprintf.
In fact (formatting+allocation) has a chance of being really terrible as an allocation and a release might well require 2 full read-modify-write cycles in a multithreaded environment unless the allocation implementation has a thread local small heap.
That being said, if I was truely concerned about performance, I would take the advice from some of the other comments above, change the interface to include a size and use snprintf - i.e.
bool
foo(const double d, char* const p, const size_t n){
use snprintf......
determine if it fit, etc etc etc.
}
If you want a std::string you are still better off using the above and instantiating the string from the resultant char* as there will be 2 allocations + 2 releases involved with the std::stringstream, std::string solution.
BTW I cannot tell if the "string" in the question is std::string or just generic ascii chars usage of "string"
The best thing to do would be to build a simple templatized function to convert any streamable type into a string. Here's the way I do it:
#include <sstream>
#include <string>
template <typename T>
const std::string to_string(const T& data)
{
std::ostringstream conv;
conv << data;
return conv.str();
}
If you want a const char* representation, simply substitute conv.str().c_str() in the above.
I'd probably go with what you suggested in your question, since there's no built-in ftoa() function and sprintf gives you control over the format. A google search for "ftoa asm" yields some possibly useful results, but I'm not sure you want to go that far.
I'd say sprintf is pretty much the optimal way. You may prefer snprintf over it, but it doesn't have much to do with performance.
Herb Sutter has done an extensive study on the alternatives for converting an int to a string, but I would think his arguments hold for a double as well.
He looks at the balances between safety, efficiency, code clarity and usability in templates.
Read it here: http://www.gotw.ca/publications/mill19.htm
_gcvt or _gcvt_s.
If you use the Qt4 frame work you could go :
double d = 5.5;
QString num = QString::number(d);
This is very useful thread. I use sprintf_s for it but I started to doubt if it is really faster than other ways. I came across following document on Boost website which shows performance comparison between Printf/scanf, StringStream and Boost.
Double to String is most common conversion we do in our code, so i'll stick with what i've been using. But, using Boost in other scenarios could be your deciding factor.
http://www.boost.org/doc/libs/1_58_0/doc/html/boost_lexical_cast/performance.html
In the future, you can use std::to_chars to write code like https://godbolt.org/z/cEO4Sd . Unfortunately, only VS2017 and VS2019 support part of this functionality...
#include <iostream>
#include <charconv>
#include <system_error>
#include <string_view>
#include <array>
int main()
{
std::array<char, 10> chars;
auto [parsed, error] = std::to_chars(
chars.data(),
chars.data() + chars.size(),
static_cast<double>(12345.234)
);
std::cout << std::string_view(chars.data(), parsed - chars.data());
}
For a lengthy discussion on MSVC details, see
https://www.reddit.com/r/cpp/comments/a2mpaj/how_to_use_the_newest_c_string_conversion/eazo82q/