How can I check, which value should I pass to std::locale? I wrote simple program:
#include <cstdio>
#include <locale>
#include <stdexcept>
int main()
{
try
{
std::locale("plk_pol");
}
catch(std::runtime_error)
{
printf("Can't load locale.\n");
return 1;
}
return 0;
}
But it gives me error message. That's strange, because std::setlocale(LC_ALL, 'plk_pol'); works. I tried other values, like pl_PL, polish_poland - behaviour is the same. Even en_US.utf8 gives error. The only locale I was able to set is C.
I have Windows 7 with Polish localisation set. The question is: where can I check, which locales are acceptable in my system?
Related
I'm trying to learn Unicode programming in Windows.
I have this simple program:
#include <iostream>
#include <string>
int main()
{
std::wstring greekWord = L"Ελληνικά";
std::wcout << greekWord << std::endl;
return 0;
}
However, it outputs nothing. Any ideas how to make it output Greek?
I tried adding non-Greek letters, and that didn't work quite right either.
The first thing to try is to make the program not dependent on the encoding of the source file. So use Unicode escapes not literal Unicode letters
std::wstring greekWord = L"\u0395\u03BB\u03BB\u03B7\u03BD\u03B9\u03BA\u03AC";
Having the incorrect encoding in the source file is only one thing of many things that could be preventing you from printing Greek. The other obvious issue is the ability of your terminal to print Greek letters. If it can't do that, or needs to be set up correctly so that it can then nothing you do in your program is going to work.
And probably you want to fix the source code encoding issue, so that you can use unescaped literals in your code. But that's dependent on the compiler/IDE you are using.
If you are outputting your cout to a normal console then the console doesn't usually support unicode text like greek, try setting it up for unicode text or find another way to output your data, like txt files or some gui,
There are two way to do this.
The old, non-standard Microsoft way is as follows:
#include <fcntl.h>
#include <io.h>
int main()
{
_setmode(_fileno(stdout), _O_U16TEXT);
_setmode(_fileno(stdin), _O_WTEXT);
// your code here
}
You will fild this everywhere, but this is not necessarily a good way to solve this problem.
The more standards-compliant way is as follows:
#include <locale>
int main()
{
std::locale l(""); // or std::locale l("en_US.utf-8");
std::locale::global(l); // or std::wcout.imbue(l); std::wcin.imbue(l);
// your code here
}
This should work with other modern compilers and operating systems too.
TRY this it works with me :
#include
#include <io.h>
#include <fcntl.h>
using namespace std;
int main() {
_setmode(_fileno(stdout),_O_U16TEXT);
wcout<<L"Ελληνικά";
setlocale(LC_ALL,"");
return 0;
}
I need to print a csv file with numbers.
When the file is printed , I have numbers with dots, but I need them with commas.
Here an example.
If I print this number in terminal using locale method, I obtain a number with comma, but in the file I have the same number but with dot. I do not understand why.
How could I do?
#include <iostream>
#include <locale>
#include <string> // std::string, std::to_string
#include <fstream>
using namespace std;
int main()
{
double x = 2.87;
std::setlocale(LC_NUMERIC, "de_DE");
std::cout.imbue(std::locale(""));
std::cout << x << std::endl;
ofstream outputfile ("out.csv");
if (outputfile.is_open())
{
outputfile <<to_string(x)<<"\n\n";
}
return 0;
}
Thanks in advance.
Locales are system-specific. You probably just made a typo; try "de-DE", which will probably work (at least it does on my Windows).
However, if your program is not inherently German-centric, then abusing the German locale just for the side effect of getting a specific decimal point character is bad programming style, I think.
Here is an alternative solution using std::numpunct::do_decimal_point:
#include <string>
#include <fstream>
#include <locale>
struct Comma final : std::numpunct<char>
{
char do_decimal_point() const override { return ','; }
};
int main()
{
std::ofstream os("out.csv");
os.imbue(std::locale(std::locale::classic(), new Comma));
double d = 2.87;
os << d << '\n'; // prints 2,87 into the file
}
This code specifically states that it just wants the standard C++ formatting with only the decimal point character replaced with ','. It makes no reference to specific countries or languages, or system-dependent properties.
Your issue is that std::to_string() uses the C locale libraries. It appears that "de_DE" is not a valid locale on your machine (or Coliru for that matter), leading to the default C locale being used and using .. The solution is to use "de_DE.UTF-8". As an aside, using "" for std::locale will not always produce commas; instead, it will depend on the locale set for your machine.
I'm writing a batch emulator as a personal project. I'm trying to implement the cd command using chdir() from unistd.h. However, using this causes a segfault.
main.cpp:
#include <cstdlib>
#include <iostream>
#include <string>
#include <vector>
#include <stdio.h>
//Custom headers
#include "splitting_algorithm.hpp"
#include "lowercase.hpp"
#include "chdir.hpp"
//Used to get and print the current working directory
#define GetCurrentDir getcwd
using namespace std;
int main(int argc, char* argv[])
{
string command;
//Begin REPL code
while (true)
{
//Prints current working directory
cout<<cCurrentPath<<": ";
std::getline(std::cin, command);
vector<string> tempCommand = strSplitter(command, " ");
string lowerCommand = makeLowercase(string(strSplitter(command, " ")[0]));
//Help text
if(tempCommand.size()==2 && string(tempCommand[1])=="/?")
{
cout<<helpText(lowerCommand);
}
//Exit command
else if(lowerCommand=="exit")
{
return 0;
}
else if(lowerCommand=="chdir")
{
cout<<string(tempCommand[1])<<endl;
chdir(tempCommand[1]);
}
else
cout<<"Can't recognize \'"<<string(tempCommand[0])<<"\' as an internal or external command, or batch script."<<endl;
}
return 0;
}
chdir.cpp:
#include <cstdlib>
#include <string>
#include <unistd.h>
void chdir(std::string path)
{
//Changes the current working directory to path
chdir(path);
}
Strangely enough, using cout to get the path for chdir works perfectly fine. How do I fix this?
You have recursive, unterminated behaviour in Your code. This overflows the stack.
Try to insert breakpoint in void chdir(std::string path) and see what happens.
You will see that the function chdir calls itself, and in turn calls itself again, and again and... well, segmentation fault.
Also, try to see what "call stack" is in the debugger, this issue is very visible there.
You should invoke the underlying chdir function using
::chdir(path.c_str());
or you will just call your own method again.
In unistd.h, chdir is defined as:
int chdir(const char *);
So you must call it with a const char* argument or the compiler will search for another function called "chdir" which take a std::string argument and use that instead.
#include <iostream>
#include <string.h>
using namespace std;
int main ()
{
string st = "Hello world";
return 0;
}
and
#include <string>
int main ()
{
std::string st = "Hello world";
return 0;
}
I tried compiling this code using minGW compiler on netbeans. It brings up the following error after the successful build.
RUN FAILED (exit value -1,073,741,511, total time: 93ms)
But it works clean when strings are not used. I would like to know what I am doing wrong here. Thanks in advance.
Use c++ strings and don't use using namespace std:
#include <string> //c++ string header
int main ()
{
std::string st = "Hello world";
return 0;
}
#include <string.h> is the old C-style string header and most likely isn't what you want to use here. See this question for more details: Difference between <string> and <string.h>?
Note: If you really wanted the old C-style strings then you really should be using #include <cstring> because this will put those functions into the std namespace and won't cause any namespace pollution that can lead to other undesirable outcomes.
Likely what happened was that you used the old style string header and didn't properly initialize those strings. The old C-style strings don't have a constructor and operator= defined like the std::string class.
Edit: After looking at the Netbeans forum this is a problem with Netbeans and not a c++ issue. Try changing the output to an external terminal in Netbeans. Or run the program directly from the command line. If these approaches don't fix the problem or are undesirable then make a post over on the Netbeans forum. Also have a look at this question: Program won't run in NetBeans, but runs on the command line!
Uss #include <string> instead of string.h
I have some very basic semaphore code that works great on Linux, but cannot for the life of me get it to run properly on OS X... It returns the oddest of results...
#include <iostream>
#include <fcntl.h>
#include <stdio.h>
#include <semaphore.h>
int main()
{
sem_t* test;
test = sem_open("test", O_CREAT, 0, 1);
int value;
sem_getvalue(test, &value);
printf("Semaphore initialized to %d\n", value);
}
Compiling this on OS X with g++ returns the following output:
iQudsi:Desktop mqudsi$ g++ test.cpp
iQudsi:Desktop mqudsi$ ./a.out
Semaphore initialized to -1881139893
Whereas on Ubuntu, I get the decidedly more-sane result:
iQudsi: Desktop mqudsi$ g++ test.cpp -lrt
iQudsi:Desktop mqudsi$ ./a.out
Semaphore initialized to 1
I've been at this for 3 hours straight, and cannot figure out why OS X is returning such bizarre results...
I've tried using file paths as the semaphore name, it didn't make a difference.
I'd appreciate any help I could get.
Are you testing for errors? Try:
#include <iostream>
#include <fcntl.h>
#include <stdio.h>
#include <semaphore.h>
int main()
{
sem_t* test;
test = sem_open("test", O_CREAT, 0, 1);
if (test == SEM_FAILED) {
perror("sem_open");
return 1;
}
int value;
if (sem_getvalue(test, &value)) {
perror("sem_getvalue");
return 1;
}
printf("Semaphore initialized to %d\n", value);
}
$ g++ sem-testing.cc -Wall
$ ./a.out
sem_getvalue: Function not implemented
$ man sem_getvalue
No manual entry for sem_getvalue
You are using a function that is not currently implemented in Mac OS X, and the integer you are printing out contains the default data that the integer was initialised with which was probably random data that was still in memory. Had you zero'd it out, by setting it with int value = 0; you might have caught this mistake sooner.
This is the code I used (thanks to bdonlan):
#include <iostream>
#include <fcntl.h>
#include <stdio.h>
#include <semaphore.h>
int main()
{
sem_t* test;
test = sem_open("test", O_CREAT, 0, 1);
if (test == SEM_FAILED) {
perror("sem_open");
return 1;
}
int value;
if (sem_getvalue(test, &value)) {
perror("sem_getvalue");
return 1;
}
printf("Semaphore initialized to %d\n", value);
}
Well, perhaps sem_open() is failing - you didn't test.
Or, perhaps OSX doesn't default to supporting shared posix sems - if /dev/shm isn't mounted, typically the system won't support sem_open().
You may want to use SysV semaphores.
A similar question regarding Slackware was asked here: how-do-i-stop-semopen-failing-with-enosys
However, further searching shows OSX named semaphones are built on top of Mach semaphores, and you probably need to sem_unlink() them when you're done (not just sem_close(), or maybe instead), and you should be careful about permissions - I suggest starting with 0777 or maybe 0700, instead of 0. See posiz semaphores in Darwin