Array values changing by themselves in C++ - c++

In the following program, I am reading a 6-length string dd and storing its ascii code in an integer array ipc of size 3x2. The problem is that the values stored in ipc are wrong and they change themselves later when I reprint them. I am surprised how can there be such a clear problem with such a simple code. (I am using Code::Blocks 10.05 on Win7 x64)
#include<iostream>
using namespace std;
int main()
{ char dd[5];
int ipc[2][1];
cin.get(dd,6);
for(int i=0;i<3;i++)
{ for(int j=0;j<2;j++)
{ ipc[i][j]=int(dd[j+2*i]);
cout<<ipc[i][j]<<endl;
}
}
cout<<"------"<<endl;
for(int i=0;i<3;i++)
{ for(int j=0;j<2;j++)
{ cout<<ipc[i][j]<<endl; }
}
}
If input given is 123456, the output is:
49
50
51
52
53
2
------
49
51
51
53
53
2
Any sort of help will be very much appreciated. Thank you.

The array declaration is incorrect and the code is going out-of-bounds on the array causing undefined behaviour. Declaration should be changed from:
int ipc[2][1];
to:
int ipc[3][2];
Additionally, cin.get() will read count - 1 characters, so:
cin.get(dd, 6);
will only read 5 characters, not 6. If the user enters 123456 only 12345 will be read. The cin.get() will also append a null character, (as commented by tinman). To correct increase the size of dd and the number of characters to be read:
char buf[7];
cin.get(buf, 7);

Related

Reading and writing to files C++ [duplicate]

This question already has an answer here:
feof() and fscanf() stop working after scanning byte 1b as a char. Is it because it is 'ESC' in ascii? What can I do?
(1 answer)
Closed 5 years ago.
I've got problem regarding output/input from files.
Here is my program:
#include <bits/stdc++.h>
using namespace std;
int main()
{
FILE * out;
out=fopen("tmp.txt", "w");
for(int i=0; i<256; i++)
{
fprintf(out, "%c", char(i));
}
fclose(out);
FILE * in;
in=fopen("tmp.txt", "r");
while(!feof(in))
{
char a=fgetc(in);
cout<<int(a)<<endl;
}
fclose(in);
}
and here is the output:
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
-1
Why is it stopping so quickly?
Does that mean char(26) is EOF?
How could i write to file (of any type) to overcome this problem?
What I'm looking for is a way to freely write values (of any range, can be char, int or sth else) to a file and then reading it.
Works for me *), however a few remarks:
You should not use #include <bits/stdc++.h>, that is an internal header intended for compiler use, not to be included from the client apps.
As some characters are translated (e.g. EOL) or specifically interpreted in the text (default) mode, you should open the files in binary mode.
Reading as (signed) char and converting to int will result in negative values past 127.
As fgetc already returns int, you actually do not need to do that conversion to signed char and back at all.
See here the code with the corrections.
*) Apparently as mentioned in other comments it might not work on Windows in text mode (see the point 2.).
What I'm looking for is a way to freely write values (of any range, can be char, int or sth else) to a file and then reading it.
In this case you must:
Separate the individual values with a delimiter, such as space or new-line symbol.
Read back integers rather than individual separate characters / bytes.
The easiest is to use C++ std::fstream for that. E.g.:
int main() {
{
std::ofstream out("tmp.txt");
for(int i=0; i<256; i++)
out << i << '\n';
// out destructor flushes and closes the stream.
}
{
std::ifstream in("tmp.txt");
for(int c; in >> c;)
std::cout << c << '\n';
}
}

C++ std::cin is stuck at 80 lines of input

here is the code that I have
int main()
{
int total = 0;
int count = 0;
std::cin >> total;
int arr[4] = {0,0,0,0};
while(count < total)
{
std::cin>>arr[0]>>arr[1]>>arr[2]>>arr[3];
count++;
std::cout<<count<<std::endl;
}
return 0;
}
so the first line of input tells how many lines I need to read after..and each line has 4 numbers separated by space. Whenever the number of lines exceed 80(e.g 100), then the while loop gets stuck. I have no idea what causes the problem and I have tried a couple things like cin.clear() but they just didnt work....
Edit: std::cin stops reading after 80 lines of input with format like 10 20 210 10
Xcode with LLVM didn't work...However g++ using terminal works.... http://melpon.org/wandbox/permlink/UXAMgM4ldn2K2NgU
here is the code that works on my terminal with g++ but not my xcode...
It's the output that's getting stuck. Unless the output of the count is being read by something and consumed, eventually the output buffer will get full and the cout line will block.

C++: Linux code does not work on Windows - piping data to program

I'm looking for some help piping a file (16-bit signed little endian integers raw data) from the command line to my program:
cat rawdata.dat | myprogram
The code works pretty well on Linux, 512 bytes are transformed into 256 ints per loop turn.
If I compile it with MinGW on Windows, only the first 76 values will be transformed correctly. Also the program stops after the first while loop.
Does anybody know what I am doing wrong? I am using Windows 7 64bit + MinGW compiler.
Code:
#include <iostream>
using namespace std;
int main()
{
int BUF_LEN = 512;
char buf[BUF_LEN];
while(!cin.eof())
{
cin.read(buf, BUF_LEN);
int16_t* data = (int16_t*) buf; //to int
for(int i=70;i<85;i++)
{
cout << i << " " << data[i] << endl;
}
}
return 0;
}
Testfile: http://www.filedropper.com/rawdata
Correct values would be:
70 -11584
71 13452
72 -13210
73 -13331
74 13893
75 10870
76 9738
77 6689
78 -253
79 -1009
80 -16036
81 14253
82 -13872
83 10020
84 -5971
TL;DR:
Fix? There isn't one. You have to force cin into binary mode. I think to do that you'd have to close and reopen cin, and I can only see that ending badly.
Real solution is don't do this. Open the file normally with
std::ifstream in("rawdata.dat", std::fstream::binary);
Rest of the story:
Suspected that this would be some sort of gooned binary translation, so I put together a quick bit of code to see what's going on in the file.
#include <iostream>
#include <fstream>
using namespace std;
#define BUF_LEN 512
int main()
{
ifstream in("rawdata.dat");
char buf[BUF_LEN];
int16_t test;
int count = 0;
while(in.read((char *)&test, sizeof(test)))
{
cout << count++ << ":" << in.tellg() << ":" << test << endl;
if (count == 85)
{
break;
}
}
return 0;
}
Important output (int16 number:position in file:number read
0:358:0
First value returned is actually at position 358. Not sure why.
75:508:10870
76:510:9738
77:909:8225
78:911:11948
Wooo-eee! Look at that position jump from 510 to 909. Nasty. 510 would be right around the end of the buffer, but it doesn't look like the buffer is being respected.
My understanding is istream::read should be totally unformatted, just a dumb copy of input stream to provided buffer so I have no idea why this happens. Maybe windows is just weird.
Addendum
Thomas Matthews probably has the right idea, secret Windows control characters, but 510 is a rather innocuous comma. Why go bonkers over a comma?
This one finally solved my problem:
Read binary data from std::cin
Just add the following lines to your code if you are using MinGW:
#include <io.h>
#include <fcntl.h>
#include <fstream>
_setmode(_fileno(stdin), _O_BINARY);

Read from text file and store each value in separate arrays in c++

I am trying to read the following contents:
rima doha 44881304 20 30 10 10 20 10102 10102
andrew ny 123456 12 12 13 14 15 01020 03040
and store them in separate arrays, edit them, then store again into the same file.
Here is my attempted code:
ifstream infile;
infile.open("D:\\customers.txt");
string names[100];
string addresses[100];
int tn[100];
int numCalls[100];
int trunkCalls[100];
int numMobCalls[100];
int isdnNum[100];
int dueD[100];
int paymentD[100];
int numOfPpl = 0;
int numOfPpl = 0;
for(int i=0; i<100; i++){
infile >> names[i] >> addresses[i]>>tn[i]>>numCalls[i]>>trunkCalls[i]>> numMobCalls[i]>> isdnNum[i]>>dueD[i]>>paymentD[i];
numOfPpl++;
}
//Code where some values were edited
ofstream outfile("D:\\customers.txt");
for(int i=0; i<numOfPpl; i++)
{
outfile<<names[i] << "\t" <<addresses[i] << "\t" <<tn[i]<<"\t" <<numCalls[i]<<"\t"
<<trunkCalls[i]<<"\t"<<numMobCalls[i]<<"\t"<<numMobCalls[i]<<"\t"<<isdnNum[i]<<"\t"<<dueD[i]<<"\t"<<paymentD[i]<<endl;
}
outfile.close();
infile.close();
Issue is that the first two lines are stored correctly, but then there are random values in the file. How do I fix this?
There's a couple things wrong with your code.
First, you declare numOfPpl twice. Get rid of the 2nd count.
Secondly, you have 9 input categories (names, addresses, etc.), but your text file has 10 per line. This will throw the entire program off.
The third issue is are you going to have exactly 100 lines in the customers.txt? If not, you should utilize a command to peek to the next line to make sure if there's another line below. Using a while or do/while loop would probably be better if there aren't 100 lines in your text file. Something like
while(infile){
// retrieve your data
infile.peek();
}
or utilizing the for loop
for(int i = 0; infile; i++){
// retrieve your data
infile.peek();
}
would probably be a better loop. Again, if you're not going to have 100 lines, the for loop will give you the null output because there isn't any data being put into the array elements. I believe that should correct your issues.
Another thing you should watch out for is having an extra space at the end of a line. That'll throw your program output off as well.

cannot read the text file properly anymore

Alright so basically this is my program that reads a text file and place it inside of an array, in the end I printed out everything in that array. The program run fine and yielded the correct result few days. However, it just stopped working today. For instance the text file is
88
687
472
671
But upon the completion of the program the output is 0 1073741824 0 1073741824. I dont know what is going on and the only time I made some change to the bash was ulimit -s unlimited. Any idea?
int main(int argc, char *argv[])
{
ifstream file(argv[1]);
int placeholder;
int size = pow(2,atoi(argv[2]));
int array[size];
int index = 0;
while (file >> placeholder)
{
array[index]=placeholder;
index++;
}
for(int i = 0; i<size; i++)
{
cout<<array[i]<<endl;
}
return 0;
}
Are you sure your text-file is readable by the program? If the input file does not exist, the program will still try to print argv[2]**2 entries from array which does not contain any elements! The program ends up dumping garbage values.
I am also not sure why you do the pow call - why not get the number of elements from argv[2]?
Also, you use some c functions (atoi) when you could use C++ stringstream to do the conversion.
When I run your code with the input you provide, like this: ./a.out file.txt 2, it prints the 4 numbers as expected. When I do this instead: ./a.out does_not_exist.txt 2, it prints 4 garbage values and dumps them to the screen.