Redirect ostream to output file - c++

I am still very inexperienced with cpp.
I have a function I'd like to call from a .cpp file, below is its header:
int wsq_encode(unsigned char* bufferRAW, int width, int height, char compressRate, std::ostream &streamWSQ);
I need to write a code that opens tons of RAW image formats (bufferRAW) and compress them to .wsq according to this company's algorithm, all the while using the width, height and compression rate parameters via argv[]. The output file is supposed to go to streamWSQ.
The wsq_encode is closed and I won't go into it. I am having trouble passing the output file to wsq_encode. The code I need to write is very simple:
#include "../inc/libcorewsq.h"
#include <fstream>
#include <iostream>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main (int argc, char **argv) {
unsigned char raw[20];
strcpy ((char*)raw, argv[1]);
int width = atoi(argv[2]);
int height = atoi(argv[3]);
ostream arq;
arq.open ("out.wsq");
wsq_encode (raw, width, height, 5, arq);
return 0;
}
I still battling how to do this. I need to compile and run it using GCC 4.4.7 inside a CentOS ssh shell.

Try using std::ofstream, where the f is for files.
Opening std::ostream opens a generic output stream.

#include is irrelevant: you failed to link in the definition of the function.

Related

Write Binary file from char buffer on linux

So i am new to linux programming on C++ and i am trying to write the contents of a binary file (.dll, .exe etc) to a .txt to test and see the results of the operation, the code works and writes the .txt file and some of the binary into it, but when i open the .txt file there is not the full binary writed inside and the problem is due invalid unicode from far i know.
Here is a screenshot for better understanding:
Click here to see image from stackoverflow
or
Text Output when open the .txt file:
MZ\90\00\00\00\00\00
And here is the code i am using (reproducible example):
#include <algorithm>
#include <array>
#include <chrono>
#include <cstring>
#include <fstream>
#include <functional>
#include <iostream>
#include <memory>
#include <sstream>
#include <fstream>
#include <string>
#include <vector>
#include <unordered_map>
#include <unordered_set>
std::vector<char> buffer;
bool read_file(std::string name, std::vector<char>& out)
{
std::ifstream file(name.c_str(), std::ios::binary);
if (!file.good())
{
return false;
}
file.unsetf(std::ios::skipws);
file.seekg(0, std::ios::end);
const size_t size = file.tellg();
file.seekg(0, std::ios::beg);
out.resize(size);
file.read(out.data(), size);
file.close();
return true;
}
void write_text_to_log_file(char* text)
{
std::ofstream log_file("log_file.txt", std::ios_base::out | std::ios_base::app );
log_file.write(text, sizeof(text));
}
int main(int argc, char* argv[])
{
read_file("bin.dll", buffer);
printf("Image Array: %s\r\n", buffer.data());
printf("Image Size: %zu\r\n", buffer.size());
write_text_to_log_file(buffer.data());
}
Any help is apreciated, i am trying to do exactly the same than file_get_contents of php and whit the raw binary buffer write the file, for example write the raw binary to .dll format .exe, .png etc etc.
log_file.write(text, sizeof(text));
sizeof is a compile time constant that gives you the size of the object. text is a char *, so this gives you a grand total of 4 or 8, depending on whether you compiled a 32bit or a 64bit binary. It doesn't matter whether text points to just a few bytes, or the entire contents of "Harry Potter And The Deathly Hallows". This sizeof will always produce either a 4 or an 8 for you, no matter what's in text.
You need to pass an additional parameter here that comes from the buffer.size() of the std::vector where the data is stored, and use that here. sizeof() is not the same thing as a method of std::vector that's called "size".

C++ - Is it possible to edit the data inside an ifstream?

I'm trying to use Boost-GIL to load an image. The read_image function takes ifstream as a parameter, but I will already have the image as a base64 encoded string. Is it possible for me to add the decoded image into an ifstream manually, so that I wont have to write and read from disk to get the image loaded into GIL? Another possibility could be to somehow use a string stream to add the data and cast that to an ifstream, though I haven't had luck trying that.
Boost.GIL's read_image function you mentioned seems to support istream interface. If you have an array, you can make use of boost.iostreams to represent the array as a stream.
Here is a made-up example since you do not provide a code snippet.
#include <boost/iostreams/device/array.hpp>
#include <boost/iostreams/stream.hpp>
#include <boost/gil.hpp>
#include <boost/gil/io/read_image.hpp>
#include <boost/gil/extension/io/tiff.hpp>
int main(int argc, char* argv[]) {
const char* data = "decoded-data";
boost::iostreams::stream<boost::iostreams::array_source> in{data, std::strlen(data)};
boost::gil::rgb8_image_t img;
read_image(in, img, boost::gil::tiff_tag());
return 0;
}
Alternatively, you could use std::stringstream to store your decoded image and give that to the read_image function. Something along the lines of:
#include <boost/archive/iterators/binary_from_base64.hpp>
#include <boost/archive/iterators/insert_linebreaks.hpp>
#include <boost/archive/iterators/transform_width.hpp>
#include <boost/archive/iterators/ostream_iterator.hpp>
#include <boost/gil.hpp>
#include <boost/gil/io/read_image.hpp>
#include <boost/gil/extension/io/tiff.hpp>
#include <sstream>
using base64_decode_iterator = transform_width<binary_from_base64<const char*>, 8, 6>;
int main(int argc, char* argv[]) {
const char* data = "base64-data";
std::stringstream ss;
std::copy(
base64_decode_iterator{data},
base64_decode_iterator{data + std::strlen(data)},
std::ostream_iterator<char>{ss}
);
boost::gil::rgb8_image_t img;
read_image(ss, img, boost::gil::tiff_tag());
return 0;
}

Ifstream is failing to load a file and it won't open

Some of this code may seem foreign to you since I make 3ds homebrew programs for fun but it's essentially the same but with extra lines of code you can put in. I'm trying to read a file called about.txt in a separate folder. I made it work when I put it in the same folder but i lost that file and then my partner said he wanted it in Scratch3ds-master\assets\english\text and not in Scratch3ds-master\source I keep getting the error I coded in. I'm new to stack-overflow so this might be too much code but well here's the code:
#include <fstream>
#include <string>
#include <iostream>
int main()
{
// Initialize the services
gfxInitDefault();
consoleInit(GFX_TOP, NULL);
int version_major;
int version_minor;
int version_patch;
version_major = 0;
version_minor = 0;
version_patch = 2;
printf("This is the placeholder for Scratch3ds\n\n");
std::ifstream about_file;
about_file.open("../assets/english/text/about.txt");
if (about_file.fail())
{
std::cerr << "file has failed to load\n";
exit(1);
}
Chance are that you're using devkitpro packages. And chances are that the devkitpro team provide an equivalent of the NDS 'ARGV protocol' for 3DS programming. In which case, if you use
int main(int argc, char* argv[]);
you should have the full path to your executable in argv[0] if argc is non-zero.
https://devkitpro.org/wiki/Homebrew_Menu might help.
Your program has no a priori knowledge of what sort of arguments main() should receive, and in your question, you're using a main function that receives no argument at all.
Established standard for C/C++ programming is that main() will receive an array of constant C strings (typically named argv for arguments values) and the number of valid entries in that array (typically named argc for count). If you replace your original code with
#include <fstream>
#include <string>
#include <iostream>
int main(int argc, char* argv[])
{
// Initialize the services
// ... more code follows
then you're able to tell whether you received argument by testing argc > 0 and you'll be able to get these arguments values with argv[i].
With homebrew development, it is unlikely that you can pass arguments such as --force or --directory=/boot as on typical command-line tools, but there is one thing that is still useful: the very first entry in argv is supposed to be a full path for the running program. so you're welcome to try
std::cerr << ((argc > 0) ? argv[0] : "<no arguments>");
and see what you get.

VCOS does not name a type

I'm, trying to output video from raspicam to framebuffer 0, and I'm having an issue with BCM_HOST, where I get a ton of errors from the included vcos.h.
All the errors are of the same 2 types:
'VCHPRE_' does not name a type,
'vcos_boot_t' has not been declared,
In files: connection.h vc_ispmanx.h, message.h etc.
etc.
I'll link to a full pastebin of errors below
I don't even know where to begin solving these, I moved /opt/vc from raspbian to my sysroot folder using VisualGDB's synchronize sysroot feature, and all the include files are there.
Is this a problem with the files themselves? It can't be,
Thanks for any help,
-D
Pastebin link: https://mypastebin.com/xQdN7mZZInHx
Example:
#include <stdio.h>
#include <syslog.h>
#include <fcntl.h>
#include <linux/fb.h>
#include <sys/mman.h>
#include "bcm_host.h"
using namespace std;
int main(int argc, char **argv) {
{
DISPMANX_DISPLAY_HANDLE_T display;
DISPMANX_MODEINFO_T display_info;
DISPMANX_RESOURCE_HANDLE_T screen_resource;
VC_IMAGE_TRANSFORM_T transform;
uint32_t image_prt;
VC_RECT_T rect1;
int ret;
int fbfd = 0;
char *fbp = 0;
struct fb_var_screeninfo vinfo;
struct fb_fix_screeninfo finfo;
return 0;
}
Ok, it seems that using VisualGDB sysroot synchronize tool causes some files to be copied with 0 length. I checked vcos.h and it was empty, but on my linux system it had data. Fixed by copying all the files manually.

Strange behavior with c++ io

I am using zlib to compress data for a game I am making. Here is the code I have been using
#include <SFML/Graphics.hpp>
#include <Windows.h>
#include <fstream>
#include <iostream>
#include "zlib.h"
#include "zconf.h"
using namespace std;
void compress(Bytef* toWrite, int bufferSize, char* filename)
{
uLongf comprLen = compressBound(bufferSize);
Bytef* data = new Bytef[comprLen];
compress(data, &comprLen, &toWrite[0], bufferSize);
ofstream file(filename);
file.write((char*) data, comprLen);
file.close();
cout<<comprLen;
}
int main()
{
const int X_BLOCKS = 1700;
const int Y_BLOCKS = 19;
int bufferSize = X_BLOCKS * Y_BLOCKS;
Bytef world[X_BLOCKS][Y_BLOCKS];
//fill world with integer values
compress(&world[0][0], bufferSize, "Level.lvl");
while(2);
return EXIT_SUCCESS;
}
Now I would have expected the program to simply compress the array world and save it to a file. However I noticed a weird behavior. When I prited the value for 'comprLen' it was a different length then the created file. I couldn't understand where the extra bytes in the file were coming from.
You need to open the file in binary mode:
std::ofstream file(filename, std::ios_base::binary);
without the std::ios_base::binary flag the system will replace end of line characters (\n) by end of line sequences (\n\r). Suppressing this conversion is the only purpose of the std::ios_base::binary flag.
Note that the conversion is made on the bytes written to the stream. That is, the number of actually written bytes will increase compared to the second argument to write(). Also note, that you need to make sure that you are using the "C" locale rather than some locale with a non-trivial code conversion facet (since you don't explicitly set the global std::locale in your code you should get the default which is the "C" locale).