How to convert string to a FILE* opened for read [closed] - c++

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 years ago.
Improve this question
I am writing a C++ application, which needs to call a library function that expects FILE* opened for read. I have no code of that function - it a black box to me.
The data that I need to pass to the function is in memory in char* buffer of a known size. Ideally, I need to wrap a my buffer in a FILE* structure and pass it in, but I need all the stdio functions that normally works on a FILE* to work on what I pass in, for I don't know which of the functions it calls - definitely fread, possibly fseek too.
The code is rather a performance-sensitive one, so I would like to avoid writing the buffer to disk just so that I could create a FILE* from it by fopen. Is there a way to make a FILE* that would allow from my buffer in memory?
Something like stringstream in c++?
My code needs to be able to run both on windows and linux.
I've seen quite a few questions where people try to write via FILE* to memory. My case is rather a reverse - I need to present the existing buffer as a readable FILE*.
Thanks a lot!
P.S. yes, Using a C string like a FILE* is exactly what I'm asking, somehow I couldn't find it earlier...
Still, if you could suggest a solution on windows, that would be very helpful!

Posix requires the fmemopen() function, which does exactly what you ask.
Glibc certainly has it.
Can't say for sure about Windows, but it appears that MinGW includes newlib, which also implements it. If that's any help.

Related

What happens if I use close() on a char array? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I had an interesting question. I have:
char buf[100]
And I decided to try using close(buf)
Code compiled, the program works. But is there any point in using close() like this?
Thank you.
Assuming "close" is the function from posix, most likely nothing will happen but it's also possible stuff could break badly.
Arrays in c and c++ decay to pointers, close takes an int. Implicitly converting a pointer to an int is not allowed by the c++ spec but some compilers allow it anyway (doing some testing it looks like modern g++ only allows it if -fpermissive is specified).
Most likely the integer that results from said conversion will be large, file descripters are usually small, so most likely close will just return a bad file descriptor error and do nothing but if it does happen to match a file descriptor then things could get interesing.....
It should not compile. The compiler should emit warnings.
The behaviour is undefined
No, there is no point in using close on a char array
There is also no meaning in doing that. What would you want to achieve?

Are there any issues reading directly into an std::string? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I've seen in code a direct read into std::string where the contents are intended to be interpreted as a string as follows:
std::string note;
note.resize(n);
read( &note[0], n );
Assume that n is of a fixed size, as in a parsing scenario.
Are there are any issues with reading directly into a string? I have seen a lot of uses of ifstreams, but it seems excessive in this case.
First, if it's a text file made by several lines, I find the std::string class not a good choice as a "container"; I would prefer just a std::vector<char>, or if you want to do some additional parsing and break the file into its single lines, a std::vector<std::string>.
I'd also pay attention to the encoding used by the file: is it UTF-8? Is it some other char-based encoding?
For example, if the file is UTF-16, reading it as a raw sequence of bytes into a std::string would be very misleading (and bug prone).
Moreover, it's important also to pay attention to the size of the file. If you have a gigantic text file (e.g. 5GB) and you are building a 32-bit Windows application, your code won't work (as 32-bit processes on Windows are limited to 2GB by default). In such cases, reading the file content in smaller chunks (or using memory-mapped file techniques with smaller "views" on the file) may be a better alternative.
Look at it this way: “What's the worst that could happen?”
Are you obtaining a file from the local user? And if they supply a file that's too big, perhaps their machine will thrash, or even kill your program with an out-of-memory error?
Do you expect the user to do that often enough to worry about?
Alternatively:
Are you obtaining the file from a network source or untrusted user? Would giving that user the ability to potentially thrash your system or kill your application constitute a risk?

Default exit function implementation [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to implement default behavior of exit call. I don't know what should I do and what is the most suitable way to do this. I have read that it should close file descriptors and something else.
Should I close default streams (stdout,err and in) ?
How to exit from nested functions calls ? Using goto is bad practice, what is the best way to break out ?
Thanks.
Do all of the things listed in exit(3), then invoke the _exit(2) system call. Alternatively, use longjmp(3) to jump back up to the main() function, then return from it. This invokes the same behavior as calling exit(3), and is just as dependent on the C runtime, so if exit(3) is unavailable for some reason, returning from main() will probably not work correctly either.
Unfortunately, AFAIK there is no portable way to enumerate all of the functions which may have been registered with atexit(3) and on_exit(3), so you'll have to keep track of those manually (i.e. every time you call atexit(3) or on_exit(3), append the function pointer to a list). Flushing stdio(3) is 3 straightforward fflush(3) calls.
You do not need to close any streams or file descriptors; the OS should do that automatically (the OS must not leak streams and fd's, so it is responsible for cleaning them up).
NB: longjmp() is almost always wrong under C++; throw an exception instead. It generally should only be used under straight C.

Changing behaviour of class in c++ via other class [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
We all know that we have ifstream and ofstream classes with their own functionality: reading, writing, line by line reading etc.
ifstream input_file("test.in") ;
ofstream output_file;
output_file.open("test.out");
So, let assume now we want to extend reading/writing functional via creating some class (lets call it BackUp) and somehow make ifstream and ofstream to use BackUp's reading/writing instead (only). As far as I understood this principle is called acquaintance?
The main difference should be that when opening a file a copy of it should be created somewhere. Then we work with this starting file like usually, we overwrite it by our results and if the procedure was successful only then the temporary copy is deleted.
Yes, I know, that it's probadly better just to write and use a common function, but this is not my goal.
I have also made a schemes to understand the logic:
Before:
After:
Goal
I want the target file to be overwritten if the entire operation is successful, so if the program crashes, if one excidently turns off the PC etc. the data from the original file would be saved atleast somewhere.
Question itself
I need to grasp the idea itself how this can be done in general case but It would be nice if one would use this particular this example to explain.

Rewrite txt file into other encoding [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have a problem. I want to rewrite txt file into other txt file, but with other encoding. I must implement conversion to Unicode, iso-8859 and windows-1250.
I must write it in c++.
Can anyone help me with this topic? How to start coding this?
Best regards!
Windows is perfectly capable of doing string conversions for you. Read data from the source file and pass it to MultiByteToWideChar specifying the source codepage, then pass that output to WideCharToMultiByte specifying the target codepage and write that output to the target file.
BTW, next question state up front that you're working on Windows only. Don't put useful information like that in a comment.
I would start by getting a good in-depth knowledge of this encoding formats, I would create some encoding conversion tables, and convert byte by byte. Also, it sounds like you are going to be dealing with different operating systems, so keep an eye out for endianness.
Here's a good link to get you started Encoding for Programmers.
EDT#1: Here is another link that goes a little more in depth on the subject of character encoding in windows. Here you can find functions and macros that can help you building your application.