C++ ofstream: Can't find output file - c++

I have created the a default DirectX12 application (the spinning 3D cube) and within my void DX::DeviceResources::Present() I am trying to write the backbuffer to a file:
// Present the contents of the swap chain to the screen.
void DX::DeviceResources::Present()
{
// The first argument instructs DXGI to block until VSync, putting the application
// to sleep until the next VSync. This ensures we don't waste any cycles rendering
// frames that will never be displayed to the screen.
HRESULT hr = m_swapChain->Present(1, 0);
UINT backBufferIndex = m_swapChain->GetCurrentBackBufferIndex();
ComPtr<ID3D12Resource> spBackBuffer;
m_swapChain->GetBuffer(backBufferIndex, IID_PPV_ARGS(&spBackBuffer));
//before writing the buffer, I want to check that the file is being
//created
ofstream myfile;
myfile.open("WHEREISTHEFILE.txt");
myfile << "Writing this to a file.\n";
myfile.close();
// If the device was removed either by a disconnection or a driver upgrade, we
// must recreate all device resources.
if (hr == DXGI_ERROR_DEVICE_REMOVED || hr == DXGI_ERROR_DEVICE_RESET)
{
m_deviceRemoved = true;
}
else
{
DX::ThrowIfFailed(hr);
MoveToNextFrame();
}
}
The problem occurs here:
ofstream myfile;
myfile.open("WHEREISTHEFILE.txt");
myfile << "Writing this to a file.\n";
myfile.close();
I just want to write a file first (as illustrated how here), before trying to write the contents of the back buffer. Now, for some reason, I cannot find the output file... I have searched everywhere, all directories in my project and even in the Microsoft DirectX SDK folder.
There are no exceptions thrown and I can step through each line while debugging without error.
Where could it be?

It should be in the directory of your project. If you are using Visual Studio, you can right-click your solution and click Open folder in File explorer.
Image: Open folder in File explorer
(I embedded it like this because I need 10 reputation to post an image directly)
Also with your code now, there is no way to determine whether your program is actually able to open the output file or not. I suggest you use something like this:
std::ofstream outputFile("./myOutput.txt");
if (outputFile.fail())
{
std::cout << "Failed to open outputfile.\n";
}
outputFile << "I like trains.";
outputFile.close();
The first line is an initialisation that does the same as .open(). Also mind the "./" in front of the path, this should not be mandatory but it can't hurt to do this (this means your current directory).
The important part about this piece of code is the .fail() check, if your program failed to open the outputfile, you will ofcourse not be able to find it anywhere. Hope this helped!

Where could it be?
Usually the file location is relative to your current working directory, i.e. WHEREISTHEFILE.txt should be located in whatever directory you were in when you started the program.
You can determine that directory inside your program via GetCurrentDirectory(), and change it to something else via SetCurrentDirectory().
But you did not check if the .open() was successful, so the writing could have failed altogether, for example due to insufficient permissions...?!

Related

What would cause ifstream code to fail on OS X?

I have the following code
string fileName = "assets/maps/main.json";
std::ifstream file(fileName);
std::string temp;
if(!file.good())
{
LOG(logERROR) << "Failed to open map file: " << fileName;
//return;
}
LOG(logDEBUG) << "file Char Count: " << file.gcount();
while(std::getline(file, temp))
{
mapString += temp;
}
file.close();
This code works superbly on Windows 8. When I take this program over to OS X, the file fails to open 100% of the time. Or to be more concise, file.good() never returns true. I intentionally commented out the return there to help debugging for later code.
Anyway, this has driven me insane. I cannot figure out why it's failing on OS X. I've tried different directories, re-created the file on OS X to make sure it wasn't an encoding or line-end issue, nothing at all.
What else can I do to debug, or what might I try as an alternative?
I've also checked the file permissions themselves and they are all fine. I have many other types of files in the same directory structure (images, music, fonts) and they all open fine, it's just this JSON file that fails, and any new derivatives of this file also fail.
When you start a program on Linux or MacOSX, the working directory will be wherever the user is. So, if your game needs to find files, you need to make use of the appropriate preference system. Mac has a concept of a 'bundle' that allows a program to come with data files and use find them, you'll have to learn how to make one. You can look inside all the '.app' directories in your /Applications directories for many examples.

Multiple use of same file streaming object

I've written the following source code:
ifstream leggiFile;
leggiFile.open("Questions.txt",ios::in);
if (!leggiFile.good())
{
cerr << "\n\n\n\tErrore during file opening Questions.txt\n\n\n" << endl;
}
else
{
// ...
};
leggiFile.close();
system("pause");
Now I'd like to use the same object for working with a second file.
leggiFile.open("Answers.txt",ios::in);
i=0;
if(!leggiFile.good())
{
cerr << "\n\n\n\tError during opening of file answers.txt\n\n\n" << endl;
}
else
{
// ...
}
Problem: The 2nd time the file cannot be opened and the error message appears. Why?
Could you please suggest me a solution?
It's possible that you've done work on the stream that set one or more of the error flags, such as eofbit.
Closing the stream doesn't clear its error flags, you have to do it manually. Call leggiFile.clear(); after you close it.
Since C++11, this is done automaticaly by open(), though. If you're already using a C++11 compiler, your problem is elsewhere (can't say where, you haven't shown enough code).
Learn singleton design pattern for logging or any multiple access to any file. You can also use Mutex lock so that code will be waited for resources like files. But it is not wise to use same file simultaneously. File can be open for a lyfecycle of code. It is not a issue.

Multiple opening of temporary file only in same process

I have a question about FILE_ATTRIBUTE_TEMPORARY marked files.
First of all, here is what I want to do:
I have a DLL, that takes a Filename, and opens that file internally and reads from it. I do not know how this file is handled inside.
The file I want to give to that DLL will be created by my process. It must be a temporary file and its data must be held only in RAM and must not be accessed by other processes. So I use the Win32 function CreateFile() with the FILE_ATTRIBUTE_TEMPORARY and the FILE_FLAG_DELETE_ON_CLOSE. This so far works, fine.
I have a tes code where I test if the file can be accessed a second time, while still opened. Here it is:
HANDLE WINHandle = CreateFile("TempFileWIN.txt", (GENERIC_WRITE | GENERIC_READ) ,(FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE), 0, CREATE_ALWAYS, (FILE_ATTRIBUTE_TEMPORARY | FILE_FLAG_DELETE_ON_CLOSE), 0);
ifstream ifs("TempFileWIN.txt", (ios::in | ios::trunc));
if(ifs.is_open())
{
cout << "Success!" << endl;
}
else if(ifs.fail())
{
cout << "Failed!" << endl;
}
I am using the fstream to test if the file could be opened with a stream. That code up there doesn't work. The output is "Failed!".
I know, that the file could be opened with the CreateFile a second time. I checked that out. But want to know if it is possible to open the file by an external DLL that works with (e.g.) a fstream.
I hope you can help me with this matter.
Best regards.
Edit:
Maybe a better question is how I can lock a file to my process and ensure, that it can never be accessed by an other process (even if my process is killed). The file must be openable with C++ fstream object.
If I were you, I would keep the handle of the open file, and pass it to the DLL code, and not use the filename, since you're likely to run into access restrictions at some point if you try to access a temporary, delete-on-close file using 'normal' file access.
It is possible to use a Windows handle in a fstream object as described in this answer: https://stackoverflow.com/a/476014/393701

Receiving a Sharing Violation Opening a File Code 32

I have been trying the following piece of code that does not work. What I am trying to do is to start executing my exe (one that I created a simple dialog based application using VC6.0) then from inside this application modify its own contents stored on the hard drive.
So there is a running copy of the exe and from this running copy it will open the disk copy into a buffer. Once loaded into a buffer then begin a search for a string. Once the string is found it will be replaced with another string which may not be the same size as the original.
Right now I am having an issue of not being able to open the file on disk for reading/writing. GetLastError returns the following error "ERROR_SHARING_VIOLATION The process cannot access the file because it is being used by another process.".
So what I did I renamed the file on disk to another name (essential same name except for the extension). Same error again about sharing violation. I am not sure why I am getting this sharing violation error code of 32. Any suggestions would be appreciated. I'll ask my second part of the question in another thread.
FILE * pFile;
pFile = fopen ("Test.exe","rb");
if (pFile != NULL)
{
// do something like search for a string
}
else
{
// fopen failed.
int value = GetLastError(); // returns 32
exit(1);
}
Read the Windows part of the File Locking wikipedia entry: you can't modify files that are currently executing.
You can rename and copy them, but you can't change them. So what you are trying to do is simply not possible. (Renaming the file doesn't unlock it at all, it's still the same file after the rename, so still not modifiable.)
You could copy your executable, modify that copy, then run that though.

Copying contents of one file to another in C++

I am using the following program to try to copy the contents of a file, src, to another, dest, in C++. The simplified code is given below:
#include <fstream>
using namespace std;
int main()
{
fstream src("c:\\tplat\test\\secClassMf19.txt", fstream::binary);
ofstream dest("c:\\tplat\\test\\mf19b.txt", fstream::trunc|fstream::binary);
dest << src.rdbuf();
return 0;
}
When I built and executed the program using CODEBLOCKS ide with GCC Compiler in windows, a new file named "....mf19.txt" was created, but no data was copied into it, and filesize = 0kb. I am positive I have some data in "...secClassMf19.txt".
I experience the same problem when I compiled the same progeam in windows Visual C++ 2008.
Can anyone please help explain why I am getting this unexpected behaviour, and more importantly, how to solve the problem?
You need to check whether opening the files actually succeeds before using those streams. Also, it never hurts to check if everything went right afterwards. Change your code to this and report back:
int main()
{
std::fstream src("c:\\tplat\test\\secClassMf19.txt", std::ios::binary);
if(!src.good())
{
std::cerr << "error opening input file\n";
std::exit(1);
}
std::ofstream dest("c:\\tplat\\test\\mf19b.txt", std::ios::trunc|std::ios::binary);
if(!dest.good())
{
std::cerr << "error opening output file\n";
std::exit(2);
}
dest << src.rdbuf();
if(!src.eof())
std::cerr << "reading from file failed\n";
if(!dst.good())
std::cerr << "writing to file failed\n";
return 0;
}
I bet you will report that one of the first two checks hits.
If opening the input file fails, try opening it using std::ios::in|std::ios::binary instead of just std::ios::binary.
Do you have any reason to not use CopyFile function?
Best
As it is written, your src instance is a regular fstream, and you are not specifying an open mode for input. The simple solution is to make src an instance of ifstream, and your code works. (Just by adding one byte!)
If you had tested the input stream (as sbi suggests), you would have found that it was not opened correctly, which is why your destination file was of zero size. It was opened in write mode (since it was an ofstream) with the truncation option to make it zero, but writing the result of rdbuf() simply failed, with nothing written.
Another thing to note is that while this works fine for small files, it would be very inefficient for large files. As is, you are reading the entire contents of the source file into memory, then writing it out again in one big block. This wastes a lot of memory. You are better off reading in chunks (say 1MB for example, a reasonable size for a disk cache) and writing a chunk at a time, with the last one being the remainder of the size. To determine the source's size, you can seek to the end and query the file offset, then you know how many bytes you are processing.
And you will probably find your OS is even more efficient at copying files if you use the native APIs, but then it becomes less portable. You may want to look at the Boost filesystem module for a portable solution.