I'm having trouble writing a short buffer into a wav file on my HD.
I followed a few tutorials, but they all give different ways of doing it.
Anyway, this is the way I implemented it but for some reason it doesn't work. When I try to print out the result of outfile, I get a 0, and nothing is written to disk.
Also, I tried different paths, sometimes with and sometimes without a file name.
UPDATE: When I change the path to only a file name (e.g. hello.wav and not ~/Documents/hello.wav), printing out the outfile returns a memory location such as 0x2194000.
void gilbertAnalysis::writeWAV(float * buffer, int bufferSize){
SF_INFO sfinfo ;
sfinfo.channels = 1;
sfinfo.samplerate = 44100;
sfinfo.format = SF_FORMAT_WAV | SF_FORMAT_PCM_16;
const char* path = "~/Documents/hello.wav";
SNDFILE * outfile = sf_open(path, SFM_WRITE, &sfinfo);
sf_count_t count = sf_write_float(outfile, &buffer[0], bufferSize) ;
sf_write_sync(outfile);
sf_close(outfile);
}
From the libsndfile docs:
On success, the sf_open function returns a non-NULL pointer which should be passed as the first parameter to all subsequent libsndfile calls dealing with that audio file. On fail, the sf_open function returns a NULL pointer. An explanation of the error can obtained by passing NULL to sf_strerror.
If outfile is 0x2194000 and not null, then you probably opened the file correctly.
Try using sf_strerror() to see what error you had when you provided a full file path and not just the file name.
Tilde (~) in file names expands to home directory by shell. For library functions file path must be full, either absolute or relative.
Related
I am porting over c++ code from linux to windows. I am currently using Visual Studio 2013 to port my code.
I need to read a binary file and am using this portion of c++ code:
// Open the stream
std::ifstream is("myfile.bin");
// Determine the file length
is.seekg(0, std::ios_base::end);
std::size_t size=is.tellg();
is.seekg(0, std::ios_base::begin);
// Create a vector to store the data
int* Data = new int[size/sizeof(int)];
// Load the data
is.read((char*) &Data[0], size);
// Close the file
is.close();
In linux, the size of my binary file is correctly found to be 744mb. However, in windows, the size of my binary file is incorrectly found to be >4GB. How can I correct this issue?
Change std::ifstream is("myfile.bin"); to std::ifstream is("myfile.bin", std::ios::binary);
With your current default open mode, the compiler choses "char" mode. In Linux chars in files are UTF8, first 128 positions are 1-byte char. But for memory UTF32, 4-bytes per char, is used. In Windows chars are "wide-chars", 2-bytes per char.
I finally had the time to actually run this myself, though I had to fix a couple of things, like ios_base::beg instead of begin (different function) Also, as mentioned, the array allocation should be this int* Data = new int[size / sizeof(int) + 1]; // At most one extra int
I found your problem: you're not in the right directory. Check if you successfully opened the file or not. If you don't, then you get a huge garbage value (probably -1, but unsigned, so massive) for size.
Try this to find your directory in Windows: (probably need Windows.h or something that I "just had" already)
char dirBuf[256];
GetCurrentDirectory(256, dirBuf);
cout << "Current directory is: " << dirBuf << endl;
See if that's where your file is and move it accordingly. Or specify the ENTIRE path in the constructor to ifstream.
Also, it has nothing to do with ios::binary or not. Works fine both ways, or fails if the file isn't there.
std::size_t size=is.tellg();
The standard doesn't require tellg to return the byte offset from the beginning of the file. In general, this may not be a reliable way to get the size of the file, though it probably does what you expect on Linux and Windows.
The return type of the tellg method is std::basic_stream::pos_type, so you're starting with an implicit conversion to std::size_t which may or may not be appropriate. In a 32-bit build, for example, it's conceivable that the size of a file could be larger than a std::size_t can represent.
But the root problem is that you're not checking for errors. If you have exceptions disabled, then tellg reports an error by returning pos_type(-1). When you cast that to an unsigned type (which std::size_t is), then you get a very large value. I suspect you failed to open the file, and since you didn't detect that error, the seekg and the tellg failed. You then coerced pos_type(-1) to a std::size_t, which made it look like the file was huge.
You also have the problems others have noted: failing to open the file in binary mode and computing the wrong size for the buffer when the file isn't a multiple of the size of an int.
The most reliable to get the file size is to use the OS's API. On Windows, you can do this instead:
// Open the file. [TODO: Get the file name in wide characters and use
// CreateFileW instead. If the file name contains characters not
// representable by the user's ANSI codepage, then CreateFileA will fail.]
HANDLE hfile = CreateFileA("myfile.bin", GENERIC_READ, FILE_SHARE_READ,
nullptr, OPEN_EXISTING,
FILE_ATTRIBUTE_NORMAL | FILE_FLAG_SEQUENTIAL_SCAN,
nullptr);
if (hfile == INVALID_HANDLE_VALUE) { error handling here }
// Figure out how big it is.
LARGE_INTEGER li_size;
if (!GetFileSizeEx(hfile, &li_size)) { error handling here }
// TODO: On a 32-bit build, this won't be able to handle huge files,
// so check that here.
std::size_t size = li_size.QuadPart;
// Create a buffer to store the data, being careful to round up to a
// multiple of sizeof(int). [TODO: Use a std::vector instead.]
int* Data = new int[(size + sizeof(int) - 1) / sizeof(int)];
// Load the data.
const DWORD BytesToRead = static_cast<DWORD>(size);
DWORD BytesRead = 0;
if (!ReadFile(hfile, Data, &BytesRead, nullptr) || BytesRead < BytesToRead) {
error handling here
}
// Close the file
CloseHandle(hfile);
int* Data = new int[size/sizeof(int)];
Why are you doing this? You're dividing the size by 4. You don't want to do this. It should just be int* Data = new int[size]
Also, it should be std::ifstream f("filename.bin", std::ios::binary);
Hey I'm having trouble with file IO. I am using some standard file pointer stuff but i keep getting this error: Unhandled exception at 0x58CBC465 (msvcr120_app.dll) in ChemicalWar.exe: An invalid parameter was passed to a function that considers invalid parameters fatal.
Now from what I gathered I think it has something to do with not having permission to write to default location but I am unsure how to change the location.
here is the code I wrote so far that is giving me trouble:
FILE* ofile;
NametoBinary(_filename);
fopen_s(&ofile, (char*)folder->ToString(), "wb");
fwrite(&animhead, sizeof(Afhead), 1, ofile);
fwrite(binbuff.data(), sizeof(unsigned char), binbuff.size(), ofile);
fclose(ofile);
it breaks on the first fwrite call. Any help would be great. Thanks in advance.
So I figured out the solution so I will post it in case anyone else needs to know.
FILE* ofile = nullptr;
NametoBinary(_filename);
auto folder = Windows::Storage::ApplicationData::Current->RoamingFolder;
std::wstring ws(folder->Path->Data());
std::string full(ws.begin(), ws.end());
full += "\\";
full += _filename;
fopen_s(&ofile, full.c_str(), "wb");
if (nullptr == ofile) return false;
fwrite(&animhead, sizeof(Afhead), 1, ofile);
So the new windows apps have there own FileI0 system. This solution will save a file in a folder in the appdata folder.
I have been unable to open the file. The fb.is_Open() never returns true. Only when I hard code the data source in the fb.open() it works.
I've tried converting it to a string, char, and wstring with no effect.
What am I missing? The correct code would be fantastic but also and explanation.
Trying to open a file with the data source variable:
wchar_t dataSource[2048];
DWORD errNum = GetModuleFileName(NULL, dataSource, sizeof(dataSource)); //get current dir.
ifstream fb;
wcscat_s(dataSource, L".confg"); //adds ".config" to get full data Sournce
fb.open(dataSource, ios::in);
if (fb.is_open())
{
//get information
}
fb.close();
Here are some things Ive tried that have not worked:
wstring x = dataSource;
x.c_str()
char* cnvFileLoc = (char*)malloc(2048);
size_t count;
count = wcstombs_s(&count, cnvFileLoc, 2048, dataSource, 2048);
what does work is:
fb.open(X:\CPP.Lessons\PluralSight\PluralSight.Fundamentals\Debug\PluralSight.Fundamentals.exe.config, ios::in)
Your call to GetModuleFileName() is wrong. The last parameter is expressed in characters, not in bytes, and the return value tells how many characters were copied:
wchar_t dataSource[2048];
if (GetModuleFileName(NULL, dataSource, 2048) > 0)
{
...
}
Or:
wchar_t dataSource[2048];
if (GetModuleFileName(NULL, dataSource, sizeof(dataSource)/sizeof(dataSource[0])) > 0)
{
...
}
Or:
wchar_t dataSource[2048];
if (GetModuleFileName(NULL, dataSource, _countof(dataSource)) > 0)
{
...
}
Or:
wchar_t dataSource[2048];
if (GetModuleFileName(NULL, dataSource, ARRAYSIZE(dataSource)) > 0)
{
...
}
That being said, you are appending .confg to the end of the full filename. So, if your application is named myapp.exe, you are trying to open myapp.exe.confg. Is that what you really want?
If yes, then make sure the .confg file actually exists, and that your app has permission to access it. CreateFile() would offer much more useful error info then ifstream does.
Otherwise, assuming the .confg file is at least in the same folder as your app, you would have to manually remove the filename portion from the buffer and then substitute in the correct filename. Have a look at PathRemoveFileSpec() and PathCombine() for that. Or, if the file is named myapp.confg, look at PathRenameExtension().
Update: I just noticed that your code is appending .confg, but your comment says .config instead:
//wcscat_s(dataSource, L".confg");
wcscat_s(dataSource, L".config");
You may have mistyped the file extension: L".confg" instead of L".config" as stated by the comment in your code.
RESOLVED
I'm trying to make a simple file loader.
I aim to get the text from a shader file (plain text file) into a char* that I will compile later.
I've tried this function:
char* load_shader(char* pURL)
{
FILE *shaderFile;
char* pShader;
// File opening
fopen_s( &shaderFile, pURL, "r" );
if ( shaderFile == NULL )
return "FILE_ER";
// File size
fseek (shaderFile , 0 , SEEK_END);
int lSize = ftell (shaderFile);
rewind (shaderFile);
// Allocating size to store the content
pShader = (char*) malloc (sizeof(char) * lSize);
if (pShader == NULL)
{
fputs ("Memory error", stderr);
return "MEM_ER";
}
// copy the file into the buffer:
int result = fread (pShader, sizeof(char), lSize, shaderFile);
if (result != lSize)
{
// size of file 106/113
cout << "size of file " << result << "/" << lSize << endl;
fputs ("Reading error", stderr);
return "READ_ER";
}
// Terminate
fclose (shaderFile);
return 0;
}
But as you can see in the code I have a strange size difference at the end of the process which makes my function crash.
I must say I'm quite a beginner in C so I might have missed some subtilities regarding the memory allocation, types, pointers...
How can I solve this size issue?
*EDIT 1:
First, I shouldn't return 0 at the end but pShader; that seemed to be what crashed the program.
Then, I change the type of reult to size_t, and added a end character to pShader, adding pShdaer[result] = '/0'; after its declaration so I can display it correctly.
Finally, as #JamesKanze suggested, I turned fopen_s into fopen as the previous was not usefull in my case.
First, for this sort of raw access, you're probably better off
using the system level functions: CreateFile or open,
ReadFile or read and CloseHandle or close, with
GetFileSize or stat to get the size. Using FILE* or
std::filebuf will only introduce an additional level of
buffering and processing, for no gain in your case.
As to what you are seeing: there is no guarantee that an ftell
will return anything exploitable as a numeric value; it could
very well be just a magic cookie. On most current systems, it
is a byte offset into the physical file, but on any non-Unix
system, the offset into the physical file will not map directly
to the logical file you are reading unless you open the file in
binary mode. If you use "rb" to open the file, you'll
probably see the same values. (Theoretically, you could get
extra 0's at the end of the file, but practically, the OS's
where that happened are either extinct, or only used on legacy
mainframes.)
EDIT:
Since the answer stating this has been deleted: you should loop
on the fread until it returns 0 (setting errno to 0 before
each call, and checking it after the return to see whether the
function returned because of an error or because it reached the
end of file). Having said this: if you're on one of the usual
Windows or Unix systems, and the file is local to the machine,
and not too big, fread will read it all in one go. The
difference in size you are seeing (given the numerical values
you posted) is almost certainly due to the fact that the two
byte Windows line endings are being mapped to a single '\n'
character. To avoid this, you must open in binary mode;
alternatively, if you really are dealing with text (and want
this mapping), you can just ignore the extra bytes in your
buffer, setting the '\0' terminator after the last byte
actually read.
I have a program that records data from a serial port. Every so often, I want to split up files such that the data logs don't become very large. The problem is, after I recreate the FILE* and try to write into it, the program crashes. No compiler errors/warnings before hand also...
The program does create one log for the first time interval, but once it's time to create a new data log, it crashes at the fwrite.
First off, initializations/declarations.
char * DATA_DIR = "C:\DATA";
sprintf(path,"%s%s%s",DATA_DIR,curtime,".log"); //curtime is just the current time in a string
FILE * DATA_LOG = fopen(path, "wb+");
And later on in a while loop
if(((CURRENT_TIME-PREVIOUS_TIME) > (SEC_IN_MINUTE * MINUTE_CHUNKS) ) && (MINUTE_CHUNKS != 0) && FIRST_TIME == 0) //all this does is just checks if its time to make a new file
{
fclose(DATA_LOG); //end the current fileread
char * path;
char curtime[16];
//gets the current time and saves it to a file name
sprintf(curtime , "%s" , currentDateTime());
sprintf(path,"%s%s%s",DATA_DIR,curtime,".log");
DATA_LOG = fopen(path, "wb+"); //open the new file
//just some logic (not relevant to problem)
PREVIOUS_TIME = CURRENT_TIME;
newDirFlag = 1;
}
fwrite(cdata , sizeof(char) , numChars , DATA_LOG); //crashes here. cdata, sizeof, and numChars don't change values
Any ideas why is this happening? I'm stumped.
Couple of problems, path has no memory allocated (you're writing stuff to some random memory address which is bad). You also should check the return values of fwrite fopen for errors. If there is one use perror so you know what the problem is. It's likely the fopen is failing or you're corrupting your stack by writing to path.
Also use snprintf it's much safter than just sprintf which is vulnerable to buffer overflow.
EDIT: just saw your comment that it's c++. Why not use std::string and fstream instead? They are much safer than what you're currently doing (and probably easier).
Your MAIN problem is that char * path; has no memory assigned to it. This means that you are writing to some RANDOM [1] location in memory.
I would suggest that you use char path[PATH_MAX]; - that way you don't have to worry about allocating and later deallocating the storage for your path.
Alternatively, you could use:
stringstream ss;
ss << DATA_DIR << currentDateTime() << ".log";
string path = ss.str();
fopen(path.c_str(), "wb+")
which is a more C++ style solution.
[1] By random, I don't mean truly a random number, but some unknown value that happens to be in that location on the stack. It is almost always NOT a good place to store a string.