How to load a Wav file in an Array in C++? - c++

Hey I've a dynamic array and I want to load to this array the data of my Wav file, I already wrote the beginning but I can't figure it out how to load the file in my dynamic array, can somebody help me further with this code?
#include <iostream>
using namespace std;
template <typename T>
class Array{
public:
int size;
T *arr;
Array(int s){
size = s;
arr = new T[size];
}
T& operator[](int index)
{
if (index > size)
resize(index);
return arr[index];
}
void resize(int newSize) {
T* newArray = new T[newSize];
for (int i = 0; i <size; i++)
{
newArrayi] = arr[i];
}
delete[] arr;
arr = newArray;
size = newSize;
}
};
int main(){
Array<char> wavArray(10);
FILE *inputFile;
inputFile =fopen("song.wav", "rb");
return 0;
}

if you just want to load the complete file into memory, this may come in handy:
#include <iterator>
// a function to load everything from an istream into a std::vector<char>
std::vector<char> load_from_stream(std::istream& is) {
return {std::istreambuf_iterator<char>(is), std::istreambuf_iterator<char>()};
}
... and use the C++ file streaming classes to open and automatically close files.
{
// open the file
std::ifstream is(file, std::ios::binary);
// check if it's opened
if(is) {
// call the function to load all from the stream
auto content = load_from_stream(is);
// print what we got (works on textfiles)
std::copy(content.begin(), content.end(),
std::ostream_iterator<char>(std::cout));
} else {
std::cerr << "failed opening " << file << "\n";
}
}
... but a WAV file contains a lot of different chunks describing the contents of the file so you may want to create individual classes for streaming these chunks to and from files.

char* readFileBytes(const char *name)
{
FILE *fl = fopen(name, "r");
fseek(fl, 0, SEEK_END);
long len = ftell(fl);
char *ret = malloc(len);
fseek(fl, 0, SEEK_SET);
fread(ret, 1, len, fl);
fclose(fl);
return ret;
}

Related

Saving data to an array of pointers when reading form an file

I do not understand why my array of pointers is only saving the last line from the file that I am reading from. When I substitute a string literal into the setData() function the code works just fine. All that the "mann" file contains are a bunch of words order alphabetically. Thank you.
#include <iostream>
#include <fstream>
using namespace std;
class orignialData {
char* data;
public:
void setData(char* s) { data = s;}
char* getData() const {return data;}
};
class dataClass {
orignialData** W_;
public:
dataClass(char* filename);
void addData();
void viewAll();
};
dataClass::dataClass(char* filename) {
fstream file;
file.open(filename, ios::in);
if (file.fail()) {
cout << "There was an error reading the file...\n";
}
W_ = 0;
W_ = new orignialData*[5];
for (int i = 0; i < 5; i++)
W_[i] = new orignialData;
char buff[30];
char* temp;
while(file >> buff) {
cout << buff << endl;
static int i = 0;
W_[i] -> setData(buff);
i++;
}
file.close();
}
Instead of data = s, write data = strdup(s) to make a copy of the contents. Otherwise, you will assign the same pointer again and again, and you will overwrite the contents of the memory to which this pointer points again and again. At the end, your temporary buffer will contain the last line of your file, and all the pointers will point to exactly this buffer. That's what you are observing...

put the file into an array in c++

I'm trying to read a txt file, and put it into an char array. But can I read different files which contain different length of characters and put them into an array. Can I create a dynamic array to contain unknown length of characters.
You can read a file of unknown size into a dynamics data structure like:
std::vector More info here.
Alternatively, you can use new to allocate a dynamic memory. However, vectors are more convenient at least to me :).
#include <vector>
#include <iostream>
#include <fstream>
int main(int argc, char **argv)
{
std::vector<std::string> content;
if (argc != 2)
{
std::cout << "bad argument" << std::endl;
return 0;
}
std::string file_name (argv[1]);
std::ifstream file(file_name);
if (!file)
{
std::cout << "can't open file" << std::endl;
return 0;
}
std::string line = "";
while (std::getline(file, line))
{
content.push_back(line);
line = "";
}
for (std::vector<std::string>::iterator it = content.begin(); it != content.end(); ++it)
std::cout << *it << std::endl;
}
here is a solution using std::vectors and std::string
the programm takes a file name as first parameter, opens it, read it line by line
each line is written in the vector
then you can display your vector as i did at the end of the function
EDIT: because C++11 is the new standars, the program use C++11 then you have to compile it using c++11 (g++ -std=c++11 if you use g++)
I just tested it it works perfectly
There may be library routines available which give you the size of the file without reading the contents of the file. In that case you could get the size and allocate a full-sized buffer, and suck in the whole file at once [if your buffer is a simple char array, don't forget to add one and put in the trailing nullchar].
The best way is use of malloc(), realloc(), and free() just like it was an old C program. If you try to use a std::vector you will choke approaching maximum RAM as realloc() can grow and shrink in place (grow is contingent on heap while shrink is guaranteed to work) while std::vector cannot do so.
In particular:
#include <iostream>
#include <tuple>
// TODO perhaps you want an auto-management class for this tuple.
// I'm not about to type up the whole auto_ptr, shared_ptr, etc.
// Mostly you don't do this enough to have to worry too hard.
std::tuple<char *, size_t> getarray(std::istream &input)
{
size_t fsize = 0;
size_t asize = 0;
size_t offset = 0;
size_t terminator = (size_t)-1;
char *buf = malloc(asize = 8192);
if (!buf) throw std::bad_alloc();
char *btmp;
char shift = 1;
do {
try {
input.read(buf + offset, asize - fsize);
} catch (...) {
free(buf);
throw;
}
if (input.gcount == 0) {
btmp = realloc(buf, bufsize);
if (btmp) buf = btmp;
return std::tuple<char *, size_t>(buf, fsize);
}
offset += input.gcount;
fsize += offset;
if (fsize == asize) {
if (shift) {
if ((asize << 1) == 0)
shift = 0;
else {
btmp = realloc(buf, asize << 1);
if (!btmp)
shift = 0;
else {
asize <<= 1;
buf = btmp;
}
}
if (!shift) {
btmp = realloc(buf, asize += 8192);
if (!btmp) {
free(buf);
throw std::bad_alloc();
}
}
}
}
} while (terminator - offset > fsize);
free(buf);
// Or perhaps something suitable.
throw "File too big to fit in size_t";
}

Reading a binary file in c++ with stringstream

I want to write/read data from a file. Is it possible to divide the file (inside the code) in multiple Strings/Sections? Or read data untill a specific line?
Just like: "Read the Data untill line 32, put it inside a String, read the next 32 lines and put it into another string"
Im already know how to read and find data with seekp but i dont really like it because my code always gets to long.
I already found some code but i dont understand it how it works:
dataset_t* DDS::readFile(std::string filename)
{
dataset_t* dataset = NULL;
std::stringstream ss;
std::ifstream fs;
uint8_t tmp_c;
try
{
fs.open(filename.c_str(), std::ifstream::in);
if (!fs)
{
std::cout << "File not found: " << filename << std::endl;
return NULL;
}
while(fs.good())
{
fs.read((char*)&tmp_c, 1);
if (fs.good()) ss.write((char*)&tmp_c, 1);
}
fs.close();
dataset = new dataset_t();
const uint32_t bufferSize = 32;
char* buffer = new char[bufferSize];
uint32_t count = 1;
while(ss.good())
{
ss.getline(buffer, bufferSize);
dataitem_t dataitem;
dataitem.identifier = buffer;
dataitem.count = count;
dataset->push_back(dataitem);
count++;
}
return dataset;
}
catch(std::exception e)
{
cdelete(dataset);
return NULL;
}
}
The Code edits a binary save file.
Or can someone link me a website where i can learn more about buffers and stringstreams?
You could create some classes to model your requirement: a take<N> for 'grab 32 lines', and a lines_from to iterate over lines.
Your lines_from class would take any std::istream: something encoded, something zipped, ... as long as it gives you a series of characters. The take<N> would convert that into array<string, N> chunks.
Here's a snippet that illustrates it:
int main(){
auto lines = lines_from{std::cin};
while(lines.good()){
auto chunk = take<3>(lines);
std::cout << chunk[0][0] << chunk[1][0] << chunk[2][0] << std::endl;
}
}
And here are the supporting classes and functions:
#include <iostream>
#include <array>
class lines_from {
public:
std::istream &in;
using value_type = std::string;
std::string operator*() {
std::string line;
std::getline(in, line);
return line;
}
bool good() const {
return in.good();
}
};
template<int N, class T>
auto take(T &range){
std::array<typename T::value_type, N> value;
for (auto &e: value) { e = *range; }
return value;
}
(demo on cpp.sh)

Extracting data from compressed file returns random data

I'm using libzip to extract the content of each file in a zip into my own data structure, a C++ immutable POD.
The problem is that every time I extract the content of a file, I get some random data with tacked on to the end. Here's my code:
void Parser::populateFileMetadata() {
int error = 0;
zip *zip = zip_open(this->file_path.c_str(), 0, &error);
if (zip == nullptr) {
LOG(DEBUG)<< "Could not open zip file.";
return;
}
const zip_int64_t n_entries = zip_get_num_entries(zip, ZIP_FL_UNCHANGED);
for (zip_int64_t i = 0; i < n_entries; i++) {
const char *file_name = zip_get_name(zip, i, ZIP_FL_ENC_GUESS);
struct zip_stat st;
zip_stat_init(&st);
zip_stat(zip, file_name, (ZIP_FL_NOCASE|ZIP_FL_UNCHANGED), &st);
char *content = new char[st.size];
zip_file *file = zip_fopen(zip, file_name, \
(ZIP_FL_NOCASE|ZIP_FL_UNCHANGED));
const zip_int64_t did_read = zip_fread(file, content, st.size);
if (did_read <= 0) {
LOG(WARNING)<< "Could not read contents of " << file_name << ".";
continue;
}
const FileMetadata metadata(string(file_name), -1, string(content));
this->file_metadata.push_back(metadata);
zip_fclose(file);
delete[] content;
}
zip_close(zip);
}
You're constructing a std::string from content without telling the constructor how long it is, so the constructor is going to read from the start of the buffer until it finds a terminating NUL. But there's no guarantee that the file contains one, and so the constructor reads past the end of your buffer until it happens to find a NUL.
Fix: use the two-argument std::string constructor (string(const char* s, size_t size)) and pass it the data length.
zip_fread seems to increase the size of content, so I just truncate content: content[st.size] = '\0';
#ruipacheco solution did not work for me. Doing content[st.size] = '\0'; fixed the problem but caused the error "double free or corruption..." when calling zip_fclose() and/or delete[] content So I did the below and it seems to work
void ReadZip(std::string &data){
....
....
data.resize(st.size);
for(uint i = 0; i<st.size; ++i)
data[i] = std::move(content[i]);
}

C++: Template fstream.write doesn't work

I'm having a problem working on a Random Access File class in c++, which should allow to write & read any primitive datatype from a file. However, even though the code compiles and executes, nothing is written to the file.
File is openend in constructor:
RandomAccessFile::RandomAccessFile(const string& fileName) : m_fileName(fileName) {
// try to open file for reading and writing
m_file.open(fileName.c_str(), ios::in|ios::out|ios::binary);
if (!m_file) {
// file doesn't exist
m_file.clear();
// create new file
m_file.open(fileName.c_str(), ios::out | ios::binary);
m_file.close();
// try to open file for reading and writing
m_file.open(fileName.c_str(), ios::in|ios::out|ios::binary);
if (!m_file) {
m_file.setf(ios::failbit);
}
}
}
Test call of function in main:
RandomAccessFile raf("C:\Temp\Vec.txt");
char c = 'c';
raf.write(c);
Write function:
template<class T>
void RandomAccessFile::write(const T& data, streampos pos) {
if (m_file.fail()) {
throw new IOException("Could not open file");
}
if (pos > 0) {
m_file.seekp(pos);
}
else {
m_file.seekp(0);
}
streamsize dataTypeSize = sizeof(T);
char *buffer = new char[dataTypeSize];
for (int i = 0; i < dataTypeSize; i++) {
buffer[dataTypeSize - 1 - i] = (data >> (i * 8));
}
m_file.write(buffer, dataTypeSize);
delete[] buffer;
}
If I debug it I can cleary see that 'c' is in the buffer when it's written to the file.
Any suggestions?
Thanks
Phil
Solved it myself with a little luck.
Apparently, fstream.open doesn't work with absolute paths.
Replacing "C.\temp\vec.txt" with just "temp" solved it.