Problems with destroying an array of libtrace_out_t* - c++

The task is to read packets from one tracer and write to many.
I use libtrace_out_t** for output tracers.
Initialization:
uint16_t size = 10;
libtrace_out_t** array = libtrace_out_t*[size];
for(uint16_t i = 0; i < size; ++i) {
array[i] = trace_create_output(uri); // created OK
trace_start_output(outTracers_[i]); // started OK
}
// writing packets
Creating, starting and writing packets using elements of tracer's array are fine.
Problems are caused by trace_destroy_output() when I destroy output tracers in loop:
for(uint16_t i = 0; i < size; ++i)
{
if(outTracers_[i])
trace_destroy_output(outTracers_[i]);
}
On the first iteration output tracer is destroying fine.
But on the second it fails with Segmentation fault in
pcap_close(pcap_t* p)
because pointer p has value 0x0.
Can someone explain me why this thing happens or how to destroy it properly?

From the code that you have posted, it looks like you are creating 10 output traces all using the same URI. So, essentially, you've created 10 output files all with the same filename which probably isn't what you intended.
When it comes time to destroy the output traces, the first destroy closes the file matching the name you provided and sets the reference to that file to be NULL. Because the reference is now NULL, any subsequent attempts to destroy that file will cause a segmentation fault.
Make sure you change your URI for each new output trace you create and you should fix the problem.
Example:
/* I prefer pcapfile: over pcap: */
const char *base="pcapfile:output";
uint16_t size = 10;
libtrace_out_t* array[size];
for (uint16_t i = 0; i < size; ++i) {
char myuri[1024];
/* First output file will be called output-1.pcap
* Second output file will be called output-2.pcap
* And so on...
*/
snprintf(myuri, 1023, "%s-%u.pcap", base, i);
array[i] = trace_create_output(uri);
/* TODO Check for errors here */
if (trace_start_output(array[i])) {
/* TODO Handle error case */
}
}
One other hint: libtrace already includes a tool called tracesplit which takes an input source and splits the packets into multiple output traces based on certain criteria (e.g. number of packets, size of output file, time interval). This tool may already do what you want without having to write code or at least it will act as a good example when writing your own code.

I think you have an out of bounds access in your code
uint16_t size = 5; /// number of tracers
for(uint16_t i = 0; i != size; ++i)
{
if(outTracers_[i])
trace_destroy_output(outTracers_[i]);
}
translates to
for(uint16_t i = 0; i <= 5; ++i)
{
...
}
And outTracers_[5] is not a valid element in your array

Related

For loop in C++ stops after a single iteration w/ pointer variable

So first all I'll preface this with: I just started using c++.
I have a structure that I store the pointer to in an unordered_map, setting members' values in the struct pointer as I get them through my process. Then I no longer need them in a map so I transfer then to a vector and loop through them.
Though on the second loop, it outputs my index (1) but the next statement of making a local pointer var for the struct at that index breaks it and the code terminates without any errors. since there are no errors then a try/catch doesn't give me anything either.
// Wanted to create a structure to handle the objects easier instead
// of multiple vectors for each property
struct appData {
std::string id = "";
std::string name = "";
std::string vdf_file = "";
std::string vdf_path = "";
};
// Relevant parts of my main()
int main() {
// Map that stores all the struct pointers
std::unordered_map<std::string, appData*> appDatas;
char memory[sizeof(appData)];
void* p = memory;
// New instance of appData
appData *tempAppData = new(p) appData();
tempAppData->appid = "86901";
// Add tempAppData to map with string key
appDatas["86901"] = tempAppData;
...
std::vector<appData*> unhashed_appDatas;
for (auto const& pair: appDatas) {
unhashed_appDatas.push_back(pair.second);
}
...
for (unsigned int x = 0; x < unhashed_appDatas.size(); x++) {
// Output index to see where it was messing up
std::cout << x << std::endl;
!! // This is where the issue happens on the second loop (see output)
appData *thisAppData = unhashed_appDatas[x];
std::string id = thisAppData->appid;
std::cout << id << std::endl;
/* ...
Do more stuff below
*/
}
...
return 0;
}
Terminal Output:
0 // Initial index of x
86901 // Id of first item
1 // New index of x on second loop before pointer var is created
// Nothing more is printed and execution terminates with no errors
My knowledge of c++ is pretty lacking, started it couple days ago, so the few things within my knowledge I've tried: moving the *thisAppData variable outside of the loop, using a for(var: vector) { ... }, and a while loop. I can assume that the issue lies with the pointer and the local variable when inside the loop.
Any help/input about how I could better approach this or if there's an issue with my code would be appreciated :)
Edit: Changed code to use .size() instead of sizeof() per #Jarod42 answer, though main issue persists
Edit2: Turns out it was my own mess-up, imagine that. 4Am brain wasn't working too well- posted answer regarding what I did incorrectly. Thanks to everyone who helped me
sizeof is the wrong tool here:
for (unsigned int x = 0; x < sizeof(unhashed_appDatas); x++) {
// ^^ wrong: give **static** size of the structure
// mainly 3 members (data, capacity, size), so something like `3*sizeof(void*)`
it should be
for (unsigned int x = 0; x < unhashed_appDatas.size(); x++) {
After many hours of trial and error I have determined the issue (aside from doing things in a way I should, which I've since corrected) it was something I messed up on that caused this issue.
TLDR:
Items wouldn't exist that I assumed did and tried to read files with a blank path and parse the contents that didn't exist.
Explaination:
In the first loop, the data I was getting was a list of files from a directory then parsing a json-like file that contained these file names and properties associated with them. Though, the file list contained entries that weren't in this other data file (since I had no check if they existed) so it would break there.
Additionally in the last loop I would get a member from a struct that would be the path of a file to read, but it would be blank (unset) because it didn't exist in data file so std::ifstream file(path); would break it.
I've since implemented checks for each key and value to ensure it will no longer break because of that.
Fixes:
Here are some fixes that were mentioned that I added to the code, which did help it work correctly in the end even if they weren't the main issue that I myself caused:
// Thanks to #EOF:
// No longer "using placement new on a buffer with automatic storage duration"
// (whatever that means haha) and was changed from:
char memory[sizeof(appData)];
void* p = memory;
appData *tempAppData = new(p) appData();
// To:
appData *tempAppData = new appData();
// Thanks to #Jarod42:
// Last for loop limit expression was corrected from:
for (unsigned int x = 0; x < sizeof(unhashed_appDatas); x++) {
}
// To:
for (unsigned int x = 0; x < unhashed_appDatas.size(); x++) {
}
// I am still using a map, despite comment noting to just use vectors
// (which I could have, but just would prefer using maps):
std::unordered_map<std::string, appData*> appDatas;
// Instead of doing something like this instead (would have arguably have been easier):
std::vector<std::string> dataKeys = { "1234" };
std::vector<appData*> appDatas = { ... };
auto indx = find(dataKeys.begin(), dataKeys.end(), "1234");
indx = (indx != dataKeys.end() ? indx : -1);
if (indx == -1) continue;
auto dataItem = appDatas[indx];
//
I appreciate everyone's assistance with my code

Why did the variable values change after entering the string?

When a new line is entered for processing via cin.getline, the values of variables not related to this line at all change. Why might this be happening?
for (int i = 0; i < n; i++) {
if (i == 0) {
std::cin.getline(command, maxLen);
}
---> std::cin.getline(command, maxLen);
processCommand(command);
}
Heap.MinHeap.elem[1]->value = 7 -----changes into----> Heap.MinHeap.elem[1]->value = 2067606717
UPD.
A char string is entered, for example "extract_min", which should output the minimum from the heap. But it has not yet reached the place where it should be performed. At that time, the values of the variables in the heap change. And this does not depend on the size of the string or the size of the heap (in my case, I store pointers to elements), I tried to change, but still the same problem.

Why this error? - Segmentation fault (core dumped)

My code compiles just fine. I am running and compiling it using another server that I connect to. When I run it, I get this error that says - Segmentation fault (core dumped). It runs perfect when I compile and run it locally on my mac, just not when I use levi (the virtual machine we use to submit our files.) What do I do to not get this error message and for my code to run? Here is my code:
//
// ChaseGraingerSection6.cpp
//
// Created by Chase Grainger on 3/19/18.
//
// I typed all of this code on my own and did
// not copy any code from any outside sources.
#include <iostream>
#include <fstream>
int main() {
const int my_dimension = 10; // dimension of 'my_array'
std::string my_array[my_dimension]; // array of fixed amount of strings
int x = 0; // used to add lines of text form 'word.txt' to 'my_array'
int y = 0; // used when reversing array values
int num_of_lines = 0; // keeps track of # of lines in text file[s]
std::string text; // used when reading lines from 'word.txt'
std::string my_reversed_array[num_of_lines]; // reversed order array
std::ofstream outData; // output stream 'outData'
std::ifstream inData; // input stream 'inData'
inData.open("word.txt"); // opens input stream
while (getline(inData, text)) { // runs through each line in 'word.txt'
my_array[x] = text; // sets index value of array to line in text file
num_of_lines += 1;
x += 1;
}
inData.close(); // closes input stream
// at this point, my_array has the text needed
outData.open("chase.txt");
for (x = num_of_lines - 1; x >= 0; x--) { // assisngs values in reverse order from one array to another
my_reversed_array[x] = my_array[y];
y += 1;
}
for (x = 0; x <= num_of_lines - 1; x++) {
outData << my_reversed_array[x] << std::endl;
}
outData.close();
}
int num_of_lines = 0;
std::string my_reversed_array[num_of_lines];
This is not actually valid C++, but on a compiler supporting variable-length arrays as an extension this creates an array of zero size.
Now, whichever compiler you use, this array does not later magically change size if you change num_of_lines; that ship has sailed.
So, whenever you write to my_reversed_array, you are writing to memory that does not belong to you.
You are going to need dynamic allocation (or a std::vector) so that you can create an array with runtime bounds — and don't do it until you know what those bounds are.
To answer your next question (why this didn't crash on your Mac), you just got [un]lucky. The program is not required to crash; its behaviour is undefined. It could instead have summoned a brilliant genius to answer your question on Stack Overflow. Oh… wait… ;)

Reading/writing binary file returns 0xCCCCCCCC

I have a script that dumps class info into a binary file, then another script that retrieves it.
Since binary files only accept chars, I wrote three functions for reading and writing Short Ints, Ints, and Floats. I've been experimenting with them so they're not overloaded properly, but they all look like this:
void writeInt(ofstream& file, int val) {
file.write(reinterpret_cast<char *>(&val), sizeof(val));
}
int readInt(ifstream& file) {
int val;
file.read(reinterpret_cast<char *>(&val), sizeof(val));
return val;
}
I'll put the class load/save script at the end of the post, but I don't think it'll make too much sense without the rest of the class info.
Anyway, it seems that the file gets saved properly. It has the correct size, and all of the data matches when I load it. However, at some point in the load process, the file.read() function starts returning 0xCCCCCCCC every time. This looks to me like a read error, but I'm not sure why, or how to correct it. Since the file is the correct size, and I don't touch the seekg() function, it doesn't seem likely that it's reaching the end of file prematurely. I can only assume it's an issue with my read/write method, since I did kind of hack it together with limited knowledge. However, if this is the case, why does it read all the data up to a certain point without issue?
The error starts occurring at a random point each run. This may or may not be related to the fact that all the class data is randomly generated.
Does anyone have experience with this? I'm not even sure how to continue debugging it at this point.
Anyway, here are the load/save functions:
void saveToFile(string fileName) {
ofstream dataFile(fileName.c_str());
writeInt(dataFile, inputSize);
writeInt(dataFile, fullSize);
writeInt(dataFile, outputSize);
// Skips input nodes - no data needs to be saved for them.
for (int i = inputSize; i < fullSize; i++) { // Saves each node after inputSize
writeShortInt(dataFile, nodes[i].size);
writeShortInt(dataFile, nodes[i].skip);
writeFloat(dataFile, nodes[i].value);
//vector<int> connects;
//vector<float> weights;
for (int j = 0; j < nodes[i].size; j++) {
writeInt(dataFile, nodes[i].connects[j]);
writeFloat(dataFile, nodes[i].weights[j]);
}
}
read(500);
}
void loadFromFile(string fileName) {
ifstream dataFile(fileName.c_str());
inputSize = readInt(dataFile);
fullSize = readInt(dataFile);
outputSize = readInt(dataFile);
nodes.resize(fullSize);
for (int i = 0; i < inputSize; i++) {
nodes[i].setSize(0); // Sets input nodes
}
for (int i = inputSize; i < fullSize; i++) { // Loads each node after inputSize
int s = readShortInt(dataFile);
nodes[i].setSize(s);
nodes[i].skip = readShortInt(dataFile);
nodes[i].value = readFloat(dataFile);
//vector<int> connects;
//vector<float> weights;
for (int j = 0; j < nodes[i].size; j++) {
nodes[i].connects[j] = readInt(dataFile); //Error occurs in a random instance of this call of readInt().
nodes[i].weights[j] = readFloat(dataFile);
}
read(i); //Outputs class data to console
}
read(500);
}
Thanks in advance!
You have to check the result of open, read, write operations.
And you need to open files (for reading and writing) as binary.

Recursive call segmentation fault issue

quick question again.
I'm creating a recursive function that will look for elements in a array of "source" rules and apply those rules to an "target array" of rules if the "source" rule type is the same as the target character. Furthermore the function checks to see if the target character is in an array of symbols or not and adds it if it is not (and throws a few flags on the newly applied rule as well). This is all driven by a recursive call that uses a counter to determine how many iterations have passed and is used to determine the spot in the target array the new rule should be applied, so we don't overwrite.
I've put in a little debugging code to show the results too.
Here's the function itself:
//Recursively tack on any non terminal pointed elements
int recursiveTack(rule * inrule[], char target, rule * targetrule[],
int counter, char symbols[])
{
printf("Got into recursiveTack\n");
printf("target is %c\n", target);
printf("counter is %d", counter);
for (int k = 0; k < sizeof(inrule); k++)
{
if (inrule[k]->type == target)
{
//doublecheck to see if we're trying to overwrite
if (targetrule[counter]->used = true)
{
counter++;
}
targetrule[counter]->head = inrule[k]->head;
targetrule[counter]->type = inrule[k]->type;
targetrule[counter]->used = true;
//Check to see if the elements are new to the symbols table and need to be added
if (!contains(returnGotoChar(targetrule[counter]), symbols))
{
//If not then add the new symbol
addChar(returnGotoChar(targetrule[counter]), symbols);
//Also set the goto status of the rule
targetrule[counter]->needsGoto = true;
//Also set the rule's currentGotoChar
targetrule[counter]->currentGotoChar = returnGotoChar(
targetrule[counter]);
}
counter++;
//recursivly add elements from non terminal nodes
if (isNonTerm(targetrule[counter]))
{
char newTarget = returnGotoChar(targetrule[counter]);
counter = recursiveTack(inrule, newTarget, targetrule, counter,
symbols);
}
}
}
//return how many elements we've added
return counter;
}
Here's the call:
if(isNonTerm(I[i+first][second]))
{
printf("Confirmed non termainal\n");
printf("Second being passed: %d\n", second);
//Adds each nonterminal rule to the rules for the I[i+first] array
second = recursiveTack(I[i], targetSymbol, I[i+first], second, symbols[first]);
}
All the arrays being passed in have been initialized prior to this point.
However, the output I get indicates that the recursion is getting killed somewhere before it gets off the ground.
Output:
Second being passed: 0
Confirmed non termainal
Got into recursiveTack
target is E
Segmentation fault
Any help would be great, I've got the rest of the program available too if needs be it's around 700 lines including comments though. I'm pretty sure this is just another case of missing something simple, but let me know what you think.
for(int k = 0; k < sizeof(inrule); k++)
sizeof(inrule) is going to return the size of a pointer type (4 or 8). Probably not what you want. You need to pass the size of the arrays as parameters as well, if you are going to use these types of structures.
It would be better to use Standard Library containers like std::vector, though.
if(targetrule[counter]->used = true){
counter++;
}
// what is the guarantee that targetrule[counter] is actually valid? could you do a printf debug before and after it?
The biggest thing I see here is:
for(int k = 0; k < sizeof(inrule); k++)
This isn't going to do what you think. inrule is an array of pointers, so sizeof(inrule) is going to be the number of elements * sizeof(rule*). This could very quickly lead to running off the end of your array.
try changing that to:
for (int k = 0; k < sizeof(inrule) / sizeof(rule*); ++k)
Something else you might consider is an fflush(stdout); after your print statements. You're crashing while some output is still buffered so it's likely hiding where your crash is happening.
EDIT:
That won't work. If you had a function that did something like:
int x[10];
for (int i = 0; i < sizeof(x) / sizeof(int); ++i) ...
It would work, but on the other side of the function call, the type degrades to int*, and sizeof(int*) is not the same as sizeof(int[10]). You either need to pass the size, or ... better yet, use vectors instead of arrays.