Can I use jsoncpp with dynamic allocation? - c++

I'm facing some jsoncpp issues with memory corruption.
When I assign some values in local Json::Value variable, sometimes It gets a wrong data and make a crash.
so I'm trying to make Json::value variable with dynamic allocation and check memory corruption more carefully.
anyway, my question is, Can I use jsoncpp with dynamic allocation ? and Do you think it is safe than before?
I'm so sorry for my lacking in english.
Thank you !

JsonCpp may get unhandy when you want to manage references to values in the tree on your own, or when you want to refer to internals of the values.
See the following code that describes three ways of how it can be used savely as well as two samples that show some common pitfalls. Hope it helps a bit.
Note that when dealing with "dynamic allocation", often smart pointers are very handy and can reduce the risk of memory leaks or memory corruption due to bugs in allocating/deleting objects at the right point. Confer, for example, shared_ptr.
Json::Value createJsonValue() {
Json::Value json("a string value");
return json; // OK - enforces a copy
}
Json::Value *createJsonValueReference() {
Json::Value *json_dynamic = new Json::Value("a string value");
return json_dynamic; // OK - does not enforce a copy but keeps json value in heap
}
std::shared_ptr<Json::Value> createJsonValueSmartPointer() {
std::shared_ptr<Json::Value> result(new Json::Value("a string value"));
return result; // OK - creates a json::value on the heap and wraps it by a shared_ptr object
}
Json::Value &referenceToLocalJson() {
Json::Value json("a string value");
return json; // Not OK: Reference to stack memory associated with local variable `json` returned
}
const char* getJsonValueContent() {
Json::Value json("a string value");
return json.asCString(); // critical: reference to internals of object that will be deleted.
}
int main()
{
Json::Value copied = createJsonValue(); // will be a copy; lifetime is until end of main
Json::Value *ptr = createJsonValueReference(); // will be a reference to an object on the heap; lifetime until you call `delete ref`
std::shared_ptr<Json::Value> smartptr = createJsonValueSmartPointer(); // share_ptr object, managing a reference to an object on the heap; lifetime of shared_ptr until end of main; lifetime of referenced object until the last shared_ptr pointing to it is destroyed
Json::Value &critical = referenceToLocalJson(); // Critical; will refer to an object that has already been deleted at the end of "referenceToLocalJson"
const char* content = getJsonValueContent(); // critical: reference to internals of object that will be deleted.
cout << "copied:" << copied << std::endl;
cout << "reference:" << *ptr << std::endl;
cout << "smartptr:" << *smartptr << std::endl;
// cout << "critical:" << critical << std::endl; // undefined outcome
// cout << "content:" << content << std::endl; // undefined outcome
delete ptr; // OK - will free object referred to by ptr
// smartptr will be deleted together with the json value it refers to at the end of this function; no "explicit" delete
return 0;
}

Related

How to create an Instance of a class on heap memory

A class type called "Pair" has already been defined for you. You need to write a function called pairFactory that creates an instance of Pair on the heap. Do not create the object on the stack. Then, your function needs to return a pointer to that created object.
I have written the code for pairFactory. It seems to run, but I get an InfraError.
Please help me find my mistake.
Also, I need to create the object in heap memory.
#include <iostream>
// This class Pair has already been defined for you.
// (You may not change this definition.)
class Pair {
public:
int first, second;
void check() {
first = 5;
std::cout << "Congratulations! The check() method of the Pair class \n has executed. (But, this isn't enough to guarantee \n that your code is correct.)" << std::endl;
}
};
Pair *pairFactory() {
//
Pair P;
P.check();
// (You can use as many lines as you want.)
return &P;
}
// Your function should be able to satisfy the tests below. You should try
// some other things to convince yourself. If you have a bug in this problem,
// the usual symptom is that after you submit, the grader will crash with a
// system error. :-)
int main() {
Pair *p;
p = new pairFactory();
// This function call should work without crashing:
p->check();
// Deallocating the heap memory. (Assuming it was made on the heap!)
delete p;
std::cout << "If you can see this text, the system hasn't crashed yet!" << std::endl;
return 0;
}
You've got it backwards. Your factory needs to allocate on the heap. What you're doing is returning the address of a function-local object that doesn't exist anymore.
Pair *pairFactory() {
return new Pair;
}
And then in your main function:
Pair *p = pairFactory();
You return reference to local variable here.
Pair *pairFactory() {
Pair P;
P.check();
return &P; // Dangling pointer
}
So you have dangling pointer, once you leave the function.
You have to call new.
Pair *pairFactory()
{
return new Pair{};
}
main may look like:
int main() {
Pair* p = pairFactory();
// This function call should work without crashing:
p->check();
// Deallocating the heap memory. (Assuming it was made on the heap!)
delete p;
std::cout << "If you can see this text, the system hasn't crashed yet!" << std::endl;
}
Better to use smart pointer to not have to manage memory yourself:
std::unique_ptr<Pair> pairFactory()
{
return std::make_unique<Pair>();
}
int main() {
auto p = pairFactory();
p->check();
std::cout << "If you can see this text, the system hasn't crashed yet!" << std::endl;
}
You did exactly as you were told not to do:
Pair *pairFactory() {
Pair P; // <----- creates an instance of Pair on the stack
…
}
The intention of this exercise likely was to test your knowledge on the new operator. See here https://en.cppreference.com/w/cpp/memory/new/operator_new
The function
Pair *pairFactory() {
return &P;
}
returns a pointer to memory on the local stack, which is destroyed / invalid as soon as it returns to main().
The issue with your code is that it returns a pointer that points to nothing as P only exists till the value of p is returned, as stored in heap memory. Using new keyword will allocate the memory in heap and will return a pointer to the value.
Pair *pairFactory() {
Pair P;
P.check();
return &P;
}

Unexpected runtime error (segmentation fault)

I am currently optimizing a memory intensive application to occupy lesser memory. What I am trying to do in the following code is to allocate the file stream objects ifstream and ofstream dynamically in order to free them exactly after it's use is no longer needed. The code functions perfectly for allocation/de-allocation of ofstream but crashes at runtime due to a possible segmentation fault when the memory contents of ifstream is de-allocated. The following is a snippet of the original code:
#include <fstream>
using namespace std;
// Dummy class to emulate the issue at hand
class dummy {
private:
int randINT;
static bool isSeeded;
public:
dummy() { randINT=rand(); }
int getVal() { return randINT; }
};
bool dummy::isSeeded=false;
int main(int argc, const char* argv[]) {
// Binary file I/O starts here
dummy * obj;
ofstream * outputFile;
ifstream * inputFile;
outputFile=new ofstream("bFile.bin",ios::binary);
if (!(*outputFile).fail()) {
obj=new dummy;
cout << "Value to be stored: " << (*obj).getVal() << "\n";
(*outputFile).write((char *) obj, sizeof(*obj)); // Save object to file
(*outputFile).close();
delete obj;
// don't assign NULL to obj; obj MUST retain the address of the previous object it pointed to
} else {
cout << "Error in opening bFile.bin for writing data.\n";
exit(1);
}
delete outputFile; // This line throws no errors!
inputFile=new ifstream("bFile.bin",ios::binary);
if (!(*inputFile).fail()) {
(*inputFile).read((char *) obj,sizeof(dummy)); // Read the object of type 'dummy' from the binary file and allocate the object at the address pointed by 'obj' i.e. the address of the previously de-allocated object of type 'dummy'
cout << "Stored Value: " << (*obj).getVal() << "\n";
(*inputFile).close();
} else {
cout << "Error in opening bFile.bin for reading data.\n";
exit(1);
}
delete inputFile; // Runtime error is thrown here for no reason!
cout << "\n-----END OF PROGRAM-----\n";
}
The above code saves an object to a binary file when the user evokes save functions. As stated in the code above, the de-allocation of inputFile pointer throws a runtime error.
Please note that I am using clang(more specifically, clang++) to compile the project.
Thanks in advance.
std::basic_istream::read(char_type*address, streamsize count) reads up to count characters and places them at address. It does not create any objects at address, as you seem to believe. The address must point to a valid chunk of memory at least count*sizeof(char_type) in size. By passing the address of a deleted object, you violate that condition: your code is broken and segmentation fault is not unexpected.
EDIT
What you're doing is undefined behaviour, short UB. When you do that, nothing is guaranteed and any logic about what happens in which order invalid. The program is allowed to do anything, including immediately crashing, running for a while and then crashing, or "make demons fly out of your nose".
In your case, I suspect that std::basic_istream::read() writing to unprotected memory causes some data to be overwritten, for example the address of another object, which later causes the segmentation fault. But this is pure speculation and not really worth pursuing.
In your case no object is created. The binary file just contains some bytes. The read() copies them to the address provided, which was not reserved for that purpose. To avoid the UB, simply add
obj = new dummy;
before the read to create an object.
If you want to re-use the memory from the previous object, you could use placement new (see points 9 and 10 of that link). For example
char*buffer = nullptr; // pointer to potential memory buffer
if(writing) {
if(!buffer)
buffer = new char[sizeof(dummy)]; // reserve memory buffer
auto pobj = new(buffer) dummy(args); // create object in buffer
write(buffer,sizeof(dummy)); // write bytes of object
pobj->~pobj(); // destruct object, but don't free buffer
}
if(reading) {
if(!buffer)
buffer = new char[sizeof(dummy)]; // not required if writing earlier
read(buffer,sizeof(dummy)); // read bytes into object
auto pobj = reinterpret_case<dummy*>(buffer); // no guarantees here
use(pobj); // do something with the object read
pobj->~pobj(); // destruct object
}
delete[] buffer; // free reserved memory
Note that if the reading does not generate a valid object, the later usage of that object, i.e. the call to its destructor, may crash.
However, all this micro-optimisation is pointless anyway (it's only worth doing if you can avoid many calls to new and delete). Don't waste your time with that.

c++'s temporary objects when inserted to a vector

I'm trying to make this code work, but the object keep getting destroyed...
I've found that it has to do with the object being copied to the vector, but can't find any way to prevent it...
#include <iostream>
#include <string>
#include <vector>
using namespace std;
class Obje
{
private:
static int instances;
int id;
public:
static int getInstances();
void getId();
virtual void myClass();
Obje(int auxId);
~Obje();
};
int Obje::instances = 0;
int Obje::getInstances()
{
return instances;
}
Obje::Obje(int auxId)
{
this->id = auxId;
cout << "Obje Created." << endl;
Obje::instances++;
}
Obje::~Obje()
{
cout << "Obje Destroyed." << endl;
Obje::instances--;
}
void Obje::myClass()
{
cout << "Obje." << endl;
}
void Obje::getId()
{
cout << this->id << endl;
}
int main()
{
vector <Obje> list;
Obje *a = new Obje(59565);
list.push_back(*a);
Obje *b = new Obje(15485);
list.push_back(*b);
for(vector<Obje>::iterator it = list.begin(); it != list.end(); ++it)
{
it->getId();
}
return 0;
}
It Generates this output:
Obje Created.
Obje Created.
Obje Destroyed.
59565
15485
Obje Destroyed.
Obje Destroyed.
What does it mean the T(const T& new); i've saw as fix for this?
First of all, it is a bad practice to allocate an object in heap without using smart pointers and forgetting to delete it. Especially, when you are creating it just to make a copy of it.
list.push_back(*a); creates a copy of *a in vector. To create an item in vector without copying another item, you can do list.emplace_back(/*constructor parameters*/);, which is available from c++11. (see http://en.cppreference.com/w/cpp/container/vector/emplace_back)
So, to make the result behavior match your expectations, you should go
vector <Obje> vec;
vec.emplace_back(59565);
vec.emplace_back(15485);
for(const auto & item : vec)
{
item.getId();
}
By the way, it is also a quite bad practice to call a vector as a list, as a list is a different container type and reading such code may be confusing a bit. I guess, I am starting being annoying, but it is better to call method getId as showId as now it returns nothing.
Regarding the use of heap, new and pointer, see my comment in your question.
Regarding the issue object was destroyed, the vector maintains an internal buffer to store object. When you push_back new object to the vector, if its internal buffer is full, it will (the stuff which will be executed when exception occurs won't be mentioned here.):
allocate new internal buffer which is big enough to store its new data.
move data from old internal buffer to new internal buffer.
destroy old buffer.
Hence, your object will be destroyed and copied to new location in this case, hence copy constructor will make it clearer to you.
P/S: AFAIK, some compilers move its data by memmove or std::move

Can someone explain exactly what happens if an exception is thrown during the process of allocating an array of objects on the heap?

I defined a class foo as follows:
class foo {
private:
static int objcnt;
public:
foo() {
if(objcnt==8)
throw outOfMemory("No more space!");
else
objcnt++;
}
class outOfMemory {
public:
outOfMemory(char* msg) { cout << msg << endl;}
};
~foo() { cout << "Deleting foo." << endl; objcnt--;}
};
int foo::objcnt = 0;
And here's the main function:
int main() {
try {
foo* p = new foo[3];
cout << "p in try " << p << endl;
foo* q = new foo[7];
}catch(foo::outOfMemory& o) {
cout << "Out-of-memory Exception Caught." << endl;
}
}
It is obvious that the line "foo* q = new foo[7];" only creates 5 objects successfully, and on the 6th object an Out-of-memory exception is thrown. But it turns out that there's only 5 destructor calls, and destrcutor is not called for the array of 3 objects stored at the position p points to. So I am wondering why? How come the program only calls the destructor for those 5 objects?
The "atomic" C++ allocation and construction functions are correct and exception-safe: If new T; throws, nothing leaks, and if new T[N] throws anywhere along the way, everything that's already been constructed is destroyed. So nothing to worry there.
Now a digression:
What you always must worry about is using more than one new expression in any single unit of responsibility. Basically, you have to consider any new expression as a hot potato that needs to be absorbed by a fully-constructed, responsible guardian object.
Consider new and new[] strictly as library building blocks: You will never use them in high-level user code (perhaps with the exception of a single new in a constructor), and only inside library classes.
To wit:
// BAD:
A * p = new A;
B * q = new B; // Ouch -- *p may leak if this throws!
// Good:
std::unique_ptr<A> p(new A);
std::unique_ptr<B> q(new B); // who cares if this throws
std::unique_ptr<C[3]> r(new C[3]); // ditto
As another aside: The standard library containers implement a similar behaviour: If you say resize(N) (growing), and an exception occurs during any of the constructions, then all of the already-constructed elements are destroyed. That is, resize(N) either grows the container to the specified size or not at all. (E.g. in GCC 4.6, see the implementation of _M_fill_insert() in bits/vector.tcc for a library version of exception-checked range construction.)
Destructors are only called for the fully constructed objects - those are objects whose constructors completed normally. That only happens automatically if an exception is thrown while new[] is in progress. So in your example the destructors will be run for five objects fully constructed during q = new foo[7] running.
Since new[] for the array that p points to completed successfully that array is now handled to your code and the C++ runtime doesn't care of it anymore - no destructors will be run unless you do delete[] p.
You get the behavior you expect when you declare the arrays on the heap:
int main()
{
try
{
foo p[3];
cout << "p in try " << p << endl;
foo q[7];
}
catch(foo::outOfMemory& o)
{
cout << "Out-of-memory Exception Caught." << endl;
}
}
In your code only the pointers were local automatic variables. Pointers don't have any associated cleanup when the stack is unwound. As others have pointed out this is why you generally do not have RAW pointers in C++ code they are usually wrapped inside a class object that uses the constructor/destructor to control their lifespan (smart pointer/container).
As a side note. It is usually better to use std::vector than raw arrays (In C++11 std::array is also useful if you have a fixed size array). This is because the stack has a limited size and these object puts the bulk of the data in the heap. The extra methods provided by these class objects make them much nicer to handle in the rest of your code and if you absolutely must have an old style array pointer to pass to a C function they are easy to obtain.
int main()
{
try
{
std::vector<foo> p(3);
cout << "p in try " << p << endl;
std::vector<foo> q(7);
// Now you can pass p/q to function much easier.
}
catch(foo::outOfMemory& o)
{
cout << "Out-of-memory Exception Caught." << endl;
}
}

C++ basics, vectors, destructors

I'm a little confused about the best practice for how to do this. Say I have a class that for example allocs some memory. I want it to self destruct like an auto but also put it in a vector for some reason unknown.
#include <iostream>
#include <vector>
class Test {
public:
Test();
Test(int a);
virtual ~Test();
int counter;
Test * otherTest;
};
volatile int count = 0;
Test::Test(int a) {
count++;
counter = count;
std::cout << counter << "Got constructed!\n";
otherTest = new Test();
otherTest->counter = 999;
}
Test::Test() {
count++;
counter = count;
std::cout << counter << "Alloced got constructed!\n";
otherTest = NULL;
}
Test::~Test() {
if(otherTest != 0){
std::cout << otherTest->counter << " 1Got destructed" << counter << "\n";
otherTest->counter = 888;
std::cout << otherTest->counter << " 2Got destructed" << counter << "\n";
}
}
int vectorTest(){
Test a(5);
std::vector<Test> vecTest;
vecTest.push_back(a);
return 1;
}
int main(){
std::cout << "HELLO WORLD\n";
vectorTest();
std::cout << "Prog finished\n";
}
In this case my destructor gets called twice all from counter 1, the alloc' object has already been set to 888 (or in a real case freed leading to bad access to a deleted object). What's the correct case for putting a local variable into a vector, is this some kind of design that would never happen sensibly. The following behaves differently and the destructor is called just once (which makes sense given its an alloc).
int vectorTest(){
//Test a(5);
std::vector<Test> vecTest;
vecTest.push_back(*(new Test(5)));
return 1;
}
How can I make the local variable behave the same leading to just one call to the destructor? Would a local simply never be put in a vector? But aren't vectors preferred over arrays, what if there are a load of local objects I want to initialize separately and place into the vector and pass this to another function without using free/heap memory? I think I'm missing something crucial here. Is this a case for some kind of smart pointer that transfers ownership?
A vector maintains its own storage and copies values into it. Since you did not implement a copy constructor, the default one is used, which just copies the value of the pointer. This pointer is thus deleted twice, once by the local variable destructor and once by the vector. Don't forget the rule of three. You either need to implement the copy and assignment operators, or just use a class that already does this, such as shared_ptr.
Note that this line causes a memory leak, since the object you allocated with new is never deleted:
vecTest.push_back(*(new Test(5)));
In addition to what Dark Falcon wrote: To avoid reallocating when inserting into a vector, you typically implement a swap function to swap local element with a default-constructed one in the vector. The swap would just exchange ownership of the pointer and all will be well. The new c++0x also has move-semantics via rvalue-references to help with this problem.
More than likely, you'd be better off having your vector hold pointers to Test objects instead of Test objects themselves. This is especially true for objects (like this test object) that allocate memory on the heap. If you end up using any algorithm (e.g. std::sort) on the vector, the algorithm will be constantly allocating and deallocating memory (which will slow it down substantially).