AVL Tree Rebalancing in C++ - c++

I'm working on an AVL tree. I think I've got all of the rotate functions working correctly. I have a rotateleft, rotateright, rotateleftright, and rotaterightleft function. They all take a node as a parameter.I don't know what node to pass to those parameters. Can you take a look at my AVL tree rebalance function and tell me if I have it correct, and what I need to pass to each of these functions. So far, I have the root or the top node, but i think I'm wrong. How do I tell what I need to pass to these functions?
Here is the function:
void BinaryTree::rebalance(Node *N)
{
int count = 1;
if((N->getLeft()->getHeight()) > (N->getRight()->getHeight() + 1))
{
if(N->getLeft()->getLeft()->getHeight() > N->getLeft()->getRight()->getHeight())
{
rotateRight(root);
recalculate(root, count);
}
else
{
rotateLeftRight(root);
recalculate(root, count);
}
}
else if(N->getRight()->getHeight()> N->getLeft()->getHeight() + 1)
{
if(N->getRight()->getRight()->getHeight() > N->getRight()->getLeft()->getHeight())
{
rotateLeft(root);
recalculate(root, count);
}
else
{
rotateRightLeft(root);
recalculate(root, count);
}
}
}
here is my rotate leftright
Node* BinaryTree::rotateLeftRight(Node *N)
{
Node *newNode = new Node();//declares a new Node
newNode = N->getLeft();//sets the node
N->setLeft(rotateLeft(newNode->getLeft());//sets the left subtree
recalculate(root);//recalculates the height
root->setHeight(NULL);//sets the height of the root node
return rotateRight(N);//retuns the tree rotated right
}
and here is my rotate left function.:
Node* BinaryTree::rotateLeft(Node *N)
{
Node *newNode = new Node();//declares a new node
newNode = N->getRight();//sets the new node to the right child of N
N->setRight(newNode->getLeft());//sets the right of N equal to new nodes left child
newNode->setLeft(N);//sets the left child of the new node to N
return newNode;//retuns the newNode
}
if i have the tree 50 20 10 and 15 what do i pass to the each of these functions to rebalance the tree?

There are some errors in your code that you did not do in the one you submitted in another question, that is you don't check for nullary pointers in your code:
you don't check if N is NULL at the begining of the method
you don't check in the line below (and in its symmetrical sibling) if the left and right nodes are NULL
if((N->getLeft()->getHeight()) > (N->getRight()->getHeight() + 1))
Regarding the algorithm itself, it depends on the behaviour of the rotation functions. The algorithm as described in the wikipedia entry explains that the second case in your nested if (the rotateLeftRight and rotateRightLeft methods) should perform 2 rotations. If your rotation functions are conform to that description, you should be alright.
The case of recalculate has been taken care of in an other question, but in this situation, you actually don't need to recalculate the height for the whole subtree, as you correctly told me in comments in that question. The only changing nodes are the ones whose children have been changed. You should perform that computation within each specific rotation method, since each case describe exactly which nodes get updated.

Related

Is it possible to make efficient pointer-based binary heap implementations?

Is it even possible to implement a binary heap using pointers rather than an array? I have searched around the internet (including SO) and no answer can be found.
The main problem here is that, how do you keep track of the last pointer? When you insert X into the heap, you place X at the last pointer and then bubble it up. Now, where does the last pointer point to?
And also, what happens when you want to remove the root? You exchange the root with the last element, and then bubble the new root down. Now, how do you know what's the new "last element" that you need when you remove root again?
Solution 1: Maintain a pointer to the last node
In this approach a pointer to the last node is maintained, and parent pointers are required.
When inserting, starting at the last node navigate to the node below which a new last node will be inserted. Insert the new node and remember it as the last node. Move it up the heap as needed.
When removing, starting at the last node navigate to the second-to-last node. Remove the original last node and remember the the new last node just found. Move the original last node into the place of the deleted node and then move it up or down the heap as needed.
It is possible to navigate to the mentioned nodes in O(log(n)) time and O(1) space. Here is a description of the algorithms but the code is available below:
For insert: If the last node is a left child, proceed with inserting the new node as the right child of the parent. Otherwise... Start at the last node. Move up as long as the current node is a right child. If the root was not reached, move to the sibling node at the right (which necessarily exists). Then (whether or not the root was reached), move down to the left as long as possible. Proceed by inserting the new node as the left child of the current node.
For remove: If the last node is the root, proceed by removing the root. Otherwise... Start at the last node. Move up as long as the current node is a left child. If the root was not reached, move to the sibling left node (which necessarily exists). Then (whether or not the root was reached), move down to the right as long as possible. We have arrived at the second-to-last node.
However, there are some things to be careful about:
When removing, there are two special cases: when the last node is being removed (unlink the node and change the last node pointer), and when the second-to-last node is being removed (not really special but the possibility must be considered when replacing the deleted node with the last node).
When moving nodes up or down the heap, if the move affects the last node, the last-node pointer must be corrected.
Long ago I have made an implementation of this. In case it helps someone, here is the code. Algorithmically it should be correct (has also been subjected to stress testing with verification), but there is no warranty of course.
Solution 2: Reach the last node from the root
This solution requires maintaining the node count (but not parent pointers or the last node). The last (or second-to-last) node is found by navigating from the root towards it.
Assume the nodes are numbered starting from 1, as per the typical notation for binary heaps. Pick any valid node number and represent it in binary. Ignore the first (most significant) 1 bit. The remaining bits define the path from the root to that node; zero means left and one means right.
For example, to reach node 11 (=1011b), start at the root then go left (0), right (1), right (1).
This algorithm can be used in insert to find where to place the new node (follow the path for node node_count+1), and in remove to find the second-to-last-node (follow the path for node node_count-1).
This approach is used in libuv for timer management; see their implementation of the binary heap.
Usefulness of Pointer-based Binary Heaps
Many answers here and even literature say that an array-based implementation of a binary heap is strictly superior. However I contest that because there are situations where the use of an array is undesirable, typically because the upper size of the array is not known in advance and on-demand reallocations of an array are not deemed acceptable, for example due to latency or possibility of allocation failure.
The fact that libuv (a widely used event loop library) uses a binary heap with pointers only further speaks for this.
It is worth noting that the Linux kernel uses (pointer-based) red-black trees as a priority queue in a few cases, for example for CPU scheduling and timer management (for the same purpose as in libuv). I find it likely that changing these to use a pointer-based binary heap will improve performance.
Hybrid Approach
It is possible to combine Solution 1 and Solution 2 into a hybrid approach which dynamically picks either of the algorithms (for finding the last or second-to-last node), the one with a lower cost, measured in the number of edges that need to be traversed. Assume we want to navigate to node number N, and highest_bit(X) means the 0-based index of the highest-order bit in N (0 means the LSB).
The cost of navigating from the root (Solution 2) is highest_bit(N).
The cost of navigating from the previous node which is on the same level (Solution 1) is: 2 * (1 + highest_bit((N-1) xor N)).
Note that in the case of a level change the second equation will yield a wrong (too large) result, but in that case traversal from the root is more efficient anyway (for which the estimate is correct) and will be chosen, so there is no need for special handling.
Some CPUs have an instruction for highest_bit allowing very efficient implementation of these estimates. An alternative approach is to maintain the highest bit as a bit mask and do these calculation with bit masks instead of bit indices. Consider for example that 1 followed by N zeroes squared is equal to 1 followed by 2N zeroes).
In my testing it has turned out that Solution 1 is on average faster than Solution 2, and the hybrid approach appeared to have about the same average performance as Solution 2. Therefore the hybrid approach is only useful if one needs to minimize the worst-case time, which is (twice) better in Solution 2; since Solution 1 will in the worst case traverse the entire height of the tree up and then down.
Code for Solution 1
Note that the traversal code in insert is slightly different from the algorithm described above but still correct.
struct Node {
Node *parent;
Node *link[2];
};
struct Heap {
Node *root;
Node *last;
};
void init (Heap *h)
{
h->root = NULL;
h->last = NULL;
}
void insert (Heap *h, Node *node)
{
// If the heap is empty, insert root node.
if (h->root == NULL) {
h->root = node;
h->last = node;
node->parent = NULL;
node->link[0] = NULL;
node->link[1] = NULL;
return;
}
// We will be finding the node to insert below.
// Start with the current last node and move up as long as the
// parent exists and the current node is its right child.
Node *cur = h->last;
while (cur->parent != NULL && cur == cur->parent->link[1]) {
cur = cur->parent;
}
if (cur->parent != NULL) {
if (cur->parent->link[1] != NULL) {
// The parent has a right child. Attach the new node to
// the leftmost node of the parent's right subtree.
cur = cur->parent->link[1];
while (cur->link[0] != NULL) {
cur = cur->link[0];
}
} else {
// The parent has no right child. This can only happen when
// the last node is a right child. The new node can become
// the right child.
cur = cur->parent;
}
} else {
// We have reached the root. The new node will be at a new level,
// the left child of the current leftmost node.
while (cur->link[0] != NULL) {
cur = cur->link[0];
}
}
// This is the node below which we will insert. It has either no
// children or only a left child.
assert(cur->link[1] == NULL);
// Insert the new node, which becomes the new last node.
h->last = node;
cur->link[cur->link[0] != NULL] = node;
node->parent = cur;
node->link[0] = NULL;
node->link[1] = NULL;
// Restore the heap property.
while (node->parent != NULL && value(node->parent) > value(node)) {
move_one_up(h, node);
}
}
void remove (Heap *h, Node *node)
{
// If this is the only node left, remove it.
if (node->parent == NULL && node->link[0] == NULL && node->link[1] == NULL) {
h->root = NULL;
h->last = NULL;
return;
}
// Locate the node before the last node.
Node *cur = h->last;
while (cur->parent != NULL && cur == cur->parent->link[0]) {
cur = cur->parent;
}
if (cur->parent != NULL) {
assert(cur->parent->link[0] != NULL);
cur = cur->parent->link[0];
}
while (cur->link[1] != NULL) {
cur = cur->link[1];
}
// Disconnect the last node.
assert(h->last->parent != NULL);
h->last->parent->link[h->last == h->last->parent->link[1]] = NULL;
if (node == h->last) {
// Deleting last, set new last.
h->last = cur;
} else {
// Not deleting last, move last to node's place.
Node *srcnode = h->last;
replace_node(h, node, srcnode);
// Set new last unless node=cur; in this case it stays the same.
if (node != cur) {
h->last = cur;
}
// Restore the heap property.
if (srcnode->parent != NULL && value(srcnode) < value(srcnode->parent)) {
do {
move_one_up(h, srcnode);
} while (srcnode->parent != NULL && value(srcnode) < value(srcnode->parent));
} else {
while (srcnode->link[0] != NULL || srcnode->link[1] != NULL) {
bool side = srcnode->link[1] != NULL && value(srcnode->link[0]) >= value(srcnode->link[1]);
if (value(srcnode) > value(srcnode->link[side])) {
move_one_up(h, srcnode->link[side]);
} else {
break;
}
}
}
}
}
Two other functions are used: move_one_up moves a node one step up in the heap, and replace_node replaces moves an existing node (srcnode) into the place held by the node being deleted. Both work only by adjusting the links to and from the other nodes, there is no actual moving of data involved. These functions should not be hard to implement, and the mentioned link includes my implementations.
The pointer based implementation of the binary heap is incredibly difficult when compared to the array based implementation. But it is fun to code it. The basic idea is that of a binary tree. But the biggest challenge you will have is to keep it left-filled. You will have difficulty in finding the exact location as to where you must insert a node.
For that, you must know binary traversal. What we do is. Suppose our heap size is 6. We will take the number + 1, and convert it to bits. The binary representation of 7 is, "111". Now, remember to always omit the first bit. So, now we are left with "11". Read from left-to-right. The bit is '1', so, go to the right child of the root node. Then the string left is "1", the first bit is '1'. As you have only 1 bit left, this single bit tells you where to insert the new node. As it is '1' the new node must be the right child of the current node. So, the raw working of the process is that, convert the size of the heap into bits. Omit the first bit. According to the leftmost bit, go to the right child of the current node if it is '1', and to the left child of the current node if it is '0'.
After inserting the new node, you will bubble it up the heap. This tells you that you will be needing the parent pointer. So, you go once down the tree and once up the tree. So, your insertion operation will take O(log N).
As for the deletion, it is still a challenge to find the last node. I hope you are familiar with deletion in a heap where we swap it with the last node and do a heapify. But for that you need the last node, for that too, we use the same technique as we did for finding the location to insert the new node, but with a little twist. If you want to find the location of the last node, you must use the binary representation of the value HeapSize itself, not HeapSize + 1. This will take you to the last node. So, the deletion will also cost you O(log N).
I'm having trouble in posting the source code here, but you can refer to my blog for the source code. In the code, there is Heap Sort too. It is very simple. We just keep deleting the root node. Refer to my blog for explanation with figures. But I guess this explanation would do.
I hope my answer has helped you. If it did, let me know...! ☺
For those saying this is a useless exercise, there are a couple of (admittedly rare) use cases where a pointer-based solution is better. If the max size of the heap is unknown, then an array implementation will need to stop-and-copy into fresh storage when the array fills. In a system (e.g. embedded) where there are fixed response time constraints and/or where free memory exists, but not a big enough contiguous block, this may be not be acceptable. The pointer tree lets you allocate incrementally in small, fixed-size chunks, so it doesn't have these problems.
To answer the OP's question, parent pointers and/or elaborate tracking aren't necessary to determine where to insert the next node or find the current last one. You only need the bits in the binary rep of the heap's size to determine the left and right child pointers to follow.
Edit Just saw Vamsi Sangam#'s explanation of this algorithm. Nonetheless, here's a demo in code:
#include <stdio.h>
#include <stdlib.h>
typedef struct node_s {
struct node_s *lft, *rgt;
int data;
} NODE;
typedef struct heap_s {
NODE *root;
size_t size;
} HEAP;
// Add a new node at the last position of a complete binary tree.
void add(HEAP *heap, NODE *node) {
size_t mask = 0;
size_t size = ++heap->size;
// Initialize the mask to the high-order 1 of the size.
for (size_t x = size; x; x &= x - 1) mask = x;
NODE **pp = &heap->root;
// Advance pp right or left depending on size bits.
while (mask >>= 1) pp = (size & mask) ? &(*pp)->rgt : &(*pp)->lft;
*pp = node;
}
void print(NODE *p, int indent) {
if (!p) return;
for (int i = 0; i < indent; i++) printf(" ");
printf("%d\n", p->data);
print(p->lft, indent + 1);
print(p->rgt, indent + 1);
}
int main(void) {
HEAP h[1] = { NULL, 0 };
for (int i = 0; i < 16; i++) {
NODE *p = malloc(sizeof *p);
p->lft = p->rgt = NULL;
p->data = i;
add(h, p);
}
print(h->root, 0);
}
As you'd hope, it prints:
0
1
3
7
15
8
4
9
10
2
5
11
12
6
13
14
Sift-down can use the same kind of iteration. It's also possible to implement the sift-up without parent pointers using either recursion or an explicit stack to "save" the nodes in the path from root to the node to be sifted.
A binary heap is a complete binary tree obeying the heap property. That's all. The fact that it can be stored using an array, is just nice and convenient. But sure, you can implement it using a linked structure. It's a fun exercise! As such, it is mostly useful as an exercise or in more advanced datastructures( meldable, addressable priority queues for example ), as it is quite a bit more involved than doing the array version. For example, think about siftup/siftdown procedures, and all the edge cutting/sewing you'll need to get right. Anyways, it's not too hard, and once again, good fun!
There are a number of comments pointing out that by a strict definition it is possible to implement a binary heap as a tree and still call it a binary heap.
Here is the problem -- there is never a reason to do so since using an array is better in every way.
If you do searches to try to find information on how to work with a heap using pointers you are not going to find any -- no one bothers since there is no reason to implement a binary heap in this way.
If you do searches on trees you will find lots of helpful materials. This was the point of my original answer. There is nothing that stops people from doing it this way but there is never a reason to do so.
You say -- I have to do so, I've got an legacy system and I have pointers to nodes I need to put them in a heap.
Make an array of those pointers and work with them in this array as you would a standard array based heap, when you need the contents dereference them. This will work better than any other way of implementing your system.
I can think of no other reason to implement a heap using pointers.
Original Answer:
If you implement it with pointers then it is a tree. A heap is a heap because of how you can calculate the location of the children as a location in the array (2 * node index +1 and 2 * node index + 2).
So no, you can't implement it with pointers, if you do you've implemented a tree.
Implementing trees is well documented if you search you will find your answers.
I have searched around the internet (including SO) and no answer can be found.
Funny, because I found an answer on SO within moments of googling it. (Same Google search led me here.)
Basically:
The node should have pointers to its parent, left child, and right child.
You need to keep pointers to:
the root of the tree (root) (duh)
the last node inserted (lastNode)
the leftmost node of the lowest level (leftmostNode)
the rightmost node of the next-to-lowest level (rightmostNode)
Now, let the node to be inserted be nodeToInsert. Insertion algorithm in pseudocode:
void insertNode(Data data) {
Node* parentNode, nodeToInsert = new Node(data);
if(root == NULL) { // empty tree
parent = NULL;
root = nodeToInsert;
leftmostNode = root;
rightmostNode = NULL;
} else if(lastNode.parent == rightmostNode && lastNode.isRightChild()) {
// level full
parentNode = leftmostNode;
leftmostNode = nodeToInsert;
parentNode->leftChild = nodeToInsert;
rightmostNode = lastNode;
} else if (lastNode.isLeftChild()) {
parentNode = lastNode->parent;
parentNode->rightChild = nodeToInsert;
} else if(lastNode.isRightChild()) {
parentNode = lastNode->parent->parent->rightChild;
parentNode->leftChild = nodeToInsert;
}
nodeToInsert->parent = parentNode;
lastNode = nodeToInsert;
heapifyUp(nodeToInsert);
}
Pseudocode for deletion:
Data deleteNode() {
Data result = root->data;
if(root == NULL) throw new EmptyHeapException();
if(lastNode == root) { // the root is the only node
free(root);
root = NULL;
} else {
Node* newRoot = lastNode;
if(lastNode == leftmostNode) {
newRoot->parent->leftChild = NULL;
lastNode = rightmostNode;
rightmostNode = rightmostNode->parent;
} else if(lastNode.isRightChild()) {
newRoot->parent->rightChild = NULL;
lastNode = newRoot->parent->leftChild;
} else if(lastNode.isLeftChild()) {
newRoot->parent->leftChild = NULL;
lastNode = newRoot->parent->parent->leftChild->rightChild;
}
newRoot->leftChild = root->leftChild;
newRoot->rightChild = root->rightChild;
newRoot->parent = NULL;
free(root);
root = newRoot;
heapifyDown(root);
}
return result;
}
heapifyUp() and heapifyDown() shouldn’t be too hard, though of course you’ll have to make sure those functions don’t make leftmostNode, rightmostNode, or lastNode point at the wrong place.
TL;DR Just use a goddamn array.

Binary Search tree Array implementation C++

I am in the process of implementing a Binary Search tree that gets represented using the Array implementation. This is my code so far: Take note that I have done with the Structure of tree and it is being saved as a Linked List. I want to convert this linked list into an array.
My thoughts on how to go about this are as followed. Make a return_array function. Have the Size of the array set to the Max number of nodes( 2^(n-1)+1) and go through the linked list. Root node would be # position 0 on the array then his L-child = (2*[index_of_parent]+1) and R-child = (2*[index_of_parent]+2). I looked around for a bit and searched to find something that can get me an idea of how I can keep track of each node and how I can go through each one.
Am I overthinking this problem?
Can there be a Recursion?
Also, I'm considering creating a visual tree instead of an array but have no idea how to space it out correctly. If anyone has an idea on how to do that it would be awesome to get a better understanding of that.
#include <iostream>
#include <stdio.h>
#include <stdlib.h>
#include <cmath>
using namespace std;
struct node {
int data;
struct node* left;
struct node* right;
};
void inorder(struct node* node){
if(node){
inorder(node->left);
cout << node->data << " ";
inorder(node->right);
}
}
void insert(struct node** node, int key){
if(*node == NULL){
(*node) = (struct node*)malloc(sizeof(struct node));
(*node)->data = key;
(*node)->left = NULL;
(*node)->right = NULL;
printf("inserted node with data %d\n", (*node)->data);
}
else if ((*node)->data > key){
insert((&(*node)->left),key);
}
else
insert((&(*node)->right),key);
}
int max_tree(struct node* node){
int left,right;
if(node == NULL)
return 0;
else
{
left=max_tree(node->left);
right=max_tree(node->right);
if(left>right)
return left+1;
else
return right+1;
}
}
//This is where i dont know how to keep the parent/children the array.
void return_array(struct node* node, int height){
int max;
height = height - 1;
max = pow(2, height) - 1;
int arr [height];
}
int main(){
int h;
struct node* root = NULL;
insert(&root, 10);
insert(&root, 20);
insert(&root, 5);
insert(&root, 2);
inorder(root);
cout << endl;
cout << "Height is: ";
cout << max_tree(root);
h = max_tree(root)
return_array(root, h)
}
Considering that you want to efficiently store a binary search tree, using
l = 2i + 1
r = 2i + 2
will waste space every time your tree encounters a leaf node that is not occurring at the end of the tree (breadth-first). Consider the following simple example:
2
/ \
1 4
/ \
3 5
This (when transformed breadth-first into an array) results in
[ 2, 1, 4, -, -, 3, 5 ]
And wastes two slots in the array.
Now if you want to store the same tree in an array without wasting space, just transform it into an array depth-first:
[ 2 1 4 3 5 ]
To recover the original tree from this, follow these steps for each node:
Choose the first node as root
For each node (including root), choose
a) the left child as the next smaller key from the array after the current key
b) the right child as the next bigger key from the array, being no larger than the smallest parent key encountered when last branching left, and smaller than the direct parent's key when you are currently in it's left branch
Obviously finding the correct b) is slightly more complex, but not too much. Refer to my code example here.
If I'm not mistaken, transforming to and from an array will take O(n) in either case. And as no space is wasted, space complexity is also O(n).
This works because binary search trees have more structure than ordinary binary trees; here, I'm just using the binary search tree property of the left child being smaller, and the right child being larger than the current node's key.
EDIT:
After doing some further research on the topic, I found that reconstructing the tree in preorder traversal order is much simpler. The recursive function doing that is implemented here and here, respectively.
It basically consists of these steps:
As long as the input array has unseen entries,
If the value to insert is greater than the current branch's minimum value and less than the current branch's maximum allowed,
Add a node to the tree at the current position and set it's value to the current input value
Remove current value from input
If there are items left in the input,
Recurse into the left child
Recurse into the right child
The current minimum and maximum values are defined by the position inside the tree (left child: less than parent, right child: greater than parent).
For more elaborate details, please refer to my source code links.
If you want to store the tree node in a array,you had better to start from 1 position of your array!So the relation between the parent and its children should be simple:
parent = n;
left = 2n;
right = 2n + 1;
you should BFS the tree,and store the node in the array(If the node is null you should also store in the array using a flag ex 0),you should get the very array of the tree!
To do this you have to follow these steps.
Create an empty queue.
Make the first node of the list as root, and enqueue it to the queue.
Until we reach the end of the list, do the following.
a. Dequeue one node from the queue. This is the current parent.
b. Traverse two nodes in the list, add them as children of the current parent.
c. Enqueue the two nodes into the queue.
Time Complexity: Time complexity of the above solution is O(n) where n is the number of nodes.

Path of the diameter of a binary tree

I have a binary tree and a method for the size of the longest path (the diameter):
int diameter(struct node * tree)
{
if (tree == 0)
return 0;
int lheight = height(tree->left);
int rheight = height(tree->right);
int ldiameter = diameter(tree->left);
int rdiameter = diameter(tree->right);
return max(lheight + rheight + 1, max(ldiameter, rdiameter));
}
I want the function to return also the exact path (list of all the nodes of the diameter).
How can I do it?
Thanks
You have two options:
A) Think.
B) Search. Among the first few google hits you can find this: http://login2win.blogspot.hu/2012/07/print-longest-path-in-binary-tree.html
Choose A) if you want to learn, choose B) if you do not care, only want a quick, albeit not necessarily perfect solution.
There are many possible solutions, some of them:
In a divide and conquer approach you will probably end up with maintaining the so far longest paths on both sides, and keep only the longer.
The quoted solution does two traversals, one for determining the diameter, and the second for printing. This is a nice trick to overcome the problem of not knowing whether we are at the deepest point in approach 1.
Instead of a depth first search, do a breadth first one. Use a queue. Proceed level by level, for each node storing the parent. When you reach the last level (no children added to queue), you can print the whole path easily, because the last printed node is on (one) longest path, and you have the parent links.
Add a property struct node * next to the node struct. Before the return statement, add a line like this tree->next = (ldiameter > rdiameter ? tree->left : tree->right) to get the longer path node as the next node. After calling diameter(root), you should be able to iterate through all of the next nodes from the root to print the largest path.
I think the following may work... compute the diameter as follows in O(N) time.
// this is a c++ code
int findDiameter(node *root, int &max_length, node* &max_dia_node, int parent[], node* parent_of_root){
if(!root) return 0;
parent[root->val] = parent_of_root->val;
int left = findDiameter(root->left, max_length);
int right = findDiameter(root->right, max_length);
if(left+right+1 > max_length){
max_dia_node = root;
max_length = left+right+1;
}
return 1 + max(left,right);
}
So in this function number of things is happening. First max_length is calculating the max diameter of the tree. And along with that I am assigning the max_dia_node to this node.
This is the node through which I will have my max diameter pass through.
Now using this information we can find the max depth left child and right child of this node (max_dia_node). From that we can have the actual nodes via "parent" array.
This is two traversal of the tree.

Binary tree Basics in C++

I have a binary tree data structure of:
//Declare Data Structure
struct CP {
int id; //ID of the Node
int data; //Data of the Node
CP * left; //Pointer to the Left Subtree
CP * right; //Pointer to the Right Subtree
};
typedef CP * CPPtr;
Without changing the tree structure, how do I actually calculate the depth if given a node id. (id is a unique indicator to each tree node)
your code is lack of some base steps or necessary initializations.
BTree_Helper(BTree *Tree){// this is roughly written like pseudo code
if(TLeft == NULL && TRight == NULL){
depth of tree = 0 ;
}
else if (TLeft == NULL){
depth of tree = depth of right tree ;
}
else if(TRight==NULL){
depth of tree = depth of left tree;
}
else{
depth of tree = the maximum between depth of left and depth of right;
}
}
I just gave some hints for your convinence.
Think carefully and try as many test suites as possible.
Going off of what y26jin suggested, maybe something like this?
BTree_Helper(CP *TreeNode) {
CP *TLeft = TreeNode->left;
CP *TRight = TreeNode->right;
if(TLeft == NULL && TRight == NULL){
return 0;
}
else if (TLeft == NULL){
return 1+(BTree_Helper(TRight));
}
else if(TRight==NULL){
return 1+(BTree_Helper(TLeft));
}
else{
return 1+max(BTree_Helper(TLeft),BTree_Helper(TRight));
}
}
I can't actually test the code right now, sorry if I'm way off here. But I think something along these lines should work.
I'm going to assume that id is the search key for the tree. In other words, the id of any node on the left subtree is less than the id of this node, and the id of any node on the right subtree is greater than the id of this node. Also, id is assumed to be unique.
To find a node with a given ID, given a pointer to the root node of the tree, you just do:
CP* find(CP* root, int searchID)
{
// Starting point.
CP* node = root;
while(node)
{
// Search hit?
if(node->id == searchID)
return node;
// Turn left or right?
if(node->id < searchID)
node = node->left;
else
node = node->right;
}
return 0; // No node with the given ID found.
}
Finding depth is a simple modification of this function: instead of returning a node, you keep count of how many levels you descend. A depth of 0 means the root node is what you want; a depth of 1 means either the left or right nodes; a depth of 2 means any of their direct children, etc. So it's really how many times you have to loop:
int depth(CP* root, int searchID)
{
// Starting point.
CP* node = root;
int depth = 0;
while(node)
{
// Search hit?
if(node->id == searchID)
return depth;
// Descending a level...
++depth;
// Turn left or right?
if(node->id < searchID)
node = node->left;
else
node = node->right;
}
return -1; // No node with the given ID found.
}
Note the special value -1 for "not found".
I recommend storing the depth of a node's subtree in that node. Then you can just update the depth of the tree as you add nodes to it. Whenever you add a node, back out of the tree, updating the depth of each node along the path to the root on the way out. If at any point, the new depth of a node's modified subtree is not greater than the depth of the node's other subtree, you can short-circuit.
The benefits to this approach are:
It's worst-case performance is O(log n) (assuming that the tree is balanced).
It is extremely easy to write non-recursively
Read about basic tree/graph search algorithms: breadth-first search (BFS) and depth-first search (DFS). Try implementing DFS both recursively and with an explicit stack<T>. Implement BFS using a queue<T>.
Pay attention to the efficiency of your approach. If you want to look-up the depth of nodes repeatedly it will probably be much faster to store the depth of every node in the tree in some sort of look-up table. Ideally a hash table but a map<T1, T2> will do in most cases.
You'll learn a lot from the above exercises. Good luck!
You can calculate the depth from any node using recursion:
int countChildren(CPPtr node) {
if ( node != null )
return 1 + countChildren(node->left) + countChildren(node->right);
else
return 0;
}
You have to pass pointers to lDepth and rDepth, not the values themselves, like so:
nodeDepth_Helper(tree,id, &lDepth, &rDepth);
Furthermore, I think the arguments to nodeDepth_helper should be declared as pointers to ints:
void nodeDepth_Helper(CPPtr tree, int id, int* lDepth,int* rDepth)
making these changes throughout should fix your problem.

Calculate height of a tree

I am trying to calculate the height of a tree. I am doing it with the code written below.
#include<iostream.h>
struct tree
{
int data;
struct tree * left;
struct tree * right;
};
typedef struct tree tree;
class Tree
{
private:
int n;
int data;
int l,r;
public:
tree * Root;
Tree(int x)
{
n=x;
l=0;
r=0;
Root=NULL;
}
void create();
int height(tree * Height);
};
void Tree::create()
{
//Creting the tree structure
}
int Tree::height(tree * Height)
{
if(Height->left==NULL && Height->right==NULL)
{return 0;
}
else
{
l=height(Height->left);
r=height(Height->right);
if (l>r)
{l=l+1;
return l;
}
else
{
r=r+1;
return r;
}
}
}
int main()
{
Tree A(10);//Initializing 10 node Tree object
A.create();//Creating a 10 node tree
cout<<"The height of tree"<<A.height(A.Root);*/
}
It gives me the correct result.
But in some posts(googled page) it was suggested to do a Postorder traversal and use this height method to calculate the height. Any specific reason?
But isn't a postorder traversal precisely what you are doing? Assuming left and right are both non-null, you first do height(left), then height(right), and then some processing in the current node. That's postorder traversal according to me.
But I would write it like this:
int Tree::height(tree *node) {
if (!node) return -1;
return 1 + max(height(node->left), height(node->right));
}
Edit: depending on how you define tree height, the base case (for an empty tree) should be 0 or -1.
The code will fail in trees where at least one of the nodes has only one child:
// code snippet (space condensed for brevity)
int Tree::height(tree * Height) {
if(Height->left==NULL && Height->right==NULL) { return 0; }
else {
l=height(Height->left);
r=height(Height->right);
//...
If the tree has two nodes (the root and either a left or right child) calling the method on the root will not fulfill the first condition (at least one of the subtrees is non-empty) and it will call recursively on both children. One of them is null, but still it will dereference the null pointer to perform the if.
A correct solution is the one posted by Hans here. At any rate you have to choose what your method invariants are: either you allow calls where the argument is null and you handle that gracefully or else you require the argument to be non-null and guarantee that you do not call the method with null pointers.
The first case is safer if you do not control all entry points (the method is public as in your code) since you cannot guarantee that external code will not pass null pointers. The second solution (changing the signature to reference, and making it a member method of the tree class) could be cleaner (or not) if you can control all entry points.
The height of the tree doesn't change with the traversal. It remains constant. It's the sequence of the nodes that change depending on the traversal.
Definitions from wikipedia.
Preorder (depth-first):
Visit the root.
Traverse the left subtree.
Traverse the right subtree.
Inorder (symmetrical):
Traverse the left subtree.
Visit the root.
Traverse the right subtree.
Postorder:
Traverse the left subtree.
Traverse the right subtree.
Visit the root.
"Visit" in the definitions means "calculate height of node". Which in your case is either zero (both left and right are null) or 1 + combined height of children.
In your implementation, the traversal order doesn't matter, it would give the same results. Cant really tell you anything more than that without a link to your source stating postorder is to prefer.
Here is answer :
int Help :: heightTree (node *nodeptr)
{
if (!nodeptr)
return 0;
else
{
return 1 + max (heightTree (nodeptr->left), heightTree (nodeptr->right));
}
}