Sorting vector elements in descending order - c++

Please tell me what is wrong in my approach.
When I run the code, it is taking too long to compute to see the result.
#include <iostream>
#include <vector>
using namespace std;
vector<int> vec;
vector<int> sort(vector<int> x) {
vector<int> y;
int i = 1;
reset:for(i = 1; i <= x.size(); i++){
for (int j = 1; j <= x.size();) {
if (j == i) {
j++;
}
else {
if (x[i - 1] > x[j - 1]) {
j++;
}
else {
i++;
goto reset;
}
}
}
y.push_back(x[i - 1]);
x.erase(x.begin() + i - 1);
}
return y;
}
int main(){
vec.push_back(5);
vec.push_back(9);
vec.push_back(3);
vec.push_back(6);
vec.push_back(2);
for (int i = 1; i <= vec.size(); i++) {
cout << sort(vec)[i-1] << " ";
}
}
I am sorting this given sequence of 5 integers into descending order. Please help.
My plan was to search for the greatest integer in the whole vector x and move to it to the vector y and repeat the process.

Simple bubble-sort example
I think that your sort function is entering an infinite loop because of the goto reset statement. If you want to implement a simple bubble-sort algorithm, you can do it like this:
#include <iostream>
#include <utility>
#include <vector>
void bubble_sort(std::vector<int>& v) {
if(v.size() == 0) return;
for(int max = v.size(); max > 0; max--) {
for(int i = 1; i < max; i++) {
int& current = v[i - 1];
int& next = v[i];
if(current < next)
std::swap(current, next);
}
}
}
This function takes a vector, and for every consecutive pair of elements in the vector, if they're out of order, it swaps them. This results in the smallest element "bubbling" to the top of the vector. The process is repeated until all the elements are in order.
If we test it, we see that it prints the right answer:
int main() {
std::vector<int> test = {5, 9, 3, 6, 2};
bubble_sort(test);
for(int i : test) {
std::cout << i << ' ';
}
std::cout << '\n';
}
Using std::sort to do this faster
The standard library provides a sort function that'll sort pretty much anything. std::sort is really well implemented, it's more efficient than bubble sort, and it's really easy to use.
By default, std::sort orders things in ascending order, although it's easy to change it so that it works in descending order. There are two ways to do this. The first way sorts the vector using the reverse iterators (which allow you to pretend the vector is in reverse order), and the second way sorts the vector using std::greater, which tells std::sort to sort things in reverse order.
// Way 1:
std::sort(test.rbegin(), test.rend());
// Way 2:
auto compare_func = std::greater<>();
std::sort(test.begin(), test.end(), compare_func);
We can re-write the program using std::sort:
#include <iostream>
#include <vector>
#include <algorithm>
int main() {
std::vector<int> test = {5, 9, 3, 6, 2};
auto compare_function = std::greater<>();
std::sort(test.begin(), test.end(), compare_function);
for(int i : test) {
std::cout << i << ' ';
}
std::cout << '\n';
}

Why can't you just use std:sort? You can do this:
sort(vec.begin(), vec.end(), [](const int a, const int b) {return a > b; }); //1
As suggested in the comments, there are two alternatives to the above:
std::sort(vec.begin(), vec.end(), std::greater<>()); //2
and:
std::sort(vec.rbegin(), vec.rend()); //3
(2) and (3) avoid a custom comparison function, and (2) is arguably more explicit about it's intent. But I was interested in the performance, and so I did a quick bench comparison of the three.
With Clang 12.0, (1) was fastest:
Clang results here
However, with GCC 10.3 all three were near identical:
GCC results here
Interesting results! With GCC, it's your choice as to which version you prefer; otherwise I would go for (1) or (2).

Related

Deleting both an element and its duplicates in a Vector in C++

I've searched the Internet and known how to delete an element (with std::erase) and finding duplicates of an element to then delete it (vec.erase(std::unique(vec.begin(), vec.end()),vec.end());). But all methods only delete either an element or its duplicates.
I want to delete both.
For example, using this vector:
std::vector<int> vec = {2,3,1,5,2,2,5,1};
I want output to be:
{3}
My initial idea was:
void removeDuplicatesandElement(std::vector<int> &vec)
{
std::sort(vec.begin(), vec.end());
int passedNumber = 0; //To tell amount of number not deleted (since not duplicated)
for (int i = 0; i != vec.size(); i = passedNumber) //This is not best practice, but I tried
{
if (vec[i] == vec[i+1])
{
int ctr = 1;
for(int j = i+1; j != vec.size(); j++)
{
if (vec[j] == vec[i]) ctr++;
else break;
}
vec.erase(vec.begin()+i, vec.begin()+i+ctr);
}
else passedNumber++;
}
}
And it worked. But this code is redundant and runs at O(n^2), so I'm trying to find other ways to solve the problem (maybe an STL function that I've never heard of, or just improve the code).
Something like this, perhaps:
void removeDuplicatesandElement(std::vector<int> &vec) {
if (vec.size() <= 1) return;
std::sort(vec.begin(), vec.end());
int cur_val = vec.front() - 1;
auto pred = [&](const int& val) {
if (val == cur_val) return true;
cur_val = val;
// Look ahead to the next element to see if it's a duplicate.
return &val != &vec.back() && (&val)[1] == val;
};
vec.erase(std::remove_if(vec.begin(), vec.end(), pred), vec.end());
}
Demo
This relies heavily on the fact that std::vector is guaranteed to have contiguous storage. It won't work with any other container.
You can do it using STL maps as follows:
#include <iostream>
#include <vector>
#include <unordered_map>
using namespace std;
void retainUniqueElements(vector<int> &A){
unordered_map<int, int> Cnt;
for(auto element:A) Cnt[element]++;
A.clear(); //removes all the elements of A
for(auto i:Cnt){
if(i.second == 1){ // that if the element occurs once
A.push_back(i.first); //then add it in our vector
}
}
}
int main() {
vector<int> vec = {2,3,1,5,2,2,5,1};
retainUniqueElements(vec);
for(auto i:vec){
cout << i << " ";
}
cout << "\n";
return 0;
}
Output:
3
Time Complexity of the above approach: O(n)
Space Complexity of the above approach: O(n)
From what you have searched, we can look in the vector for duplicated values, then use the Erase–remove idiom to clean up the vector.
#include <vector>
#include <algorithm>
#include <iostream>
void removeDuplicatesandElement(std::vector<int> &vec)
{
std::sort(vec.begin(), vec.end());
if (vec.size() < 2)
return;
for (int i = 0; i < vec.size() - 1;)
{
// This is for the case we emptied our vector
if (vec.size() < 2)
return;
// This heavily relies on the fact that this vector is sorted
if (vec[i] == vec[i + 1])
vec.erase(std::remove(vec.begin(), vec.end(), (int)vec[i]), vec.end());
else
i += 1;
}
// Since all duplicates are removed, the remaining elements in the vector are unique, thus the size of the vector
// But we are not returning anything or any reference, so I'm just gonna leave this here
// return vec.size()
}
int main()
{
std::vector<int> vec = {2, 3, 1, 5, 2, 2, 5, 1};
removeDuplicatesandElement(vec);
for (auto i : vec)
{
std::cout << i << " ";
}
std::cout << "\n";
return 0;
}
Output: 3
Time complexity: O(n)

Get number of same values in arrays in C++

I need a function int countDifferentNumbers(int v[], int n) which counts how many different values the array v with n entries contains.
Example:
It should return the result 3 for the array v = {1, 5, 5, 8, 1, 1} because the array contains only 3 different values.
This is how the code looks like so far:
int countDifferentNumbers(int v[], int n)
{
int counter = 0;
for(int i = 0; i < n; ++i)
{
for(int j = i; j < n; ++j)
{
if(v[i] == v[j + 1])
{
cout << "match" << endl;
counter++;
cout << v[i] << endl;
}
}
}
return counter;
}
I would appreciate an explanation of what is wrong in my function and how I need to redesign it.
Note: Unfortunately, I have not found a suitable thread for this either. All threads with my problems were solved in Java and Python languages.
Recently I see more and more answers here on SO that lead users in the wrong direction by giving bad answers.
Also, for C++, the question has already been answered in the comment by Igor Tandetnik, and that should finally be used.
But let me answer the question of the OP as asked. What is wrong with my function? OK, there are several aspects. Let us first look at the style.
You have 0 lines of comments, so the code quality is 0. If you would write comments, then you would already find most bugs by yourself, because then, you need to explain your own wrong statements.
Then please see your source code with my amendments. I added the problems as comment.
// This is just a dumped function and not a minimum reproducible example
// All header files are messing
// Obviously "using namespace std;" was used that should NEVER be done
// The function should retrun an unsigned value, best size_t, because a count can never be negative
// Same for n, that is the size of an array. Can also never be negative
// C-sytle arrays should NEVER be used in C++. NEVER. Use std::vector or std::array instead
int countDifferentNumbers(int v[], int n)
{
int counter = 0; // Now in C++ we can use braced initialzation instead of assignement
for (int i = 0; i < n; ++i)
{
for (int j = i; j < n; ++j)
{
if (v[i] == v[j + 1]) // Accessing out of bounds element
{
cout << "match" << endl; // Now endl needed here. Can all be done in one cout statement in one line
counter++; // Always counting up the same counter for all kind of double numbers.
cout << v[i] << endl;
}
}
}
return counter;
That was one point of the answer. But now the second point. Evene more important. The algorithm or the design is wrong. And finding the correct solution, this thinking before codingt, you need to do, before you write any line of code.
You obviously want to find the count of unique numbers in an array.
Then you could look what is already there on Stackoverflow. You would probaly find 20 answers already that coud give you a hint.
You could use std::unique. Please see here for a description. This function sounds like it does what you want, right? Some example implementation:
#include <iostream>
#include <unordered_map>
#include <vector>
#include <algorithm>
// If you want to keep the original data, remove the reference-specifier &
size_t countDifferentNumbers(std::vector<int>& v) {
std::sort(v.begin(), v.end()); // Sorting is precondition for std::unique
v.erase(std::unique(v.begin(), v.end()), v.end()); // Erase all non-unique elements
return v.size(); // Return the result
}
int main() {
std::vector test{ 1, 5, 5, 8, 1, 1 }; // Some test data
std::cout << countDifferentNumbers(test) << '\n'; // SHow result to user
return 0;
}
Then, we could count the occurence of each number in a std::map or std::unordered_map. And the number of counters will be the result. Example:
#include <iostream>
#include <unordered_map>
#include <vector>
#include <algorithm>
// If you want to keep the original data, remove the reference-specifier &
size_t countDifferentNumbers(std::vector<int>& v) {
std::unordered_map<int, size_t> counter{}; // Here we will count all occurences of different numbers
for (const int i : v) counter[i]++; // Iterate over vector and count different numbers
return counter.size(); // Count of different numbers
}
int main() {
std::vector test{ 1, 5, 5, 8, 1, 1 }; // Some test data
std::cout << countDifferentNumbers(test) << '\n'; // Show result to user
return 0;
}
But, then, thinking further, about what conatiners we could use, we will find out the answer from Igor Tandetnik. There are 2 containers that can hold unique values only. No double values. And these are: std::set and std::unordered_set., So, we can simply copy the data into one of those containers, and, only unique values will be stored there.
There are many ways to get the data into a set. But the simplest one is to use its range constructor. Then, we have unique elements, and, the containers size function will give the result:
See here: Constructor Number 2.
The result will be a function with one line like this
#include <iostream>
#include <unordered_set>
#include <vector>
// If you want to keep the original data, remove the reference-specifier &
size_t countDifferentNumbers(std::vector<int>& v) {
return std::unordered_set<int>(v.begin(), v.end()).size();
}
int main() {
std::vector test{ 1, 5, 5, 8, 1, 1 }; // Some test data
std::cout << countDifferentNumbers(test) << '\n'; // Show result to user
return 0;
}
And since functions with one line are often not so usefull, we can also write the final solution:
#include <iostream>
#include <unordered_set>
#include <vector>
int main() {
std::vector test{ 1, 5, 5, 8, 1, 1 }; // Some test data
std::cout << std::unordered_set<int>(test.begin(), test.end()).size() << '\n'; // Show result to user
return 0;
}
So, by analyzing the problem and choosing the right algorithm and container and using C++, we come to the most easy solution.
Please enable C++17 for your compiler.
first sort the array v. if n >0 then initially there must be one number which is unique so just increment the value of counter once. then with loop check if the two consecutive number are same or not. if same do nothing else increment the value of counter.
if you are writing code in c then use qsort. #include <stdlib.h> add this in header and. use qsort() func
here is the code:
#include <bits/stdc++.h>
using namespace std;
int countDifferentNumbers(int v[] , int n)
{
int counter = 0;
sort(v, v+ n); // if you are writing code in c then just write a decent sort algorithm.
if (n>0 ){
printf("%d\n", v[0]);
counter ++;
}
for(int i = 0; i < n-1; ++i)
{
if(v[i] == v[i+1]){
continue;
} else {
printf("%d\n", v[i+1]);
counter++;
}
}
return counter;
}
int main()
{
int v[] = {1, 5, 5, 8, 1, 1};
int result = countDifferentNumbers(v,6);
printf("unique number %d", result );
return 0;
}

Minimum Swaps 2 - minimum number of swaps required to sort a vector in ascending order

I'm doing a fairly easy HackerRank test which asks the user to write a function which returns the minimum number of swaps needed to sort an unordered vector in ascending order, e.g.
Start: 1, 2, 5, 4, 3
End: 1, 2, 3, 4, 5
Minimum number of swaps: 1
I've written a function which works on 13/14 test cases, but is too slow for the final case.
#include<iostream>
#include<vector>
using namespace std;
int mimumumSwaps(vector<int> arr) {
int p = 0; // Represents the (index + 1) of arr, e.g. 1, 2, ..., arr.size() + 1
int swaps = 0;
for (vector<int>::iterator i = arr.begin(); i != arr.end(); ++i) {
p++;
if (*i == p) // Element is in the correct place
continue;
else{ // Iterate through the rest of arr until the correct element is found
for (vector<int>::iterator j = arr.begin() + p - 1; j != arr.end(); ++j) {
if (*j == p) {
// Swap the elements
double temp = *j;
*j = *i;
*i = temp;
swaps++;
break;
}
}
}
}
return swaps;
}
int main()
{
vector<int> arr = { 1, 2, 5, 4, 3 };
cout << mimumumSwaps(arr);
}
How would I speed this up further?
Are there any functions I could import which could speed up processes for me?
Is there a way to do this without actually swapping any elements and simply working out the min. swaps which I imagine would speed up the process time?
All permutations can be broken down into cyclic subsets. Find said subsets.
Rotating a subset of K elements by 1 takes K-1 swaps.
Walk array until you find an element out of place. Walk that cycle until it completes. Advance, skipping elements that you've put into a cycle already. Sum (size-1) for each cycle.
To skip, maintain an ordered or unordered set of unexamined items, and fast remove as you examine them.
I think that gives optimal swap count in O(n lg n) or so.
#include <bits/stdc++.h>
#include <vector>
#include <algorithm>
using namespace std;
int minimumSwaps(vector<int> arr)
{
int i,c,j,k,l;
j=c=0;
l=k=arr.size();
while (j<k)
{
i=0;
while (i<l)
{
if (arr[i]!=i+1)
{
swap(arr[i],arr[arr[i]-1]);
c++;
}
i++;
}
k=k/2;
j++;
}
return c;
}
int main()
{
int n,q;
cin >> n;
vector<int> arr;
for (int i = 0; i < n; i++)
{
cin>>q;
arr.push_back(q);
}
int res = minimumSwaps(arr);
cout << res << "\n";
return 0;
}

Keep the duplicated values only - Vectors C++

Assume I have a vector with the following elements {1, 1, 2, 3, 3, 4}
I want to write a program with c++ code to remove the unique values and keep only the duplicated once. So the end result will be something like this {1,3}.
So far this is what I've done, but it takes a lot of time,
Is there any way this can be more efficient,
vector <int> g1 = {1,1,2,3,3,4}
vector <int> g2;
for(int i = 0; i < g1.size(); i++)
{
if(count(g1.begin(), g1.end(), g1[i]) > 1)
g2.push_back(g1[i]);
}
v.erase(std::unique(g2.begin(), g2.end()), g2.end());
for(int i = 0; i < g2.size(); i++)
{
cout << g2[i];
}
My approach is to create an <algorithm>-style template, and use an unordered_map to do the counting. This means you only iterate over the input list once, and the time complexity is O(n). It does use O(n) extra memory though, and isn't particularly cache-friendly. Also this does assume that the type in the input is hashable.
#include <algorithm>
#include <iostream>
#include <iterator>
#include <unordered_map>
template <typename InputIt, typename OutputIt>
OutputIt copy_duplicates(
InputIt first,
InputIt last,
OutputIt d_first)
{
std::unordered_map<typename std::iterator_traits<InputIt>::value_type,
std::size_t> seen;
for ( ; first != last; ++first) {
if ( 2 == ++seen[*first] ) {
// only output on the second time of seeing a value
*d_first = *first;
++d_first;
}
}
return d_first;
}
int main()
{
int i[] = {1, 2, 3, 1, 1, 3, 5}; // print 1, 3,
// ^ ^
copy_duplicates(std::begin(i), std::end(i),
std::ostream_iterator<int>(std::cout, ", "));
}
This can output to any kind of iterator. There are special iterators you can use that when written to will insert the value into a container.
Here's a way that's a little more cache friendly than unordered_map answer, but is O(n log n) instead of O(n), though it does not use any extra memory and does no allocations. Additionally, the overall multiplier is probably higher, in spite of it's cache friendliness.
#include <vector>
#include <algorithm>
void only_distinct_duplicates(::std::vector<int> &v)
{
::std::sort(v.begin(), v.end());
auto output = v.begin();
auto test = v.begin();
auto run_start = v.begin();
auto const end = v.end();
for (auto test = v.begin(); test != end; ++test) {
if (*test == *run_start) {
if ((test - run_start) == 1) {
*output = *run_start;
++output;
}
} else {
run_start = test;
}
}
v.erase(output, end);
}
I've tested this, and it works. If you want a generic version that should work on any type that vector can store:
template <typename T>
void only_distinct_duplicates(::std::vector<T> &v)
{
::std::sort(v.begin(), v.end());
auto output = v.begin();
auto test = v.begin();
auto run_start = v.begin();
auto const end = v.end();
for (auto test = v.begin(); test != end; ++test) {
if (*test != *run_start) {
if ((test - run_start) > 1) {
::std::swap(*output, *run_start);
++output;
}
run_start = test;
}
}
if ((end - run_start) > 1) {
::std::swap(*output, *run_start);
++output;
}
v.erase(output, end);
}
Assuming the input vector is not sorted, the following will work and is generalized to support any vector with element type T. It will be more efficient than the other solutions proposed so far.
#include <algorithm>
#include <iostream>
#include <vector>
template<typename T>
void erase_unique_and_duplicates(std::vector<T>& v)
{
auto first{v.begin()};
std::sort(first, v.end());
while (first != v.end()) {
auto last{std::find_if(first, v.end(), [&](int i) { return i != *first; })};
if (last - first > 1) {
first = v.erase(first + 1, last);
}
else {
first = v.erase(first);
}
}
}
int main(int argc, char** argv)
{
std::vector<int> v{1, 2, 3, 4, 5, 2, 3, 4};
erase_unique_and_duplicates(v);
// The following will print '2 3 4'.
for (int i : v) {
std::cout << i << ' ';
}
std::cout << '\n';
return 0;
}
I have 2 improvements for you:
You can change your count to start at g1.begin() + i, everything before was handled by the previous iterations of the loop.
You can change the if to == 2 instead of > 1, so it will add numbers only once, independent of the occurences. If a number is 5 times in the vector, the first 3 will be ignored, the 4th will make it into the new vector and the 5th will be ignored again. So you can remove the erase of the duplicates
Example:
#include <iostream>
#include <vector>
#include <algorithm>
using namespace std;
int main() {
vector <int> g1 = {1,1,2,3,3,1,4};
vector <int> g2;
for(int i = 0; i < g1.size(); i++)
{
if(count(g1.begin() + i, g1.end(), g1[i]) == 2)
g2.push_back(g1[i]);
}
for(int i = 0; i < g2.size(); i++)
{
cout << g2[i] << " ";
}
cout << endl;
return 0;
}
I'll borrow a principal from Python which is excellent for such operations -
You can use a dictionary where the dictionary-key is the item in the vector and the dictionary-value is the count (start with 1 and increase by one every time you encounter a value that is already in the dictionary).
afterward, create a new vector (or clear the original) with only the dictionary keys that are larger than 1.
Look up in google - std::map
Hope this helps.
In general, that task got complexity about O(n*n), that's why it appears slow. Does it have to be a vector? Is that a restriction? Must it be ordered? If not, it better to actually store values as std::map, which eliminates doubles when populated, or as a std::multimap if presence of doubles matters.

how can I find repeated elements in a vector [duplicate]

This question already has answers here:
Checking for duplicates in a vector [duplicate]
(5 answers)
Closed 9 years ago.
I have a vector of int which can include maximum 4 elements and minimum 2, for example :
std::vector<int> vectorDATA(X); // x means unknown here
What I want to do is to erase the elements that are repeated for example :
vectorDATA{1,2,2} to vectorDATA{1,2}
vectorDATA{1,2,3} to nothing changes
vectorDATA{2,2,2} to vectorDATA{2}
vectorDATA{3,2,1,3} to vectorDATA{3,2,1}
vectorDATA{1,2,1,2} to vector{1,2}
and so on
here a code simple :
cv::HoughLines(canny,lineQ,1,CV_PI/180,200);
std::cout << " line Size "<<lineQ.size()<< std::endl;
std::vector<int> linesData(lineQ.size());
std::vector<int> ::iterator it;
if(lineQ.size() <=4 && lineQ.size() !=0 ){
if(lineQ.size()==1){
break;
}else {
for ( int i = 0; i<lineQ.size();i++){
linesData[i] = lineQ[i][1]; // my comparison parameter is the lineQ[i][1]
}
// based on the answer I got I'm trying this but I really don't how to continue ?
std::sort(lineQ.begin(),lineQ.end(),[](const cv::Vec2f &a,const cv::Vec2f &b)
{
return ????
}
I tried use a for and do while loop, but I didn't get it, and the function std::adjacent_find this has a condition that the elements should be consecutive.
Maybe it's easy but I just don't get it !
thanks for any help !
The easy way is sort then unique-erase, but this changes order.
The c++11 order preserving way is to create an unordered_set<int> s; and do:
unordered_set<int> s;
vec.erase(
std::remove_if( vec.begin(),vec.end(), // remove from vector
[&](int x)->bool{
return !std::get<1>(s.insert(x)); // true iff the item was already in the set
}
),
vec.end() // erase from the end of kept elements to the end of the `vec`
);
which is the remove-erase idiom using the unordered_set to detect duplicates.
I didn't see a sort-less source code in the already mentioned answers, so here it goes. Hash table for checking duplicates, shifting unique elements towards the front of the vector, note that src is always >= dst and dst is the number of copied, i.e. unique elements at the end.
#include <unordered_set>
#include <vector>
#include <iostream>
void
uniq (std::vector<int> &a) {
std::unordered_set<int> s;
size_t dst = 0;
for (size_t src = 0; src < a.size(); ++src) {
if (s.count (a[src]) == 0) {
s.insert (a[src]);
a[dst++] = a[src];
}
}
a.resize (dst);
}
int
main () {
std::vector<int> a = { 3, 2, 1, 3, 2, 1, 2, 3, 4, 5 ,2, 3, 1, 1 };
uniq (a);
for (auto v : a)
std::cout<< v << " ";
std::cout << std::endl;
}
If you want to realy remove repeated elements, you may try something like this:
#include <iostream>
#include <algorithm>
#include <vector>
using namespace std;
int main () {
int data[] = {1,2,3,2,1};
vector<int> vectorDATA = (&data[0], &data[0] + 5);
sort(vectorDATA.begin(),vectorDATA.end());
for(int i = 0; i < vectorDATA.size()-1; ++i)
{
if(vectorDATA[i] == vectorDATA[i+1])
vectorDATA.erase(vectorDATA.begin()+i+1);
}
for(int i = 0; i < vectorDATA.size();++i)
{
cout << vectorDATA[i] << " ";
}
cout << endl;
return 0;
}
Lack of of this method is then elements lost his order.