I have trouble with DES algorithm. How to solve this?
(I'm using DevC++ v5.11)
I don't completely understand what DES are. What should I do/try ?
// Triple DES (3DES)
void DES::inital_key(const char key[64],char ekey[16][48],bool is_crypt)
{
union{ //Error here
char pkey[56];
struct{char l[28],r[28];};
};
permute(key,pkey,_DES::perm1,56);
for(uint n=0; n<16; n++) {
lshift(l,_DES::sc[n]);
lshift(r,_DES::sc[n]);
permute(pkey,ekey[is_crypt?n:15-n],_DES::perm2,48);
}
}
/////////////////////////////////////////////////////////////////////////////
void DES::work(const char in[64],char out[64],const char key[64],bool is_crypt)
{
char ekey[16][48];
union{ //And here
char pin[64];
struct{char l[32],r[32];};
};
inital_key(key,ekey,is_crypt);
permute(in,pin,_DES::perm3,64);
for(uint n=0; n<16;) round(l,r,ekey[n++]),round(r,l,ekey[n++]);
permute(pin,out,_DES::perm6,64);
}
You are declaring anonymous structs inside of unnamed unions, just like the compiler errors say. You need to assign names to the unions (and for portability, you should name the structs, too):
void DES::inital_key(const char key[64], char ekey[16][48], bool is_crypt)
{
union {
char pkey[56];
struct { char l[28], r[28]; } s;
} u;
permute(key, u.pkey, _DES::perm1, 56);
for(uint n = 0; n < 16; n++) {
lshift(u.s.l, _DES::sc[n]);
lshift(u.s.r, _DES::sc[n]);
permute(u.pkey, ekey[is_crypt ? n : 15-n], _DES::perm2, 48);
}
}
void DES::work(const char in[64], char out[64], const char key[64], bool is_crypt)
{
char ekey[16][48];
union {
char pin[64];
struct { char l[32], r[32]; } s;
} u;
inital_key(key, ekey, is_crypt);
permute(in, u.pin, _DES::perm3, 64);
for(uint n = 0; n < 16;) round(u.s.l, u.s.r, ekey[n++]), round(u.s.r, u.s.l, ekey[n++]);
permute(u.pin, out, _DES::perm6, 64);
}
This code is relying on a Microsoft extension.
Your compiler, GCC, does not understand it.
You cannot use this code unless you make it standard C++, or switch compilers.
#include <iostream>
#include <string.h>
#include <time.h>
using namespace std;
struct MyID
{
char FirstName[10]; // array for lenight of the word.
char LastName[10]; // array for lenight of the word.
int IdNumber;
};
void InitializeArray(MyID IDNumber[], int Size);
//void SortTheArray(MyID IDNumber[], int Size);
int main(){
const int Size = 100;
MyID IDNumber[Size];
strcpy_s(IDNumber[Size].FirstName, "Aziz");
strcpy_s(IDNumber[Size].LastName, "LEGEND");
// I believe the error is around here.
InitializeArray(IDNumber, Size);
//SortTheArray(IDNumber, Size);
}
void InitializeArray(MyID IDNumber[], int Size){
//srand(time(0));
for (int i = 0; i < Size; i++){
//IDNumber[i].IdNumber = rand() %100 ;
cout<<IDNumber[i].FirstName<<endl;
IDNumber[i].LastName;
}
}
I have this problem, every time I want to test my function and struct, this error will prompt. Also, I want to see if my name will print correctly before continue to write rest program. The idea is I want to print same name every time without ask user to print name every time.
Also, I have upload the picture of result if you want to see it.
Because you are using arrays, you are experiencing buffer overrun error:
const int Size = 100;
MyID IDNumber[Size];
strcpy_s(IDNumber[Size].FirstName, "Aziz");
strcpy_s(IDNumber[Size].LastName, "LEGEND");
The expression IDNumber[Size] is equivalent to IDNumber[100].
In C++, array slot indices go from 0 to Size - 1. You are accessing one past the end of the array.
Edit 1: Initializing an array
Based on your comment, you can use a loop to initialize the slots in an array (vector):
struct Person
{
std::string first_name;
std::string last_name;
};
const unsigned int CAPACITY = 100;
int main()
{
std::vector<Person> database(CAPACITY);
Person p;
std::ostringstream name_stream;
for (unsigned int i = 0; i < CAPACITY; ++i)
{
name_stream << "Aziz" << i;
database[i].first_name = name_stream.str();
database[i].last_name = "LEGEND";
}
return 0;
}
I create a table in c++. I have a class with the table skills. I would like to get memory for the square table with function, but I don't know how I should code the constructor and the function for memory allocation.
I get the size from keyboard and I'd like to give this return statement for an other function that allocates the memory. Table must be an 2-dimensions array[][] or matrix.
#include <iostream>
using namespace std;
class Table {
unsigned int size;
public:
unsigned int GetTableSize();
unsigned int *GetMemory(unsigned int);
};
unsigned int Table::GetTableSize() {
cout << "Give size: " << endl;
cin >> size;
return size;
}
unsigned int *Table::GetMemory(unsigned int s){
s = size;
return new unsigned int[s * s];
}
int main()
{
Table tab;
tab.GetTableSize();
tab.GetMemory();
return 0;
}
*GetMemory function must return with the memory size of the table. I have problem with tab.GetMemory. I tried tab.*GetMemory as well.
tab*GetMemory: QT creator says: GetMemory is not declared.
tab.GetMemory: Qt creator says: not matching function for call 'Table::Getmemory'.
First of all your Question is not that clear, I would like to inform regarding the error you are getting in Qt.
In Your Code, You are not passing any value in the below line
"tab.GetMemory();"
How ever your function expects an unsigned integer value.
I would like to suggest you that you can modify the below code snippet as follows
Your Code:
unsigned int *Table::GetMemory(unsigned int s){
s = size;
return new unsigned int[s * s];
}
Modified Code:
unsigned int *Table::GetMemory(){
//s = size;
return new unsigned int[size * size];
}
Hope that Helps.
I wanted a heap allocated buffer common to a class (to use as a scratchpad during computations). At some point I may free and then reallocate the buffer if it is not large enough. I wanted the buffer to exist without having to call a "myclass::initialize();" in main(); I came up with the following code that compiles and works well for my purpose.
My questions are: Why does this code compile correctly? Why is malloc() allowed to be outside of main() or any other function? Is the compiler interpreting this somehow and removing the malloc?
Code compiled on linux 64bit using "g++ example.cpp" and checked with valgrind
// example.cpp
#include <cstdio>
#include <cstdlib>
class myclass {
public:
static char* pbuf; // buffer
static unsigned int length; // buffer length
const static unsigned int chunk_size; // allocation chunck size
};
// set constants and allocate buffer
const unsigned int myclass::chunk_size = sizeof(long unsigned int) * 8;
unsigned int myclass::length = chunk_size; // start with smallest chunk
char* myclass::pbuf = (char*)malloc(sizeof(char)*myclass::length);
int main() {
// write to buffer (0 to 63 on 64bit machine)
for (int i = 0; i < myclass::length; i++) {
*(myclass::pbuf+i) = i;
}
// read from buffer (print the numbers 0 to 63)
for (int i = 0; i < myclass::length; i++) {
printf("%d\n", *(myclass::pbuf+i));
}
free(myclass::pbuf); // last line of program
}
Thanks for the answers. Sound like this is more common than I thought. "Functions calls are allowed in static initializers". This leads me to a slightly modified version catching a possible malloc error:
#include <cstdio>
#include <cstdlib>
class myclass {
public:
static char* pbuf; // buffer
static unsigned int length; // buffer length
const static unsigned int chunk_size; // allocation chunck size
static void* malloc_buf(unsigned int);
};
// set constants and allocate buffer
const unsigned int myclass::chunk_size = sizeof(long unsigned int) * 8;
unsigned int myclass::length = chunk_size; // start with smallest chunk
//char* myclass::pbuf = (char*)malloc(sizeof(char)*myclass::length);
char* myclass::pbuf = (char*)myclass::malloc_buf(sizeof(char)*myclass::length);
void* myclass::malloc_buf(unsigned int N) {
void* buf = malloc(N);
if (!buf) exit(EXIT_FAILURE);
return buf;
}
int main() {
// write to buffer (0 to 63 on 64bit machine)
for (int i = 0; i < myclass::length; i++) {
*(myclass::pbuf+i) = i;
}
// read from buffer (print the numbers 0 to 63)
for (int i = 0; i < myclass::length; i++) {
printf("%d\n", *(myclass::pbuf+i));
}
free(myclass::pbuf); // last line of program
}
It's just doing static initialization (initialization before main is called). Static initializers are allowed to call functions.
main() is just another function - which is why it has such specific requirements placed on it to allow it to be called properly.
Other things can and do happen before it gets called. Static initialization among them.
I came across this code for developing a class for GA/GP but failed to understand it and hence unable debug the program.
typedef struct {
void *dataPointer;
int length;
} binary_data;
typedef struct {
organism *organisms; //This must be malloc'ed
int organismsCount;
int (*fitnessTest)(organism org);
int orgDnaLength;
unsigned int desiredFitness;
void (*progress)(unsigned int fitness);
} evolutionary_algorithm;
The above is straight forward. Then we try to initiate organism before testing their fitnness etc...
int main(int argc, char *argv[])
{
srand(time(NULL));
int i;
evolutionary_algorithm ea;
ea.progress = progressDisplayer;
ea.organismsCount = 50;
ea.orgDnaLength = sizeof(unsigned int);
organism *orgs =(organism *) malloc(sizeof(organism) * ea.organismsCount);
for (i = 0; i < 50; i++)
{
organism newOrg;
binary_data newOrgDna;
newOrgDna.dataPointer = malloc(sizeof(unsigned int));
memset(newOrgDna.dataPointer, i, 1);
newOrgDna.length = sizeof(unsigned int);
newOrg.dna = newOrgDna;
orgs[i] = newOrg;
}
As far as i understand is the memset() tries to write a binary value into that memory location void pointer (newOrgDna.dataPointer) and so on. But i cant figure how to reassemble all those binary values to get the integer value assigned to variable "dna" of newOrg so that i check the integer value assign to the an individual organism and eventually the entire population residing in the entire memory location which has been assigned to "orgs".
As you guess from above, i not very familiar memory management at this deep level of details so your help is very much appreciated.
Thank you so much
This code looks a bit strange. This line:
newOrgDna.dataPointer = malloc(sizeof(unsigned int));
will allocate probably 4 bytes (or 8 on 64 bit machines). Strange part is that memset in line just below will set only first byte.
To get actual value you might do:
char val = *((char*) newOrgDna.dataPointer);
But, as I said, this code looks a bit off. I would rewrite it as:
for (i = 0; i < 50; i++)
{
organism newOrg;
binary_data newOrgDna;
unsigned int * data = (unsigned int*) malloc(sizeof(unsigned int));
*data = i;
newOrgDna.length = sizeof(*data);
newOrgDna.data = (void*) data; // I think that cast can be dropped
newOrg.dna = newOrgDna;
orgs[i] = newOrg;
}
Then everywhere you want to get data from organism * you can do:
void f( organism * o )
{
assert( sizeof(unsigned int) == o->dna.length );
unsigned int data = *((unsigned int*) o->dna.data);
}
Also this is rather a C question not C++.