how to convert planar mesh to arrangement in CGAL - c++

I need to convert 2d planar polygonal meshes to 2D Arrangements in CGAL. for example if I have the following mesh in Wavefront obj format:
v -5.687006 -4.782805 0.000000
v 4.878987 -4.782805 0.000000
v -5.687006 4.782805 0.000000
v 4.878987 4.782805 0.000000
v -0.404010 -4.782805 0.000000
v -5.687006 0.000000 0.000000
v 4.878987 0.000000 0.000000
v -0.404010 4.782805 0.000000
v -0.404010 0.000000 0.000000
f 5 2 9
f 9 2 7
f 7 4 9
f 9 4 8
f 8 3 9
f 9 3 6
f 6 1 9
f 9 1 5
what is the simplest way I could convert it to a 2d Arrangement using the CGAL library?

Using the following example, you'll find out.
insert_in_face_interior for the first segment
insert_from_left_vertex or insert_from_right_vertex for the middle one, depending on the orientation of your polygon.
insert_at_vertices for the last one

Related

How can I calculate correlation between different values of a python dictionary?

I am learning python.I want to calculate correlation between values.Below is my data which is a dictionary.
My_data = {1: [1450.0, -80.0, 840.0, -220.0, 630.0, 780.0, -1140.0], 2: [1450.0, -80.0, 840.0, -220.0, 630.0, 780.0, -1140.0],3:[ 720.0, -230.0, 460.0, 220.0, 710.0, -460.0, 90.0] }
This is what I expect to have in return.
1 2 3
1 1 0.69 0.77
2 1 0.54
3 1
This is the code I tried. I get TypeError:unsupported operand type(s) for /: 'list' and 'long'
I am not sure what went wrong. I would appreciate if somebody explains me and help me get the desired solution.
my_array=np.array(My_data .values())
Correlation = np.corrcoef(my_array,my_array)
Case 1: if you are open to use pandas
Using pandas (which is a wrapper of numpy), you can porceed as follows:
In [55]: import pandas as pd
In [56]: df = pd.DataFrame.from_dict(My_data, orient='index').T
In [57]: df.corr(method='pearson')
Out[57]:
1 2 3
1 1.000000 1.000000 0.384781
2 1.000000 1.000000 0.121978
3 0.384781 0.121978 1.000000
In [58]: df.corr(method='kendall')
Out[58]:
1 2 3
1 1.000000 1.000000 0.333333
2 1.000000 1.000000 0.240385
3 0.333333 0.240385 1.000000
In [59]: df.corr(method='spearman')
Out[59]:
1 2 3
1 1.000000 1.00000 0.464286
2 1.000000 1.00000 0.327370
3 0.464286 0.32737 1.000000
In [60]:
Explanation:
The following line creates a pandas.DataFrame from the dictionary My_data
df = pd.DataFrame.from_dict(My_data, orient='index').T
Which looks like this:
In [60]: df
Out[60]:
1 2 3
0 1450.0 1450.0 720.0
1 -80.0 -80.0 -230.0
2 840.0 840.0 460.0
3 -220.0 -220.0 220.0
4 630.0 630.0 710.0
5 780.0 780.0 -460.0
6 -1140.0 -1140.0 90.0
7 NaN 450.0 -640.0
8 NaN 730.0 870.0
9 NaN -810.0 -290.0
10 NaN 390.0 -2180.0
11 NaN -220.0 -790.0
12 NaN -1640.0 65.0
13 NaN -590.0 70.0
14 NaN -145.0 460.0
15 NaN -420.0 NaN
16 NaN 620.0 NaN
17 NaN 450.0 NaN
18 NaN -90.0 NaN
19 NaN 990.0 NaN
20 NaN -705.0 NaN
then df.corr() will compute the pairwise correlation between columns.
Case 2: if you want a pure numpy solution
You need to convert your data into numpy.ndarray first, then you can compute the correlation like this,
In [91]: np.corrcoef(np.asarray(new_data.values()))
Out[91]:
array([[ 1. , 1. , 0.38478131],
[ 1. , 1. , 0.38478131],
[ 0.38478131, 0.38478131, 1. ]])
In [92]:

Pandas Dataframe: Pairwise division of columns without replacement [duplicate]

This question already has answers here:
Fastest way to calculate difference in all columns
(4 answers)
Closed 5 years ago.
I am trying to divide all columns by each column but only once (A/B but not B/A)
From Dividing each column by every other column and creating a new dataframe from the results
and thanks to #COLDSPEED, the following code performs the division of all columns by every column (and adds the corresponding new columns).
I cannot figure out how to avoid the pair duplication.
import pandas as pd
import numpy as np
np.random.seed(42)
df = pd.DataFrame(np.random.randint(0,9,size=(5, 3)), columns=list('ABC'))
ratio_df = pd.concat([df[df.columns.difference([col])].div(df[col], axis=0) \
for col in df.columns], axis=1)
print ratio_df
Which outputs:
Original dataframe
A B C
0 6 3 7
1 4 6 2
2 6 7 4
3 3 7 7
4 2 5 4
Resulting dataframe
B C A C A B
0 0.500000 1.166667 2.000000 2.333333 0.857143 0.428571
1 1.500000 0.500000 0.666667 0.333333 2.000000 3.000000
2 1.166667 0.666667 0.857143 0.571429 1.500000 1.750000
3 2.333333 2.333333 0.428571 1.000000 0.428571 1.000000
4 2.500000 2.000000 0.400000 0.800000 0.500000 1.250000
In row 0, the value for the first column B is B/A or 3/6 = 0.5 and the first column A is A/B or 6/3 = 2
I would like to keep only one result for the pair operation (eg only left column / right column).
A/B A/C B/C
0 2.000000 0.857143 0.428571
1 0.666667 2.000000 3.000000
2 0.857143 1.500000 1.750000
3 0.428571 0.428571 1.000000
4 0.400000 0.500000 1.250000
I was not able to find clues on this matter.
How could I resolve it?
Thanks!
Here's one approach -
idx0,idx1 = np.triu_indices(df.shape[1],1)
df_out = pd.DataFrame(df.iloc[:,idx0].values/df.iloc[:,idx1])
c = df.columns.values
df_out.columns = c[idx0]+'/'+c[idx1]
Sample run -
In [58]: df
Out[58]:
A B C
0 6 3 7
1 4 6 2
2 6 7 4
3 3 7 7
4 2 5 4
In [59]: df_out
Out[59]:
A/B A/C B/C
0 2.000000 0.857143 0.428571
1 0.666667 2.000000 3.000000
2 0.857143 1.500000 1.750000
3 0.428571 0.428571 1.000000
4 0.400000 0.500000 1.250000
Alternative way to get idx0 and idx1 -
from itertools import combinations
idx0,idx1 = np.array(list(combinations(range(df.shape[1]),2))).T

tu tv texture coordinates larger than 1.0 ? __ (OBJ format)

I want to understand how OBJ format dealing with texture coordinates.
example:
vt 1.000000 1.005200
vt 0.467300 1.709900
vt 0.923800 1.994400
vt 0.500000 1.002600
vt 0.371400 1.000000
vt 0.438100 2.000000
vt 0.000000 1.000000
vt 0.467300 1.709900
vt 0.105000 1.159500
vt 0.434600 1.002300
i understand values should range from (0) to (1.000000) to cover Texture image file from 0% to 100% for each X and Y. (tu tv)
But i find some values in array are above 1.000000, and sometime below 0.000000
How should i deal with these values to stay between 0 and 1 ?
This kind of texture coodinate value indicates that the texture shall be repeated. In the cases of the obj-file
vt 0.438100 2.000000
the v-part shall be repeated twice.

What's different with my lookAt and perspective calls VS gluPerspective and glLookat (cube stretched)

Side note: hey everyone, if you found my question/answer helpful, please don't forget to up vote. I kind of need it...
So there seems to be something different with my implementation of both matrix [projection and model] (other than the stuff I've commented out for debugging purposes). Below is a screenshot of the bug I see when drawing a cube. Keep in mind I do keep the viewport and matrix up to date with the window size and calculate screen ratio with float and not int, so don't bother asking, I've checked the usual suspects.....
Screen Shot
Files (linux build, see readme in ./build)
side note: while debugging, I've changed the cube's distance. To reproduce the screen shot, on line 76 of workspace.cpp set mDistance to about 90 and stretch the window frame to dimensions noted at lower right corner of the window.
Please keep in mind the screen shot and the debug text output are seperate events as I'm constantly debugging this problem and getting new numbers.
The code:
#define _AP_MAA 0
#define _AP_MAB 1
#define _AP_MAC 2
#define _AP_MAD 3
#define _AP_MBA 4
#define _AP_MBB 5
#define _AP_MBC 6
#define _AP_MBD 7
#define _AP_MCA 8
#define _AP_MCB 9
#define _AP_MCC 10
#define _AP_MCD 11
#define _AP_MDA 12
#define _AP_MDB 13
#define _AP_MDC 14
#define _AP_MDD 15
Setting up the camera perspective:
void APCamera::setPerspective(GMFloat_t fov, GMFloat_t aspect, GMFloat_t near, GMFloat_t far)
{
GMFloat_t difZ = near - far;
GMFloat_t *data;
mProjection->clear(); //set to identity matrix
data = mProjection->getData();
GMFloat_t v = 1.0f / tan(fov / 2.0f);
data[_AP_MAA] = v / aspect;
data[_AP_MBB] = v;
data[_AP_MCC] = (far + near) / (difZ);
data[_AP_MCD] = -1.0f;
data[_AP_MDD] = 0.0f;
data[_AP_MDC] = (2.0f * far * near)/ (difZ);
mRatio = aspect;
mInvProjOutdated = true;
mIsPerspective = true;
}
Setting up the camera direction:
bool APCamera::lookTo(Coordinate &to, Coordinate &from, Coordinate &up)
{
Coordinate f, unitUp, right;
GMFloat_t *data;
CoordinateOp::diff(&to, &from, &f);
VectorOp::toUnit(&f, &f);
VectorOp::toUnit(&up, &unitUp);
VectorOp::cross(&f, &unitUp, &right);
if((fabs(right.x) < FLOAT_THRESHOLD) && (fabs(right.y) < FLOAT_THRESHOLD) && (fabs(right.z) < FLOAT_THRESHOLD))
{
return false;
}
mCamPt = from;
VectorOp::toUnit(&right, &mRight);
mForward = f;
VectorOp::cross(&mRight, &mForward, &mUp);
mModelView->clear();
data = mModelView->getData();
data[_AP_MAA] = mRight.x;
data[_AP_MBA] = mRight.y;
data[_AP_MCA] = mRight.z;
data[_AP_MAB] = mUp.x;
data[_AP_MBB] = mUp.y;
data[_AP_MCB] = mUp.z;
data[_AP_MAC] = -mForward.x;
data[_AP_MBC] = -mForward.y;
data[_AP_MCC] = -mForward.z;
//translation part is commented out to narrow bugs down, "camera" is kept at the center (0,0,0)
//data[_AP_MDA] = (data[_AP_MAA] * -mCamPt.x) + (data[_AP_MBA] * -mCamPt.y) + (data[_AP_MCA] * -mCamPt.z);
//data[_AP_MDB] = (data[_AP_MAB] * -mCamPt.x) + (data[_AP_MBB] * -mCamPt.y) + (data[_AP_MCB] * -mCamPt.z);
//data[_AP_MDC] = (data[_AP_MAC] * -mCamPt.x) + (data[_AP_MBC] * -mCamPt.y) + (data[_AP_MCC] * -mCamPt.z);
mInvViewOutdated = true;
return true;
}
The debug output:
LookTo() From:<0,0,0> To:<-1,0,0>:
0.000000 0.000000 -1.000000 0.000000
0.000000 1.000000 0.000000 0.000000
1.000000 -0.000000 -0.000000 0.000000
0.000000 0.000000 0.000000 1.000000
setPerspective() fov:0.785398 ratio:1.185185 near:0.500000 far:100.000000:
2.036993 0.000000 0.000000 0.000000
0.000000 2.414213 0.000000 0.000000
0.000000 0.000000 -1.010050 -1.005025
0.000000 0.000000 -1.000000 0.000000
In the end, it looks like the trouble maker was just the FOV. So the quick answer is NO I didn't do anything different from the documented perspective and look at function. For anyone having a similar problem 2.0f * atan(tan(DEFAULT_FOV_RAD/mRatio) * mRatio) did the job for me.

Convert my list output to a dataframe in pandas

How do i convert my list output to a data frame? below is a sample of the code and data
import pandas as pd
import numpy as np
from datetime import datetime
dat=pd.read_csv()
dat.Date = dat.Date.apply(lambda d: datetime.strptime(d, "%d-%m-%Y"))
dat.index = dat.Date
dat = dat.drop(['Date'], axis=1)
################################################################
#Provide Input parameters
Decay=0.4
Decay_Dur=15 #(in days)
Return_Avg_Dur=15 #(in days)
################################################################
Weights=[]
Weights=[pow(i,((2*Decay)-1)) for i in range(1,Decay_Dur+1)] # Calculate Weights
Weights=Weights[::-1] #Reverse the order
fin_dat=[0]
for j in range(1,(dat.shape[0]-Decay_Dur)):
Sum_Weighted_Index=0
for i in range(j,Decay_Dur+j):
temp=Weights[i-j]*dat.iat[i-1,2] #
Sum_Weighted_Index+=temp
fin_dat.append(Sum_Weighted_Index)
Date SPX Index Surprise Index S&P 500 Daily Return
19-07-2007 1553.08 -0.0563 0.0045
20-07-2007 1534.1 0 -0.0122
23-07-2007 1541.57 0 0.0049
24-07-2007 1511.04 0 -0.0198
25-07-2007 1518.09 0 0.0047
26-07-2007 1482.66 0 -0.0233
27-07-2007 1458.95 0 -0.016
30-07-2007 1473.91 0 0.0103
31-07-2007 1455.27 -0.0867 -0.0126
01-08-2007 1465.81 -0.1529 0.0072
02-08-2007 1472.2 0 0.0044
03-08-2007 1433.06 -0.0848 -0.0266
06-08-2007 1467.67 0 0.0242
07-08-2007 1476.71 0 0.0062
08-08-2007 1497.49 0 0.0141
09-08-2007 1453.09 0 -0.0296
10-08-2007 1453.64 0 0.0004
13-08-2007 1452.92 0.0138 -0.0005
14-08-2007 1426.54 0 -0.0182
15-08-2007 1406.7 0 -0.0139
16-08-2007 1411.27 -0.1289 0.0032
17-08-2007 1445.94 0 0.0246
20-08-2007 1445.55 0 -0.0003
21-08-2007 1447.12 0 0.0011
22-08-2007 1464.07 0 0.0117
23-08-2007 1462.5 0 -0.0011
24-08-2007 1479.37 0 0.0115
27-08-2007 1466.79 0 -0.0085
I tried to use your code and then create new version using pandas functions.
It's all my "notes" - and result at the end.
Check whether the results are correct.
import pandas as pd
#--- generate some data ---
#dates = pd.date_range( '01-01-2010', periods=30, freq='D' )
#values = range(0,30)
#dat = pd.DataFrame( {'Date':dates, 'val1':values, 'val2':values} )
#dat.index = dat.Date
#print dat
data = '''Date SPX Surprise S&P-500
19-07-2007 1553.08 -0.0563 0.0045
20-07-2007 1534.1 0 -0.0122
23-07-2007 1541.57 0 0.0049
24-07-2007 1511.04 0 -0.0198
25-07-2007 1518.09 0 0.0047
26-07-2007 1482.66 0 -0.0233
27-07-2007 1458.95 0 -0.016
30-07-2007 1473.91 0 0.0103
31-07-2007 1455.27 -0.0867 -0.0126
01-08-2007 1465.81 -0.1529 0.0072
02-08-2007 1472.2 0 0.0044
03-08-2007 1433.06 -0.0848 -0.0266
06-08-2007 1467.67 0 0.0242
07-08-2007 1476.71 0 0.0062
08-08-2007 1497.49 0 0.0141
09-08-2007 1453.09 0 -0.0296
10-08-2007 1453.64 0 0.0004
13-08-2007 1452.92 0.0138 -0.0005
14-08-2007 1426.54 0 -0.0182
15-08-2007 1406.7 0 -0.0139
16-08-2007 1411.27 -0.1289 0.0032
17-08-2007 1445.94 0 0.0246
20-08-2007 1445.55 0 -0.0003
21-08-2007 1447.12 0 0.0011
22-08-2007 1464.07 0 0.0117
23-08-2007 1462.5 0 -0.0011
24-08-2007 1479.37 0 0.0115
27-08-2007 1466.79 0 -0.0085'''
from StringIO import StringIO
dat = pd.DataFrame.from_csv( StringIO(data), sep='\s+')
#------------------------------------------
decay = 0.4
decay_dur = 15 # (in days)
return_avg_dur = 15 # (in days)
#--- old version ---
weights = [ pow(i,(2*decay)-1) for i in range(1,decay_dur+1) ] # Calculate Weights
weights = weights[::-1] #Reverse the order
#weights = [ pow(i,(2*decay)-1) for i in range(1,decay_dur+1) ][::-1]
#fin_dat=[0]
dat['old'] = 0.0
for j in range(1,(dat.shape[0]-decay_dur)):
sum_weighted_index = 0
for i in range(j,decay_dur+j):
#sum_weighted_index += weights[i-j] * dat.iat[i-1,2] #
sum_weighted_index += weights[i-j] * dat['S&P-500'].iat[i-1] #
#fin_dat.append(sum_weighted_index)
dat['old'].iat[j] = sum_weighted_index
#print sum_weighted_index
#--- new version ---
#def sum_weighted_index(data):
# result = 0
# for w, d in zip(weights, data):
# result += w * d
# return result
def sum_weighted_index(data):
return sum( w * d for w, d in zip(weights, data) )
dat['new'] = pd.rolling_apply(dat['S&P-500'], decay_dur, sum_weighted_index).shift(-decay_dur+2).fillna(0)
print dat
result
SPX Surprise S&P-500 old new
Date
2007-07-19 1553.08 -0.0563 0.0045 0.000000 0.000000
2007-07-20 1534.10 0.0000 -0.0122 -0.010550 -0.010550
2007-07-23 1541.57 0.0000 0.0049 -0.044731 -0.044731
2007-07-24 1511.04 0.0000 -0.0198 -0.034384 -0.034384
2007-07-25 1518.09 0.0000 0.0047 -0.036309 -0.036309
2007-07-26 1482.66 0.0000 -0.0233 -0.042091 -0.042091
2007-07-27 1458.95 0.0000 -0.0160 -0.055676 -0.055676
2007-07-30 1473.91 0.0000 0.0103 -0.035502 -0.035502
2007-07-31 1455.27 -0.0867 -0.0126 -0.000058 -0.000058
2007-01-08 1465.81 -0.1529 0.0072 -0.008301 -0.008301
2007-02-08 1472.20 0.0000 0.0044 -0.000615 -0.000615
2007-03-08 1433.06 -0.0848 -0.0266 0.006442 0.006442
2007-06-08 1467.67 0.0000 0.0242 0.001076 0.001076
2007-07-08 1476.71 0.0000 0.0062 0.000000 0.027115
2007-08-08 1497.49 0.0000 0.0141 0.000000 0.002560
2007-09-08 1453.09 0.0000 -0.0296 0.000000 0.000000
2007-10-08 1453.64 0.0000 0.0004 0.000000 0.000000
2007-08-13 1452.92 0.0138 -0.0005 0.000000 0.000000
2007-08-14 1426.54 0.0000 -0.0182 0.000000 0.000000
2007-08-15 1406.70 0.0000 -0.0139 0.000000 0.000000
2007-08-16 1411.27 -0.1289 0.0032 0.000000 0.000000
2007-08-17 1445.94 0.0000 0.0246 0.000000 0.000000
2007-08-20 1445.55 0.0000 -0.0003 0.000000 0.000000
2007-08-21 1447.12 0.0000 0.0011 0.000000 0.000000
2007-08-22 1464.07 0.0000 0.0117 0.000000 0.000000
2007-08-23 1462.50 0.0000 -0.0011 0.000000 0.000000
2007-08-24 1479.37 0.0000 0.0115 0.000000 0.000000
2007-08-27 1466.79 0.0000 -0.0085 0.000000 0.000000