How to Implement PCA dimensionality 3D matrix in right way - pca

I have data matrix 3D (2050,19,9). then I want to do PCA dimension reduction on 3D matrices(2050,19,9). what should I do so that I have good training data after reduction dimension. below is my code of reduction.
SIDataPCA = np.array([features_2d.flatten() for features_2d in SmartInsole])
pca = PCA(n_components=89)
pca.fit(SIDataPCA )
data_pca = pca.transform(SIDataPCA )
print("original shape: ", SIDataPCA .shape)
print("transformed shape:", data_pca.shape)
below is SmartInsole look like:
[[ 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[ 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[ 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[ 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[ 0. 0. 0. 46. 35. 1. 0. 0. 0.]
[ 0. 0. 0. 2. 48. 106. 143. 72. 0.]
[ 0. 0. 0. 82. 75. 135. 147. 152. 19.]
[ 0. 0. 0. 100. 89. 130. 140. 148. 131.]
[ 0. 0. 0. 19. 45. 69. 121. 130. 132.]
[ 0. 0. 0. 0. 0. 0. 94. 113. 0.]
[ 0. 0. 0. 0. 0. 0. 66. 79. 0.]
[ 0. 0. 0. 0. 0. 38. 84. 0. 0.]
[ 0. 0. 0. 0. 0. 26. 59. 0. 0.]
[ 0. 0. 0. 2. 51. 68. 65. 0. 0.]
[ 0. 0. 0. 14. 68. 76. 38. 0. 0.]
[ 0. 0. 0. 65. 80. 101. 46. 0. 0.]
[ 0. 0. 0. 52. 64. 90. 62. 0. 0.]
[ 0. 0. 0. 14. 33. 33. 6. 0. 0.]
[ 0. 0. 0. 0. 0. 0. 0. 0. 0.]]
SmartInsole Shape : (2050,19,9)

Related

Mean each row of nonzero values and avoid RuntimeWarning and NaN as some rows are all zero

I already checked Numpy mean of nonzero values and it worked nicely. However, some rows of my matrix are all zero element. What is a good way to avoid RuntimeWarning: invalid value encountered in true_divide in this case? Also, I don't want the zero element to be replaced by Nan here.
eachPSM = np.ones([3,4])
eachPSM[0] = 0
print eachPSM
>> [[ 0. 0. 0. 0.]
[ 1. 1. 1. 1.]
[ 1. 1. 1. 1.]]
print np.true_divide(eachPSM.sum(1),(eachPSM!=0).sum(1))
>> RuntimeWarning: invalid value encountered in true_divide
[ nan 1. 1.]
With a as the input array, you could use masking -
invalid_val = np.nan # specifies mean value to be assigned for all zeros rows
out = np.full(a.shape[0],invalid_val)
count = (a!=0).sum(1)
valid_mask = count!=0
out[valid_mask] = a[valid_mask].sum(1)/count[valid_mask]
import warnings
...
with warnings.catch_warnings():
warnings.simplefilter("ignore", category=RuntimeWarning)
eachPSM[np.isnan(eachPSM)] = 0
Since anything divided by 1 is same as the numerator you can fill zero by 1 i.e
x = eachPSM.sum(1)
y = (eachPSM!=0).sum(1)
y[y==0] = 1
np.true_divide(x,y)
#array([ 0., 1., 1.])
Masked array provide elegant solutions :
eachPSM = np.ones([3,4])
eachPSM[0] = 0
eachPSM[1,1] = 0
#[[ 0. 0. 0. 0.]
# [ 1. 0. 1. 1.]
# [ 1. 1. 1. 1.]]
In [39]: np.ma.masked_equal(eachPSM,0).mean(1)
Out[39]:
masked_array(data = [-- 1.0 1.0],
mask = [ True False False],
fill_value = 1e+20)
In [40]: np.ma.masked_equal(eachPSM,0).mean(1).data
Out[40]: array([ 0., 1., 1.])

How to access individual values used by ndimage.generic_filter (scipy module)

Following code uses 'generic_filter' method, from ndimage (scipy python module), to calculate means for a sub matrix of 3x3 elements; considering each matrix element as central element (and excluding it) and avoiding border effects.
import numpy as np
from scipy import ndimage
a = np.reshape(np.arange(25),(5,5))
print a
matrix = np.array(a).astype(np.float)
mask = np.ones((3, 3))
mask[1, 1] = 0
result = ndimage.generic_filter(matrix, np.nanmean, footprint = mask, mode='constant', cval=np.NaN)
print result
Result is printed as follows:
[[ 0 1 2 3 4]
[ 5 6 7 8 9]
[10 11 12 13 14]
[15 16 17 18 19]
[20 21 22 23 24]]
[[ 4. 4. 5. 6. 6.66666667]
[ 5.6 6. 7. 8. 8.4 ]
[ 10.6 11. 12. 13. 13.4 ]
[ 15.6 16. 17. 18. 18.4 ]
[ 17.33333333 18. 19. 20. 20. ]]
and it could be corroborated that it works as expected [first mean is (5+6+1)/3 = 4, second mean is (0+6+5+7+2)/5 = 4; and it so on].
My question is how can I access to individual values (sub matrix) used by ndimage.generic_filter to calculate each np.nanmean?
ndimage.generic_filter calls an function once for each sub-array. You specify the function as the second argument. So if you change np.nanmean to a custom function func (see below) then you can have access to the sub-arrays from within func.
To accumulate a list of sub-arrays, which could be accessed after the call to ndimage.generic_filter, you could pass a list to func as an extra argument, and append the subarrays to the list inside func. You can then access that list (and its contents) afterwards:
import numpy as np
from scipy import ndimage
def func(x, subarrays):
print(x)
subarrays.append(x)
return np.nanmean(x)
a = np.reshape(np.arange(25),(5,5))
matrix = np.array(a).astype(np.float)
mask = np.ones((3, 3))
mask[1, 1] = 0
subarrays = []
result = ndimage.generic_filter(matrix, func, footprint = mask,
mode='constant', cval=np.NaN,
extra_arguments=(subarrays,))
print(result)
print(len(subarrays))
prints each subarray:
[ nan nan nan nan 1. nan 5. 6.]
[ nan nan nan 0. 2. 5. 6. 7.]
[ nan nan nan 1. 3. 6. 7. 8.]
[ nan nan nan 2. 4. 7. 8. 9.]
[ nan nan nan 3. nan 8. 9. nan]
[ nan 0. 1. nan 6. nan 10. 11.]
[ 0. 1. 2. 5. 7. 10. 11. 12.]
[ 1. 2. 3. 6. 8. 11. 12. 13.]
[ 2. 3. 4. 7. 9. 12. 13. 14.]
[ 3. 4. nan 8. nan 13. 14. nan]
[ nan 5. 6. nan 11. nan 15. 16.]
[ 5. 6. 7. 10. 12. 15. 16. 17.]
[ 6. 7. 8. 11. 13. 16. 17. 18.]
[ 7. 8. 9. 12. 14. 17. 18. 19.]
[ 8. 9. nan 13. nan 18. 19. nan]
[ nan 10. 11. nan 16. nan 20. 21.]
[ 10. 11. 12. 15. 17. 20. 21. 22.]
[ 11. 12. 13. 16. 18. 21. 22. 23.]
[ 12. 13. 14. 17. 19. 22. 23. 24.]
[ 13. 14. nan 18. nan 23. 24. nan]
[ nan 15. 16. nan 21. nan nan nan]
[ 15. 16. 17. 20. 22. nan nan nan]
[ 16. 17. 18. 21. 23. nan nan nan]
[ 17. 18. 19. 22. 24. nan nan nan]
[ 18. 19. nan 23. nan nan nan nan]
and prints the final result:
[[ 4. 4. 5. 6. 6.66666667]
[ 5.6 6. 7. 8. 8.4 ]
[ 10.6 11. 12. 13. 13.4 ]
[ 15.6 16. 17. 18. 18.4 ]
[ 17.33333333 18. 19. 20. 20. ]]
and the length of subarrays:
25

Convolution Neural Network input_shape dimension error (KERAS ,PYTHON)

I have a train dataset of the following shape: (300, 5, 720)
[[[ 6. 11. 389. ..., 0. 0. 0.]
[ 2. 0. 0. ..., 62. 0. 0.]
[ 0. 0. 18. ..., 0. 0. 0.]
[ 38. 201. 47. ..., 0. 108. 0.]
[ 0. 0. 1. ..., 0. 0. 0.]]
[[ 136. 95. 0. ..., 0. 0. 0.]
[ 85. 88. 85. ..., 0. 31. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 13. 19. 0. ..., 0. 0. 0.]]]
I am trying to pass each sample as input to the cnn model, each input is of size (5,720)
,I am using the following model in keras:
cnn = Sequential()
cnn.add(Conv2D(64, (5, 50),
padding="same",
activation="relu",data_format="channels_last",
input_shape=in_shape))
cnn.add(MaxPooling2D(pool_size=(2,2),data_format="channels_last"))
cnn.add(Flatten())
cnn.add(Dropout(0.5))
cnn.add(Dense(number_of_classes, activation="softmax"))
cnn.compile(loss="categorical_crossentropy", optimizer="adam", metrics=
['accuracy'])
cnn.fit(x_train, y_train,
batch_size=batch_size,
epochs=epochs,
validation_data=(x_test, y_test),
shuffle=True)
I am using input shape as:
rows,cols=x_train.shape[1:]
in_shape=(rows,cols,1)
but I am getting the following error:
ValueError: Error when checking model input: expected conv2d_1_input to have 4 dimensions, but got array with shape (300, 5,
720)
How can I fix this error?
This is one of the most classic errors in Convolutions in Keras. Its origin lies in the fact that when using channels_last input dimension you need to make your input to have dimension (height, width, channels) even in case when you have only one channel. So basically reshaping:
x_train = x_train.reshape((x_train.shape[0], 5, 720, 1)
should solve your problem.

Why the FCN32 output is a black image (all pixels have zero values)?

I am trying to get the output from FCN 32. I trained FCN32 with pascalcontext-fcn32-heavy.caffemodel pre-trained model. I could run for grayscale images with 5 classes. However, during inference, the output is all zero (a black image). This is inference code:
import numpy as np
from PIL import Image
import sys
import scipy.io as sio
from caffe.proto import caffe_pb2
import caffe
caffe.set_device(0)
caffe.set_mode_gpu()
# load image, subtract mean, and make dims C x H x W for Caffe
img_name='/home/ss/caffe-pascalcontext-fcn32s/dataset/Test/PNG/image-061-023.png' #+
im = Image.open(img_name)
in_ = np.array(im, dtype=np.float32)
in_ = np.expand_dims(in_, axis=0) #+
print in_.shape
#Read mean image
'''####################'''
mean_blob = caffe_pb2.BlobProto()
with open('/home/ss/caffe-pascalcontext-fcn32s/input/FCN32_mean.binaryproto') as f:
mean_blob.ParseFromString(f.read())
mean_array = np.asarray(mean_blob.data, dtype=np.float32).reshape(
(mean_blob.channels, mean_blob.height, mean_blob.width))
in_ -= mean_array
net_root = '/home/ss/caffe-pascalcontext-fcn32s'
MODEL_DEF = net_root + '/deploy.prototxt'
PRETRAINED = net_root + '/snapshot/FCN32s_train_iter_40000.caffemodel'
# load net
#net = caffe.Net('deploy.prototxt', 'snapshot/train_iter_640000.caffemodel', caffe.TEST)
net = caffe.Net(MODEL_DEF,PRETRAINED, caffe.TEST)
#net = caffe.Net('deploy.prototxt', 'snapshot_bak1/train_iter_400000.caffemodel', caffe.TEST)
# shape for input (data blob is N x C x H x W), set data
# put img to net
net.blobs['data'].reshape(1, *in_.shape) # 1: batch size, *in_.shape 3 channel ?
net.blobs['data'].data[...] = in_
# run net and take argmax for prediction
output = net.forward()
# print
def print_param(output):
# the blobs
print '--------------------------'
print 'the blobs'
for k, v in net.blobs.items():
print k, v.data.shape
# the parameters
print '--------------------------'
print 'the paramsters'
for k, v in net.params.items():
print k, v[0].data.shape
# the conv layer weights
print '--------------------------'
print 'the conv layer weights'
print net.params['conv1_1'][0].data
# the data blob
print '--------------------------'
print 'the data blob'
print net.blobs['data'].data
# the conv1_1 blob
print '--------------------------'
print 'the conv1_1 blob'
print net.blobs['conv1_1'].data
# the pool1 blob
print '--------------------------'
print 'the pool1 blob'
print net.blobs['pool1'].data
weights = net.blobs['fc6'].data[0]
print 'blobs fc6'
print np.unique(weights)
weights = net.blobs['fc7'].data[0]
print 'blobs fc7'
print np.unique(weights)
weights = net.blobs['score_fr_sign'].data[0]
print 'blobs score_fr_sign'
print np.unique(weights)
weights = net.blobs['upscore_sign'].data[0]
print 'blobs upscore_sign'
print np.unique(weights)
weights = net.blobs['score'].data[0]
print weights.shape #+
sio.savemat('scores.mat',{'weights':weights}) #+
print 'blobs score'
print np.unique(weights)
print_param(output)
out = net.blobs['score'].data[0].argmax(axis=0)
print out #+
#np.savetxt("vote", out, fmt="%02d")
np.savetxt("vote", out, fmt="%d")
print im.height
print im.width
print out.shape, len(out.shape)
def array2img(out):
out1 = np.array(out, np.unit8)
img = Image.fromarray(out1,'L')
for x in range(img.size[0]):
for y in range(img.size[1]):
if not img.getpixel((x, y)) == 0:
print 'PLz', str(img.getpixel((x, y)))
img.show()
def show_pred_img(file_name):
file = open(file_name, 'r')
lines = file.read().split('\n')
#img_name = str(sys.argv[1])
im = Image.open(img_name)
im_pixel = im.load()
img = Image.new('RGB', im.size, "black")
pixels = img.load()
w, h = 0, 0
for l in lines:
w = 0
if len(l) > 0:
word = l.split(' ')
for x in word:
if int(x) == 1:
pixels[w, h] = im_pixel[w, h]
w += 1
h += 1
print im.size
#img.show()
img.save(img_name+'_result.png')
show_pred_img('vote')
This the log information of inference:
the blobs
data (1, 1, 256, 256)
data_input_0_split_0 (1, 1, 256, 256)
data_input_0_split_1 (1, 1, 256, 256)
conv1_1 (1, 64, 454, 454)
conv1_2 (1, 64, 454, 454)
pool1 (1, 64, 227, 227)
conv2_1 (1, 128, 227, 227)
conv2_2 (1, 128, 227, 227)
pool2 (1, 128, 114, 114)
conv3_1 (1, 256, 114, 114)
conv3_2 (1, 256, 114, 114)
conv3_3 (1, 256, 114, 114)
pool3 (1, 256, 57, 57)
conv4_1 (1, 512, 57, 57)
conv4_2 (1, 512, 57, 57)
conv4_3 (1, 512, 57, 57)
pool4 (1, 512, 29, 29)
conv5_1 (1, 512, 29, 29)
conv5_2 (1, 512, 29, 29)
conv5_3 (1, 512, 29, 29)
pool5 (1, 512, 15, 15)
fc6 (1, 4096, 9, 9)
fc7 (1, 4096, 9, 9)
score_fr_sign (1, 5, 9, 9)
upscore_sign (1, 5, 320, 320)
score (1, 5, 256, 256)
--------------------------
the paramsters
conv1_1 (64, 1, 3, 3)
conv1_2 (64, 64, 3, 3)
conv2_1 (128, 64, 3, 3)
conv2_2 (128, 128, 3, 3)
conv3_1 (256, 128, 3, 3)
conv3_2 (256, 256, 3, 3)
conv3_3 (256, 256, 3, 3)
conv4_1 (512, 256, 3, 3)
conv4_2 (512, 512, 3, 3)
conv4_3 (512, 512, 3, 3)
conv5_1 (512, 512, 3, 3)
conv5_2 (512, 512, 3, 3)
conv5_3 (512, 512, 3, 3)
fc6 (4096, 512, 7, 7)
fc7 (4096, 4096, 1, 1)
score_fr_sign (5, 4096, 1, 1)
upscore_sign (5, 1, 64, 64)
--------------------------
the conv layer weights
[[[[ 0. 0. 0.]
[ 0. 0. 0.]
[ 0. 0. 0.]]]
...
.
.
.
[[[ 0. 0. 0.]
[ 0. 0. 0.]
[ 0. 0. 0.]]]]
--------------------------
the data blob
[[[[ 29.32040787 20.31391525 20.30148506 ..., 10.41113186 11.42486095
6.42949915]
[ 33.32374954 21.31280136 22.30037117 ..., 9.40779209 10.42189217
8.43079758]
[ 36.32300568 25.30816269 25.29183578 ..., 10.40148449 11.41818142
10.42838573]
...,
[ 34.64990616 31.65658569 30.65714264 ..., 4. 2.99981451
0.99962896]
[ 39.65788651 33.65769958 29.65974045 ..., 5.99981451 4.99944353
0.99888682]
[ 41.6641922 34.66493607 30.66567802 ..., 5.99962902 2.99907231
3.99833035]]]]
--------------------------
the conv1_1 blob
[[[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
...,
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]]]
--------------------------
the pool1 blob
[[[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
...,
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]
[[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]]]]
blobs fc6
[ 0.]
blobs fc7
[ 0.]
blobs score_fr_sign
[-1.61920226 -1.34294271 0.07809996 0.60521388 2.2788291 ]
blobs upscore_sign
[-1.61920238 -1.61920226 -1.61920214 ..., 2.27882886 2.2788291
2.27882934]
(5, 256, 256)
blobs score
[-1.61920238 -1.61920226 -1.61920214 -1.59390223 -1.59390211 -1.5689975
-1.54330218 -1.54330206 -1.51918805 -1.49270213 -1.49270201 -1.4709599
-1.46937859 -1.44210207 -1.44210196 -1.42273164 -1.41956913 -1.39150202
-1.3915019 -1.37608469 -1.37450349 -1.36975968 -1.34294283 -1.34294271
-1.3429426 -1.34090197 -1.34090185 -1.32943773 -1.32627523 -1.32195926
-1.31995022 -1.30130363 -1.2903018 -1.28437209 -1.2827909 -1.27999234
-1.27999222 -1.27804708 -1.27014089 -1.25999236 -1.23970175 -1.23930645
-1.23802543 -1.23802531 -1.23614395 -1.22981894 -1.22033143 -1.21999264
-1.21868122 -1.19605839 -1.19605827 -1.195822 -1.19424069 -1.18949699
-1.1891017 -1.18910158 -1.18159068 -1.17999291 -1.17736995 -1.17052197
-1.15409136 -1.15233755 -1.14917505 -1.14285004 -1.14130461 -1.13999307
-1.13850164 -1.13850152 -1.13605869 -1.13336253 -1.12071252 -1.11212444
-1.11043441 -1.1088531 -1.10410941 -1.10261631 -1.09999335 -1.09620309
-1.09474754 -1.08790159 -1.08790147 -1.08513427 -1.07090306 -1.07015753
-1.07015741 -1.06853116 -1.06536865 -1.06523943 -1.06392801 -1.05999362
-1.05904365 -1.05343628 -1.04955614 -1.03730154 -1.03730142 -1.03690612
-1.02820921 -1.02819049 -1.02786267 -1.02662802 -1.02523971 -1.0218842
-1.02109361 -1.0199939 -1.013978 -1.01212502 -1.00290918 -0.99179727
-0.99048585 -0.98867792 -0.98788732 -0.98670143 -0.98670137 -0.9865514
-0.98622358 -0.98622352 -0.98472482 -0.97999406 -0.97839981 -0.97128415
-0.97081381 -0.9689123 -0.95626229 -0.95573193 -0.95310903 -0.94914663
-0.94786316 -0.94756538 -0.9442566 -0.94425654 -0.94282162 -0.94044977
-0.93999434 -0.93491536 -0.92950261 -0.9238466 -0.92097807 -0.91966659
-0.9157322 -0.91040593 -0.90961534 -0.90917486 -0.90724343 -0.90228963
-0.90091842 -0.89999455 -0.89143091 -0.88819134 -0.88622415 -0.88360125
-0.8787809 -0.87835538 -0.87324655 -0.8716653 -0.87048656 -0.86692154
-0.86032271 -0.86032265 -0.85999483 -0.85901529 -0.85278171 -0.85147029
-0.84794647 -0.84753585 -0.84688014 -0.8409785 -0.83608711 -0.8329246
-0.83179826 -0.8265996 -0.81999505 -0.81933933 -0.81835574 -0.81835568
-0.81711209 -0.81671637 -0.81147051 -0.80556893 -0.80360168 -0.80050892
-0.79892766 -0.79418391 -0.79310995 -0.78720838 -0.78627765 -0.7858969
-0.78196251 -0.77999532 -0.77540517 -0.76622486 -0.76493073 -0.76176822
-0.75544322 -0.75507742 -0.75442165 -0.75245446 -0.7472086 -0.73933983
-0.73093385 -0.72935259 -0.72884804 -0.72460884 -0.72425795 -0.72294647
-0.71901208 -0.71245474 -0.70327443 -0.69693691 -0.6937744 -0.69343841
-0.69081551 -0.68556964 -0.67770082 -0.66452122 -0.66393042 -0.66293997
-0.66261894 -0.65868455 -0.65212721 -0.63442242 -0.63210559 -0.63179946
-0.6265536 -0.60622585 -0.60491437 -0.60127115 -0.60097998 -0.57802927
-0.57540637 -0.55114424 -0.54983276 -0.52425915 -0.49868551 0.02900147
0.03048873 0.03197598 0.03205225 0.03346324 0.03361578 0.03495049
0.0351793 0.03525557 0.03643775 0.03674283 0.03689536 0.037925
0.03830635 0.03853516 0.03861143 0.03941226 0.03986987 0.04017495
0.04032749 0.04089952 0.0414334 0.04181475 0.04204356 0.04211983
0.04238677 0.04299692 0.04345454 0.04375962 0.04387403 0.04391216
0.04456045 0.04509434 0.04536128 0.04547568 0.04570449 0.04578076
0.04612397 0.04673413 0.04684854 0.04719175 0.04749683 0.04759216
0.04764936 0.0476875 0.04837392 0.04890781 0.04925102 0.04928916
0.04951797 0.04959423 0.05001372 0.05003278 0.05003279 0.05062388
0.05108149 0.05138657 0.05153911 0.05165351 0.05233994 0.05247341
0.05247341 0.05287382 0.05325517 0.05348398 0.05356025 0.054056
0.05466616 0.05491403 0.05491403 0.05512378 0.05542885 0.05558139
0.05645849 0.05699238 0.05735466 0.05735466 0.05737372 0.05760253
0.0576788 0.05886098 0.05931859 0.05962367 0.05977621 0.05979528
0.05979528 0.06126347 0.06164481 0.06187363 0.06194989 0.0622359
0.06223591 0.06366596 0.06397104 0.06412357 0.06467653 0.06606845
0.06629726 0.06637353 0.06711715 0.06847093 0.06862348 0.06955777
0.06955778 0.07087342 0.0709497 0.0719984 0.0719984 0.07327592
0.07443902 0.07443903 0.0756784 0.07687964 0.07687965 0.07809995
0.07809996 0.07809997 0.22473885 0.23626392 0.24778898 0.24838002
0.25931406 0.26049611 0.27083912 0.27261221 0.27320322 0.28236419
0.28472832 0.28591037 0.29388925 0.29684439 0.29861748 0.29920852
0.30541432 0.3089605 0.31132463 0.31250668 0.31693938 0.3210766
0.32403174 0.32580483 0.32639587 0.32846448 0.33319271 0.33673888
0.33910298 0.33998954 0.34028506 0.34530881 0.349446 0.35151461
0.35240114 0.35417423 0.35476527 0.35742489 0.36215314 0.36303967
0.36569929 0.36806342 0.36880219 0.36880222 0.36924547 0.36954099
0.37486026 0.37899747 0.38165709 0.38195261 0.3837257 0.38431671
0.38756737 0.38771513 0.38771516 0.39229563 0.39584181 0.39820591
0.39938796 0.40027452 0.40559378 0.40662807 0.40973097 0.41268614
0.4144592 0.41505024 0.41889194 0.42362016 0.42554098 0.42554101
0.42716634 0.42953047 0.43071252 0.43750936 0.44164655 0.44445392
0.44445395 0.44460171 0.44637477 0.44696581 0.45612678 0.45967296
0.46203706 0.46321911 0.46336687 0.4633669 0.4747442 0.47769934
0.47947243 0.48006344 0.48227981 0.48227984 0.49336162 0.49572572
0.49690777 0.50119275 0.51197904 0.5137521 0.51434314 0.52010566
0.52010572 0.53059644 0.53177851 0.53901857 0.53901863 0.54921389
0.54980487 0.55793154 0.56783128 0.57684445 0.57684451 0.58644873
0.59575737 0.59575742 0.60521382 0.60521388 0.60521394 0.84621561
0.88961124 0.93300694 0.93523234 0.97640258 0.98085344 1.01979828
1.02647448 1.02869999 1.06319392 1.07209563 1.07654643 1.10658967
1.11771667 1.12439299 1.12661839 1.14998531 1.16333783 1.17223942
1.17669034 1.19338095 1.20895886 1.22008598 1.22676229 1.22898769
1.23677659 1.25458002 1.26793253 1.27683413 1.28017235 1.28128505
1.30020106 1.31577897 1.32356799 1.32690609 1.3335824 1.3358078
1.34582222 1.36362553 1.36696362 1.37697804 1.38587976 1.38866138
1.3886615 1.39033055 1.39144325 1.41147208 1.42704999 1.43706429
1.43817711 1.44485331 1.4470787 1.45931852 1.45987487 1.45987499
1.47712183 1.49047434 1.49937606 1.50382698 1.50716507 1.52719378
1.53108823 1.53108835 1.5427717 1.55389881 1.56057513 1.56280053
1.57726574 1.59506905 1.6023016 1.60230172 1.60842156 1.61732328
1.62177408 1.6473664 1.66294444 1.67351508 1.6735152 1.67407143
1.68074775 1.68297315 1.71746719 1.7308197 1.7397213 1.74417222
1.74472845 1.74472857 1.78756785 1.79869497 1.80537117 1.80759656
1.81594181 1.81594193 1.81594205 1.85766852 1.86657023 1.87102103
1.88715529 1.88715541 1.9277693 1.9344455 1.9366709 1.95836878
1.99786997 2.00232077 2.02958202 2.02958226 2.06797075 2.07019615
2.10079551 2.10079575 2.1380713 2.17200899 2.20817208 2.24322224
2.24322248 2.27882886 2.2788291 2.27882934]
256
256
(256, 256) 2
(256, 256)
I have two major questions:
I am wondering why the output is black? and
How can I know when to stop running the algorithm (i.e., iteration
number)? I really do not know what is the optimum iteration number and
loss value that I can stop fine tuning in that stage. I stopped
training in 40,000 iterations, I have no idea about this.
Is it necessary that the result of segmentation be a grayscale image
as well (like input), or creating RGB result image does not make any
difference in the output?
I really do not know how much I am doing the right way. Quite CONFUSED :(
Does anyone have any suggestion? I really appreciate your help.
Make sure that your labels' data type is uint8! I had the same problem!
Also make sure that you have weight filler like below in your prototxt before training!
layer {
name: "myupscore2"
type: "Deconvolution"
bottom: "myscore_fr"
top: "myupscore2"
param {
lr_mult: 5
}
convolution_param {
group :2
num_output: 2
weight_filler: { type: "bilinear" }
bias_term: false
kernel_size: 4
stride: 16
}
}
good luck!
Yes, it usually depends on your image size! Have you checked your data type? Both your images and groundtruths should be uint8!
Also have you add the "group" line to your Deconv layer?
Best

How to overplot concatenated data from different Lon/Lat frames on a single map (python2.7)?

This is my first request here. I'm not very familiar with Python and I'll do my best to be understood.(Native French ;))
I'm currently working on several NetCDF files containing altimetric data.
For each NetCDF file, I'm plotting the satellite along-track with basemap, according lon/lat arrays, that differ for each file (i.e for each track).
Track 70
Track 18
I'm trying to plot all these tracks on a single map. So I've concatenated all NetCDF files and get these dimensions for all variables: (Ntracks, Npoints, Ncycles).
I'd like to keep lon/lat arrays of the first track to plot all tracks onto the map. Previously in the code, I have been doing a loop over the number of cycles to get lon/lat arrays with only two dimensions (Ntracks, Npoints)
# Define map
m = Basemap(projection='mill',lat_ts=10,\
llcrnrlon= lon[0,:].min(), \
urcrnrlon= lon[0,:].max(), \
llcrnrlat= lat[0,:].min(), \
urcrnrlat= lat[0,:].max(), \
resolution='l')
# Area
lonmin, lonmax = m.lonmin, m.lonmax
latmin, latmax = m.latmin, m.latmax
print lonmin, lonmax
print latmin, latmax
# Dimensions
Ntracks, Npoints, Ncycles = data.shape
for track in range(Ntracks):
print 'file', track, 'track', Tracks[track]
x, y = np.meshgrid( lon[track,:], lat[track,:] )
y = y.T
for c in range(Ncycles):
Data, tmp = m(*np.meshgrid( data[track,:,c], lon[0,:] ))
# plot Data on the map
# m.contourf(x, y, Data)
# plot Track position on the map
m.plot(x, y, marker= 'o', markersize= 3, color= '#1e90ff', markeredgecolor='#1e90ff')
parallels = np.arange( 45. , latmax, 5., dtype= 'int')
meridians = np.arange( lonmin, lonmax, 5., dtype= 'int')
[...]
Here is my problem:
1) Do I have to use an interpolation function from basemap ?
Here I'm just using the meshgrid function to plot the data, but it does not work apparently, and I use it with an array I do not need and inside a loop... I'd like to loop over the number of tracks (Ntracks) and the number of cycles (Ncycles) to keep all the information contained in data array, but nothing is displayed on the map.
2)
I've got a map with uncorrect lon/lat frame, in the sense that I'm forcing in the code the parallels to start at 45°N in latitude and it shows latitudes starting at 0° (see map below).
[EDIT 1]
Here is the zone for which all tracks should be plotted onto:
lonmin= -17.9882926941
lonmax= 4.60782623291
latmin= 45.
latmax= 60.7448616028
file 0 track 011
Lon(Ntracks, Npoints) <=> Lon(39, 600)
lon[0,:]=
[-17.98829269 -17.96035004 -17.9158268 -17.87090111 -17.82613754
-17.78137589 -17.73649788 -17.69123077 -17.64610481 -17.60084343
-17.55563354 -17.51044273 -17.4653244 -17.41967773 -17.37390709
-17.32869911 -17.28360939 -17.23790359 -17.19250679 -17.14686203
-17.10056877 -17.05456734 -17.00789261 -16.96157265 -16.91489029
-16.86802864 -16.82170677 -16.77542496 -16.72836304 -16.68114853
-16.63401222 -16.58691406 -16.54015732 -16.49285126 -16.44582558
-16.39826965 -16.35064316 -16.3035717 -16.25635719 -16.20870781
-16.16150856 -16.1133728 -16.06542015 -16.01705933 -15.96865177
-15.92026043 -15.87177563 -15.82310295 -15.77482033 -15.72598553
-15.67701912 -15.62811565 -15.57882786 -15.53001785 -15.48101044
-15.43181229 -15.38246822 -15.33287716 -15.28355789 -15.23418236
-15.18494129 -15.13527107 -15.0858717 -15.03561592 -14.98551655
-14.93518257 -14.88439083 -14.83401299 -14.78322983 -14.73276806
-14.68206596 -14.6312542 -14.58057022 -14.52875805 -14.47770214
-14.4268198 -14.37572861 -14.32432938 -14.27313328 -14.22089672
-14.16933823 -14.11782074 -14.06633949 -14.01482773 -13.96279335
-13.91062164 -13.8580122 -13.80557537 -13.75274754 -13.70005322
-13.64694786 -13.59441757 -13.54134083 -13.48770905 -13.43468666
-13.38124275 -13.32789707 -13.27467823 -13.22168827 -13.16738701
-13.11308289 -13.05891991 -13.00494385 -12.95111179 -12.89686108
-12.84276581 -12.78821468 -12.7339344 -12.67910194 -12.62386894
-12.56842422 -12.51327515 -12.45816803 -12.40354824 -12.34745026
-12.29209328 -12.23564339 -12.18034649 -12.12487125 -12.06906986
-12.01329899 -11.95672607 -11.8996048 -11.843297 -11.78651047
-11.73054409 -11.67354107 -11.61672497 -11.56000042 -11.50267315
-11.44525433 -11.38746643 -11.32937717 -11.27204227 -11.21484566
-11.15631771 -11.0982399 -11.0397234 -10.98154163 -10.92320538
-10.86515236 -10.80695057 -10.7475996 -10.68784046 -10.62869835
-10.56943226 -10.51098156 -10.45191383 -10.392416 -10.33296585
-10.27302551 -10.21626282 -10.16596699 -7.04195881 -6.97156954
-6.90749311 -6.84037924 -6.77310753 -6.70632839 -6.64275551
-6.5768404 -2.65448499 -2.58824921 -2.51378822 -2.43802571
-2.36123848 -2.2846365 -2.20791101 -2.13056469 -2.05158925
-1.97271824 -1.89305627 -1.81484437 -1.73559391 -1.65574718
-1.57778573 -1.49864793 -1.41998756 -1.33994663 -1.25909233
-1.17918336 -1.09941947 -1.01876879 -0.93915391 -0.85877413
-0.77846903 -0.6969139 -0.61503422 -0.53209156 -0.45091194
-0.36821485 -0.28601661 -0.20424061 -0.12195258 -0.03996643
0.0433232 0.12745281 0.21068352 0.29375839 0.37731969
0.46031842 0.54389882 0.62750047 0.71277285 0.79772025
0.88338596 0.9679296 1.05336332 1.13906097 1.22359383
1.30949378 1.39461899 1.4820528 1.56847286 1.65480137
1.7418288 1.82818389 1.9145112 2.00085664 2.08809233
2.17680693 2.26604605 2.35390067 2.44224119 2.53127289
2.6189208 2.70771909 2.79608703 2.88613439 2.97592211
3.06579161 3.15590477 3.24528027 3.33474064 3.42422748
3.51482105 3.60685039 3.69885063 3.78991985 3.88147163
3.97312284 4.06450033 4.15607595 4.24770594 4.34134626
4.43407822 4.52639294 4.60782623 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. ]
Lat(Ntracks, Npoints) <=> Lat(39, 600)
lat[0,:]=
[ 46.67876053 46.70789719 46.75389481 46.80019379 46.84618378
46.89203644 46.93796539 46.98420334 47.03012466 47.07610321
47.12192535 47.16764069 47.21319199 47.25907135 47.30500793
47.35031509 47.39530945 47.44083023 47.48605728 47.531353
47.57711792 47.62258148 47.66851807 47.71405792 47.75978851
47.80570221 47.85082626 47.89584732 47.94157028 47.98722076
48.03278732 48.07815552 48.12312698 48.16848373 48.21348572
48.25885773 48.30422974 48.34898376 48.39368439 48.43871307
48.483181 48.52843857 48.57341003 48.61867523 48.66384888
48.70891953 48.75396347 48.79903793 48.84369659 48.88874817
48.93371582 48.97858047 49.02370453 49.06818771 49.11281586
49.1574707 49.20219421 49.2469635 49.29138947 49.33580399
49.37992096 49.42437744 49.46848297 49.51322556 49.55770111
49.60223007 49.64704132 49.69140244 49.73606491 49.78027344
49.82453156 49.86888885 49.91291809 49.95780563 50.00203705
50.04593658 50.08992004 50.13394165 50.17778397 50.22238159
50.26634979 50.31006622 50.35368347 50.39723587 50.44110489
50.4848938 50.52896881 50.57278061 50.61688232 50.66064072
50.70475006 50.74821472 50.79191589 50.8360405 50.87957764
50.92328644 50.96680832 51.0102005 51.05315399 51.09710693
51.14103317 51.18455505 51.2278862 51.27103043 51.31438446
51.35745621 51.40075684 51.44378281 51.48701477 51.53056335
51.57411957 51.61739731 51.6603775 51.70298004 51.74655151
51.78942108 51.8330307 51.87560654 51.91830826 51.96100998
52.003685 52.04676819 52.09022903 52.13286972 52.17581177
52.21800613 52.26076508 52.30334854 52.34573364 52.38841248
52.43110657 52.47396469 52.51687622 52.55908966 52.6010704
52.64397049 52.68639755 52.72896576 52.77127838 52.81345749
52.85536957 52.89732742 52.93992615 52.98266983 53.02484894
53.06702805 53.10857391 53.15029907 53.19232941 53.23413849
53.27612686 53.31597137 53.35136032 55.36987305 55.41225815
55.45026016 55.49019241 55.52997208 55.56932831 55.60702133
55.64598083 57.74236298 57.77462387 57.81058502 57.84719086
57.88407516 57.92087173 57.95750427 57.99436951 58.03178406
58.06901169 58.10655212 58.14322662 58.18028259 58.217453
58.25369263 58.29024124 58.32647324 58.36317825 58.40009689
58.43651962 58.47273254 58.50924683 58.5450325 58.58111572
58.61699295 58.6533699 58.68965149 58.72631073 58.76209259
58.79837036 58.8342247 58.86991501 58.9055748 58.94097137
58.97681808 59.01285934 59.04847336 59.08379364 59.11923218
59.15425873 59.18941879 59.22439575 59.26002884 59.29529953
59.33084106 59.36565399 59.40074921 59.4357872 59.47022247
59.50508499 59.53953552 59.57476807 59.60944366 59.64393997
59.67856216 59.71277618 59.74685669 59.78082275 59.81495667
59.84958267 59.88422775 59.91826248 59.95224762 59.98641968
60.01992798 60.05369949 60.08721161 60.12124252 60.15496445
60.18862534 60.22219467 60.25540161 60.28845596 60.32141876
60.35464859 60.38824463 60.42168808 60.45466614 60.48766327
60.52061844 60.55331039 60.58587646 60.61832047 60.65141296
60.68397141 60.71630478 60.7448616 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. ]
Data(Ntracks, Npoints, Ncycles) <=> Data(39, 600, 841)
data[0,:,0]=
[-- 60.757530212402344 60.614707946777344 60.51324462890625
60.4495735168457 60.375301361083984 60.32353973388672 60.32004165649414
60.300262451171875 60.279476165771484 60.23453903198242 60.14611053466797
60.10492706298828 60.11872100830078 60.010215759277344 60.01679992675781
60.02739334106445 60.066802978515625 59.99708938598633 59.957611083984375
59.90860366821289 59.859092712402344 59.797611236572266 59.82122802734375
59.76555252075195 59.80582809448242 59.78395462036133 59.772483825683594
59.69769287109375 59.66111373901367 59.69377136230469 59.59833526611328
59.56122970581055 59.4196662902832 59.33363342285156 -- 59.148193359375
59.139976501464844 59.07219314575195 59.00056838989258 58.96044158935547
58.793479919433594 58.71685028076172 58.660919189453125 58.6155891418457
58.58014678955078 58.54222106933594 58.41620635986328 58.34292221069336
58.252777099609375 58.19034957885742 58.19147872924805 58.04945755004883
57.993141174316406 57.881595611572266 57.75284194946289 57.673988342285156
57.546085357666016 57.44636917114258 57.33456802368164 57.22673034667969
57.17810821533203 57.060855865478516 56.996437072753906 56.93328094482422
56.91613006591797 56.93352508544922 56.90220260620117 56.9296760559082
56.91029739379883 56.85015106201172 56.861961364746094 56.836082458496094
56.787139892578125 56.74370574951172 56.63985824584961 56.602230072021484
56.52391052246094 56.51660919189453 56.583072662353516 56.68135070800781
56.87864685058594 57.20947265625 57.535404205322266 57.839866638183594
58.04792785644531 58.211509704589844 58.28927230834961 58.30498123168945
58.34526824951172 58.36879348754883 58.385929107666016 58.426536560058594
58.49345779418945 58.57032012939453 58.644989013671875 58.690040588378906
58.773582458496094 58.865699768066406 58.99164581298828 59.110862731933594
59.230987548828125 59.282142639160156 59.38497543334961 59.463314056396484
59.49340057373047 59.626068115234375 59.69114303588867 59.858642578125
59.95542526245117 59.99937057495117 60.070919036865234 60.03804016113281
59.97809600830078 59.87148666381836 59.790035247802734 --
59.65596389770508 59.62194061279297 59.55290603637695 59.620635986328125
59.64723587036133 59.79397201538086 59.94666290283203 60.14747619628906
60.29004669189453 60.43260955810547 60.56108856201172 60.61063003540039
60.656070709228516 60.633262634277344 60.578800201416016 60.46556091308594
60.39137268066406 60.31590270996094 60.23550033569336 60.143409729003906
60.12107467651367 60.056251525878906 59.99232864379883 59.92224884033203
59.85647201538086 59.7867317199707 59.73897933959961 59.61933517456055
59.55458068847656 59.484127044677734 59.402469635009766 59.34931945800781
59.20806121826172 59.122562408447266 58.91756057739258 -- -- -- -- -- --
56.795066833496094 56.77033996582031 56.6777229309082 -- -- -- -- -- --
49.40351486206055 49.361656188964844 49.281063079833984 49.14778137207031
49.10255813598633 49.08881378173828 49.0178337097168 49.016841888427734
49.01139450073242 49.05029296875 49.02135467529297 49.01438522338867
48.99295425415039 48.902462005615234 48.83680725097656 48.740150451660156
48.63616943359375 48.580101013183594 48.51719284057617 48.39471435546875
48.1739616394043 47.9736213684082 47.780086517333984 47.58523941040039
47.404056549072266 47.18346405029297 47.02470397949219 46.925045013427734
46.82598114013672 46.70199203491211 46.732479095458984 46.74734878540039
46.783023834228516 46.78947448730469 46.713375091552734 46.77571105957031
46.73973846435547 46.72625732421875 46.674217224121094 46.656429290771484
46.73429489135742 46.767398834228516 46.6992301940918 46.69961929321289
46.61093521118164 46.63847351074219 46.60043716430664 46.61210632324219
46.60145568847656 46.54708480834961 46.542999267578125 46.46186828613281
46.31348419189453 46.2059326171875 46.15165710449219 46.089412689208984
46.001495361328125 45.91263961791992 45.77867126464844 45.679054260253906
45.52286148071289 45.441471099853516 45.2866325378418 45.193050384521484
45.02134704589844 44.91787338256836 44.865013122558594 44.81865692138672
44.69915771484375 44.725948333740234 44.68387222290039 44.63495635986328
44.725196838378906 44.79462432861328 44.785888671875 44.75141525268555
44.883846282958984 44.94137191772461 45.078712463378906 45.230037689208984
45.407440185546875 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0]
I forgot to specify the the map object "m" when using meshgrid ...
for track in range(Ntracks):
x, y = m(*np.meshgrid( lon[track,:], lat[track,:] ))
y = y.T
m.plot(x, y, marker= '.')
This code allows me to plot lon/lat coordinates on the map, corresponding to the tracks locations.