osmdroid: Tile server which allows bulk download - osmdroid

I am working on an application, for which I need to be able to use map data, while offline. I have looked up how to use the CacheManager and how to download tiles referencing the SampleCacheDownloader.java.
I am using the following code to download the tiles:
CacheManager cMgr = new CacheManager(map);
BoundingBox bBox = cMgr.extendedBoundsFromGeoPoints(currentGeoPoints, 16);
cMgr.downloadAreaAsync(this, bBox, 16, 21, new CacheManager.CacheManagerCallback() {
#Override
public void onTaskComplete() {
}
#Override
public void updateProgress(int progress, int currentZoomLevel, int zoomMin, int zoomMax) {
}
#Override
public void downloadStarted() {
}
#Override
public void setPossibleTilesInArea(int total) {
}
#Override
public void onTaskFailed(int errors) {
}
});
The code works, as in starting the download. But all the tile servers, which are included in the TileSourceFactory class either throw the TileSourcePolicyException "This online tile source doesn't support bulk download" (e.g. MAPNIK and WIKIMEDIA) or fail to get the tiles, because they don't exist for the set zoom levels (e.g. USGS_SAT and USGS_TOPO).
So my question is, are there other publicly available tile servers or do I have to set up my own to be able to download tiles in bulk? Or do I have the completely wrong approach to caching the tiles included in a bounding box with set zoom levels?

Related

Osmdroid 5.6, offline MBTiles

I'm trying to use MBtiles offline with osmdroid,
I took code sample from here https://github.com/osmdroid/osmdroid/blob/master/OpenStreetMapViewer/src/main/java/org/osmdroid/samplefragments/tileproviders/SampleOfflineOnly.java
But always empty map shown, is there problem with my code?
my code is:
public class OSMDroid extends AppCompatActivity {
private MapView mapView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_osmdroid);
mapView = (MapView) findViewById(R.id.map);
String name = "map.mbtiles";
File f = new File(Environment.getExternalStorageDirectory() + "/osmdroid", name);
if (f.exists()) {
try {
//ok found a file we support and have a driver for the format, for this demo, we'll just use the first one
//create the offline tile provider, it will only do offline file archives
//again using the first file
OfflineTileProvider tileProvider = new OfflineTileProvider(new SimpleRegisterReceiver(this),
new File[]{f});
//tell osmdroid to use that provider instead of the default rig which is (asserts, cache, files/archives, online
mapView.setTileProvider(tileProvider);
//this bit enables us to find out what tiles sources are available. note, that this action may take some time to run
//and should be ran asynchronously. we've put it inline for simplicity
String source = "";
IArchiveFile[] archives = tileProvider.getArchives();
if (archives.length > 0) {
//cheating a bit here, get the first archive file and ask for the tile sources names it contains
Set<String> tileSources = archives[0].getTileSources();
//presumably, this would be a great place to tell your users which tiles sources are available
if (!tileSources.isEmpty()) {
//ok good, we found at least one tile source, create a basic file based tile source using that name
//and set it. If we don't set it, osmdroid will attempt to use the default source, which is "MAPNIK",
//which probably won't match your offline tile source, unless it's MAPNIK
source = tileSources.iterator().next();
mapView.setTileSource(FileBasedTileSource.getSource(source));
} else {
mapView.setTileSource(TileSourceFactory.DEFAULT_TILE_SOURCE);
}
} else {
mapView.setTileSource(TileSourceFactory.DEFAULT_TILE_SOURCE);
}
mapView.setUseDataConnection(false);
mapView.setBuiltInZoomControls(true);
IMapController mapController = mapView.getController();
mapController.setZoom(10);
GeoPoint startPt = new GeoPoint(61.5797,51.5997);
mapController.setCenter(startPt);
mapView.invalidate();
return;
} catch (Exception ex) {
ex.printStackTrace();
}
}
}
}
MBTilesFileArchive getTileSources always returns empty string, here is implementation:
public Set<String> getTileSources(){
//the MBTiles spec doesn't store source information in it, so we can't return anything
return Collections.EMPTY_SET;
}
In order to make a offline map you should add tiles first. You can use Maperitive app to make your map tiles(zip is easier to manage than sql). Name the zip MapquestOSM. After you have done it create a folder "osmdroid" in phones memory(Directly into the internal memory or sd card) and add your map tiles in it.
Parameters of the XYTileSource changes related to the map tiles you have created. This code handles everything about map tiles itself. I hope it helps you
mapView.setUseDataConnection(false);
mapView.setTileSource(new XYTileSource("MapquestOSM", 2, 15, 256, ".png", new String[]{}));

Google Maps two finger pinch/stretch callbacks

In my map, I am trying to capture zoom in/out using ScaleGestureDetector but I am never receiving any callbacks to either of onScale or onScaleBegin or onScaleEnd.
In my Fragment's onCreateView, I initialize:
scaleGestureDetector = new ScaleGestureDetector(getContext(), new simpleOnScaleGestureListener());
And I implement the callbacks like so:
public class simpleOnScaleGestureListener extends
SimpleOnScaleGestureListener {
#Override
public boolean onScale(ScaleGestureDetector detector) {
// TODO Auto-generated method stub
startScale = detector.getScaleFactor();
Log.d(TAG, "::onScale:: " + detector.getScaleFactor());
return true;
}
#Override
public boolean onScaleBegin(ScaleGestureDetector detector) {
// TODO Auto-generated method stub
Log.d(TAG, "::onScaleBegin:: " + detector.getScaleFactor());
return true;
}
#Override
public void onScaleEnd(ScaleGestureDetector detector) {
// TODO Auto-generated method stub
Log.d(TAG, "::onScaleEnd:: " + detector.getScaleFactor());
endScale = detector.getScaleFactor();
}
Also, is it fair to assume that the callbacks will be called continuously whenever the user zooms in/out?
I was able to get past the issue of getting callbacks. Essentially, two things:
In your activity/fragment, implement, GoogleMap.OnCameraIdleListener
In onMapReady(), call mMap.setOnCameraIdleListener(this);
Hence, override onCameraIdle():
#Override
public void onCameraIdle() {
Log.i(TAG, "::onCameraIdle::" + mMap.getCameraPosition().toString());
}
to get lat/long, zoom, tilt and bearing, essentially CameraPosition.
I found a way to get radius in meters by referring to this response
VisibleRegion vr = map.getProjection().getVisibleRegion();
Location center = new Location("center");
center.setLatitude(vr.latLngBounds.getCenter().latitude);
center.setLongitude(vr.latLngBounds.getCenter().longitude);
//Location("radiusLatLng") as mentioned in google maps sample
Location farVisiblePoint = new Location("radiusLatLng");
farVisiblePoint.setLatitude(vr.farLeft.latitude);
farVisiblePoint.setLongitude(vr.farLeft.longitude);
radius = center.distanceTo(farVisiblePoint);

JavaFX - using toggle isSelected() in conditional statements

I'm building a simple sketch program using JavaFX. I want the user to be able to switch between drawing a rectangle, circle or line and I've put toggle radio buttons in the menu for these options.
Is it possible to write an if/else statement so I can write code for three different functions depending on which is selected? At the moment it will only draw lines. This is part of my code so far (sorry it's messy):
package Sketchbook;
public class Sketchbook extends Application {
final static int CANVAS_WIDTH = 800;
final static int CANVAS_HEIGHT = 600;
ColorPicker colorPicker1;
ColorPicker colorPicker2;
#Override
public void start(final Stage primaryStage) {
final Canvas canvas = new Canvas(CANVAS_WIDTH, CANVAS_HEIGHT);
final GraphicsContext graphicsContext = canvas.getGraphicsContext2D();
initDraw(graphicsContext);
canvas.addEventHandler(MouseEvent.MOUSE_PRESSED,
new EventHandler<MouseEvent>(){
#Override
public void handle(MouseEvent event) {
graphicsContext.beginPath();
graphicsContext.moveTo(event.getX(), event.getY());
graphicsContext.setStroke(colorPicker1.getValue());
graphicsContext.stroke();
}
});
canvas.addEventHandler(MouseEvent.MOUSE_DRAGGED,
new EventHandler<MouseEvent>(){
#Override
public void handle(MouseEvent event) {
graphicsContext.lineTo(event.getX(), event.getY());
graphicsContext.setStroke(colorPicker1.getValue());
graphicsContext.stroke();
}
});
canvas.addEventHandler(MouseEvent.MOUSE_RELEASED,
new EventHandler<MouseEvent>(){
#Override
public void handle(MouseEvent event) {
}
});
Group root = new Group();
ToggleGroup toggleGroup = new ToggleGroup();
RadioButton rectangle = new RadioButton("Rectangle");
RadioButton circle = new RadioButton("Circle");
RadioButton line = new RadioButton("Line");
rectangle.setSelected(true);
rectangle.setToggleGroup(toggleGroup);
circle.setToggleGroup(toggleGroup);
line.setToggleGroup(toggleGroup);
You've pretty much already described what you need to do:
canvas.addEventHandler(MouseEvent.MOUSE_DRAGGED,
new EventHandler<MouseEvent>(){
#Override
public void handle(MouseEvent event) {
if (toggleGroup.getSelectedToggle() == line) {
graphicsContext.lineTo(event.getX(), event.getY());
graphicsContext.setStroke(colorPicker1.getValue());
graphicsContext.stroke();
} else if (toggleGroup.getSelectedToggle() == rectangle) {
// etc...
} // etc...
}
}
});
Obviously you may need to reorder the code a little to make sure variables are declared and initialized before you use them.

Glass - Slow camera \ FileObserver notification - XE12 - using ACTION_IMAGE_CAPTURE

I have basically implemented the new XE12\GDK2 cameramanager sample code to capture an image on application start. However, the notification to the FileObserver callback takes anywhere from 3 to 30 seconds to get the notification of the image file creation. Taking a picture using the default 'Take a Picture' app works just fine so I dont thin it is an OS\update issue.
My app's behavior is like:
- Take the picture
- Tap to accept
Wait 3 to 30 seconds
- Get the callback and the imageview is updated with the captured image.
I dont think I have modified a single line of the sample code provided in the GDK 2.0 camera tutorial. So wondering what I am missing.
I have attached the relevant section of the code below. Any tips\pointers highly appreciated.
#Override
protected void onStart() {
super.onStart();
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
// String path = Environment.getExternalStorageDirectory().getPath();
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
}
private void processPictureWhenReady(final String picturePath) {
final File pictureFile = new File(picturePath);
if (pictureFile.exists()) {
// The picture is ready; process it. Takes 3-30 seconds to get here!
try {
Bitmap imageBitmap = BitmapFactory.decodeFile(picturePath);
int w = imageBitmap.getWidth();
int h = imageBitmap.getHeight();
Bitmap bm2 = Bitmap.createScaledBitmap(imageBitmap, w/2, h/2, true);
imageBitmap = bm2.copy(bm2.getConfig(), true);
//m_ImageView.setImageBitmap(bm2);
} catch (Exception e) {
Log.e("Exc", e.getMessage());
}
} else {
tm = System.currentTimeMillis();
// The file does not exist yet. Before starting the file observer, you
// can update your UI to let the user know that the application is
// waiting for the picture (for example, by displaying the thumbnail
// image and a progress indicator).
final File parentDirectory = pictureFile.getParentFile();
FileObserver observer = new FileObserver(parentDirectory.getPath()) {
// Protect against additional pending events after CLOSE_WRITE is
// handled.
private boolean isFileWritten;
#Override
public void onEvent(int event, String path) {
if (!isFileWritten) {
// For safety, make sure that the file that was created in
// the directory is actually the one that we're expecting.
File affectedFile = new File(parentDirectory, path);
isFileWritten = (event == FileObserver.CLOSE_WRITE
&& affectedFile.equals(pictureFile));
if (isFileWritten) {
stopWatching();
// Now that the file is ready, recursively call
// processPictureWhenReady again (on the UI thread).
runOnUiThread(new Runnable() {
#Override
public void run() {
processPictureWhenReady(picturePath);
}
});
}
}
}
};
observer.startWatching();
}
}
Answering my own question - though I got the clarifications from Jenny Murphy and John Feig :-). Hopefully it helps others.
To the first point - why is image capture using the sample code from the GDK guide so slow:
This is the expected behavior. The Glass camera intent (ACTION_IMAGE_CAPTURE) performs a ton of proprietary post-processing on the captured image - auto-HDR etc which takes time. This is cleverly disguised in the 'Take a picture' command by only displaying the preview image (which is available immediately.). As proof, try to find the image you just took in your time-line. You will not see it for several seconds (around 8 seconds on average in my experience.).
Frankly, unless you are ok just grabbing the preview image, the camera intent may not be very useful in most apps.
The solution is to use the Camera directly using default Android APIs. For convenience, I have pasted a snippet of this code. Please excuse if it is kind of basic for many of you. A lot of the code is copied from John Feig's GIFCamera glassware on GitHub
activity_main layout contains a SurfaceView called preview
<SurfaceView
android:id="#+id/preview"
android:layout_width="500dp"
android:layout_height="500dp"
android:layout_alignParentTop="true"
android:layout_marginTop="20dp"
/>
MainActivity.java
public class MainActivity extends Activity implements PhotoCallback {
public byte[] m_jpg = null;
Camera cam = null;
SurfaceHolder m_sh;
private final SurfaceHolder.Callback mSurfaceHolderCallback = new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder hldr) {
m_sh = hldr;
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
myCapHandler2(); //Start Camera Preview etc.
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
SurfaceView preview = (SurfaceView) findViewById(R.id.preview);
preview.getHolder().addCallback(mSurfaceHolderCallback);
}
public void myCapHandler2() {
//open camera
try {
cam = Camera.open(0);
Camera.Parameters params = cam.getParameters();
List<Size> sizes = params.getSupportedPreviewSizes();
params.setJpegQuality(90);
params.setPreviewFpsRange(30000, 30000);
params.setPictureSize(sizes.get(1).width, sizes.get(1).height);
params.setPreviewSize(sizes.get(1).width, sizes.get(1).height);
cam.setParameters(params);
try {
cam.setPreviewDisplay(m_sh);
}
catch (IOException e) {
e.printStackTrace();
}
// Important: Call startPreview() to start updating the preview
// surface. Preview must be started before you can take a picture.
cam.startPreview();
cam.takePicture(null, null,
new PhotoHandler(this));
} catch (Exception e) {
if (null != cam) {
cam.stopPreview();
cam.release();
}
}
}
#Override
public void pictureTaken(byte[] jpg) {
m_jpg = jpg;
//Picture captured - release the camera for other apps
cam.stopPreview();
cam.release();
}
#Override
public void onPause() {
if (null != cam) {
cam.stopPreview();
cam.release();
}
}
#Override
public void onDestroy() {
if (null != cam) {
cam.stopPreview();
cam.release();
}
}
}
PhotoHandler.java
import android.hardware.Camera;
import android.os.AsyncTask;
public class PhotoHandler implements Camera.PictureCallback {
private PhotoCallback photoCallback;
public PhotoHandler(PhotoCallback photoCallback) {
super();
this.photoCallback = photoCallback;
}
#Override
public void onPictureTaken(byte[] data, Camera camera) {
new ProcessCapturedImage().execute(data);
}
private class ProcessCapturedImage extends AsyncTask<byte[], Void, byte[]> {
#Override
protected byte[] doInBackground(byte[]... params) {
if (null == params || null == params[0])
return null;
return params[0];
}
#Override
protected void onPostExecute(byte[] params) {
photoCallback.pictureTaken(params);
}
}
}
PhotoCallback.java
public interface PhotoCallback {
public void pictureTaken(byte[] jpg);
}
All the best with your camera glassware.

LWUIT List works terribly slow

I've faced with the well-known problem in LWUIT. My list component with the checkbox renderer scrolls very slow. If to test my application on emulator it runs quite smoothly (nevertheless I see CPU utilization splashes up to 60% during scroll action), but if to run it on mobile phone it takes a couple of seconds between focus movements.
There's a code of renderer:
public class CheckBoxMultiselectRenderer extends CheckBox implements ListCellRenderer {
public CheckBoxMultiselectRenderer() {
super("");
}
//override
public void repaint() {
}
public Component getListCellRendererComponent(List list, Object value,
int index,boolean isSelected) {
Location loc = (Location)value;
setText(loc.getLocationName());
setFocus(isSelected);
setSelected(loc.isSelected());
return this;
}
public Component getListFocusComponent(List list) {
setText("");
setFocus(true);
getStyle().setBgTransparency(Consts.BG_TRANSPARENCY);
return this;
}
}
that's the code of my form containing the list:
protected void createMarkup() {
Form form = getForm();
form.setLayout(new BorderLayout());
form.setScrollable(false);
Label title = new Label("Choose location zone:");
title.getStyle().setMargin(5, 5, 0, 0);
title.getStyle().setBgTransparency(Consts.BG_TRANSPARENCY);
title.setAlignment(Component.CENTER);
form.addComponent(BorderLayout.NORTH, title);
list = new List(StateKeeper.getLocationsAsList());
list.setFixedSelection(List.FIXED_NONE_CYCLIC);
// list.setSmoothScrolling(true);
list.getStyle().setBgTransparency(0);
list.setListCellRenderer(new CheckBoxMultiselectRenderer());
list.addActionListener(new ActionListener(){
public void actionPerformed(ActionEvent ae){
// List l = (List)ae.getSource();
// l.requestFocus();
// l.setHandlesInput(true);
Location selItem = (Location)list.getSelectedItem();
selItem.setSelected(!selItem.isSelected());
}
});
form.addComponent(BorderLayout.CENTER, list);
}
I would be very thankful for any help!
We must be so carefull building lwuit List. If we have made something wrong they can work worse than expected. I recommend you to take a look on this
LWUIT Blog ListRender
You can also rewrite your paint method. You list's speed will be increased.