I want to import a shapefile in geoserver.
Its size is 2.2GB.
I tryed to use QGIS plugin (Geoserver Explorer- QGIS: version 2.8.6 Wien) to import but I have an error "filesize would require zip64 extensions".
As well, I tryed directly to copy the shapefile on the directory of geoserver but it doesn't work (no error message but the shapefile is not copied)
Any advice ?
Please load shape file into postgis database. After import into geoserver from postgis. Guide video import from postgis https://youtu.be/NSMM4pp1kmg
Either use the GUI to load the shapefile (see the documentation) as with any other datasource, or use the REST API as described here.
Related
After updated the azure blob version, it won't install everything. It will install specific version of namespace in the blob. So i installed the following packages.
azure-common==1.1.25
azure-core==1.8.1
azure-nspkg==3.0.2
azure-storage-blob==12.4.0
i'm not able to upload the blob. All the references in the SO and other platforms are for old versions. For new versions there is no reference. The error i'm getting is
from azure.storage.blob import BlobPermissions, ContentSettings
ImportError: cannot import name 'BlobPermissions' from 'azure.storage.blob'
if i manually went to the path and removed the BlobPermissions from import compilation is happening. But upload is not happening, upload time i'm getting this error
connection_string=self.connection_string)
endpoint_suffix=self.endpoint_suffix)
TypeError: __init__() got an unexpected keyword argument 'token_credential'
Can anyone help me with proper doc for django azure upload with new version. The references i got in SO is manual upload way.
Some reference i got in SO:
Upload and Delete Azure Storage Blob using azure-storage-blob or azure-storage
ImportError: cannot import name 'BlobService' when using Azure Backend
ImportError: cannot import name 'BlobPermissions' from 'azure.storage.blob'
BlobPermissions is used for the older version. It has replaced with BlobSasPermissions for the new version.
It seems that django.core.files.storage is not supported by the latest version(3.1). So you could use the older version(e.g. 2.1) to upload file using django, or just use azure-sdk.
With older version:
from django.core.files.storage import default_storage
f = open('file.csv', 'rb')
default_storage.save(path, f)
With Azure SDK:
from azure.storage.blob import BlobClient
blob = BlobClient.from_connection_string(conn_str="<connection_string>", container_name="my_container", blob_name="my_blob")
with open("file.csv", "rb") as data:
blob.upload_blob(data)
I created an Odoo Module in Python using the Python library ujson.
I installed this library on my development server manually with pip install ujson.
Now I want to install the Module on my live server. Can I somehow tell the Odoo Module to install the ujson library when it is installed? So I just have to add the Module to my addons path and install it via the Odoo Web Interface?
Another reason to have this automated would be if I like to share my custom module, so others don't have to install the library manually on their server.
Any suggestions how to configure my Module that way? Or should I just include the library's directory in my module?
You should try-except the import to handle problems on odoo server start:
try:
from external_dependency import ClassA
except ImportError:
pass
And for other users of your module, extend the external_dependencies in your module manifest (v9 and less: __openerp__.py; v10+: __manifest__.py), which will prompt a warning on installation:
"external_dependencies": {
'python': ['external_dependency']
},
Big thanks goes to Ivan and his Blog
Thank you for your help, #Walid Mashal and #CZoellner, you both pointed me to the right direction.
I solved this task now with the following code added to the __init__.py of my module:
import pip
try:
import ujson
except ImportError:
print('\n There was no such module named -ujson- installed')
print('xxxxxxxxxxxxxxxx installing ujson xxxxxxxxxxxxxx')
pip.main(['install', 'ujson'])
In python file using the following command, you can install it (it works for odoo only). Eg: Here I am going to install xlsxwriter
try:
import xlsxwriter
except:
os.system("pip install xlsxwriter")
import xlsxwriter
The following is the code that is used in odoo base module report in base addons inside report.py (odoo_root_folder/addons/report/models/report.py) to install wkhtmltopdf.
from openerp.tools.misc import find_in_path
import subprocess
def _get_wkhtmltopdf_bin():
return find_in_path('wkhtmltopdf')
try:
process = subprocess.Popen([_get_wkhtmltopdf_bin(), '--version'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
except (OSError, IOError):
_logger.info('You need Wkhtmltopdf to print a pdf version of the reports.')
basically you need to find some python code that will run the library and install it and include that code in one of you .py files, and that should do it.
i have built an app engine application to load data into bigquery table using google app engine launcher but when I run it on local host or on the cloud i get the No module named cloud while using google.cloud import bigquery error message in log file. I have installed the google cloud client library but it is still giving me the same error. please see below the code I am using
---main.py file contains
import argparse
import time
import uuid
from google.cloud import bigquery
def load_data_from_gcs(dataset_name, table_name, source):
bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset(dataset_name)
table = dataset.table(table_name)
job_name = str(uuid.uuid4())
job = bigquery_client.load_table_from_storage(
job_name, table, source)
job.begin()
wait_for_job(job)
print('Loaded {} rows into {}:{}.'.format(
job.output_rows, dataset_name, table_name))
def wait_for_job(job):
while True:
job.reload()
if job.state == 'DONE':
if job.error_result:
raise RuntimeError(job.error_result)
return
time.sleep(1)
if __name__ == '__main__':
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('Test')
parser.add_argument('mytable')
parser.add_argument('gs://week/geninfo.csv')
args = parser.parse_args()
load_data_from_gcs(
args.dataset_name,
args.table_name,
args.source)
--app.yaml file contains the following code
application: mycloudproject
version: 1
runtime: python27
api_version: 1
threadsafe: yes
handlers:
- url: /favicon\.ico
static_files: favicon.ico
upload: favicon\.ico
- url: .*
script: main.app
Please let me know what is missing or if I am doing something wrong here?
This can be a bit tricky. Google Cloud uses the new Python namespace format (if you look at the source you'll notice that there's no __init__.py in the directory structure).
This was changed in Python 3.3 with PEP-420
Fortunately in Python 2.7 you can fix this easily by avoiding implicit imports. Just add this to the top of your file:
from __future__ import absolute_import
Hope that helps.
Find the directory containing google/cloud/..., and add that directory to the PYTHONPATH so that python can find it. See this post for details on how to add to PYTHONPATH. It outlines two common ways to do it:
Here's how to do it with a bash command:
export PYTHONPATH=$PYTHONPATH:/<path_to_modules>
Or you could append it to the path in your script:
# if the google/ directory is in the directory /path/to/directory/
path_to_look_for_module = '/path/to/directory/'
import sys
if not path_to_look_for_module in sys.path:
sys.path.append(path_to_look_for_module)
If that doesn't work, here is some code I found in one of my projects for importing Google Appengine modules:
def fixup_paths(path):
"""Adds GAE SDK path to system path and appends it to the google path
if that already exists."""
# Not all Google packages are inside namespace packages, which means
# there might be another non-namespace package named `google` already on
# the path and simply appending the App Engine SDK to the path will not
# work since the other package will get discovered and used first.
# This emulates namespace packages by first searching if a `google` package
# exists by importing it, and if so appending to its module search path.
try:
import google
google.__path__.append("{0}/google".format(path))
except ImportError:
pass
sys.path.insert(0, path)
# and then call later in your code:
fixup_paths(path_to_google_sdk)
from google.cloud import bigquery
It looks like you are trying to use the Cloud Datastore client library in a Google App Engine's standard environment. As documented in Google's documentation you should not be doing this. Instead, either use the NDB Client Library or do not use the standard environment.
Are you sure you've updated to the latest version of the library? The version installed by pip may be out of date. Previously, the module was imported as:
from gcloud import bigquery
If that works, you're running an older version. To install the latest, I'd recommend pulling from the master in the github project.
I'm using pycharm to develop appengine. Now i'm trying to use endpoints and I've put
libraries:
- name: pycrypto
version: latest
- name: endpoints
version: 1.0
and then in main.py
import endpoints
But it gives me error
No module named endpoints
I can see the endpoints folder inside the GAE library. Anyone can help?
*EDIT: it is just a matter of IDE (pycharm) cant locate endpoints. The app runs fine and okay both in dev server or cloud server. There is a picture just to make it a bit clearer:
Thanks
You need to add {GAE_SDK}/lib/endpoints-1.0, not just the SDK itself. The reason you can import google is because it is directly under {GAE_SDK}. The libraries you specify in app.yaml are laid out differently due to supporting multiple versions. I believe you also need to add {GAE_SDK}/lib/protorpc-1.0/, it's just not showing because there's already an import error.
I'm using the new version of PyCharm Community and I got to config too. You need to set the Source option on each folder like endpoints in File - Setting - Project:
I've run across the following code somewhere which fixes it for me in a client script. I'm not able to say how much of it may be unnecessary. You'd need to edit the google_appengine path for your SDK installation:
sdk_path = os.path.expanduser('~/work/google-cloud-sdk/platform/google_appengine')
try:
import google
google.__path__.append("{0}/google".format(sdk_path))
except ImportError:
pass
try:
import protorpc
protorpc.__path__.append("{0}/lib/protorpc-1.0/protorpc".format(sdk_path))
except ImportError:
pass
sys.path.append("{0}/lib/endpoints-1.0".format(sdk_path))
Not sure exactly what I broke. I have an ubuntu natty linux server, and have several virtualenvs on it. Django image upload was working fine on the dev virtualenv, so it was time to get it working in production. PIL was misbehaving there so I tried to uninstall and reinstall several times after fiddling with libjpeg dependencies and ended up following the steps here: http://littlebrain.org/2011/08/21/installing-pil-in-virtualenv-in-ubuntu/
and now image upload is broken in all virtualenvs.
The PIL setup summary says all should work:
--- TKINTER support available
--- JPEG support available
--- ZLIB (PNG/ZIP) support available
--- FREETYPE2 support available
--- LITTLECMS support available
and when I run the following test within the shell it works fine, with both JPG and PNG:
>>> import PIL
>>> import Image
>>> import _imaging
>>> i = Image.open("someimage.jpg")
>>> i
<JpegImagePlugin.JpegImageFile image mode=RGB size=600x599 at 0x9646C0C>
>>> i.load()
<PixelAccess object at 0x2b86510>
however when I try to upload images in the CMS I get the dreaded:
Upload a valid image. The file you uploaded was either not an image or a corrupted image.
Anyone have an idea what could be going wrong?
Debugging tip: add some print statements in your code (or add logging) in the place in your code where you're having problems.
import sys
print sys.path
print PIL.__file__
print your_image_object
print type(your_image_object)
Things like that. Perhaps it will pinpoint your problem
Another thought: you said you installed pip in a virtualenv. Is your virtualenv active when you run it via your webserver?