How to import a folder with BackupExec backups that BackupExec doesn't know about? - backupexec

How to import a folder with BackupExec backups that BackupExec doesn't know about?
In BackupExec you have storage defined. I have a folder with BackupExec backups in a different location, how do I tell BackupExec about them?
It is NOT from a previous version of BackupExec.

Related

How do I adapt my import statements to my folder structure in Django?

I just set up a basic folder structure for a new project, however I am doing it a bit different than the standard file structure, since my front-end is in React and will be in a separate git repository. The problem is, my virtual environment is installed in the "backend-django" project and when I try to access my package imports (that I installed in my virtual env) in backend/settings.py it does not recognize them.
As an example: I pip installed django-environ into my venv but when I go to settings.py (like I normally do) and import environ, I get a 'no module named environ' error.
This is my first foray into Django (clearly). For reference, I have previously only used Flask for Python projects. Any help is appreciated!
A mockup of my basic folder structure is here:
folder structure
I have already tried from backend-django import environ and from . import environ. I am still getting the same error.

The worker for a Google Cloud Platform task cannot find the logging library

I have created a simple task based on the Google Cloud Platform "update counter" push task example. All I want to do is log that it has been invoked to the Stackdriver logs.
from google.cloud import logging
logging_client = logging.Client()
log_name = 'service-log'
logger = logging_client.logger(log_name)
import webapp2
class UpdateCounterHandler(webapp2.RequestHandler):
def post(self):
amount = int(self.request.get('amount'))
logger.log_text('Service startup task done.')
app = webapp2.WSGIApplication([
('/update_counter', UpdateCounterHandler)
], debug=True)
After deploying this and invoking it, there is an error. In the logs online it says:
from google.cloud import logging
ImportError: No module named cloud
This isn't a local version, but one that I've deployed. It's hard for me to believe that I have to actually install python libraries into the production runtime. (I can't even imagine that I can.)
As the root readme states:
Many samples require extra libraries to be installed. If there is a requirements.txt, you will need to install the dependencies with pip.
Try adding the library as explained here.
When using logging from the Python standard library in App Engine, the logs also end up in Stackdriver. So you could use import logging instead of from google.cloud import logging.
When you are specifically interested in using the google.cloud.logging library, then it needs to be installed to a project folder ./lib as referred by Tudormi: here

django-admin.py can't find custom settings file

I have several customized django settings, this is basically my project structure:
MainProject/
manage.py
my_project/
settings/
base.py
dev.py
prod.py
I've created the __init__.py files inside the directories to identify them as packages.
I've exported the DJANGO_SETTINGS_MODULE to point to the chosen settings file.
The manage.py command seems to handle it pretty good, I never had problem with it.
The problem is that no matter what I do the django-admin.py is not able to find any settings file. I've tried several possible solution but nothing seems to work so far.
I've used the --settings=my_project.settings.dev
I've edited and hard-coded the manage.py to let it point to the dev.py file
I've created a settings file either inside the MainProject and my_project directories importing the dev file (that in turn imports the base.py).
I've created a settings file that let Django know which files should it use as settings
This is regarding the point 4:
from django.core.management import setup_environ
try:
import my_project.settings.dev as settings
except ImportError:
import sys
sys.stderr.write("Couldn't find the settings.py module.")
sys.exit(1)
setup_environ(settings)
Nothing seems to work so far.
====================================
SOLUTION:
I did not find the exact solution but thanks to a comment on the chosen answer I understood that you can basically use manage.py for everything that you could do in django-admin.py, I didn't know that! Since things DO work for me using manage.py I'm fine with it.
What I recommend doing:
Create a normal settings.py file and import one of the others in there. This avoids duplication of settings shared among the three scenarios + it is actually the recommended way of doing it according to the DRY principle.
Typically, you will only have the set the debug parameter, database settings and private keys in the specific settings files. All the other settings should be shared among all scenarios to avoid forgetting to update one and getting hard to debug errors.
Have you tried to import the dev settings inside the __init__.py from your settings module?
settings/_init_.py
from .dev import *

Make 'collectstatic' find updated files

Is there a way to make python manage.py collectstatic find updated static files? Currently, it is properly searching STATICFILES_DIRS and finding where I have my static files, but it only uploads new ones. If I modify a static file, it does not detect this. Does Django do this so we have to delete each file first, or is there an easy solution?
UPDATE:
Disclaimer - This issue has to do with external hosting on Amazon's S3 Storage
I had simply forgot to include AWS_PRELOAD_METADATA = True in my settings.py file.
Adding this setting in fixed the issue of collectstatic only finding new files. Also, I saw a major speed increase in syncing between the server and Amazon's S3.
If you are using S3 for storage, I found this answer to be quite helpful.

Django, boto, S3 and easy_thumbnails not working in production environment

I'm using Django, django-storages with S3 (boto) in combination with easy-thumbnails. On my local machine, everything works as expected: if the thumbnail doesn't exists, it gets created and upload to S3 and saves in the easy-thumbnails database tables. But the problem is, when I push the code to my production server, it doesn't work, easy-thumbnails output an empty image SRC.
What I already noticed is that when I create the thumbnails on my local machine, the easy-thumbnail path uses backward slashes and my Linux server needs forwards slashes. If I change the slashes in the database, the thumbnails are showed on my Linux machine, but it is still not able to generate thumbnails on the Linux (production) machine.
The simple django-storages test fails:
>>> import django
>>> from django.core.files.storage import default_storage
>>> file = default_storage.open('storage_test', 'w')
Output:
django.core.exceptions.ImproperlyConfigured: Requested setting DEFAULT_FILE_STORAGE, but settings are not configured. You must either define the environment variable DJANGO_SETTINGS_MODULE or call settings.configure() before accessing settings.
If I do:
>>> from base.settings import staging
>>> from django.conf import settings
>>> settings.configure(staging)
This works (I have a settings directory with 4 settings files: base.py, staging.py, development.py and production.py)
It seems that on my production server, the config file isn't loaded properly (however the rest of the website works fine). If I add THUMBNAIL_DEBUG = True to my settings file, but easy-thumbnails' debugging still doesn't work (it does work on my local machine).
What can be te problem? I've been debugging for 10+ hours already.
Try refactoring your settings to use a more object-oriented structure. A good example is outlined by [David Cramer from Disqus:
http://justcramer.com/2011/01/13/settings-in-django/
You'll put any server-specific settings in a local_settings.py file, and you can store a stripped-down version as example_local_settings.py within your repository.
You can still use separate settings files if you have a lot of settings specific to a staging or review server, but you wouldn't want to store complete database credentials in a code repo, so you'll have to customize the local_settings.py anyways. You can define which settings to include by adding imports at the top of local_settings.py:
from project.conf.settings.dev import *
Then, you can set your DJANGO_SETTINGS_MODULE to always point to the same place. This would be instead of calling settings.configure() as outlined in the Django docs:
https://docs.djangoproject.com/en/dev/topics/settings/#either-configure-or-django-settings-module-is-required
And that way, you know that your settings on your production server will definitely be imported, since local_settings.py is always imported.
first try to use:
python manage.py shell --settings=settings/staging
to load shell with correct settings file and then try to debug
For some reason, S3 and easy thumbnails in the templating language didn't seem to get along with each other ... some path problem which probably could be solved at some point.
My solution (read: workaround) was to move the thumbnail generation into the model inside the image field itseld, for example:
avatar = ThumbnailerImageField(upload_to = avatar_file_name, resize_source=dict(size=(125, 125), crop="smart"), blank = True)
For the sake of completeness:
def avatar_file_name(instance, filename):
path = "%s/avatar.%s" % (str(instance.user.username), filename.split('.')[1])
if os.path.exists(settings.MEDIA_ROOT + path):
os.remove(settings.MEDIA_ROOT + path)
return path