So writing a fabfile function, I attempt to automate the setup of a particular mac os x machine like this.
from fabric.api import local, env, sudo, require
def mac_port_setup():
PACKAGES = ['vim +python27', 'htop']
for item in PACKAGES:
local('sudo port -v install {0}'.format(item))
The PACKAGES list can actually be a huge list and I want to avoid installing a package if the package is already installed. What are the possible ways to go about preventing an already installed package from being re-installed in my fabric automation?
Pretty sure running the install again won't do anything on already installed packages. If you want to check first port has the installed command to list what's been installed.
Related
I have tried to create envelope on my linux pop os system using miniconda. When I activate it, I can install packages using pip, but when I run my django instance it doesn't find the modules.
If I type which python is shows the miniconda path correctly. I can look in the site-packages folder and see the packages installed.
I've tried installing django-anymail and corsheaders and they are both not being found. It does find my locally installed apps.
If I use the command line and open python and then import, it does not recognize my modules installed in the virtual envelope either. I thought it was a problem with conda, so I also created an envelope using python's native method: python3 -m venv
I have the same problem with it finding pip install site-packages.
Is there a command I can run to show all available packages?
I hadn't realized I had aliased my python. Now it is working.
After running a few tests through AWS SSM Document aws-RunPatchBaseline, the content code used for Debian based OS like Ubuntu doesn´t truly install any required packages even setting this with "install" parameter and being executed. Is this something that needs to be fixed or why does this SSM document work this way without truly installing packages.
From documentation, please see below:
apt-get update downloads the package lists from the repositories and "updates" them to get information on the newest versions of packages and their dependencies. It will do this for all repositories and PPAs. From http://linux.die.net/man/8/apt-get:
Used to re-synchronize the package index files from their sources. The indexes of available packages are fetched from the location(s) specified in /etc/apt/sources.list(5). An update should always be performed before an upgrade or dist-upgrade.
apt-get upgrade will fetch new versions of packages existing on the machine if APT knows about these new versions by way of apt-get update.
From http://linux.die.net/man/8/apt-get:
Used to install the newest versions of all packages currently installed on the system from the sources enumerated in /etc/apt/sources.list(5). Packages currently installed with new versions available are retrieved and upgraded; under no circumstances are currently installed packages removed, nor are packages that are not already installed retrieved and installed. New versions of currently installed packages that cannot be upgraded without changing the install status of another package will be left at their current version. [Emphasis mine] An update must be performed first so that apt-get knows that new versions of packages are available.
Currently code content has this:
" apt-get update >/dev/null",
" fi",
"}",
¿Should I add a custom line or create a custom ssm with apt-get upgrade -y after apt-get update?, this document is supossed to work by installing packages, but so far on Ubuntu for example it just doesn´t do anything besides updating the package lists from repos (without installing any).
In my experience, on Ubuntu 20.04, it works. You can verify this by checking apt logs after you run AWS-RunPatchBaseline on such an instance. The logs are located in:
/var/log/apt/history.log
and
/var/log/apt/term.log
Since you have provided any details on what where your tests, which linux distribution did you use, nor provided any log output with possible errors of ssm agent or apt, it is difficult to speculate why it does not work for you.
The actual upgrade is performed by a python script, not the command you listed. You can inspect its code after you run AWS-RunPatchBaseline:
/var/log/amazon/ssm/patch-baseline-operations
I've been using Python3.4 to complete certain tasks, though I still use Python2.7 as default.
I think I should be able to begin downloading py34 ports from using sudo port install py34-whatever in the same location as my Python2.7 ports.
However, I am running into significant downloading errors doing this.
Is it possible to download both py27 and py34 ports into the same location? Will there be problems doing this?
Your problems appear to be a generic Macports download problem. Resetting the download process via sudo port clean <portname> should help.
As to the general question of using multiple versions:
Macports allows you to install an arbitrary number of different versions in parallel. You switch between them using port select --set <application> <portname>, for example sudo port select --set python python34.
For easier access, you can define your own shell alias (e.g. python3 or python34), pointing to /opt/local/bin/python34.
My personal experience is that Anaconda makes these types of tasks painless. All the while providing the same functionality. http://docs.continuum.io/anaconda/install
Suppose you want an isolated environment for py27:
http://conda.pydata.org/docs/using/envs.html#create-an-environment
conda create --name py27 python==2.7.10
To use the environment:
source activate py27
To install a package, conda install or pip install.
If you want a Python 3.4 environment just change the above command a bit. I have no affiliation with Anaconda, and I would guess other Python distros work just as well. This just made things easier for me, hope it does for others as well!
For regulatory reasons my company has deployed an air-gapped Red Hat environment with, among other, Python Anaconda and R installed. How to I go about updating Anaconda packages in such an environment? I can move files from my own machine to the environment via FTP but cannot access the internet directly from the air-gapped environment.
I usually update my anaconda packages with something like this:
conda update scipy
The answers appears to be here in the Anaconda FAQ:
http://docs.continuum.io/anaconda/faq.html#install-pkgs-offline
How do I install packages into Anaconda offline?
You can directly install a conda package if you have it available on your local machine (use the full path for the package to ensure conda finds it):
conda install .tar.bz2
If you do not have the package on the target machine, you’ll need to move a copy of .tar.bz2 to it. Packages installed by conda are found in the anaconda/pkgs directory.
You can also install a tar-file directly (.tar) that itself contains many conda packages at any path location. You can make it easily using tar and then install it directly as well. No internet connection is needed as long as the tar-file is available on the target machine. Use the full path for the tar-file to ensure conda finds it:
conda install .tar
I am trying to install django-dash to run one of the dashboard examples and see what it's like.
I am on Windows running Python 2.7 and Django 1.6.5. I know the usual approach is to download pip then install the package using pip. However, I am on a work computer with no administrative rights so I can't access my Internet Option Settings to find my proxy URL to follow the instructions below:
Proxy problems
If you work in an office, you might be behind a HTTP proxy. If so, set the environment variables http_proxy and https_proxy. Most Python applications (and other free software) respect these. Example syntax:
http://proxy_url:port
http://username:password#proxy_url:port
I had the same issue when trying to install Django but was able to get it to work by moving the django directory under Python27/Lib/site-packages. Is there something similar I can do with django-dash?
I also tried downloading the sources and running python setup.py install. I received the following error:
File "setup.py", line 3, in <module> from setuptools import setup, find_packages ImportError: No module named setuptools
Link to django-dash: http://django-dash.readthedocs.org/en/latest/
Yes, you can probably get the sources from The Python Package Index
Once you have them, uncompress the files and install them manually (this will depend on you OS).
On Linux systems:
python setup.py build
python setup.py install
Here's the full reference
EDIT : Note that when manually installing those packages, you must also install any missing dependencies, eg. setuptools in your case