I have installed OpenStack using PackStack on CentOS-7. I need to install successfully Tacker service with OpenStack PackStack on CentOS-7.
Any help on this would be helpful for me.
Thanks.
Follow the official openstack guide to install tacker https://docs.openstack.org/tacker/latest/install/manual_installation.html
There are .rpm packages in RDO repositories called openstack-tacker and openstack-tacker-common, install them and configure the service following the official guide.
Tacker also requires some other services (mistral and barbican) which need to be installed and configured, there are other deployment options which support deploying tacker and its dependencies in a single config, as example kolla-ansible.
Related
I am trying to deploy a service to aws using serverless. I am deploying it using gitlab cicd instead of doing it locally. Initially my serverless version was latest(had not mentioned any specific version) but then when I pushed my code to gitlab and i got few errors in the pipeline as the latest version is not stable. So had to change the version to a stable version. Now when i pushed my code changes to gitlab, my deployment failed and i got
Serverless Error ----------------------------------------
Cannot run local installation of the Serverless Framework by the outdated global version. Please upgrade via:
npm install -g serverless Note: Latest release can run any version of the locally installed Serverless Framework.
I dont want to upgrade my serverless version.
in my gitlab-ci.yml i have changed
- npm install -g serverless
to this
- npm install -g serverless#2.69.1
Is there any way I can fix this ?
Any help would be appreciated, thank you.
in your case, the most likely reason for that is the fact that you have a local installation of Serverless or some of the plugins/your other dependencies have Serverless v3 in peer dependencies and install it by default in npm#7 and higher.
To resolve it, either remove local installation or pin the version of the locally installed Serverless (in devDependencies of package.json of your project).
./node_modules/.bin/sls deploy does the trick.
However, the proper answer is in the docs:
There are 2 scenarios:
Using v3 globally, and v2 in specific projects.
This is the simplest. Upgrade the global version to v3, and install v2 in specific projects (via NPM). The serverless command will automatically run the correct version (v3 can run v2).
Using v2 globally, and v3 in specific projects.
To achieve that, install v3 in specific projects (via NPM). Then, use serverless for v2 projects, and npx serverless for v3 projects.
https://www.serverless.com/framework/docs/guides/upgrading-v3#using-v2-and-v3-in-different-projects
I am currently running Google Cloud Composer with a Composer version 2.0.9 and airflow version 2.1.4. I am trying install the most recent version of dbt (1.0.4 for core and 1.0.0 for the BigQuery plugin). Because cloud composter images has specific packages installed, I am getting conflicting PyPI dependency issues. When I try to fix one dependency another issue occurs. Does anyone know the specific set of packages installed that would resolve this issue? I have read the following posts by the community but I wanted to know if anyone has a solution for just using composer?
How to run DBT in airflow without copying our repo
How to set up dbt with Google Cloud Composer?
I was able to reproduce the behaviour you are seeing. Below are the dependency conflicts I saw in the Cloud Build logs. These conflicts are occurring between the dbt-core requirements and the pre-installed package requirements in Composer.
Pre-installed package requirements:
hologram 0.0.14 has requirement jsonschema<3.2,>=3.0, but you have jsonschema 3.2.0. ##=> can be installed manually
flask 1.1.4 has requirement click<8.0,>=5.1, but you have click 8.1.2.
apache-airflow 2.1.4+composer has requirement markupsafe<2.0,>=1.1.1, but you have markupsafe 2.0.1.
looker-sdk 22.4.0 has requirement typing-extensions>=4.1.1, but you have typing-extensions 3.10.0.2.
dbt-core requirements:
hologram 0.0.14 has requirement jsonschema<3.2,>=3.0, but you have jsonschema 3.2.0. ##=> can be installed manually
dbt-core 1.0.4 has requirement click<9,>=8, but you have click 7.1.2.
dbt-core 1.0.4 has requirement MarkupSafe==2.0.1, but you have markupsafe 1.1.1.
dbt-core 1.0.4 has requirement typing-extensions<3.11,>=3.7.4, but you have typing-extensions 4.1.1.
I tried downgrading the pre-installed packages, but subsequent package installations fail and it is not recommended as well.
Therefore, I would suggest using an external solution as stated in this thread you have linked. Quoting the workarounds given in #Ryan Yuan's answer here.
Using external services to run dbt jobs, e.g. Cloud Run.
Using Composer's KubernetesPodOperator(updated Composer 2 link). My colleague has put up a nice article on dbt discourse here going through the setup process.
Ignoring Composer's Dependency conflicts by setting Composer's environmental variable IGNORE_PYPI_DEPENDENCY_CONFLICTS to True.
However, I don't recommend this as it may cause potential issues.
Creating a Python virtual environment in Composer and install the dbt packages.
As mentioned by #Kabilan Mohanraj, the current version of dbt (1.0.4) and a more recent version of Composer has dependency issues (Composer version 2.0.9 and Airflow version 2.1.4). Therefore an alternative solution is needed. In my case, I played around and searched for a solution from other people in the community and found one person using a certain version of Composer and dbt that only had mimimal dependency issues. However, as mentioned by #Kabilan Mohanraj, Google does not recommend downgrading preinstalled packages, so this would not be a viable solution for something in production.
create composer through gcloud to use an older version that is not available via the Composer UI
gcloud composer environments create my_airflow_dbt_example
--location us-central1
--image-version composer-1.17.9-airflow-2.1.4
requirements
dbt-bigquery==0.21.0
jsonschema==3.1.1
packaging==20.9
For this specific composer version, you are downgrading jsonschema from 3.2.0 to 3.1.1 and packaging from 21.3 to 20.9
I am using Cloud Foundry's nodejs profile and my nodejs package.json requires chartjs-node-canvas. That package uses node-canvas and node-canvas is based on Cairo. The node-canvas site says I have to add the cairo-devel package to Linux (apt-get) in order for canvas to be installed.
Is it possible to add software to the OS image running on cloud foundry? If so, how?
You can do that by vendoring the dependencies. When you vendor them, you'll build locally in an Ubuntu Bionic Linux container or VM. Node will build everything that's required and you will no longer need the cairo-devel package (it's only needed to build).
The process to vendor dependencies is documented here.
The other option is to use the Apt Buildpack which is described on this SO post. That can be used to install any apt packages.
I have gotten to know that the AWS SDK comes with the default operating system installed on AWS EC2 when provisioned via Elastic Beanstalk.
I would like to know if I can access the PHP version of the SDK.
Or does it need to be installed separately.
Thank you.
You have to Install it using phar or composer , Please Refer AWS Documentation for this:
To use Composer with the AWS SDK for PHP:
Open a terminal window and navigate to the directory where your
project is stored. Composer is installed on a per-project basis.
Download and install Composer in your project directory. If you have
curl installed, you can use the following command:
curl -sS https://getcomposer.org/installer | php
Create a file at the root level of your project called composer.json
and add the following dependency for the AWS PHP SDK:
{
"require": {
"aws/aws-sdk-php": "2.*"
}
}
Install the dependencies by running Composer's install command:
php composer.phar install
This will create a vendor directory in your project with the required libraries and an autoloader script used to load them for your project.
Require Composer's autoloader by adding the following line to your
code's bootstrap process (typically in index.php):
require '/path/to/sdk/vendor/autoload.php';
Your code is now ready to use the AWS SDK for PHP!
is there any well documented step by step procedure to install redmine?. I tried to install it on my Ubuntu machine.But i am unable to access it from another machine. please tell me how to it. is there any document which show how to host as centralized
See this guide: http://www.redmine.org/projects/redmine/wiki/redmineinstall. Its a general installation guide. I used it to install Redmine on Debian Jessie.