I follow the below steps to back up and restore drupal 7 site.
1) On the source server, upload the backup and migrate module
2) On the source server, take a backup of the database using the back up and migrate module
3) Copy all the site files - Drupal core, modules, themes, and your files folder, and migrate them to the target server in the exact same folder structure as the source server.
4) After that, delete settings.php, and make a copy default.settings.php called settings.php
5) Install Drupal as a new installation on the target server
6) Enable the backup and migrate module on the target server
7) Use the B&M module to import the database backup from step 2
I did follow the same steps to backup and restore in drupal 8 but I am running into issues. Does the backup and restore site in drupal 8 work the same way?
More or less.
Maybe could more simple if you have a backup copy.
If you can create a backup before, using drush, it will be the best way.
So assuming you have a backup copy
From the linux command line, change to the directory where we want to restore our Drupal project.
to create the database in the system:
$ mysqladmin -u [username] -h [host] -p create [database]
so the copy of the database will restored from the .sql file that you generated when you backed up:
$ mysql -u [username] -h [host] -p [database]
Set the database password.
next go to the rigth directory
cd /folder/site/drupal
and unzip the site in the righ place where you need it.
$ tar -xzvf /folder/where/create/backup/filename-backup.tar.gz -C / folder / site / drupal
Related
I have a site running on an Elastic Beanstalk single instance server and want to add automated SSL certificate generation from LetsEncrypt using the AcmePHP library.
The library tries to store the certificates in ~/.acmephp, which the server responds to with an error
Failed to create "/home/webapp/.acmephp": mkdir(): Permission denied.
The acmephp library doesn't have an option to change the path built in, and rather than fork and recompile the script, I'd like to be able to store the files in the default directory.
Does anyone know how I can give the app permission to create this directory, outside of the web root, or how I can make the server create it automatically and have it be available to the app?
It looks like since it's being ran by the webapp user, when acmePHP is trying to store the certificate under that user's home directory it fails because that directory doesn't exist (afaik the webapp user only runs httpd and it definitely doesn't have a home directory).
A very dirty workaround could be manually creating that file and folder in the . ebextensions folder in your project.The file would be .ebextensions/create_home.config and it would contain something like this:
files:
"/tmp/create-home.sh" :
mode: "000755"
content: |
#!/usr/bin/env bash
mkdir -p /home/webapp
chown webapp:webapp -R /home/webapp
commands:
01_create:
command: "/tmp/create-home.sh"
That script is ran by the root user, and afterwards it changes ownership of the /home/webapp folder to the webapp user and group respectively. Hope it helps
I am running Apache superset on GCP instance and it works fine with Sqlite database which is default in superset and I don't need to configure so many things. But my requirement is that I need superset to connect directly with BigQuery instead of Sqlite and I don't have developer background. So, is there an easy way to do that without heavy codes?
Connecting to BigQuery is very well documented here in Preset's Superset user documentation https://docs.preset.io/docs/big-query-database
Following the steps mentioned at the official Google Cloud page here, you need to do the following
Install pybigquery
pip install pybigquery
Download your Google Cloud authorization json key file
From your terminal instance, set GOOGLE_APPLICATION_CREDENTIALS env. var to the path of your json key file
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/[json_file].json"
Elbehery is right. I don't have enough rep to comment, but I wanted to note that Apache has created docs for this.
FWIW, I couldn't use the UI for importing credentials.json so I set it as an env var in my Docker image. Here are the commands and steps I run locally:
# Setup virtual environment (exit by typing "deactivate")
pip3 install virtualenv
python3 -m virtualenv ./.venv
source ./.venv/bin/activate
# Download Superset
git clone https://github.com/apache/superset.git
cd superset/
# Create a copy of your credentials for docker to use
cp ~/.config/gcloud/application_default_credentials.json docker/credentials.json
echo "GOOGLE_APPLICATION_CREDENTIALS=docker/credentials.json" >> docker/.env-non-dev
# Run Superset
docker-compose -f docker-compose-non-dev.yml pull
docker-compose -f docker-compose-non-dev.yml up
Now that Superset is running locally:
visit http://0.0.0.0:8088/ in your web browser
In the top right of UI, click the +DATABASE button (or + > Data > Connect Database)
In popup window Click Supported Databases then (at the bottom) Other
Set DISPLAY NAME: BigQuery
Set SQLALCHEMY URI: bigquery://my_project_id
Click Test Connection
Click Connect
Now that BigQuery is integrated into Superset:
In the top of page Click SQL Lab > SQL Editor
Most of the documentation examples are for Linux:
https://docs.bitnami.com/installer/apps/redmine/#how-to-upgrade-redmine
I would like to see one for the Windows Server variety. Trying to upgrade 3.2.2 to 3.3.1. Client wants to keep it on local Windows only. No cloud.
Bitnami Developer here. Thanks for your comment, we will update soon the documentation of bitnami to add more guides of windows.
I have been able to migrate to redmine 3.2.2 to 3.3.1, these are the steps you have to follow:
Go to the manager-windows (C:\Bitnami\redmine-3.2.2-0\manager-windows.exe) and stop all the services. Then start again mysql. You should have something like this:
manager-windows
Do a dump of your mysql database. You can use the use-redmine console to do this (C:\Bitnami\redmine-3.2.2-0\use_redmine.exe) and execute the following:
mysqldump -u root -p --databases bitnami_redmine > backup.sql
Save that backup and download the last version of redmine stack installer (3.3.1-0): Bitnami redmine installers
Install it in your machine and open the manager-windows (C:\Bitnami\redmine-3.3.1-0\manager-windows.exe). Stop all services and start again the mysql service to restore the backup.
Start the use_redmine console(C:\Bitnami\redmine-3.3.1-0\use_redmine.exe)
Execute the following in the use_redmine console:
mysql -u root -p
Password: ****
mysql> drop database bitnami_redmine;
mysql> create database bitnami_redmine;
mysql> grant all privileges on bitnami_redmine.* to 'bn_redmine'#'localhost' identified by 'DATABASE_PASSWORD';
Restore the new database:
`mysql -u root -p bitnami_redmine < /path/to/your/backup.sql`
Edit the Redmine configuration file to update the database user password (the same that you set previously) at
C:\Bitnami\redmine-3.3.1-0\apps\redmine\htdocs\config\database.yml:
production:
adapter: mysql2
database: bitnami_redmine
host: localhost
username: bn_redmine
password: "DATABASE_PASSWORD"
encoding: utf8
In the use_redmine console migrate the database to the latest version:
bundle exec rake db:migrate RAILS_ENV=production
After this, you should be able to start all the services again in the C:\Bitnami\redmine-3.3.1-0\manager-windows.exe and log in in the application as always.
I have a Django app live on Heroku. I'm migrating it to Azure, taking advantage of the $120K/yr credit they recently offered me. Here's what I've done so far:
i) I created an Azure VM with Ubuntu (Standard_D1).
ii) I installed postgresql on it (my db of choice)
iii) I pulled my Heroku app's files from my github onto the Azure VM.
iv) I created a postgres DB on the Azure VM, and then ran syncdb to create the required tables.
v) I tweaked postgresql.conf and pg_hba.conf to cater to some tuning requirements and such.
vi) I took a backup from my Heroku app's dashboard, and downloaded it. This backup file's name is a random uuid, without a file format (e.g. f0af6457-1a24-47d0-881c-434f9bef7c92).
vii) I'm now gearing up to use pg_restore to fit the backup in the newly created+synced app on Azure VM.
Does all this sound about right so far? I have 3 questions:
1) Will pg_restore work with the backup I got off Heroku? This backup doesn't have a file format at all; whereas I'm under the impression it has to be a .tar archive to be compatible with pg_restore.
2) My database is called mydbname. The data backup is saved at /datadrive/backup/filename. Thus, in my case is the correct pg_restore command something like: pg_restore -d mydbname /datadrive/backup/filename?
3) Once I successfully load the correct data in my Azure app, the final step, in my opinion, is to route traffic going to the Heroku app instead to the Azure app. For that, I'll tweak DNS entries. Am I missing anything else here, in your opinion?
Essentially the extension shouldn't matter, your restore should work but frankly haven't tested myself with a heroku backup.
However what I would suggest is lets make it a valid .dump file
curl -o latest.dump heroku pg:backups public-url --app <yourappname>
this should be your valid .dump file, though its not any different from the backup you already have..
We started using Redmine at work. I know it uses MySQL as the database, and Apache 2 as the web server. How can Redmine be properly backed up so that it can be reloaded quickly when anything goes wrong?
This will do just fine:
mysqldump --single-transaction --user=user_name --password=your_password redmine_database > backup.sql
It will dump the entire contents of the redmine_database to the backup.sql file.
Update:
As far as backing up "apache", as I state in my comment below - you don't need or want to back up your apache installation. If you ever need to recover your system, apache would need to be reinstalled as with any other application. If you are referring to the actual files and directories within your redmine installation, those as well don't need to be backed up except for the files/ directory which contains user uploaded files to redmine. You can backup your entire redmine installation (to be safe) with the following command:
tar czvf redmine_backup.tar.gz /path/too/redmine/installation
Run it as a VM (JumpBox has a quickstartable one, I believe) then periodically pause or shutdown the VM and backup/copy the entire virtual disk.
I know this doesn't help with an existing installation, but it's what I'd recommend to anyone planning backups before they implement. That's not meant to be snide, just helpful to anyone else reading this thread.
Bitnami apps are self contained, so another option if you can afford some downtime, is simply to shutdown the server, and zip the directory contents ... You may want to do this maybe once a week, in addition to your mysqldump backups. This way you also capture any changes that may have happened in Apache, etc.
Read the Redmine user guide (look at the bottom).
Also, don't forget to backup the attached files.
Redmine backups should include:
Data (stored in your redmine database)
attachments (stored in the files directory of your Redmine install)
Here is a simple shell script that can be used for daily backups (assuming you're using a MySQL database):
# Database
/usr/bin/mysqldump -u <username> -p<password> <redmine_database> | gzip > /path/to/backup/db/redmine_`date +%y_%m_%d`.gz
# Attachments
rsync -a /path/to/redmine/files /path/to/backup/files
Redmine sets table charset as "latin1".
So, if you use non-latin1 charset (CJK in UTF-8 or something), you should give following option to backup script.
mysqldump -u root -p --default-character-set=latin1 --skip-set-charset bitnami_redmine -r backup.sql
It skips "set charset blah-blah-blah" on sql dump and you would get a clean(=dump without interpretation) dump.
By the way, you have to back up the files directory as well; it holds all uploaded files. I installed the Bitnami Redmine stack on Windows.
For MySQL, I use MySQLAdmin to schedule database backup every day.
And I use aceBackup to automatic backup database dump files and Redmine uploaded files to a remote FTP server.
When the server is something wrong, I can just reinstall the Bitnami Redmine stack, and import early dumped database file, then cover Redmine's files directory with backup files.
And that's OK.
This separate program (Bitnami Redmine stack) and data (database & uploaded files) perfectly.