Joomla Backup to run Locally - joomla2.5

Can a downloaded backup of Joomla website be run on localhost? If yes, what changes I need to do locally?
I placed the unzipped backup folder under C:\xampp\htdocs\joomla. So now as I try to run loclhost //localhost/joomla it gives me following error.
Database Error: Unable to connect to the Database: Could not connect to MySQL.
Please note that I've not done any changes to backup. Just placed the unzipped backup downloaded via FTP in htdocs.

You will also need to download a copy of the website database and install this locally. If this is new to you, check out Akeeba Backup, it will simpplify your life when moving sites between different servers. It will bundle the database and site files into a nice, neat zipped package for you.

Chances are the database specifics are incorrect. Unless you have created the exact database name and user on localhost you'll need to edit configuration.php to straighten that out.

Related

Website deployed on AWS EC2 Using FileZilla isn't showing live changes

My organization is using AWS and I connected to it using FileZilla.
Server Folder Path:
Source
-> Website Folder
---> Client
------> All files visible here which are things from the create-react-app
Specifically:
Build, Node_modules, Public, Src, and other files including Makefile
On my local computer I used npm install to gather the latest files, then I made some changes and when I finished I ran "Npm run build" and then copied over all folders including the new build folder.
I used drag and drop into the filezilla window.
Now when I check the live website for changes, it still shows the old information even though it's updated on my local build. I don't understand.
Is there another way to deploy this properly? I don't see why I can't drag and drop the source files over. It's just static changes.
Things I tried:
Clearing Cache
Using different browser
Incognito
Reconnecting to server via filezilla
I figured it out. Only took forever.
The solution was to connect using ssh client and push transfer the files over directly.
I used putty to do this.

Django - share media files during development

I want developers to use the same development database while developing a Django project so I've created a remote server with Ubuntu 18.04, created Postgres database and allowed remote access.
The problem is that there are missing pictures - media directory is in .gitignore because of production server. How I make media being shared (and editable) so any developer can see any image and assign image to object?
I was thinking about sharing it through git or to store media on production server where Postgres is but not sure how and if it's the best way to do this.
Funny setup...
2 thoughts:
Shared DB sure you have your reasons but why are you running a centralized dev server vs having developers run local and gain access to terminal logs with a DATABASE configuration that points to the remote DB?
You can stash an empty media directory or any other -- description on how at How can I add an empty directory to a Git repository?

How to download updated sqlite Database from AWS EB? (Django app)

I have deployed my django app on AWS Elastic beanstalk, using the sqlite database. The site has been running for a while.
Now I have made some changes to the website and want to deploy again, but by keeping the changes the database went through(by form inputs)
But I am not able to find a way to download the updated database or any files. The only version of the app I am able to download is the one I already uploaded when I first deployed.
How can I download the current running app version and deploy with the same database?
you may use scp to connect to remote server and download all your files.
as commented, you should better use dumpdata and then i/o that file (instead of the db or full site).
however, if your remote system python version is not the same as your django site, then the dumpdata may not work (this is my case), so I managed to download the whole site folder back to local and then use dumpdata
aws elastic beanstalk Python app's files in /opt/python/current/app
just use scp -r to download the whole site to local.
I believe you have sqlite db file being packaged from your local during eb deploy. If that is a case then all you need to do is not include db.sqlite file in your deployments.
Move location of db.sqllite file to s3 or ebs persistent volume so that it persists.
Use .ebextentions to run db migration in aws on top of the aws db file just the way django recommends.
To be on safe side you can ssh into your eb env which is ec2 instance and download db.sqlite file just incase.

Azure web app throwing error when syncing with my github repository

In syncing with code I recently pushed to my github repository, my Django Azure Web App throws me the error:
pip can't proceed with requirement 'anyjson==0.3.3 (from -r requirements.txt (line 2))' due to a pre-existing build directory.
location: D:\home\site\wwwroot\env\build\anyjson
This is likely due to a previous installation that failed.
pip is being responsible and not assuming it can delete this.
Please delete it and try again.
How do I manually delete this myself? Like, is there a way I can connect to the Azure Web App via SSH or something? It's not immediately clear what I need to do here.
Go to yoursite.scm.azurewebsites.net.
Login with your Azure credentials.
This will give you a console with various tools, including a file browser you can use to delete the files in question.

Django Deployment Advice

I have a multi-step deployment system setup, where I develop locally, have a staging app with a copy of the production db, and then the production app. I use SVN for version control.
When deploying my production app I have been just moving the urls.py and settings.py files up a directory, deleting my django app directory with rm -rf command and then doing an svn export from the repository which creates a new django app directory with my updated code. I then move my urls.py and settings.py files back into place and everything works great.
My new problem is that I am now storing user uploads in a folder inside of my django app, so I can't just remove the whole app dir anymore or I would loose all of my users files.
What do you think my best approach is now? Would svn export --force work, since it should just be overwriting all of my changed files? Should I take an entirely new approach? I am open to advice?
You may want to watch this presentation by Jacob. It can help you improve your deployment process.
I use Bitbucket as my repo and I can simply perform push on my Dev box and run pull/update on Stage/Prod box. Actually I don't run them manually, I use fabric to do them for me :).
Your could use rsync or something similar to backup your uploaded files and use this backup when you deploy your project.
For deployment you could try to use buildout:
http://www.buildout.org/
http://pypi.python.org/pypi/djangorecipe
http://jacobian.org/writing/django-apps-with-buildout/
For other deployment methods see this question:
Django deployment tools
You can move your files to S3 servers (http://aws.amazon.com/s3/), so you will not ever have to care about moving them with your project.