I uninstalled and reinstalled Wamp. When I did this, I lost all my databases on localhost. I did not think about it, and did not imagine that it would delete them all.
Is there a way to recover those local databases?
If you have a backup of the DB would be nice!
If not, you may have to go to data recovery tool (list of recovery file software, from wikipedia).
MySQL store the DB in the file system; when you drop a DB, you are deleting the files of that DB in your server's file system.
Search for the MySQL config file for the directory that has the files to know exaclty where your files had been.
Read this answer; explains where to find that config file and your MySQL directoy.
Once you have the directory, you could try the recovery program to restore the deleted files.
If you succeed, you may want to read this answer, that explain the steps to recover DB from files in MySQL and the file structure of MySQL.
Hope your files still there! and do a daily backup.
Related
One of my clients is used to save to a NFS volume the contents of some HDFS folders. Sometimes during the copy I'll receive the following error
copying /folder/foo
copyToLocal: Input/output error
I thought that some of the files inside the foo folder might be corrupt, but a check with hdfs fsck didn't report anything strange. In the foo folder there are many many files so it's not feasible to manually search for something strange. I tried to enable debug mode for the copyToLocal command but there is no clue about any type of error. How could I debug this issue ?
I have the gut feeling that there are some networking issues on the NFS, but I also don't know how to debug this kind of problems.
The HDFS folders are also very large, I don't know if this could cause any additional problem.
We are in a Kerberos environment and we're executing commands after the kinit as the hdfs superuser.
p.s. Probably SO isn't the right place to ask this question, feel free to redirect me to the right website :)
I have successfully set up and used Django with MySQL on my local machine, but now if I put my project on say GitHub what should the requirements be for any other person to be able to run it?
I am asking this because my project uses a database that I have stored in my local machine and while uploading the project on GitHub I have not uploaded the database. in sqlite3 there is a database file inside the project itself but this does not happen for MySQL whose database is stored in a different location.
I mean Django accesses the database from a different location(var/lib/MySQL) and when I try to copy the database from there to the project folder and specify its location in settings.py, I get an access denied error.
So how can I solve this?
You would typically have a seed file for others to use. Others will create a database on their own systems and use your seed file to get started with the project.
It should not be necessary to copy the database files. Also, you should not just copy the MySQL directory like that. If you copy the whole directory then you might replace what somebody already has on their system, but if you copy only some of the files then you might be missing things like the MySQL user accounts. Besides, that is a backup procedure, not a deployment or distribution procedure.
For somebody else to get started with your project the normal process is:
manually create the appropriate MySQL user and database (or provide a script to automate it)
Run migrations: python manage.py migrate
Import initial data:
This can be with fixtures: python manage.py loaddata my_data.json
Or with a custom management command: python manage.py load_my_data
However, if you really need to provide somebody with an almost ready database, you can use mysqldmp which will produce a SQL text file, but the other person still needs to create the user account manually.
I want to add some with himank that if you need to provide some additional data for database you can up your fixture-datalink in fixture folder. Then other person will able to load those manually with command or even able to run a script link to populate initially data to database.
I'm trying to develop a Django website with Heroku. Having no previous experience with databases (except the sqlite3 one from the tutorial), it seems to me a good idea to have the following file structure:
Projects
'-MySite
|-MySite
'-MyDB
I'm finding it hard to figure out how to do it, with psql commands preferring to put the databases in some obscure directory instead. Perhaps it's not such a good idea?
Eventually I want to be able to test and develop my site (it'll be just a blog for a while, I'm still learning) locally (ie. add a post, play with the CSS) and sync with the Heroku repository, but I also want to be able to add posts via the website itself occasionally.
The underlying data files (MyDb) has nothing to do with your project files and should not be under your project.
EDIT
added two ways to sync your local database with the database ON the Heroku server
1) export-import
This is the most simple way, do the following steps every now and then:
make an export on the Heroku server by using the pg_dump utility
download the dump file
import the dump into your local database by using the psql utility
2) replication
A more sophisticated way for keeping your local db in sync all the time is Replication. It is used in professional environments and it is probably an overkill for you at the moment. You can read more about it here: http://www.postgresql.org/docs/9.1/static/high-availability.html
I recently deployed a couple of web applications built using django (on webfaction).
These would be some of the first projects of this scale that i am working on, so I wanted to know what an effective backup strategy was for maintaining backups both on webfaction and an alternate location.
EDIT:
What i want to backup?
Database and user uploaded media. (my code is managed via git)
I'm not sure there is a one size fits all answer especially since you haven't said what you intend to backup. My usual MO:
Source code: use source control such as svn or git. This means that you will usually have: dev, deploy and repository backups for code (specially in a drsc).
Database: this also depends on usage, but usually:
Have a dump_database.py management command that will introspect settings and for each db will output the correct db dump command (taking into consideration the db type and also the database name).
Have a cron job on another server that connects through ssh to the application server, executes the dump db management command, tars the sql file with the db name + timestamp as the file name and uploads it to another server (amazon's s3 in my case).
Media file: e.g. user uploads. Keep a cron job on another server that can ssh into the application server and calls rsync to another server.
The thing to keep in mind though, it what is the intended purpose of the backup.
If it's accidental (be it disk failure, bug or sql injection) data loss or simply restoring, you can keep those cron jobs on the same server.
If you also want to be safe in case the server is compromised, you cannot keep the remote backup credentials (sshkeys, amazon secret etc) on the application server! Or else an attacker will gain access to the backup server.
I maintain a couple of low-traffic sites that have reasonable user uploaded media files and semi big databases. My goal is to backup all the data that is not under version control in a central place.
My current approach
At the moment I use a nightly cronjob that uses dumpdata to dump all the DB content into JSON files in a subdirectory of the project. The media uploads is already in the project directory (in media).
After the DB is dumped, the files are copied with rdiff-backup (makes an incremental backup) into another location. I then download the rdiff-backup directory on a regular basis with rsync to store a local copy.
Your Ideas?
What do you use to backup your data? Please post your backup solution - if you only have a few hits per day on your site or if you maintain a high traffic one with shareded databases and multiple fileservers :)
Thanks for your input.
Recently, I've found this solution called Django-Backup and has worked for me. You can even combine the task of backing up the databases or media files with a cronjob.
Regards,
My backup solution works the following way:
Every night, dump the data to a separate directory. I prefer to keep data dump directory distinct from the project directory (one reason being that project directory changes with every code deployment).
Run a job to upload the data to my Amazon S3 account and another location using rsync.
Send me an email with the log.
To restore a backup locally I use a script to download the data from S3 and upload it locally.