Mutt: download only current folder using offlineimap - mutt

Background:
I use offlineimap to download emails, and use sidebar to switch folders to display. Previously, I hard-coded to only synchronize "INBOX" folder. Here is part of my .muttrc related:
macro index o "<sync-mailbox>.<shell-escape>offlineimap -qf INBOX<enter>.<sync-mailbox>" "run offlineimap to sync inbox"
Goal:
I'd like to synchronize the current opened folder. Essentially, I want to get a variable that contains the name of the currently opened folder, and replace the hard-coded "INBOX" with the name. However, I failed to find out how to get the currently opened folder. And the "folder-hook" method seems not work.

As a workaround, use a folder hook that on entering $folder rebinds the binding to only update $folder. For example
folder-hook . 'macro index o "<shell-escape>offlineimap -qo >/dev/null 2>&1 &<enter><sync-mailbox><refresh>"'
folder-hook =INBOX$ 'macro index o "<shell-escape>offlineimap -qo -f INBOX >/dev/null 2>&1 &<enter><sync-mailbox><refresh>"'
folder-hook =INBOX.Sent$ 'macro index o "<shell-escape>offlineimap -qo -f INBOX.Sent >/dev/null 2>&1 &<enter><sync-mailbox><refresh>"'

Related

cd into directories with a specific pattern, read file content and print their content to a text file in linux

I am trying to automate a report generation process.
I need to enter into directories having a specific pattern and then read files from them. The directories name is in pattern PYYYYMMDD001, PYYYYMMDD002 and so on. I need to enter each directory with the defined pattern and read data from each file within the directory. But I am unable to do so as I am committing a mistake while defining the pattern. Please find the command I am using
TODAY=$(date +"%m%d%Y")
cd /home/user/allFiles
for d in "P"$TODAY*
do
(cd $d && grep -o '-NEW' *_$TODAY*_GOOD* | uniq -c| sed 's/\|/ /'|awk '{print $1}' > /home/user/new/$TODAY"Report.txt" )
done
When I am trying to execute it, getting the error of P02192017* [No such file or directory]
The list of directories are - P02192017001, P02192017002, P02192017003 , P02192017004 , P02192017005, P02192017006 , P02192017007, P02192017008
Any kind of help towards this would be highly appreciated.

Shell script to create directories and files from a list of file names

I'm (still) not a shell-wizard, but I'm trying to find a way to create directories and files from a list of file names.
Let's take this source file (source.txt) as an example:
README.md
foo/index.html
foo/bar/README.md
foo/bar/index.html
foo/baz/README.md
I'll use this command to remove empty lines and trim useless spaces:
$ more source.txt | sed '/^$/d;s/^ *//;s/ *$//'
It will give me this list:
README.md
foo/index.html
foo/bar/README.md
foo/bar/index.html
foo/baz/README.md
Now I'm trying to loop on every line and create the related file (it it doesn't already exists), with it's parents directories.
How could I do this?
Ideally, I would put this script in an alias to quickly use it.
As always, posting a question brings me to the end of the problem...
I came to a satisfying solution, using dirname and basename in a for .. in loop:
for i in `cat source.txt | sed '/^$/d;s/^ *//;s/ *$//'`;
do mkdir -p `dirname $i`;
touch `echo $(dirname $i)$(echo "/")$(basename $i)`;
done
This one-line command will:
read the file names list
create directories
create empty files in their own directory

Bash script to change file extension using regex

I have a lot of files i've copied over from my iphone file system, to start with they were mp3 files, but app on iphone changed their names to some random staff which looks like:
1c03e04cc1bbfcb0c1237f57f1d0ae2e.mp3?extra=f7NhT68pNkmEbGA_I1WbVShXQ2E2gJAGBKSEyh3hf0hsbLB1cqnXDuepYA5ubcFm_B3KSsrXDuKVtWVAUh_MAPeFiEHXVdg
I only need to remove part of file name after mp3. Please give me a script - there are more than 600 files, and manually it is impossible.
you can use rename command:
rename "s/mp3\?.*/mp3/" *.mp3*
#!/bin/bash
shopt -s nullglob
for F in *.mp3\?*; do
echo mv -v -- "$F" "${F%%.mp3\?*}.mp3"
done
Save it to a script like script.sh then run as bash /path/to/script.sh in the directory where the files exist.
Remove echo when you find it correct already.

Mirror only files having specific string in file path

I'm trying to mirror only those branches of a directory tree that contain a specific directory name somewhere within the branch. I've spent several hours trying different things to no avail.
A remote FTP site has a directory structure like this:
image_db
movies
v2
20131225
xyz
xyz.jpg
20131231
abc
abc.jpg
AllPhotos <-- this is what I want to mirror
xyz
xyz.jpg
abc
abc.jpg
v4
(similar structure to 'v2' above, contains 'AllPhotos')
...
tv_shows
(similar structure to 'movies', contains 'AllPhotos')
other
(different paths, some of which contain 'AllPhotos')
...
I am trying to create a local mirror of only the 'AllPhotos' directories, with their parent paths intact.
I've tried variations of this:
lftp -e 'mirror --only-newer --use-pget-n=4 --verbose -X /* -I AllPhotos/ /image_db/ /var/www/html/mir_images' -u username,password ftp.example.com
...where the "-X /*" excludes all directories and "-I AllPhotos/" includes only AllPhotos. This doesn't work, lftp just copies everything.
I also tried variations of this:
lftp -e 'glob -d -- mirror --only-newer --use-pget-n=4 --verbose /image_db/*/*/AllPhotos/ /var/www/html/mir_images' -u username,password ftp.example.com
...and lftp crunches away at the remote directory structure without actually creating anything on my side.
Basically, I want to mirror only those files that have the string 'AllPhotos' somewhere in the full directory path.
Update 1:
If I can do this with wget, rsync, ftpcopy or some other utility besides lftp, I welcome suggestions for alternatives.
Trying wget didn't work for me either:
wget -m -q -I /image_db/*/*/AllPhotos ftp://username:password#ftp.example.com/image_db
...it just gets the whole directory structure, even though the wget documentation says that wildcards are permitted in -I paths.
Update 2:
After further investigation, I am coming to the conclusion that I should probably write my own mirroring utility, although I still suspect I am approaching lftp the wrong way, and that there's a way to make it mirror only files that have a specific string in the absolute path.
One solution :
curl -s 'ftp://domain.tld/path' |
awk '/^d.*regex/{print $NF}' |
xargs wget -m ftp://domain.tld/path/
Or using lftp :
lftp -e 'ls; quit' 'ftp://domain.tld/path' |
awk '/^d.*regex/{print $NF}' |
xargs -I% lftp -e "mirror -e %; quit" ftp://domain.tld/path/

How can I move all folders in a directory to subdirectories based on their value?

A legacy web application I've inherited control over seems to have hit the maximum number of subdirectories in a folder on my web server. Whenever an article is created in the system, it's static content is placed in a subdirectory of the document root matching the pattern /uploads/story/{STORY_ID}/. But now the system is unable to create any new directories in the /uploads/story/ folder.
I'd like to address this in 2 steps, but I'm not sure how to run the necessary linux commands to achieve this. Would you be able to help?
As a temporary fix to buy me more time to implement a better directory structure, I'd like to archive the static content of all stories with a STORY_ID of less than 1000. These should be moved from /uploads/story/ to /uploads/story_archive/.
I'll change the upload path to be /uploads/story/{THOUSANDS}/{STORY_ID}/ in the code, but will need to be able to move all folders within /uploads/story/ into this format. e.g. /uploads/story/65312/ would become /uploads/story/65/65312/. How can I do this?
Edit
Fixing (1) was as simple as running:
$ cd /path/to/uploads/
$ mkdir story_archive
$ for i in {1..999}; do mv story/$i story_archive/; done
Given that you are sure /uploads/story/* will give you a number in the * part, you can do the following (Note: backup the whole thing just in case):
# update this based on your actual directory
path_to_fix=/uploads/stories
# move the directories out of the way so they don't get mixed up
mv $path_to_fix $path_to_fix/../temp
mkdir $path_to_fix > /dev/null 2>&1
# get all directories to be moved
dirs=$(ls $path_to_fix/../temp)
# for each of them
for d in $dirs; do
# get the basename, which is store_id
id=$(basename $d)
# divide by 1000
sub=$((id / 1000))
# create the appropriate directory
mkdir $path_to_fix/$sub > /dev/null 2>&1
# move the original directory to that sub-directory
mv $path_to_fix/../temp/$d $path_to_fix/$sub
done
# cleanup
rm -Rf $path_to_fix/../temp
for the temporary fix you can use for loop and a glob
for path in /uploads/story/*; do
storyid=${path##*/}
if [[ ${storyid} -lt 1000 ]]; then
mv "${path}" /uploads/story_archive/${storyid}
fi
done
that will iterate over all directories in /uploads/story/ directory with the full path in $path var.
the bash construct ${variable##pattern} will remove the longest substring matching pattern from the left hand side of variable, in this case all the leading directories leaving just the storyid to get stored in the var.
we then check to see if storey id is less than 1000 and move it to the story archive.
The next bit.
for path in /uploads/story/*; do
storyid=${path##*/}
if [[ $storyid -ge 1000 ]]; then
thous=${storyid%%???}
[[ -d/uploads/story/$thous/ ]] || mkdir /uploads/story/$thous/
mv $path /uploads/story/$thous/
fi
done
ok here again we iterate over all the directories and pluck off the story id. this time though we make sure that the storyid is greater than or equal to 1000 and use the %%??? to remove the last three characters from storyid ( the same as the ## trick but from the right hand side of the variable. )
Then we see if the thousads dir exists and make it if it doesn't and move the dir over.
you could even do one sweep and do both tasks at once
for path in /uploads/story/*; do
storyid=${path##*/}
if [[ $storyid -lt 1000 ]]; then
mv "${path}" /uploads/story_archive/${storyid}
else
thous=${storyid%%???}
[[ -d/uploads/story/$thous/ ]] || mkdir /uploads/story/$thous/
mv $path /uploads/story/$thous/
fi
done