Upload a variable named file via CURL command line - regex

I am trying to upload a file to a server using Curl command line tool. File which is uploaded is generated by a build tool so its name is variable and depends upon the version of the project.
For example:
File to be uploaded has name: puppet-14.1.6-snapshot.zip // where 14.1.6 is the version of project
Here is my (working) CURL command:
curl -u admin:admin -F file=#"target/puppet-14.1.6-snapshot.zip" -F name="puppet-core-pkg" -F force=true -F install=true http://myserver.com:4502/service.jsp
The above call works perfectly but I am trying to find an alternative via which i do not need to change the file parameter every time a new version of project goes out.
I have already tried these two
file=#"target/puppet-*-snapshot.zip" //Does not work
file=#"target/puppet-[*]-snapshot.zip" //Does not work
Is it possible to use some regex and upload the file which matched the given regex ?

I have found a temporary workaround for it but still not a convincing solution. Here it goes:
package_name=$(ls | grep puppet-)
curl -u admin:admin -F file=#"target/$package_name" -F name="puppet-core-pkg" -F force=true -F install=true http://myserver.com:4502/service.jsp
I know there is only one file which gets matched to puppet- So package_name contains the file name which is to be uploaded.
Although this works for me, I am leaving this question open for an elegant solution.

Is this the only file in the directory? If so, then you can just have a variable get the name of the file in the directory, and then do:
curl -u admin:admin -F file="$TheFile" # and so and so forth

Related

I need a bash script which needs to download a tar file from a website, this site has multiple files which needs to be filtered

I have a situation where I need to use curl/wget to download a tar file from a website, based on users input. If they mention a build I need to download a tar file based on the release version, I have a logic already to switch between builds, Questions is how can i filter out a particular tar file from multiple files.
curl -s https://somewebsite/somerepo-folder/os-based.tar | grep os-based* > sample.txt
curl -s https://somewebsite/somerepo-folder/os-based2.tar
curl -s https://somewebsite/somerepo-folder/os-based2.tar
first curl downloads all files. Regex helps here, how can I place this along with curl?
if there is a mapping between the user-input and the tar file that you can think of, you can do something like this:
userInput=1
# some logic to map user-input with the tar filename to download
$tarFileName="os-based$userInput.tar"
wget "https://somewebsite/somerepo-folder/$tarFileName"

rename command Debian server

The following worked on my old server (Ubuntu)
rename -n 's/(.*)\/.*\./$1\/$1./' */*
but not on my new server (Debian).
I'm guessing the new server is using the Perl rename. How would one convert the above to work the same with Perl rename? All it was meant to do is rename files in a folder so that the name starts with the name of the parent folder (removing any name before the last dot in the original filename). Thus, include/anything.h would become include/include.h.
The rename command that is part of the util-linux package, won't work.
You need to run :
# apt install rename
If you run the following command (GNU)
$ file "$(readlink -f "$(type -p rename)")"
and you have a result that contains Perl script, ASCII text executable and not containing ELF, then this seems to be the right tool =)
If not, to make it the default (usually already the case) on Debian and derivative like Ubuntu :
# update-alternatives --set rename /usr/bin/file-rename

Cannot delete file with special caracters under LINUX

When trying to run a PHP script under Linux, my command fails and I inherited a a new file in the folder.
The file is called ");? ?for ($j=0;$j".
Impossible to delete with rm, impossible to move...Screenshot
Any idea, please ?
Just an untested idea :
Maybe you can try to delete all the repository, with rm -R folder_name.
You could also add -f : rm -R -f folder_name
Of course, don't forget to save the other files beforehand, but it should be easy as there are just a few.

Bash script to change file extension using regex

I have a lot of files i've copied over from my iphone file system, to start with they were mp3 files, but app on iphone changed their names to some random staff which looks like:
1c03e04cc1bbfcb0c1237f57f1d0ae2e.mp3?extra=f7NhT68pNkmEbGA_I1WbVShXQ2E2gJAGBKSEyh3hf0hsbLB1cqnXDuepYA5ubcFm_B3KSsrXDuKVtWVAUh_MAPeFiEHXVdg
I only need to remove part of file name after mp3. Please give me a script - there are more than 600 files, and manually it is impossible.
you can use rename command:
rename "s/mp3\?.*/mp3/" *.mp3*
#!/bin/bash
shopt -s nullglob
for F in *.mp3\?*; do
echo mv -v -- "$F" "${F%%.mp3\?*}.mp3"
done
Save it to a script like script.sh then run as bash /path/to/script.sh in the directory where the files exist.
Remove echo when you find it correct already.

Mirror only files having specific string in file path

I'm trying to mirror only those branches of a directory tree that contain a specific directory name somewhere within the branch. I've spent several hours trying different things to no avail.
A remote FTP site has a directory structure like this:
image_db
movies
v2
20131225
xyz
xyz.jpg
20131231
abc
abc.jpg
AllPhotos <-- this is what I want to mirror
xyz
xyz.jpg
abc
abc.jpg
v4
(similar structure to 'v2' above, contains 'AllPhotos')
...
tv_shows
(similar structure to 'movies', contains 'AllPhotos')
other
(different paths, some of which contain 'AllPhotos')
...
I am trying to create a local mirror of only the 'AllPhotos' directories, with their parent paths intact.
I've tried variations of this:
lftp -e 'mirror --only-newer --use-pget-n=4 --verbose -X /* -I AllPhotos/ /image_db/ /var/www/html/mir_images' -u username,password ftp.example.com
...where the "-X /*" excludes all directories and "-I AllPhotos/" includes only AllPhotos. This doesn't work, lftp just copies everything.
I also tried variations of this:
lftp -e 'glob -d -- mirror --only-newer --use-pget-n=4 --verbose /image_db/*/*/AllPhotos/ /var/www/html/mir_images' -u username,password ftp.example.com
...and lftp crunches away at the remote directory structure without actually creating anything on my side.
Basically, I want to mirror only those files that have the string 'AllPhotos' somewhere in the full directory path.
Update 1:
If I can do this with wget, rsync, ftpcopy or some other utility besides lftp, I welcome suggestions for alternatives.
Trying wget didn't work for me either:
wget -m -q -I /image_db/*/*/AllPhotos ftp://username:password#ftp.example.com/image_db
...it just gets the whole directory structure, even though the wget documentation says that wildcards are permitted in -I paths.
Update 2:
After further investigation, I am coming to the conclusion that I should probably write my own mirroring utility, although I still suspect I am approaching lftp the wrong way, and that there's a way to make it mirror only files that have a specific string in the absolute path.
One solution :
curl -s 'ftp://domain.tld/path' |
awk '/^d.*regex/{print $NF}' |
xargs wget -m ftp://domain.tld/path/
Or using lftp :
lftp -e 'ls; quit' 'ftp://domain.tld/path' |
awk '/^d.*regex/{print $NF}' |
xargs -I% lftp -e "mirror -e %; quit" ftp://domain.tld/path/