Fix "This file is not commonly downloaded" warning - Firefox only - amazon-web-services

I run a website (www.pixelscrapper.com) that serves file downloads of images, and zipped collections of images (which are zip files containing multiple imags, psds, vectors, etc.) These files are hosted on Amazon S3, and served via download urls generated by aws sdk for php (v1).
Just recently, users trying to download our zip files using Firefox have started getting "This file is not commonly downloaded" warnings (after the download finishes), which forces them to override the warning before accessing the file via the Firefox download manager. Naturally, this kind of warning causes concern for our users.
This warning shows up IN FIREFOX ONLY--Chrome, Edge, and Internet Explorer show no warnings when downloading and opening our zip files. The warning also only seems to show up for (surprise, surprise) files that have been more recently added to the site, and have relatively few total downloads--but many of our files never receive large numbers of downloads, so this warning has the potential to plague many of our files indefinitely.
My question is: is there anything I can do to prevent this warning? By adjusting headers, signing files in some way, etc.? (From what I understand, Chrome and Edge also have "uncommon file" warning, but they don't seem to be concerned with our files--why is this warning only firing in Firefox?) I've searched around on Stack Overflow and elsewhere, but most of the questions I've seen about "uncommon download" warnings are targeted at Chrome or Internet Explorer, and I can't seem to find any Firefox-specific information about this warning.
Here's a sample file download url (generated by aws sdk) that is causing warnings:
https://pixelscrapper-user-content.s3.amazonaws.com/template-attachment/user-2/node-13574/paper-037-template-polka-dots.zip?response-content-disposition=attachment%3B%20filename%3D%22ps_marisa-lerin_13574_paper-037-template-polka-dots_cu.zip%22&AWSAccessKeyId=AKIAIWM7MZMHRPA6FHEA&Expires=1495386939&Signature=HDmwRFPX81CIVrQgu1BkEyR9iRQ%3D
Here's an inspection of headers in Firefox:
UPDATE:
The issue here is not the nasty-looking url generated by the aws sdk: I checked downloading the same zip file (containing one jpg, one psd) from the following "clean" url, and it still gives the warning: http://pixelscrapper-misc-files.s3.amazonaws.com/ps_marisa-lerin_13574_paper-037-template-polka-dots_cu.zip

Check the settings under menu path Tools->Options->Security. You may need to uncheck the options under the "General" section of that dialogue box. Simply unchecking "Warn me about unwanted and uncommon software" resolves this.
No need to change other settings. I hope it helps.
Also see: https://support.mozilla.org/en-US/kb/how-does-phishing-and-malware-protection-work

Related

AWS S3 Buckets Access Errors & Buckets Showing Old Files

I have S3 buckets that I have been using for years and today when I logged in through the console to manually upload some files, I have noticed that all of my buckets are showing ERROR under the access tab.
While I can still see the files, I'm unable to upload or modify any files and also all files in my buckets are showing old versions from December even though I have updated some of the text files just this month. Also, all files are missing their meta tags.
I did not manage or change any permissions in my account in years and I'm the only one with access to these files.
Anyone else had this issue? How can I fix this?
It really feels like AWS had some major failure and replaced my current files with some old backup.
I had the same issue (except of the old files part). In my case it was a browser plugin called "Avira Browserschutz", a similar plugin to adblock, which caused it. Other plugins such as uBlock Origin might result in identical behavior.
Test it by disabling said plugins or visit AWS in incognito mode.

Set File Type on Directory Upload to Google Storage

When I upload a set of files to a bucket on Google Storage, they are automatically assigned file types ("text/html", "application/json", etc.). But when I do a directory upload via the developer console, the files in the directory all get type "application/octet-stream". How do I get Google Storage to automatically assign file types to the contents of an uploaded directory?
This is likely a bug in the developer console. This problem comes from adding a directory via drag-and-drop. Uploading a directory via the "Upload Folder" button fixes this problem (it associates the correct file types with the files in the directory).
As pointed out in your response above,
This is likely a bug in the developer console. This problem comes from adding a directory via drag-and-drop. Uploading a directory via the "Upload Folder" button fixes this problem (it associates the correct file types with the files in the directory).
this does seem a bug
This does seem to be a bug. All Google Cloud Platform bugs should be reported on the public issue tracker. You might want to file a bug at Public Issue Tracker so that this can be further investigated.

Google Cloud Storage - files not showing

I have over 30 Leaflet maps hosted on my Google Cloud Platform bucket (for example) and it has always been an easy process to upload my folder (which includes an html file with sub-folders including .js and .css files) and share the map publicly.
I tried uploading another map today, but within the folder there are no files showing and I get the following message "There are no live objects in this folder. If you have object versioning enabled, this folder may contain archived versions of objects, which aren't visible in the console. You can list archived object versions using gsutil or the APIs."
Does anyone know what is going on here?
We have also seen this problem, and it seems that the issue is limited to buckets that have spaces in the name.
It's also not reproducible through the gcloud web console, but if you use gsutil to upload a file to a bucket with a space in the name then it won't be visible on the web UI.
I can see from your screenshot that your bucket also has spaces (%20 in the url).
If you need a workaround asap, you could rename your bucket...
But google should fix this soon, I hope.
There is currently open issue on GCS/Console integration
If files have any symbols that needs urlencoding - they are not visible in console - but accessible via gsutil/API (which is currently recommended as workaround)
Issue has been resolved as of 8-May-2018 10:00 UTC
This can happen if the file doesn't have an extension, the UI treats it as a folder and lets you navigate into it, showing a blank folder instead of the file contents.
We had the same symptom (files show up in API but invisible on the web and via CLI).
The issue turned out to be that we were saving files to "./uploads", which Google interprets as "create a directory literally called '.' and then a subdirectory called uploads."
The fix was to upload to "uploads/" instead of "./uploads". We also just ran a mass copy operation via the API for everything under "./uploads". All visible now!
I also had spaces in my url and it was not working properly yesterday. Checked this morning and everything is working as expected. I still have the spaces in my URL btw.

Cannot publish web service - Error : Could not open Source file: The given path's format is not supported

We are trying to publish a web service from our Dev box onto the UAT box.
There are no errors when building the web-service, but when trying to publish (using UNC path: \\TEST-SERVER\c$\inetpub\wwwroot\PerformanceReviewWebService) we get the below error message and the process fails:
3>C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v12.0\Web\Microsoft.Web.Publishing.targets(1475,5):
Error : Could not open Source file: The given path's format is not supported.
what can we do to track down this error and resolve it?
On the target box we have checked the security of the folder:
..\c$\inetpub\wwwroot\PerformanceReviewWebService
and we definitely have access to write into that directory.
We have no build errors.
Here are a few things you can try that may help debug or resolve the issue:
Try launching Visual Studio as an Administrator and retry publishing.
Try publishing to a local location. Does this succeed?
If yes there is likely a problem with either accessing the UNC path or a file/security permission issue
If local publishing does not succeed, make sure you haven't mistakenly moved any of you bin, contents or webconfig files to a different location.
Check for files in Visual Studio that are light grey, or have an exclamation mark next to them (They may be missing from the project).
Attempt to possibly reload the project to a prior state where publishing was successful. Continue making updates gradually until your project is up to date.
Create a new empty project and attempt to publish to the same location. If this works you have something corrupted or wrong in you existing project. You can try excluding files or folders from the build to narrow down which ones may be causing the problem.
In my case a referenced web-project shared content with the web-project I want to publish. After removing it from the content of the referenced project it was working again.

Update remote File using Eclipse's RSE Plugin

I've installed the Remote System Explorer Plugin for Eclipse and set up a SSH connection to our development server using public key authentication and a custom Port (not sure if any of these customisations relate to the problem).
However browsing the file system works great and I can even create folders and files. E.g. /tmp/foo/bar.txt but I can't figure out to "push" changes I've did to the server.
So my problem is I open a file, write some text and save it in Eclipse. The asterisk next to file names vanishes (indicating it's save) and if I close an re-open the file in Eclipse the changes are present. But they're not visible on the server. E.g. doing changes to a .html file won't show any changes in the web browser or cat bar.txt (mentioned earlier) produces an empty output.
Creating folders or files is working as intended but updates to file contents are not shown up on the remote system though they're persist in Eclipse.
Is there some button I'm missing to update my "local" changes to the "remote server". Can I even get rid off all this caching? As we're working in a Team it's crucial our Files are always up to date and having some extra caching would definitely spoil all the fun :(
My IDE configuration is like that :
Eclipse IDE for C/C++ Developers
Version: Juno Service Release 2
Build id: 20130225-0426
Get a handle to the IFileService that is hosting the file ... this gives you the ability to upload files.