iCloud- detect file upload - icloud

I have code that detects iCloud availability, generates the file and uses the FileManager to send it to the cloud. Files then remain local, until IOS uploads it to the cloud.
Is there a way to detect when the file actually gets uploaded?

Using NSMetaDataQuery you can obtain status information for all your iCloud files. The NSMetaDataItem returned for each file consists of key/value pairs and contains useful information such as NSMetadataUbiquitousItemIsUploadedKey, NSMetadataUbiquitousItemIsUploadingKey, NSMetadataUbiquitousItemUploadingErrorKey and NSMetadataUbiquitousItemPercentUploadedKey.

Related

Upload data from a webpage to an IPFS

How can we upload data from a webpage to an IPFS server? In IPFS, I have already uploaded a .txt/JSON empty file.
The file can either be a .txt or a JSON file.
If i understand your question correctly,you are trying to upload something from your webpage to IPFS cluster(at least this is the basic context i understand).
So with this intent i would say this is possible.
What you need to do is
a. Upload the file in a local IPFS node using the IPFS api(Read the official documentation) and getting the content identifier(cid) from your webpage.
Problem:
What happens when the local node is not available?(You shutdown your computer simply!!)
The file yo upload will be garbage collected after it is not used for some time as a result you lose the data completely.
Solution:
Use a Pinning service.Pinning is the mechanism that allows you to tell IPFS to always keep a given object somewhere where it is always accessible and save it from the internal Garbage collection that the system performs.
The default pinning location is your local computer node and you can do that using the IPFS specified API.
You can also use other pinning service available like Pinnata where you can host the content in their node itself(You scale the number of nodes and you pay!)
In this way you can guarantee that the content is always served and online.
Hope this helps you to get started with.

How to allow google cloud storage to save duplicate files and not overwrite

I am using google cloud storage bucket to save file uploads from my Django web application. However if a file with same name is uploaded, then it overwrites the existing file and i do not want this to happen. I want to allow duplicate file saving at same location. Before moving to google cloud storage when i used my computer's hard disk to save files, Django used to smartly update filename in database as well as hard disk.
I upload files with the name given by users, and I concatenate a timestamp including seconds and milliseconds. but the name of the file is seen by clients as they added it, since I remove that part of the string when it is displayed in the view.
example
image1-16-03-2022-12-20-32-user-u123.pdf
image1-27-01-2022-8-22-32-usuario-anotheruser.pdf
both users would see the name image1

Sitecore - How to load images on CD server from CM server?

I have two separated servers: one is CD server and one is CM Server. I upload images on CM server and publish them. On the web database, although I saw the the images under Media Library item
But they aren't displayed on CD server (e.g on website), it indicates that the images not found. Please help me to know how I can solve that problem or I do need some configuration for that.
Many thanks.
Sitecore media items can carry actual media file either as:
Blob in the database - everything works automatically OOB
Files on the file system - one needs to configure either WebDeploy, or DFS
Database resources are costly, you might not want to waste them on something that can be achieved by free tools.
Since WebDeploy by default locates modified files by comparing file hashes between source, and target, it will become slower after a while.
You might have uploaded image in media library as a file. As such, image is stored as a File on file system. To verify this, your image item in media library will have a path value set in 'File Path' field of your image item. Such files have to be moved to file system of CD server as well.
If you uploaded your images in bulk, you can store them as blob in DB by default rather than as a File in file system using following setting-
<setting name="Media.UploadAsFiles" value="false">

Get metadata info without downloading the complete file

As I read the different posts here and libtorrent documentation, I know (as documented), I have to download the torrent file in order to get the metadata. but how the uTorrent App works, when I just start downloading, I get the metadata within a second then after getting the metadata, I can pause downloading. So, it doesn't restrict me to download a complete file in order to return metadata.
So, is there a way to get metadata without downloading the complete file
libtorrent's metadata_received_alert is what you want. This will be sent once the metadata finishes downloading. Do make sure that youre receiving status notifications though.

Limit MQFTE file transfer to one file at a time

I have a MQFTE setup where we are receiving files from an external vendor. The files get dumped on a server in DMZ and we have an MQFTE agent that picks the files from that server and drops to our server.
We receive files in "sets" i.e. each incoming file has an associated xml file that describes and contains metadata about the file. E.g. a applicationform.pdf and applicationform.xml. The final application stores the pdf file based on the data/metadata in the xml.
Since the trigger is fired for each incoming file, we check in the trigger whether or not we've received the XML file and the content file (e.g. PDF).
However, I don't think this is the best approach as it adds to a lot of booking code to check for concurrency issues when both files arrive at same time. Is there a way to :
Restrict the trigger so that it only fires when both files have arrived? In my research this is not possible.
Configure the agent on the server so that it only receives one file at a time? Looking at the documentation, it seems like it can achieved but only on the agent initiating the transfer, not on the agent receiving the transfer? The documentation hints at monitorMaxResourcesInPoll and -bs parameter, but that would be on the source agent I guess. Since the agent is shared with multiple systems, this would impact them as well.
Also, I would appreciate any tips and suggestions or even alternative solutions to best meet the requirement.
I don't think there is a way to check for both files existing before the monitor triggers. What some users do is send all of the files they want to transfer, and then finally put a 'marker' file in the directory which the resource monitor looks for. Because the marker file is only written after all other files are ready to be sent, the monitor only transfers the files when they're all there.
In answer to 2) I you could set maxDestinationTransfers to 1 on the destination agent to limit it to receive a single transfer at a time. If a transfer contains multiple files they will be transferred in sequence so the destination is really only receiving 1 file at a time. monitorMaxResourcesInPoll simply limits the monitoring agent to the number of files it parses in the source directory per monitor poll. You could set that to 1 but if you want to transfer the PDF and the XML file in the same transfer you'd need to set it to 2. It's probably not the setting you want to use.