Is Team Identifier still necessary for iCloud Container - icloud

I have a quick question: I noticed that with Xcode 6 that the default iCloud containers are named something like iCloud.com.company.myApp. Even though my Team-Id is selected under the General>Identity project settings. In the past I would have container ids including my Team-Id like ABC12D3EF8.com.company.myApp.
Will I need to change the iCloud container to include the Team-Id for any future mac app? or is the iCloud prefix enough in a post iCloud Drive era?
Thanks

The iCloud prefix is required for any new iCloud containers. Older apps can still continue to use the Team Identifier prefix and will work as usual for iCloud Drive. The new iCloud prefixed containers can be used with or without iCloud Drive providing you set the appropriate entitlements.

Related

Incorrect entitlements container name

When I analyze my project before distributing it to App Store Connect, I'm shown an incorrect container identifier. The container identifier should be iCloud.rtb.SBWorkbook. I had the container as iCloud.SBWorkbook a while ago but have since changed it. I have scoured my project and can't find any reference to the old container. Any idea where the system is picking up this old name? There are both containers found in the iCloud dashboard and the one being used it the correct one, iCloud.rtb.SBWorkbook. The app submits to App Store Connect without a problem but I suspect this issue is tied to sharing issues I'm working on as well.
ENTITLEMENTS
com.apple.developer.ubiquity-container-identifiers
iCloud.SBWorkbook
com.apple.developer.icloud-container-identifiers
iCloud.SBWorkbook
get-task-allow
false
com.apple.developer.team-identifier
------
application-identifier
------.rtb.SBWorkbook
The entitlements file is:

How to implement Auto-Updates for my Application without Administration Rights

I maintain a large Windows C++ Application that installs with nullsoft nsis. Installation is quick and simple (less than 1 minute).
Some users in large companies do not have administration privileges and they have to order costly 3rd party services to update their installation. Therefore some of them only update once a year, while we ship every month and sometimes fix important bugs etc.
So we are thinking about automatic updates that do not require elevated administration rights. Mozilla and Adobe do this as well as others. As far as i can see an the Mozilla XULRunner site they install a service which then in turn can run a update without forcing the user to enter a administration password. I also found Googles Omaha but it is not clear about the administrator privileges ("Support for restricted user environments; for example, users without administrator privileges "..."This requires the user has administrator privileges.").
So far i have not found exact answers to these questions:
What steps do we have to take in order to establish such a mechanism?
Can we keep on with nsis?
What server infrastructure is requested?
Your application should check for updates on your server/website and get the download link.
This should be pretty easy if you maintain a text file/page with fixed hyperlink.
This hyperlink can be hard-coded in your application.
If it detects a version newer than the current version then download the files.
Along with these files there should be instructions for which files to replace and Which files are to be added at what location, etc.
Now whether or not you need admin privileges depends on where you need to place the updated files. If the target folder has some restrictions then it would be difficult to update in the same session. So you may have to launch a dummy exe which asks the user for admin privileges during startup. Now you can copy the updated files to your desired location without much pain.

Qt cross platform safe way of storing data in an SQLite database?

I'm trying to figure out the safest way of storing chat history for my application on the clients computer. By "safe" I mean so that my application is allowed to actually read/write to the SQLite database. Clients will range from Windows, OS X and Linux users. So i need to find a way on each platform of determining where I'm allowed to create a SQLite database for storing the message history.
Problems I've run into in the past were for example when people used terminal clients for example Citrix where the users is not allowed to write to almost any directory. The drive is often a shared network drive.
Some ideas:
Include an empty database.db within my installer that contains prebuilt tables. And store the database next to my executable. However I'm almost certain that not all clients will be allowed to read/write here, for example Windows users who do not have admin rights.
Use QStandardPaths::writableLocation and create the database at the first run time
Locate the users home dir and create the database at the first run time
Any ideas if there is a really good solution to this problem?

How to save files when iCloud not enabled, and then becomes enabled

According to apple's iCloud docs:
Every user with an Apple ID receives a free iCloud account but some users might choose not to enable iCloud for a given device. Before you try to use any other iCloud interfaces, you must call the URLForUbiquityContainerIdentifier: method to determine if iCloud is enabled. This method returns a valid URL when iCloud is enabled (and the specified container directory is available) or nil when iCloud is disabled.
If URLForUbiquityContainerIdentifier: returns nil, then I presume you have to save to the user's documents directory. What are you supposed to do if the user enabled iCloud at a later date? The docs only tell you what to do when everything goes to plan, but make no mention of best practices for handling anything else... What if iCloud gets disabled, will the local copy of the icloud document get deleted, or have it's permissions removed from the sandbox?
You should save the state of iCloud enablement (i.e. were you able to get a nonnil result from URLForUbiquityContainerIdentifier:), register for notifications of changes to the situation, etc. If you detect that the state has changed from off to on, the solution is simple; call setUbiquitous:itemAtURL:destinationURL:error: to copy the file from your world into the ubiquity world (it's just a folder, after all).
This situation is pretty well covered in the docs:
http://developer.apple.com/library/mac/#documentation/General/Conceptual/iCloudDesignGuide/Chapters/iCloudFundametals.html
Unfortunately, if you detect the opposite (the user has turned off iCloud), the file is lost and you can't retrieve it, because, ex hypothesi, URLForUbiquityContainerIdentifier: is now returning nil. But it is still in the cloud. I presume you could provide interface for asking the user to turn iCloud back on, and more interface for letting the user request that the file be moved back into the nonubiquity world.

Sitecore development and demo servers

I'm attempting to get an understanding of what is a best practice / recommended setup for moving information between multiple Sitecore installations. I have a copy of Sitecore setup on my machine for development. We need a copy of the system setup for demonstration to the client and for people to enter in content prelaunch. How should I set things up so I people can enter content / modify the demonstration version of the site and still allow me to continue development on my local machine and publish my updates without overwriting changes between the systems? Or is this not the correct approach for me to be taking?
I believe that the 'publishing target' feature is what I need to use, but as this is my first project working with Sitecore and so I am looking for practical experience on how to manage this workflow.
Nathan,
You didn't specify what version of Sitecore, but I will assume 6.01+
Leveraging publishing targets will allow you to 'publish' your development Sitecore tree (or sub-trees) from your development environment to the destination, such as your QA server. However, there is potential that you publish /sitecore/content/home/* and then you wipe out your production content!
Mark mentioned using "Sitecore Packages" to move your content (as well as templates, layout items, etc...) over, which is the traditional way of moving items between environments. Also, you didn't specify what version of Sitecore you are using, but the Staging Module is not needed for Sitecore 6.3+. The Staging Module was generally used to keep file systems in sync and to clear the cache of Content Delivery servers.
However, the one piece of the puzzle that is missing here is that, you will still need to update your code (.jpg, .css, .js, .dll, .etc) on the QA box.
The optimal solution would be to have your Sitecore items (templates, layout item, rendering items, and developer owned content items) in Source control right alongside your ASP.NET Web Application and any class library projects you may have. At a basic level, you can do this using built in "Serialization" features of Sitecore. Lars Nielsen wrote an article touching on this.
To take this to the next level, you would use a tool such as Team Development for Sitecore. This tool will allow you to easily bring your Sitecore items into Visual Studio and treat them as code. At this point you could setup automated builds, or continuous integration, so that your code and Sitecore items, are automatically pushed to your QA environment. There are also configuration options to handle the scenario of keeping production content in place while still deploying developer owned items.
I recommend you looks at the staging module if you need to publish to multiple targets from the same instance, i.e. publish content from one tree over a firewall to a development site, to a QA site, etc.
If you're just migrating content from one instance to another piecemeal, you can use Sitecore packages which are standard tools to move content. The packages serialize the content to XML and zip it up and allow you to install them in other instances.