Incorrect entitlements container name - swiftui

When I analyze my project before distributing it to App Store Connect, I'm shown an incorrect container identifier. The container identifier should be iCloud.rtb.SBWorkbook. I had the container as iCloud.SBWorkbook a while ago but have since changed it. I have scoured my project and can't find any reference to the old container. Any idea where the system is picking up this old name? There are both containers found in the iCloud dashboard and the one being used it the correct one, iCloud.rtb.SBWorkbook. The app submits to App Store Connect without a problem but I suspect this issue is tied to sharing issues I'm working on as well.
ENTITLEMENTS
com.apple.developer.ubiquity-container-identifiers
iCloud.SBWorkbook
com.apple.developer.icloud-container-identifiers
iCloud.SBWorkbook
get-task-allow
false
com.apple.developer.team-identifier
------
application-identifier
------.rtb.SBWorkbook
The entitlements file is:

Related

Google Cloud Platform adding OAuth Client ID says Requested entity already exists

I created a OAuth 2 client Id in Google Cloud Platform(GCP) in our production application. However this was only for internal use, so I removed it and tried to add it again in our development GCP project.
However when trying to add it, it says
Save failed
Requested entity already exists
Tracking number: xxx
What am I doing wrong? Do I need to do some extra steps to completely remove the OAuth 2 client id? I removed them around a month ago already, so it really should be gone by now.
It seems after 1 month the problem has automatically resolved itself. I assume it just soft deletes when you press delete, and then hard deletes one month later. Pretty annoying system.
You can also remove the entire project to get rid of unwanted ghost clients, but obviously you then lose all configuration.

Can I reuse the deleted project ID on google cloud platform? By the way, I found something wrong in google cloud platform docs

To interact with Google Cloud resources, you must provide the identifying project information for every request. A project can be identified in the following ways:
Project name: the customized name you chose when you created the project, or when you activated an API that required you to create a project ID. Note that you can't reuse the project name of a deleted project.
Project ID: a unique identifier for your project, composed of the project name and a randomly assigned number.
Project number: a number that's automatically generated by the server and assigned to your project.
https://cloud.google.com/resource-manager/docs/creating-managing-projects
As I know, project name can be changed whenever I hope, and I have tested that even I can use the exist project's name as a new project name.
There is an slight error in the documentation :
The behavior is:
1. You can reuse the project name of an existing project, many times over.
2. You can reuse the project name of a deleted project.
3. You can reuse the project name of a deleted and purged project.
It is the reuse of Project IDs that is not an available option.
To clarify as I have previously written, even after the purge from the system, (30 days) you will still be unable to reuse the project id - it is permanently recorded in Google’s system.
Maybe I can help a little, I would like to let you know that there may be a better way to deal with this issue. It would probably be best if you create a variable in your system (called something like : MY_CURRENT_PROJECT_ID) which is used anywhere that the project id is needed. This would mean that if the project id changed you would only have to change it in one place.
I hope this helps.

Deploying Django as standalone internal app?

I'm developing an tool using Django for internal use at my organization. It's used to search and tag documents (using Haystack and Solr), and will be employed on different projects. My team currently has a working prototype and we want to deploy it 'in the wild.'
Our security environment is strict. Project documents are located on subfolders on a network drive, and access to these folders is restricted based on users' Windows credentials (we also have an MS SQL server that uses the same credentials). A user can only access the projects they are involved in. Since we're an exclusively Microsoft shop, if we want to deploy our app on the company intranet, we'll need to use an IIS server to deal with these permissions. No one on the team has the requisite knowledge to work with IIS, Active Directory, and our IT department is already over-extended. In short, we're not web developers and we don't have immediate access to anybody experienced.
My hacky solution is to forgo IIS entirely and have each end user run a lightweight server locally (namely, CherryPy) while each retaining access to a common project-specific database (e.g. a SQLite DB living on the network drive or a DB on the MS SQL server). In order to use the tool, they would just launch an all-in-one batch script and point their browser to 127.0.0.1:8000. I recognize how ugly this is, but I feel like it leverages the security measures already in place (note that never expect more than 10 simultaneous users on a given project). Is this a terrible idea, and if so, what's a better solution?
I've dealt with a similar situation (primary development was geared toward a normal deployment situation, but some users have a requirement to use the application on a standalone workstation). Rather than deploy web and db servers on a standalone workstation, I just run the app with the Django internal development server and a SQLite DB. I didn't use CherryPy, but hopefully this is somewhat useful to you.
My current solution makes a nice executable for users not familiar with the command line (who also have trouble remembering the URL to put in their browser) but is also relatively easy development:
Use PyInstaller to package up the Django app into single executable. Once you figure this out, don't continue to do it by hand, add it to your continuous integration system (or at least write a script).
Modify the manage.py to:
Detect if the app is frozen by PyInstaller and there are no arguments (i.e.: user executed it by double clicking it) and if so, then run execute_from_command_line(..) with arguments to start the Django development server.
Right before running the execute_from_command_line(..), pop off a thread that does a time.sleep(2) (to let the development server come up fully) and then webbrowser.open_new("http://127.0.0.1:8000").
Modify the app's settings.py to detect if frozen and change things around such as the path to the DB server, enabling the development server, etc.
A couple additional notes.
If you go with SQLite, Windows file locking on network shares may not be adequate if you have concurrent writing to the DB; concurrent readers should be fine. Additionally, since you'll have different DB files for different projects you'll have to figure out a way for the user to indicate which file to use. Maybe prompt in app, or build the same app multiple times with different settings.py files. Variety of a ways to hit this nail...
If you go with MSSQL (or any client/server DB), the app will have to know the DB credentials (which means they could be extracted by a knowledgable user). This presents a security risk that may not be acceptable. Basically, don't try to have the only layer of security within the app that the user is executing. The DB credentials used by the app that a user is executing should only have the access that the user is allowed.

How to run my own C++ source files in the installation wizard?

I have created a windows install deployment for my C++ application using VS2010. However my problem is that I don't know how can I squeeze in some of my own code to the installation wizard (and is it possible at all?). The problem is that during installation I want:
ask the user to provide his installation key,
grab the hardware fingerprint (I already have an algorithm for that with WMI),
sent both keys using my own c++ communication libraries (so NOT the browser),
continue the installation after receiving a confirmation from the server.
Moreover, this would require adding custom items to the installation wizard like the input field for installation key, or pop up boxes witch error warring like:
Couldn't connect to the server. Please check your internet connection before continuing with the installation.
So it's in fact a 2 level question:
How to run my C++ source files during the installation wizard?
How to add custom element to the installation wizard GUI?
So far it has been hard to find anything helpful in Google. :/
Check Windows Installer, more specifically Custom Actions:
The developer of an installer package may write code to serve their own purpose, delivered in a DLL. This can be executed during the installation sequences, including when the user clicks a button in the user interface, or during the InstallExecuteSequence. Custom Actions typically validate product license keys, or initialise more complex services. Developers should normally provide inverse custom actions for use during uninstallation.
Msiexec provides a way to break after loading a specified custom action DLL but before invoking the action.

Sitecore development and demo servers

I'm attempting to get an understanding of what is a best practice / recommended setup for moving information between multiple Sitecore installations. I have a copy of Sitecore setup on my machine for development. We need a copy of the system setup for demonstration to the client and for people to enter in content prelaunch. How should I set things up so I people can enter content / modify the demonstration version of the site and still allow me to continue development on my local machine and publish my updates without overwriting changes between the systems? Or is this not the correct approach for me to be taking?
I believe that the 'publishing target' feature is what I need to use, but as this is my first project working with Sitecore and so I am looking for practical experience on how to manage this workflow.
Nathan,
You didn't specify what version of Sitecore, but I will assume 6.01+
Leveraging publishing targets will allow you to 'publish' your development Sitecore tree (or sub-trees) from your development environment to the destination, such as your QA server. However, there is potential that you publish /sitecore/content/home/* and then you wipe out your production content!
Mark mentioned using "Sitecore Packages" to move your content (as well as templates, layout items, etc...) over, which is the traditional way of moving items between environments. Also, you didn't specify what version of Sitecore you are using, but the Staging Module is not needed for Sitecore 6.3+. The Staging Module was generally used to keep file systems in sync and to clear the cache of Content Delivery servers.
However, the one piece of the puzzle that is missing here is that, you will still need to update your code (.jpg, .css, .js, .dll, .etc) on the QA box.
The optimal solution would be to have your Sitecore items (templates, layout item, rendering items, and developer owned content items) in Source control right alongside your ASP.NET Web Application and any class library projects you may have. At a basic level, you can do this using built in "Serialization" features of Sitecore. Lars Nielsen wrote an article touching on this.
To take this to the next level, you would use a tool such as Team Development for Sitecore. This tool will allow you to easily bring your Sitecore items into Visual Studio and treat them as code. At this point you could setup automated builds, or continuous integration, so that your code and Sitecore items, are automatically pushed to your QA environment. There are also configuration options to handle the scenario of keeping production content in place while still deploying developer owned items.
I recommend you looks at the staging module if you need to publish to multiple targets from the same instance, i.e. publish content from one tree over a firewall to a development site, to a QA site, etc.
If you're just migrating content from one instance to another piecemeal, you can use Sitecore packages which are standard tools to move content. The packages serialize the content to XML and zip it up and allow you to install them in other instances.