I have created a flow in Cloud Dataprep, job executed. All fine.
However, my colleagues, who also has owner role in this GCP project, are not able to see the flow I created. I'm not able to find sharing options anywhere.
How should it be setup so that Dataprep flow can be worked on by multiple users?
Thanks.
UPDATE from September 2018 onwards
Sharing is now available. See the answer from #justbeez below https://stackoverflow.com/a/55386000/945789 for details.
No longer relevant - original answer prior to this release below
At the current time this is not possible. There were a bunch of new features released Jan 23, 2018 but sharing flows and/or datasets is still not there. Given it's still Beta keep an eye on releases as they seem to be big jumps: https://cloud.google.com/dataprep/docs/release-notes.
There are two workarounds I know of at the moment:
Whenever you create flows, do so under a shared/generic Google login. Google don't recommend this as it's not great security, but it's the only way to have live access with more than one person. Of course that user will also need access to your project.
Periodically export your flow as a zip (three dots menu when you hover a flow) and store that on a shared drive. You can import that flow at the flows menu. I'm not sure whether export and/or import was new in the Jan 2018 release or not, but I didn't notice it prior to that.
We've been using sharing at the Flow level since it was added in September 2018 (check the changelog). You get to this option from the kabob menu on the individual flow:
From there, you'll get a sharing menu you can use (keep in mind, you can only share with people who are in the project and have suitable permissions):
From there, you can see either all flows by default or just those that are shared:
Note that there's currently only one shared permission level, which is Collaborator. This means the person you share with can work with the flow, add recipes, etc.—but they don't have the option of renaming the flow or managing sharing. The recently added Folders option doesn't support sharing at all.
Full details on sharing flows can be found here:
https://cloud.google.com/dataprep/docs/html/Overview-of-Sharing_118228675
Related
I am trying to wrap my head around Microsoft Cloud for Sustainability. Apparently it's a solution based on Microsoft Dynamics. I need to have more back-end to that solution, because as it is right now I'm either lacking permissions (or extra paid access to Microsoft resources) or missing a chunk of documentation, because I'm unable to:
Change default language across the board - I can switch MS Dynamics to any language I want, but it will work for a shell only. Anything that's CfS specific, is in English. Do I remove the demo data and import my own scopes and data? As only thing available are database and Cube for BI analytics and JSON files describing CfS structure in general (that's in CDM), do I really have to create it from scratch? This brings me to second question:
Access entry-level data that's already in demo version - I need to see what's in the database the CfS is using, or be able to modify it. Is there any way to get to it via Business Central, if at all possible?
Since I will be preparing several presentations for potential customers, I need a way to quickly create a dataset based on initial and very basic information provided by each customer, how can I do that with trial user
I work for a company that's Microsoft Certified Partner, so logically resources for what I need should be available to me, but either links in the documentation are dead (and some are, as they redirect to general info) or require some special access level (or are dead, but error message is really not helpful at all).
Is there somewhere else I can go? The Documentation page offers little towards what I need...
P.S. I think I should tag this question with CfS specific tags, but not enough rep...
I'm working on a project centered around API Change Management. I'm curious as to how AWS informs developers of changes to its APIs. Is it through the document history (https://docs.aws.amazon.com/apigateway/latest/developerguide/history.html)? Or do they send out emails to developers?
Regarding emails, are emails sent to all developers using the API (ex. API Gateway) or just developers using a particular endpoint and will be affected by the change? What is the frequency of notifications - breaking changes, minor changes, etc.
Thanks so much for your help!
For non-breaking changes, you can learn about them on the Developer Guide as you pointed out. Some of these changes are also announced on their What's New page (RSS feed). You can also follow the SDK releases which are updated often (e.g. by using the RSS feed for aws-sdk-go releases). I believe that most of the SDKs are using code generation to generate a lot of the API functionality. They push updates to these files in the SDK git repositories (ruby example, go example), but it is not clear if there is another place to find these files. It doesn't seem like they want us to consume these directly (see this developer forum thread from 2015). There's also awsapichanges.info, which appears to be built by AWS themselves.
AWS very rarely makes breaking changes to their API. Even SimpleDB, which is a very old AWS product, still works.
Having said that, they do make breaking changes from time to time, but they try to announce them well ahead of time. The biggest breaking change that they are trying to complete is probably their attempt to deprecate S3 path-style access. This was first quietly announced in their AWS Developer Forums, which caused a lot panic especially since the timeline was incredibly short. Based on the panic, AWS quickly backtracked and revised the plan, more publicly this time.
They have done some other S3 breaking changes in other ways. For example, S3 buckets must now have DNS-compliant names. This was only recently (March 1, 2018) enforced on new buckets in us-east-1, but for most other regions this was enforced from the start when the regions were made available. Old S3 buckets in us-east-1 may still have names that are not DNS-compliant.
Lambda is removing old runtimes once the version of the programming language stops being maintained (such as Python 2.7). This should be a known expectation for anyone who starts using the service, and there is always a new version that you can migrate to. AWS sends you email reminders if you still have Lambda functions that is using the old runtime, when the deadline nears.
Here is a GitHub repository where people try to track breaking changes: https://github.com/SummitRoute/aws_breaking_changes. You can see that the list is not that long.
I'm a billing administrator for several GCP projects. Some of them I can't access so I can't tell if they're still required. They have very low usage e.g. $0.25 / month so probably just storage costs. I'm pretty sure they were created by a sysadmin who used to work here but has left and he doesn't seem to have given anyone else rights to view the project.
Is there any way to get myself added as a project owner since we're paying for it?
You'll need to be the project owner in order to change ownership of the project. You'll need to contact support as stated on relevant section of the documentation.
Our team is now more than 25 and I believe we have reached the shared history limits so Postman is automatically archiving collections. It is OK to archive some old ones, but then they are also archiving our Smoke Test which we run from time to time and some other developer collections that are used as reference by new team members. Has anyone found a way to select which Collection will remain unarchived?
I've searched the net for some official comment from Postman and came accross this: Archived Items in Free Teams Account Cannot be Deleted. It seems the Postman team has no plan of adding a delete button to clear out the history.
I also tried another work around that I saw wherein you download the archived data and re-share the specific collection that you want to be shared. I was able to share two collectections, but when I was out to share the 3rd one, the 2 got archived and I was left with only the 3rd one shared. It seems that this also generates multiple environment instances making it not really ideal since you have to do a cleanup as you do trial and error.
I just need to have at least 3 permanently shared collections.
Shared history and collections have separate limits (both 25 requests). Therefore, clearing out history won't affect your collections. Archived collections don't count towards this limit either, so deleting archived collections (as suggested in the Github thread) won't have any effect either.
Other then moving to a paid plan to avoid archiving collections, you can "Unshare" the collections so that they don't get archived. To do so, you have to remove them from all workspaces, but your personal workspace. If you go to "Share Collection", you'd be able to see all the workspaces the collection is in. If you still want to share the collections with other people, you can export it and give it to them. The downside it that you'll have to do this manually export/import every time you make a change to the collection.
I have no other choice but to adopt iCloud right now. In the near future i would like to build my own cloud service. Is there any problem if the app transfers all the data from iCloud to my own cloud?
Only the data related to my app of course.
After user's permission.
Is Apple positive about this?
If you mean, would Apple approve an app for the store that was going to transfer the user's iCloud data to some other online service, as usual all we can do is try and gauge the odds.
None of Apple's guidelines even hint that apps may not use non-iCloud services.
Neither do they hint that there's any issue with moving data from one service to another, even if one of them is iCloud.
Apple does not look kindly on apps that transfer user data to online storage without the user's knowledge. Assuming you make it clear to users what you're doing, this is probably not an issue, but users should have the chance to opt out of your service.
Based on information available right now, what you suggest is probably OK so long as your app makes clear what's happening. It's unwise to try and predict Apple's app-approval actions too closely. They might change their policies tomorrow, or they might decide to reject your app for reasons that had not previously been stated. At the moment though, switching services like that seems likely to be accepted.