I've been recently added to a new GCP project which has litterally tons and tons of pods, workloads and bases.
I want to visualize all of it in a schema or model.
Is there any tool or plugin that i can use to modelize the project ?
Probably the best mechanism would be to use Cloud Console and view the project's resources through the various pages built in to the console.
Google provides very many APIs (services) and these may contain multiple resource (types) and, as you've seen, there can be many instances of the resources.
I think anything that enumerate all a project's resources could be somewhat overwhelming whereas Console provides structure.
Choose your project at or append a query string project=... to:
https://console.cloud.google.com
I have a project created via the console. The project is on hold and might be revived in 6 months. I would like to download to local storage, delete it, then upload it to the cloud if/when the project is taken off hold. I have not been able to get gsutil working on my machine. (Newbie to the SDK)
I know how to delete the project "Shutdown", I have saved parts (i.e. MySQL Database, Cloud Functions, etc.). but I can't figure out how to save the project with it's users, permissions, settings, intact.
Is there a way? (I am concerned about costs while it is not used, otherwise I would leave it.)
From Data Studio, I am not able to list any Big Query projects. Not even Public projects. I have access to Big Query Tables and projects, but once I log in through Data Studio and try to add a data source, it's showing all blank, including custom queries, etc.
I also faced the same situation and I hope this helps someone else with this confusion. I'm assuming that this could be the solution to this question too. (Do let me know)
When you are trying to add a BigQuery connection to Data Studio, you need to have a Google Project because BigQuery is a paid service. The querying in BigQuery incurs a cost. A project with billing enabled, is not a must for you to load the Datasets in the above screenshot.
Follow these steps to get this issue fixed.
Check the email address you have used to connect to Data Studio. You can click on the avatar on the top right corner. (If this is wrong, simply switch to the correct email.)
Check whether Google Cloud Console is enabled for the above-selected email.
Check whether you have a project in the Cloud Console. Make sure there is a project to be used when creating a BigQuery connection from Data Studio.
With this, your view should show the projects linked with your Google Cloud Console email address.
If the above doesn't work..
Do a refresh of the page and see
If not, create a support ticket or Ask the Data Studio Community.
I made a Google Cloud project to test a few things out. The main one being DataPrep. Then I decided I'd do something else for a bit and tried to delete the project.
I opened up the Google Cloud Shell and typed:
gcloud projects delete <project_name>
and then typed Y for Yes and now it is just hung there doing nothing at all. It doesn't really matter I guess but it is annoying having an old project hanging around I can't get rid of.
I've tried with billing enabled and disabled but neither seem to make a difference. I was using the shell from within the Console website by the way.
Does anyone have any idea why it is taking so long to delete this project?
I think you should follow these steps
To shut down a project using the Cloud Platform Console:
Open the Settings page in the Google Cloud Platform Console.
Click Select a project.
Select a project you wish to delete
and click Open.
Click Delete Project.
Enter the Project ID and click Shut down.
I am working on a SP2013 project for a customer, and I need to set up a working environment for development, testing and production. Let's assume for the sake of simplicity that the work consists only of various customizations (lists, libraries, apps, themes etc.) and no code.
My setup is as follows:
The production environment is on some servers on the customer site
The test environment is set up in Azure
The development environment is on a virtual machine on my PC
Now, let's assume everything is set up correctly on each environment, and I want to be able to support the following tasks:
I do customizations on my dev environment, and want to deploy this in test for others to test, prefereably with existing data
After testing and QA, I want to deploy from test to production. This must of course only affect customizations, not existing data
Every now and then I would like to take a snapshot of the production environment and move it to test, so that the deployment of a new feature from development can be done as realistic as possible
I want to perform these tasks as smoothly and efficiently as possible, especially when deploying from dev to test which is done often. Deploying from test to production will not be done that often, and hence some more manual work will be tolerated.
I know of a few mechanisms that might be relevant:
Content deployment
Cross site publishing
Content database backup/restore
Save site as template, export wsp and import
(Last resort) Manually set up each customization by hand
Could some of you experienced SharePoint devs/admins make some recommendations as to which mechanism to use in which situation, when to not use it etc.? Are there other methods that should be mentioned? Remember that the three environments reside in separated physical environments, which will probably make a fully automated solution difficult. Would it make it easier if I set up the test environment on the customer site (i.e. part of the same farm)?
Another option depending on your specific customisations might be a third party tool. There are a number of them out there. ShareGate is one I have personally been using for migration work and seems very simple and effective for moving content around quickly between environments. Attunity Repliweb for SharePoint is another that might be worth looking at for the sort of development specific release work that you require.
As for native options, I am still finding my way as well but here are my suggestions :
Where possible I have used Visual Studio to create solution packages containing features to deploy pieces of functionality. A branding solution package for example might include several features that deploy your custom master pages, theme / look files, common JavaScript libraries and images.
Feature deployment makes it easy for you to deploy or remove functionality between environments and to reuse functionality between sites. Additionally you can add your Visual Studio solutions to a source control system such as VS Online or GitHub.
For one off sites I have created a dev site, configured it then used the built in SharePoint backup and restore to deploy it to prod. Subsequent changes have been created in dev and then manually applied to test and prod. Depending on the customisations this has been quite time consuming. You might combine this with a tool such as ShareGate to automate the deployment of individual artifacts such as a customized list from one environment to another.
For moving content around I have been using a combination of ShareGate for things such as documents alongside Boost Solutions Excel Import for handling list data. This allows me to export large amounts of list data to excel and easily reimport it into a new list which might be a copy that I have added new functionality to in preparation for replacing the old prod list or perhaps dev / test lists that I am populating before doing a full site backup to restore to production.
Good luck and hopefully some of these suggestions are useful to you ! I will be following this question as I am also interested to hear of better methods / habits for managing the SharePoint development cycle.
I finished setting up a development environment for a SharePoint 2013 production environment that I maintain. The last step was to move my production content to my development environment. I had to dig around a bit to find the PowerShell etc. Rather than go through that again next time, I decided to write a blog about it, so that I’d have all the steps in one place.
The first step is to back up the content database that you want to restore to development. To do this open up SQL Server Management Studio, right-click on the database you want to back up, hover over tasks, and select backup. You will be presented with the Back Up Database window. Make sure that your backup type is set to full, give the backup a name or stick with the default, and note or change the destination.
You can skip these steps if you have scheduled backups running and are able to access the backup drive. In that case just go grab a copy of the most recent full backup and copy it to your development SQL Server.
The next step is to restore the database to development. To do this open up SQL Server Management Studio in your development environment, right-click on the Databases folder, and select Restore Database. When presented with the Restore Database window, click on the Device Radio Button and click the ellipsis next to the text box. This will bring up the Select backup devices window. From there click Add, locate your backup file and click OK, click OK again to be returned to the Restore Database Window, and finally from there click OK. Now your database has been restored, and you are ready to add it to SharePoint.
If you don’t already have one with content in it that you don’t care about in it, create a new Web Application…
https://sharepointv15.wordpress.com/2012/07/24/create-a-web-application-in-sharepoint-2013/
Don’t worry about creating a site collection.
Now go to Central Admin and click on Manage content databases under Application Management.
rsd1
Make sure that the correct Web Application is selected. If it is not click on the drop down arrow next to the Web Application name, click change web application and select the correct Web Application in the window that you are presented with.
rsd2
Next click on the Content Database name
On the Manage Content Database Settings screen scroll down, click on the Remove Content Database check box, click OK on the warning pop up and click OK at the bottom of the screen.
rsd3
Now you’ll need to open up the SharePoint 2013 Management Shell as an administrator. To do this click on your start menu, click all programs, click on the Sharepoint 2013 folder, right-click the SharePoint 2013 Management Shell and select Run as Administrator.
From here you will run the Mount-SPContentDatabase cmdlet
Mount-SPContentDatabase “MyDatabase” -DatabaseServer “MyServer” -WebApplication http://sitename
Click below for details on this cmdlt…
http://technet.microsoft.com/en-us/library/ff607581.aspx
At this point you should be able to navigate to the web application URL and see the Site Collection that lives in the database you just mounted.
Note: This will work in SharePoint 2010 or SharePoint 2013. However, the database must be the same version of SharePoint that the farm you are trying to mount to is. If it is a lower version it will automatically try to upgrade it, so keep that in mind.
follow the below link.
https://sharepointv15.wordpress.com/2013/03/21/moving-content-between-environments/