VS2017 Data Lake tools cannot submit U-SQL script to Azure - visual-studio-2017

Using Visual Studio 2017 with Data Lake Tools v. 2.3.1000.1, I am unable to submit a U-SQL job directly to Azure. I only have the option to submit locally:
This is the case even though I am connected to Azure through the "Server Explorer" tool window, from which I can access my U-SQL databases, view jobs that were previously submitted to Azure, etc.:
Using Visual Studio 2015, I have no such issue:
Am I forgetting a setting or a property somewhere, or is this perhaps a bug in Data Lake Tools for VS2017?

Do you still have the issue?
I sometimes see this happen if I open an existing solution and the tabs are already open, and I have not yet logged in. I then log in and the drop down menu of the open window will not be refreshed.
I close and reopen the script and it normally shows up.
Another reason, that I think will be addressed soon, could be that you have a filter on your cloud view explorer on which subscriptions you expose. If you hid the subscription there with your ADLA account, you will not see it in the pull down.
In any case, please let us know if you still have the issue and none of the two suggestions help.

Related

Not able to list Bigquery projects from Data Studio Connection

From Data Studio, I am not able to list any Big Query projects. Not even Public projects. I have access to Big Query Tables and projects, but once I log in through Data Studio and try to add a data source, it's showing all blank, including custom queries, etc.
I also faced the same situation and I hope this helps someone else with this confusion. I'm assuming that this could be the solution to this question too. (Do let me know)
When you are trying to add a BigQuery connection to Data Studio, you need to have a Google Project because BigQuery is a paid service. The querying in BigQuery incurs a cost. A project with billing enabled, is not a must for you to load the Datasets in the above screenshot.
Follow these steps to get this issue fixed.
Check the email address you have used to connect to Data Studio. You can click on the avatar on the top right corner. (If this is wrong, simply switch to the correct email.)
Check whether Google Cloud Console is enabled for the above-selected email.
Check whether you have a project in the Cloud Console. Make sure there is a project to be used when creating a BigQuery connection from Data Studio.
With this, your view should show the projects linked with your Google Cloud Console email address.
If the above doesn't work..
Do a refresh of the page and see
If not, create a support ticket or Ask the Data Studio Community.

Is it possible to add ODBC service to existing CF 2016 install?

We are migrating some code from CF 10 to CF 2016 virtual machines that needs to connect to a couple Access databases. I installed the necessary drivers and setup odbc datasources in Windows but have found that our CF 2016 VM was setup without the ODBC service and I have not been able to find a clear way to add it.
When I go to add/remove programs the only option is to uninstall CF and when I run a CF 2016 installer it will not let me go through the "Server configuration" process because an existing install already exists. Is there a standard approach for adding sub-components to a CF server that were not chosen on first install?
Looks like I found my answer after digging around for a while. The migration wizard needs to be run again to add the ODBC service but there could be various complications with this depending on permissions and other factors. I was not able to get this to work after a brief period of time so I am just going to wrap up my transition to using python for our limited Access needs but I do believe this to be the answer.
https://community.adobe.com/t5/ColdFusion/ColdFusion-11-ODBC-service/td-p/6207226
Here are the basic steps. See link above for various troubleshooting info from Charlie Arehart and others.
Navigate to adminconfig.xml at C:\ColdFusion11\cfusion\lib\ and open it with text editor (say notepad).
Change the value from “false” to “true” in runmigrationwizard
Change the value from “false” to “true” in odbc
Save the file and restart ColdFusion Service.
After restarting the service, access the cf admin url and you will get the migration wizard. Follow the onscreen wizard to continue.
I came upon the answer by way of a thread about getting Access data sources setup in CF11 and CF2016 which Charlie Arehart contributed to and linked over to info about the ODBC service setup.

Visual Studio TFS Alerts: How to send out an alert upon successful publish of a web application?

I have taken a look at alerts management for TFS 2012 after installing power tools, and I can see about four types of alert templates:
Work Item
Code Review
Check In
Build
I was wondering if there was a possibility of having a supported way to register alerts under “Publish” event-type, manageable directly via the Alerts explorer.
If not, I thought of some workarounds:
If this cannot be done and managed via the Alerts explorer, can I customize an alert to be triggered on Publish event via a Web Service? If so, does the TFS API support such customization?
I can also instead go with a continuous delivery approach and set an automated publish upon successful build of a solution, with an email alert on Build-Event Success (which would also mean that a solution has been published).
Which approach would be a supported way for setting "on publish"-event alerts for web solutions via TFS?
My suggestion would be not to use the Publish in Visual Studio but instead to use a Build to publish your solution (either triggered on check-in, or manually triggered). Then you can easily setup an alert on that build. Using a build instead of VS-Publish is also considered a better practice because it gives you more power and flexibility for the deployment process.

SharePoint 2013 Dev/Test/Production environment - Best practice for moving content

I am working on a SP2013 project for a customer, and I need to set up a working environment for development, testing and production. Let's assume for the sake of simplicity that the work consists only of various customizations (lists, libraries, apps, themes etc.) and no code.
My setup is as follows:
The production environment is on some servers on the customer site
The test environment is set up in Azure
The development environment is on a virtual machine on my PC
Now, let's assume everything is set up correctly on each environment, and I want to be able to support the following tasks:
I do customizations on my dev environment, and want to deploy this in test for others to test, prefereably with existing data
After testing and QA, I want to deploy from test to production. This must of course only affect customizations, not existing data
Every now and then I would like to take a snapshot of the production environment and move it to test, so that the deployment of a new feature from development can be done as realistic as possible
I want to perform these tasks as smoothly and efficiently as possible, especially when deploying from dev to test which is done often. Deploying from test to production will not be done that often, and hence some more manual work will be tolerated.
I know of a few mechanisms that might be relevant:
Content deployment
Cross site publishing
Content database backup/restore
Save site as template, export wsp and import
(Last resort) Manually set up each customization by hand
Could some of you experienced SharePoint devs/admins make some recommendations as to which mechanism to use in which situation, when to not use it etc.? Are there other methods that should be mentioned? Remember that the three environments reside in separated physical environments, which will probably make a fully automated solution difficult. Would it make it easier if I set up the test environment on the customer site (i.e. part of the same farm)?
Another option depending on your specific customisations might be a third party tool. There are a number of them out there. ShareGate is one I have personally been using for migration work and seems very simple and effective for moving content around quickly between environments. Attunity Repliweb for SharePoint is another that might be worth looking at for the sort of development specific release work that you require.
As for native options, I am still finding my way as well but here are my suggestions :
Where possible I have used Visual Studio to create solution packages containing features to deploy pieces of functionality. A branding solution package for example might include several features that deploy your custom master pages, theme / look files, common JavaScript libraries and images.
Feature deployment makes it easy for you to deploy or remove functionality between environments and to reuse functionality between sites. Additionally you can add your Visual Studio solutions to a source control system such as VS Online or GitHub.
For one off sites I have created a dev site, configured it then used the built in SharePoint backup and restore to deploy it to prod. Subsequent changes have been created in dev and then manually applied to test and prod. Depending on the customisations this has been quite time consuming. You might combine this with a tool such as ShareGate to automate the deployment of individual artifacts such as a customized list from one environment to another.
For moving content around I have been using a combination of ShareGate for things such as documents alongside Boost Solutions Excel Import for handling list data. This allows me to export large amounts of list data to excel and easily reimport it into a new list which might be a copy that I have added new functionality to in preparation for replacing the old prod list or perhaps dev / test lists that I am populating before doing a full site backup to restore to production.
Good luck and hopefully some of these suggestions are useful to you ! I will be following this question as I am also interested to hear of better methods / habits for managing the SharePoint development cycle.
I finished setting up a development environment for a SharePoint 2013 production environment that I maintain. The last step was to move my production content to my development environment. I had to dig around a bit to find the PowerShell etc. Rather than go through that again next time, I decided to write a blog about it, so that I’d have all the steps in one place.
The first step is to back up the content database that you want to restore to development. To do this open up SQL Server Management Studio, right-click on the database you want to back up, hover over tasks, and select backup. You will be presented with the Back Up Database window. Make sure that your backup type is set to full, give the backup a name or stick with the default, and note or change the destination.
You can skip these steps if you have scheduled backups running and are able to access the backup drive. In that case just go grab a copy of the most recent full backup and copy it to your development SQL Server.
The next step is to restore the database to development. To do this open up SQL Server Management Studio in your development environment, right-click on the Databases folder, and select Restore Database. When presented with the Restore Database window, click on the Device Radio Button and click the ellipsis next to the text box. This will bring up the Select backup devices window. From there click Add, locate your backup file and click OK, click OK again to be returned to the Restore Database Window, and finally from there click OK. Now your database has been restored, and you are ready to add it to SharePoint.
If you don’t already have one with content in it that you don’t care about in it, create a new Web Application…
https://sharepointv15.wordpress.com/2012/07/24/create-a-web-application-in-sharepoint-2013/
Don’t worry about creating a site collection.
Now go to Central Admin and click on Manage content databases under Application Management.
rsd1
Make sure that the correct Web Application is selected. If it is not click on the drop down arrow next to the Web Application name, click change web application and select the correct Web Application in the window that you are presented with.
rsd2
Next click on the Content Database name
On the Manage Content Database Settings screen scroll down, click on the Remove Content Database check box, click OK on the warning pop up and click OK at the bottom of the screen.
rsd3
Now you’ll need to open up the SharePoint 2013 Management Shell as an administrator. To do this click on your start menu, click all programs, click on the Sharepoint 2013 folder, right-click the SharePoint 2013 Management Shell and select Run as Administrator.
From here you will run the Mount-SPContentDatabase cmdlet
Mount-SPContentDatabase “MyDatabase” -DatabaseServer “MyServer” -WebApplication http://sitename
Click below for details on this cmdlt…
http://technet.microsoft.com/en-us/library/ff607581.aspx
At this point you should be able to navigate to the web application URL and see the Site Collection that lives in the database you just mounted.
Note: This will work in SharePoint 2010 or SharePoint 2013. However, the database must be the same version of SharePoint that the farm you are trying to mount to is. If it is a lower version it will automatically try to upgrade it, so keep that in mind.
follow the below link.
https://sharepointv15.wordpress.com/2013/03/21/moving-content-between-environments/

Create a SharePoint workflow programmatically

I am working on a copy of a SharePoint 2007 site for a client.
I would like to be able to automate as much of the update process as I can with minimal disruption to the client's system when the updates are ready for production.
To that effect, I was wondering if anyone knows how to automate creating a SharePoint workflow (created using SPD 2007) in another SharePoint server/site.
Perhaps I haven't searched enough yet, but I have not discovered if there is a way to do this with web services, which I believe would be my preference.
I do not believe I have the ability to use STSadm on this, as the hosting for the SharePoint site is separate.
I think I can export the workflows in a personal web package and I'll admit, I haven't experimented with this yet on workflows, but my current experience with other exports, such as lists, is that guids seem to get messed up between sites. Even if this is not an issue, I'm not sure if there is a way of automating the import process (without STSadm).
I'm hoping not to have to work through a long list of manual procedures (that could accidentally get missed) when implementing these changes on the target production site.
My preference is to be able to create some sort of update batch or application that will make the changes quickly and that I can test before implementing on the production system.
This entails quite a few things, but for now, I'd like to focus on getting workflows into the target system.
Any suggestions on where to get started would be welcome.
SharePoint Designer workflows are not portable between sites. (Reference) 1
For your situation, I would recommend taking the Visual Studio workflow route. Take a look at this tutorial: How to Create Custom SharePoint Workflows in Visual Studio 2008. The key for you is how you will associate it to lists.
The other option is to create a custom Workflow Activity (2007 has less options that 2010). You will still have to create the workflow using SharePoint Designer and add your custom activity to it in each site.
1. Yes, there is the "hack way" of trying to do it by copying the XML and changing the GUIDs... but it is error prone and difficult.
SharePoint 2010 gives more flexibility for workflows and thus the first #Kit Menke statement isn't true for readers using SP2010 (i see that this is tagged as sharepoint2007, but i'm making it clear for readers using SP2010)
However, if you publish a workflow template to a SharePoint site
collection, you can download that template as a WSP file and then
deploy it to other site collections.
Read more about Workflow deployment process (SharePoint Foundation 2010)