Run a console application ( which internally invokes powershell scripts) an azure web job - azure-webjobs

I have a console application that among other tasks, invokes a powershell script. I would like to run this from a web job on a schedule.
a) I was able to publish my console application to a web job and I had good luck with running this. However, had to choose the on demand option and have to trigger it manually.
b) I packaged the output of my project in a zip and uploaded it to a new webjob (which i was able to schedule!!) However, this time the web job ran but failed to load and run the powershell scripts. It complains with the following error: (Not sure how I can control execution policies)
System.Management.Automation.PSSecurityException: File D:\local\Temp\jobs\triggered\ScheduledTenantExpiryMonitor\i2eejhup.fkj\WebJobzip\GetExpirationScript.ps1 cannot be loaded because running scripts is disabled on this system. For more information, see about_Execution_Policies at http://go.microsoft.com/fwlink/?LinkID=135170. ---> System.UnauthorizedAccessException:
Can someone help me figure this out? The end goal is to run my job successfully in a scheduled fashion.

To invoke the powershell file directly, you could take the suggestions of #Thomas.
Running Powershell Web Jobs on Azure websites
If you did want to invoke powershell command with Console Application, you could do it with Powshell .NET SDK. It can run powershell command with .NET Console Application in WebJobs. Here are the detail steps.
Install System.Management.Automation dll using NuGet.
Run powershell scripts using following method.
private static string RunScript(string scripts)
{
// create Powershell runspace
Runspace runspace = RunspaceFactory.CreateRunspace();
// open it
runspace.Open();
// create a pipeline and feed it the script text
Pipeline pipeline = runspace.CreatePipeline();
pipeline.Commands.AddScript(scripts);
pipeline.Commands.Add("Out-String");
//execute the script
Collection<PSObject> results = pipeline.Invoke();
//close the runspace
runspace.Close();
// convert the script result into a single string
StringBuilder stringBuilder = new StringBuilder();
foreach (PSObject obj in results)
{
stringBuilder.AppendLine(obj.ToString());
}
return stringBuilder.ToString();
}
Test upper method using following code. command.ps1 file contains Get-Process command.
string fileName = "command.ps1";
string scriptText = File.ReadAllText(AppDomain.CurrentDomain.BaseDirectory + fileName);
Console.Write(scriptText);
Here is the output of Get-Process command in WebJob.

Related

How to Upload PBIX file stored in Azure storage container to Power BI Service

My requirement is to upload the PBIX file stored in Azure storage container to Power BI Service without downloading it to local drive as I have to use the PowerShell script in Runbook Automation
Normally we can upload the PBIX file by giving local path like below
$pbixFilePath = "C:\PBIXFileLocation\Test.pbix"
$import = New-PowerBIReport -Path $pbixFilePath -Workspace $workspace -ConflictAction CreateOrOverwrite
$import | Select-Object *
But now which path I have to use if the PBIX file is stored in Azure storage container and how the PowerShell script can be created? Is it possible?
Tried to list the blobs in the container with the Get-AzStorageBlob cmdlet and passed it as a path in above script and ended up with this error:
If possible please help me with a sample PowerShell script to achieve the above requirement
Thanks in Advance!
Issue can be resolved by following my similar post in Azure platform
AnuragSingh-MSFT is a gem explained me clearly and resolved the issue
A basic understanding of Azure Automation runbook execution should help clarify this doubt. When runbooks are designed to authenticate and run against resources in Azure, they run in an Azure sandbox. Azure Automation assigns a worker to run each job during runbook execution in the sandbox. Please see this link for more details - Runbook execution environment These sandboxes are isolated environment with access to only some of the location/path/directories.
The following section should help answer the question - ... which path I have to use if the PBIX file is stored in Azure storage container and how the PowerShell script can be created?
The script snippet provided by Manu above would download the blob content in the same directory inside sandbox from where script is running. You can access this path inside the script using "." --> for example, if the blob that you are downloading is named testBlob, it will be available in location .\testBlob. ("." stands for current directory).
Therefore, the pbixFilePath can be initialized as $pbixFilePath = ".\Test.pbix"
Another option is to use $env:temp as mentioned in the question. It is one of the environments variable available on local machine (on your workstation) which generally resolves to C:\Users<username>\AppData\Local\Temp
In Azure Automation sandbox environment, this variable resolves to C:\Users\Client\Temp
Therefore, you could download the blob content using the following line:
Get-AzStorageBlobContent -Blob $blob -Container $ContainerName -Context $Ctx -Destination $env:temp #Destination parameter sets the target folder. By default it is local directory (.)
In this case, you would initialize pbixFilePath as $pbixFilePath = $env:temp+"\Test.pbix"
Either case is fine as long as the Automation limits are not exceeded.

Exception in thread "main" com.google.cloud.bigquery.dms.common.exceptions.AgentException: Unable to start tbuild command

I am trying to migrate a table from teradata to BQ using GCP's Data Transfer functionality. I have following the steps suggested on https://cloud.google.com/bigquery-transfer/docs/teradata-migration.
A detailed description of the steps is below:
The APIs suggested on the aobove were enabled.
A pre-existing GCS bucket, BigQuery DataSet and GCP Service Account was used in this process.
Google SDK setup was completed on the device.
Google Cloud service account key was set to an environment variable called GOOGLE_APPLICATION_CREDENTIALS.
BigQuery Data Transfer was set up.
Initialized the migration agent
On the command line, a command to run the jar file was issued, with some particular flags e.g.
java -cp C:\migration\tdgssconfig.jar;C:\migration\terajdbc4.jar;C:\migration\mirroring-agent.jar com.google.cloud.bigquery.dms.Agent --initialize
When prompted, the required parameters were entered.
When prompted for a BigQuery Data Transfer Service Resource name was entered using the data transfer config from GCP.
After entering all the requested parameters, the migration agent creates a configuration file and puts it into the local path provided in the parameters.
Run the migration agent
The following command was executed by using the classpath to the JDBC drivers and path to the configuration file created in the previous initialization step
e.g.
java -cp C:\migration\tdgssconfig.jar;C:\migration\terajdbc4.jar;C:\migration\mirroring-agent.jar com.google.cloud.bigquery.dms.Agent --configuration-file=config.json
At this point, an error was faced which said “Unable to start tbuild command”. Below is a screenshot of the error:
Following steps were taken to try to resolve this error using steps given here:
Teradata Parallel Transporter was installed using Teradata Tools and Utilities.
No bin file was found for the installation.
Below is a screenshot of the error message:
Exception in thread "main" com.google.cloud.bigquery.dms.common.exceptions.AgentException: Unable to start tbuild command

AWS .Net API - The provided token has expired

I am facing this weird scenario. I generate my AWS AccessKeyId, SecretAccessKey and SessionToken by running assume-role-with-saml command. After copying these values to .aws\credentials file, I try to run command "aws s3 ls" and can see all the S3 buckets. Similarly I can run any AWS command to view objects and it works perfectly fine.
However, when I write .Net Core application to list objects, it doesn't work on my computer. Same .Net application works find on other colleagues' computers. We all have access to AWS through the same role. There are no users in IAM console.
Here is the sample code, but I am not sure there is nothing wrong with the code, because it works fine on other users' computers.
var _ssmClient = new AmazonSimpleSystemsManagementClient();
var r = _ssmClient.GetParameterAsync(new Amazon.SimpleSystemsManagement.Model.GetParameterRequest
{
Name = "/KEY1/KEY2",
WithDecryption = true
}).ConfigureAwait(false).GetAwaiter().GetResult();
Any idea why running commands through CLI works and API calls don't work? Don't they both look at the same %USERPROFILE%.aws\credentials file?
I found it. Posting here since it can be useful for someone having same issue.
Go to this folder: %USERPROFILE%\AppData\Local\AWSToolkit
Take a backup of all files and folders and delete all from above location.
This solution applies only if you can run commands like "aws s3 ls" and get the results successfully, but you get error "The provided token has expired" while running the same from .Net API libraries.

How can I provision IIS on EC2 Windows with a resource?

I have just started working on a project that is hosted on an AWS EC2 Windows Instance with an IIS. I want to move this setup to more reliable place, and one of the first things I wanted to do was to move away from snowflake servers that are setup and configured by hand.
So started looking at Terraform from Hashicorp. My thought was that I could define the entire setup including network etc in Terraform and that way make sure it was configured correctly.
I thought I would start with defining a server. A simple Windows Server instance with an IIS installed. But this is where I run into my first problems. I thought I could configure the IIS from Terraform. I guess you can't. So my next thought was to combine Terraform with Powershell Desired State Configuration.
I can setup an IIS server on a box using DSC. But I am stuck invoking DSC from Terraform. I can provision a vanilla server easily. I have tried looking for a good blog post on how to use DSC in combination with Terraform, but I can't find one that explains how to do it.
Can anyone point me towards a good place to read up on this? Or alternatively if the reason I can't find this is that it is just bad practice and I should do it in another way, then please educate me.
Thanks
How can I provision IIS on EC2 Windows with a resource?
You can run arbitrary PowerShell scripts on startup as follows:
resource "aws_instance" "windows_2016_server" {
//...
user_data = <<-EOF
<powershell>
$file = $env:SystemRoot + "\Temp\${var.some_variable}" + (Get-Date).ToString("MM-dd-yy-hh-mm")
New-Item $file -ItemType file
</powershell>
EOF
//...
}
You'll need a variable like this defined to use that (I'm providing a more complex example so there's a more useful starting point)
variable "some_variable" {
type = string
default = "UserDataTestFile"
}
Instead of creating a timestamp file like the example above, you can invoke DSC to set up IIS as you normally would interactively from PowerShell on a server.
You can read more about user_data on Windows here:
https://docs.aws.amazon.com/AWSEC2/latest/WindowsGuide/ec2-windows-user-data.html
user_data will include your PowerShell directly.
You can use a templatefile("${module.path}/user-data.ps1, {some_variable = var.some_variable}) instead of an inline script as above.
Have user-data.ps1 in the same directory as the TF file that references it:
<powershell>
$file = $env:SystemRoot + "\Temp\${some_variable}" + (Get-Date).ToString("MM-dd-yy-hh-mm")
New-Item $file -ItemType file
</powershell>
You still need the <powershell></powershell> tags around your script source code. That's a requirement of how Windows on EC2 expects PowerShell user-data scripts.
And then update your TF file as follows:
resource "aws_instance" "windows_2016_server" {
//...
user_data = templatefile("${module.path}/user-data.ps1, {
some_variable = var.some_variable
})
//...
}
Note that in the file read by templatefile has variables like some_variable and NOT var.some_variable.
Read more about templatefile here:
https://www.terraform.io/docs/configuration/functions/templatefile.html

Folder Paths in Pivotal Cloud Foundry

My application needs to move the files from input folder to error folder , after validation and im using -->Files.move in java and it works in my machine ,and want to achieve the same in Pivotal Cloud ,since couldnt use the same as its not working in cloud .Do i need to tweak the below or any other alternatives out there ?Thanks in advance !!
String inputFolder="\\home\\**\\**\\***\\input_working";
String errorFolder = "\\home\\**\\**\\***\\input_errors";
for (String inputTextFile : errorfiles) {
String msg = this.getClass().getSimpleName() + "- Input file has errors, Moving it to Error Directory ..."+inputFolder+inputTextFile+" To "+errorFolder+ inputTextFile;
LOGGER.info(REPORT_MARKER,LOG_HVML_TEMPLATE_TWO,msg,2);
Files.move(Paths.get(inputFolder + "\\"+ inputTextFile), Paths.get(errorFolder + "\\"+ inputTextFile));
}
On a cloud foundry instance you can use NFS Volume Services. you can create a service and bind it to the application, then you can read and write to the file system path.
You can follow this documentation for the exact steps