I am following this guide to initialise a GCP Pub/Sub publisher.
The coding syntax is as follows:
#PubSubClient
public interface PubSubService {
#Topic("topic-a")
void send(final A a);
#Topic("topic-b")
void send(final B b);
}
I want to set this topic value based on the environment, as I will have a different topic for QA/DEV (say topic-a-qa and topic-b-qa).
Is there any way for me to set this String value in the #Topic annotation via or based on environment properties?
I do not have an option of have a different Project under the GCP account, also creating a different class for QA overriding this one is not so graceful when maintaining environments.
You can use placeholders inside Micronaut's annotations.
#PubSubClient
public interface PubSubService {
#Topic("${topic.a.name:topic-a}")
void send(final A a);
#Topic("${topic.b.name:topic-b}")
void send(final B b);
}
The expression ${topic.a.name:topic-a} instructs Micronaut to search for the value in the configuration under the topic.a.name key, and fall back to the value topic-a if the configuration key is not found. Then you can configure different topic names using e.g. application-qa.yml configuration file:
src/main/resources/application-qa.yml
topic:
a:
name: topic-a-qa
b:
name: topic-b-qa
Lastly, just make sure that when you run the application in the QA environment you set a proper active environment, e.g.
$ java -Dmicronaut.environments=qa -jar myapp.jar
Related
I am using context to pass values to CDK. Is there currently a way to define project context file per deployment environment (dev, test) so that when the number of values that I have to pass grow, they will be easier to manage compared to passing the values in the command-line:
cdk synth --context bucketName1=my-dev-bucket1 --context bucketName2=my-dev-bucket2 MyStack
It would be possible to use one cdk.json context file and only pass the environment as the context value in the command-line, and depending on it's value select the correct values:
{
...
"context": {
"devBucketName1": "my-dev-bucket1",
"devBucketName2": "my-dev-bucket2",
"testBucketName1": "my-test-bucket1",
"testBucketName2": "my-test-bucket2",
}
}
But preferably, I would like to split it into separate files, f.e. cdk.dev.json and cdk.test.json which would contain their corresponding values, and use the correct one depending on the environment.
According to the documentation, CDK will look for context in one of several places. However, there's no mention of defining multiple/additional files.
The best solution I've been able to come up with is to make use of JSON to separate context out per environment:
"context": {
"dev": {
"bucketName": "my-dev-bucket"
}
"prod": {
"bucketName": "my-prod-bucket"
}
}
This allows you to access the different values programmatically depending on which environment CDK is deploying to.
let myEnv = dev // This could be passed in as a property of the class instead and accessed via props.myEnv
const myBucket = new s3.Bucket(this, "MyBucket", {
bucketName: app.node.tryGetContext(myEnv).bucketName
})
You can also do so programmatically in your code:
For instance, I have a context variable of deploy_tag cdk deploy Stack\* -c deploy_tag=PROD
then in my code, i have retrieved that deploy_tag variable and I make the decisions there, such as: (using python, but the idea is the same)
bucket_name = BUCKET_NAME_PROD if deploy_tag == 'PROD' else BUCKET_NAME_DEV
this can give you a lot more control, and if you set up a constants file in your code you can keep that up to date with far less in your cdk.json that may become very cluttered with larger stacks and multiple environments. If you go this route then you can have your Prod and Dev constants file, and your context variable can inform your cdk which file to load for a given deployment.
i also tend to create a new class object with all my deployment properties either assigned or derived, and pass that object into each stack, retrieving what i need out of there.
i wanted to know what is the best approach to save WCF endpoints information in the config file for different environments(DEV,TEST,PRE-PROD, PROD).
i am familiar with 1 way of doing this - 1. Maintain different config files(for each env) and deploy them accordingly.
Can someone please suggest the best way to do this ??
You can configure the endpoint at runtime with endpointbehaviors. In this behaviors you can get the machinename for example. Depending on the machinename you could set yout endpointaddress for your endpoint, and then start the service.
Here is a link : https://msdn.microsoft.com/en-us/library/vstudio/ms730137%28v=vs.100%29.aspx
EDIT:
So you write:
class CustomEndpointBehavior : IEndpointBehavior{
public void Validate(ServiceEndpoint endpoint)
{
// get here the address and rewrite it dependig on the machinemane e.g.
// remember to set the new address to the endpoint!
}
public void AddBindingParameters(ServiceEndpoint endpoint, BindingParameterCollection bindingParameters)
{
}
public void ApplyDispatchBehavior(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher)
{
}
public void ApplyClientBehavior(ServiceEndpoint endpoint, ClientRuntime clientRuntime)
{
}
}
And in the class where you start the service, you need to set the CustomEndpointBehavior to the serviceHost, like:
serviceHost.Description.Behaviors.Add(new CustomEndpointBehavior());
I think this is the link. Create multiple configuration and then use pre build event to copy it. http://www.hanselman.com/blog/ManagingMultipleConfigurationFileEnvironmentsWithPreBuildEvents.aspx
In you have visual studio 2010 and above then it will be able to merge the config. https://msdn.microsoft.com/en-us/library/dd465326(v=vs.110).aspx
With the new release of Azure Webjobs 3.0.0 SDK it was announced the:
http://azure.microsoft.com/blog/2014/06/18/announcing-the-0-3-0-beta-preview-of-microsoft-azure-webjobs-sdk/
Improved function discovery
We added an ITypeLocator and INameResolver to enable customizing how the WebJobs SDK looks >for functions. This enables scenarios such as the following:
You can define functions where the QueueName is not explicit. You can read Queue names from a config source and specify this value at runtime.
Restrict function discovery to a particular class or assembly.
Dynamic functions at indexing time: you can define the function signature at runtime.
But there's no sample code on how to do it.
Does anyone know how to define the queue name at runtime (e.g. from app.config)?
If you take advantage of the new INameResolver in the configuration you can make your own implementation of the interface and replace it in the JobHostConfiguration. Take a look at this blog post where I made a small POC on the topic.
To use an external runtime service to define the name of the queue:
public class QueueNameResolver : INameResolver
{
public string Resolve(string practiceId)
{
//define in appsettings the queuename property
return CloudConfigurationManager.GetSetting("queuname");
//or some other service of your design
}
}
In the WebJob Code, Program.cs:
public void init()
{
// Retrieve storage account from connection string.
string azureJobStorageConnectionString = ConfigurationManager.ConnectionStrings["AzureWebJobsStorage"].ConnectionString;
var config =
new JobHostConfiguration(azureJobStorageConnectionString)
{
NameResolver = new QueueNameResolver()
};
host = new JobHost(config);
host.RunAndBlock();
}
as per azure doco
I am developing an Azure WebJobs executable that I would like to use with multiple Azure websites. Each web site would need its own Azure Storage queue.
The problem I see is that the ProcessQueueMessage requires the queue name to be defined statically as an attribute of the first parameter inputText. I would rather have the queue name be a configuration property of the running Azure Website instance, and have the job executable read that at runtime when it starts up.
Is there any way to do this?
This can now be done. Simply create an INameResolver to allow you to resolve any string surrounded in % (percent) signs. For example, if this is your function with a queue name specified:
public static void WriteLog([QueueTrigger("%logqueue%")] string logMessage)
{
Console.WriteLine(logMessage);
}
Notice how there are % (percent) signs around the string logqueue. This means the job system will try to resolve the name using an INameResolver which you can create and then register with your job.
Here is an example of a resolver that will just take the string specified in the percent signs and look it up in your AppSettings in the config file:
public class QueueNameResolver : INameResolver
{
public string Resolve(string name)
{
return ConfigurationManager.AppSettings[name].ToString();
}
}
And then in your Program.cs file, you just need to wire this up:
var host = new JobHost(new JobHostConfiguration
{
NameResolver = new QueueNameResolver()
});
host.RunAndBlock();
This is probably an old question, but in case anyone else stumbles across this post. This is now supported by passing a JobHostConfiguration object into the JobHost constructor.
http://azure.microsoft.com/en-gb/documentation/articles/websites-dotnet-webjobs-sdk-storage-queues-how-to/#config
A slight better implementation of name resolver to avoid fetching from configuration all time. It uses a Dictionary to store the config values once retrieved.
using Microsoft.Azure.WebJobs;
using System.Collections.Generic;
using System.Configuration;
public class QueueNameResolver : INameResolver
{
private static Dictionary<string, string> keys = new Dictionary<string, string>();
public string Resolve(string name)
{
if (!keys.ContainsKey(name))
{
keys.Add(name, ConfigurationManager.AppSettings[name].ToString());
}
return keys[name];
}
}
Unfortunately, that is not possible. You can use the IBinder interface to bind dynamically to a queue but you will not have the triggering mechanism for it.
Basically, the input queue name has to be hardcoded if you want triggers. For output, you can use the previously mentioned interface.
Here is a sample for IBinder. The sample binds a blob dynamically but you can do something very similar for queues.
I am trying to use a web service as part of a dataflow task in SSIS. Right now i have a script task within a data flow with a web service reference.
What the task has to ultimately do is: collect all postal codes from an inernal database > use the web service i created to get the lay/long of thos postal codes > export the lat/long to a table.
As i said before i have the scrip set up and looks to be working, but when i go to execute the package i get an CannotCreateUserComponentException error.
Any help on this would be awesome, or another approach to it? I am new to the web services in SSIS so this is the best solution i could come up with.
Edit:
Right now the script is accepting the postal code from an OLE DB Source and will ouput a string in for format of "lat,long" (presumably). The web Service was created by someone within the company and has one method called FindCoordinates which takes in a string value (Postal code)(which i think calls the google geocode xml, not exactly sure what he did). Here is the code that is currently in my script:
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
using SC_6c1b642acd544cd5b01c032b6f8dfd03.csproj.MyService;
using System.Xml;
using System.Web.Services;
[WebService(Namespace = "http://microsoft.com/webservices/")]
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
public override void PreExecute()
{
base.PreExecute();
/*
Add your code here for preprocessing or remove if not needed
*/
}
public override void PostExecute()
{
base.PostExecute();
/*
Add your code here for postprocessing or remove if not needed
You can set read/write variables here, for example:
Variables.MyIntVar = 100
*/
}
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
String postal = Row.PostalCode;
Service service = new Service();
Row.LatLong = service.FindCoordinates(postal);
}
}
Hope this help.