Is it possible to use manage secrets feature of Visual Studio with Azure Service fabric - visual-studio-2017

Is it possible to use manage secrets feature of Visual Studio with Azure Service fabric
I created an Stateless ASP.net core service fabric project. I moved some of the configs to secret.json but it does not seem to pick it up from there. It always seems to pickup from appsetting.json.
This is i am assuming due to the fact i am mentioning appsetting.json at startup
return new WebHostBuilder()
.UseKestrel()
.ConfigureAppConfiguration((builderContext, config) =>
{
config.AddJsonFile("appsettings.json", optional: false, reloadOnChange: true);
})
.ConfigureServices(
services => services
.AddSingleton<StatelessServiceContext>(serviceContext))
.UseContentRoot(Directory.GetCurrentDirectory())
.UseStartup<Startup>()
.UseServiceFabricIntegration(listener, ServiceFabricIntegrationOptions.None)
.UseUrls(url)
.Build();

Yes, you should be able to do this.
https://learn.microsoft.com/en-us/azure/service-fabric-mesh/service-fabric-mesh-howto-manage-secrets
For this guidance about how to use/inject secrets in your application:
https://learn.microsoft.com/azure/service-fabric-mesh/service-fabric-mesh-howto-manage-secrets#modify-mesh-application-to-reference-mesh-secret-values
For how to declare secret resources correctly: https://learn.microsoft.com/azure/service-fabric-mesh/service-fabric-mesh-howto-manage-secrets#declare-a-mesh-secrets-resource
You can add var secretPath = PathHelper.GetSecretsPathFromSecretsId(typeof(Program).Assembly.GetCustomAttribute<UserSecretsIdAttribute>().UserSecretsId); to your app to see where that stores your secret configs.
It would look like this:
C:\Windows\System32\config\systemprofile\AppData\Roaming\Microsoft\UserSecrets\

Related

AWS Query Can't save to a mutableList

I'm trying out AWS with Android Studio using Kotlin.
For the life of me, I cannot understand why my query is not saving into a local mutableList. When I output my list inside the while loop, everything is saved correctly, but once I leave the query, my mutableList becomes empty. Any help would be appreciated. Thank you.
Code
var userContent : MutableList<MutableList<String>> = arrayListOf()
Amplify.DataStore.query(UserInfo::class.java,
Where.matches(UserInfo.USER.eq("$myID")),
{
posts ->
while(posts.hasNext()){
val post = posts.next()
userContent.add(mutableListOf(post.content, post.username, post.password))
Log.i("Testing1","$userContent")
}
},
{Log.e("Auth","Error",it)}
)
Log.i("Testing2","$userContent")
I see you are using Amplify - which is fine. However, as an Android Developer, you have another option now. You can use the brand NEW AWS SDK for Kotlin.
This SDK is built for use with Android Studio. Using this SDK, you can build Native Android Apps that can invoke AWS services like Amazon DynamoDB.
Here is an AWS tutorial that walks you through creating an Android app that invokes Amazon DynamoDB and SNS.
Creating your first Native Android application using the AWS SDK for Kotlin

don't want to login google cloud with service account

I am new at google cloud and this is my first experience with this platform. ( Before I was using Azure )
So I am working on a c# project and the project has a requirement to save images online and for that, I created cloud storage.
not for using the services, I find our that I have to download a service account credential file and set the path of that file in the environment variable.
Which is good and working file
RxStorageClient = StorageClient.Create();
But the problem is that. my whole project is a collection of 27 different projects and that all are in the same solution and there are multi-cloud storage account involved also I want to use them with docker.
So I was wondering. is there any alternative to this service account system? like API key or connection string like Azure provides?
Because I saw this initialization function have some other options to authenticate. but didn't saw any example
RxStorageClient = StorageClient.Create();
Can anyone please provide a proper example to connect with cloud storage services without this service account file system
You can do this instead of relying on the environment variable by downloading credential files for each project you need to access.
So for example, if you have three projects that you want to access storage on, then you'd need code paths that initialize the StorageClient with the appropriate service account key from each of those projects.
StorageClient.Create() can take an optional GoogleCredential() object to authorize it (if you don't specify, it grabs the default application credentials, which, one way to set is that GOOGLE_APPLICATION_CREDENTIALS env var).
So on GoogleCredential, check out the FromFile(String) static call, where the String is the path to the service account JSON file.
There are no examples. Service accounts are absolutely required, even if hidden from view, to deal with Google Cloud products. They're part of the IAM system for authenticating and authorizing various pieces of software for use with various products. I strongly suggest that you become familiar with the mechanisms of providing a service account to a given program. For code running outside of Google Cloud compute and serverless products, the current preferred solution involves using environment variables to point to files that contain credentials. For code running Google (like Cloud Run, Compute Engine, Cloud Functions), it's possible to provide service accounts by configuration so that the code doesn't need to do anything special.

How do i pass AWS credentials using a process variable containing an AWS service endpoint id

I want to use two variables, $(Aws.Endpoint) and $(Aws.Region), in my AWS-related release tasks, and provide values for those as process variables.
Aws.Endpoint is the id of an aws service endpoint in VSTS. When i do this, i get
Endpoint auth data not present: ...
Has anyone who ran into this seemingly trivial issue found a solution? Otherwise i need to define the aws endpoint directly in the task, which feels wrong, because i eventually want the release tasks to be part of a task group, shared by all the environment making up the pipeline (dev, stage, prod).
Note: i see there is no stackoverflow tag for AWS Tools for Visual Studio Team Services, and i don't have the reputation to create a new tag. If someone with enough reputation could create something like aws-tools-for-vsts (homepage), that would be grand.
No, you can’t do it for the tasks in AWS tools for Microsoft Visual Studio Team Services extension. You can custom build/release task to meet your requirement through VSTS extension.
If you want to get AWS endpoint in your custom build/release task, you can get the endpoint through Get-VstsEndpoint or task.getEndpointAuthorization with the GUID of service (can get in build/release log: ##[debug]awsCredentials=9a3009d2-35f3-4954-a8fa-34c3313c34f6)
For example:
$awsEndpoint = Get-VstsEndpoint -Name [GUID of service] -Require
Write-Host $awsEndpoint
foreach($p in $awsEndpoint.Auth.Parameters){
Write-Host $p
}

How to use realm with AWS serverless lambda?

So I'm currently using AWS serverless to make a createObject endpoint, which takes the values from the http body and uses the key-values to store an item in DynamoDB. I am doing this via AWS Lambda. We recently started using Realm (Realm.io) and we see that they have a Javascript library for node. Is it possible to invoke a realm function to store the same object, via AWS Lambda?
As I understand it, the Node SDK for Realm is only for the Professional and Enterprise editions of the Realm Object Server (ROS). The ROS can only be deployed on your own Linux instance.
Please see this for more details: https://realm.io/docs/realm-object-server/pe-ee/
Exactly as Clifton Labrum says, the professional and enterprise editions of the Realm Mobile Platform has a node.js SDK to be used on the server to listen for changes in Realms.
At every change in a Realm, you will get an event, which you can then process as you like. For instance, you can store objects in DynamoDB. You can also just leave the objects in Realms.

Googlescript and AWS SDK

I want to interact with Amazon Web Services DynamoDB in a Google Sheet via code in a GoogleScript. However, I cannot figure out how to integrate the AWS SDK. I am hoping to avoid having to write a library to handle the integration via the AWS HTTP API, as there are JavaScript and Java SDKs available for the SDK. Help?
(I've done some pretty extensive Google and Stack Overflow searches. This is the closest thing I've found to an answer, but that is the Google App Engine, not the Google Apps Script.)
Thanks!
I just made a function that does the basic authentication for any api request you want to make. You still might have to get some of the headers in order but it does most of the hard work for you.
For example:
function myFunction() {
AWS.init("MY_ACCESS_KEY", "MY_SECRET_KEY");
var instanceXML = AWS.request('ec2', 'us-east-1', 'DescribeInstances', {"Version":"2015-10-01"});
...
}
I put it in a repo with some documentation for use. Here's the link: https://github.com/smithy545/aws-apps-scripts
I recently added my own derivative of the "AWS via Google Apps Script" here: https://github.com/neilobremski/gas/blob/main/GasAWS.js
The JavaScript code here is a bit more modern and specifically uses the Utilities available in Google Apps Script for the SHA-256 stuff rather than the abandoned (but excellent!) Crypto-JS library.
Usage is the same except I moved the region parameter and have a global default. I did this because often resources are in the same region and repeatedly specifying it is a pain.
Responding if anyone else finds this useful, one of the responses above gave a good hint for me to get started but not able to add comment on it.
Steps in below tutorial walks through setting up API Gateway, Lambda and DynamoDb. Once setup, you can directly use URLFetch in GAS to your API Gateway.
http://docs.aws.amazon.com/lambda/latest/dg/with-on-demand-https-example.html
Code in AWS link is complete and needs no additional setup to work directly using URL Fetch. You may however want to enable security on API Gateway.
On GAS side -
var headers = {
"x-api-key" : AWS_KEY
};
var event = {};
event.operation = "create";
event.tableName = AWS_DB_NAME;
event.payload.Item = yourDataObjwithKey;
var options = {
"method":"POST",
"headers":headers,
"payload":JSON.stringify(event)
};
var url = AWS_GW;
var response = UrlFetchApp.fetch(url, options);
I am about to implement this and it looks like the best way is to use a Lambda function to query the DynamoDB via API Gateway.
It's really not that hard if you know NodeJS.