Where to upload console app for Azure DF v2 custom activity? - custom-activity

I am trying to create DF v2 custom activity. I have console application, but i am not sure where to put it.
Should i only build it and upload to blob storage or is there some special publishment?
Thx

As stated here https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-dotnet-custom-activity you need to upload your code to a Compute Environment which will be an Azure Batch pool.
The custom activity runs your customized code logic on an Azure Batch
pool of virtual machines.
Then just configure the linked service and the activity as shown in the official documentation.
Hope this helped!

Related

Google Cloud Run service deployment, is it the best direction in my situation?

I have some experience with Google Cloud Functions (CF). I tried to deploy a CF function recently with a Python app, but it uses an NLP model so the 8GB memory limit is exceeded when the model is triggered. The function is triggered when a JSON file is uploaded to a bucket.
So, I plan to try Google Cloud Run but I have no experience with it. Also, I am not completely sure if it is the best course of action.
If it is, what is the best way of implementing provided that the Run service will be triggered by a file uploaded to a bucket? In CF, you can select the triggering event, in Run I didn't see anything like that. I could use some starting points as I couldn't find my case in the GCP documentation.
Any help will be appreciated.
You can use at least these two things:
The legacy one: Create a GCS notification in PubSub. Then create a push subscription and add the Cloud Run URL in the HTTP push destination
A more recent way is to use Eventarc to invoke directly a Cloud Run endpoint from an event (it roughly create the same thing with a PubSub topic and push subscription, but it's fully configured for you)
EDIT 1
When you use Push notification, you will received a standard PubSub message. The format is described in the documentation for the attributes and for the body content; keep in mind that the raw content is base64 encoded and you have to decode it to get the final format
I personally have a Cloud Run service that log the contents of any requests to be able to get in the logs all the data that I need to develop. When I have a new message format, I configure the push to that Cloud Run endpoint and I automatically get the format
For Eventarc, the format will be added to the UI soon (I view that feature in preview, but it's not yet available). The best solution is to log the content to know what you get to know what to do!

AWS S3 Get cli command line from each operation done by GUI

When I do an operation like an S3 upload, using the AWS GUI from browser, is it possible to retrieve the relative CLI command for generating the same operation already done?
Thanks.
There is an extension for chrome and Firefox (As far as I know) that records the changes made in the AWS console and translates it to CLI commands.
Plugin Link
Although not all the services / actions are supported It does a pretty good job. Here you can check the service coverage of the plugin
Service Coverage

Use AWS ES2 for Big Data Analytic

Is it possible to send files from a mobile application to ES2 that has a python script file that processes the file and the final product will be save into S3?
Deploy a simple webapp in EC2 to receive the data from your mobile app, run the python script you mentioned with the data, use the S3 API and save the data there. As for how you're going to deploy that webapp, there are tons of ways/languages/technologies, fit for another question.

AWS Billing(Usage + rateCard)

I want to get AWS usage report in .net using SDk or Rest API. Is there any service available for it?
To get rate card(pricing info) i have used
https://pricing.us-east-1.amazonaws.com/offers/v1.0/aws/AmazonCloudWatch/current/index.json
this service from which i could get the json object.
Please advice if there is any such service avaialble to get the resources consumed in AWS. SO that i can calculate the billing.
Regards,
Aparna
In general you need to cover several items:
Authentication - You'll need to set up your credentials to be able to connect to AWS. Check the Documentation
Enable DBR or CUR reports in the billing console. This will export your monthly billing information to S3. Check the Documentation
Once ready use the SDK to download the reports and import them to DB, ES, whatever you're working with to process large excel files. Check the Documentation
Good luck!

Update wowza StreamPublisher schedule via REST API (or alternative)

Just getting started with Wowza Streaming Engine.
Objective:
Set up a streaming server which live streams existing video (from S3) at a pre-defined schedule (think of a tv channel that linearly streams - you're unable to seek through).
Create a separate admin app that manages that schedule and updates the streaming app accordingly.
Accomplish this with as a little custom Java as possible.
Questions:
Is it possible to fetch / update streamingschedule.smil with the Wowza Streaming Engine REST API?
There are methods to retrieve and update specific SMIL files via the REST API, but they only seem to be applicable to those created through the manager. After all, streamingschedule.smil needs to be created manually by hand
Alternatively, is it possible to reference a streamingschedule.smil that exists on an S3 bucket? (In a similar way footage can be linked from S3 buckets with the use of the MediaCache module)
A comment here (search for '3a') seems to indicate it's possible, but there's a lot of noise in that thread.
What I've done:
Set up Wowza Streaming Engine 4.4.1 on EC2
Enabled REST API documentation
Created a separate S3 bucket and filled it with pre-recorded footage
Enabled MediaCache on the server which points to the above S3 bucket
Created a customised VOD edge application, with AppType set to Live and StreamType set to live in order to be able to point to the above (as suggested here)
Created a StreamPublisher module with a streamingschedule.smil file
The above all works and I have a working schedule with linearly streaming content pulled from an S3 bucket. Just need to be able to easily manipulate that schedule without having to manually edit the file via SSH.
So close! TIA
To answer your questions:
No. However, you can update it by creating an http provider and having it handle the modifications to that schedule. Should you want more flexibility here you can even extend the scheduler module to not require that file at all.
Yes. You would have to modify the ServerListenerStreamPublisher solution to accomplish it. Currently it solely looks a the local filesystem to read teh streamingschedule.smil file.
Thanks,
Matt