Is there a way to change the scheduled time for a webjob from within an MVC application or is the only way through the azure portal.
I do not mean the initial schedule times, but after deployment change the scheduled time via UI in the Mvc application.
You can always access the WebJobs in your website through the WebJobs API.
You can trigger the WebJob through a manual call for Triggered Jobs.
You can also upload a new zip through the API in case your schedule changes.
In order to change the schedule, you'll have to go through the portal, or redeploy the WebJob with the new schedule. You can obviously do this via Visual Studio or using the API.
There's more information on WebJobs here.
Hope this helps.
Related
I have a list of 10 time stamps which keeps on updating dynamically. In total there are 3 such lists for 3 users. I want to build a utility to trigger a function at the next upcoming time stamp. (preferably everything over server-less compute)
I am stuck in finding out how to achieve this over aws or firebase
On Firebase/Google Cloud Functions the two most common options are either to store the schedule in a database and then periodically trigger a Cloud Function and run the tasks that are due, or to use Cloud Tasks to dynamically schedule a callback to a separate Cloud Function for each task.
I recommend also reading:
Doug's blog post on How to schedule a Cloud Function to run in the future with Cloud Tasks (to build a Firestore document TTL)
Fireship.io's tutorial on Dynamic Scheduled Background Jobs in Firebase
How can scheduled Firebase Cloud Messaging notifications be made outside of the Firebase Console?
Previous questions on dynamically scheduling functions, as this has been covered quite well before.
Update (late 2022): there is now also a built-in way to schedule Cloud Functions dynamically: enqueue functions with Cloud Tasks.
We are triggering DAGs in Google Cloud Composer via Airflow's REST interface.
https://airflow.apache.org/docs/apache-airflow/stable/deprecated-rest-api-ref.html
We can see that the DAGs triggered this way get Run IDs with manual__ prefix, e.g.:
Run Id
manual__2021-06-25T05.52.31.103391+00.00
When we trigger these DAGs from Cloud Functions want to differentiate these executions from the manually executed DAGs (triggered from UI) or from automatically scheduled executions (triggered by cron/scheduling criteria).
Question: Is it possible to mark the executions triggered via REST API other than manual__?
Sidenote: currently we are using Composer with Airflow 1.x, but we are planning to change to Airflow 2.x as soon as the Composer supports it.
It can not be changed.
Airflow has Enum of DagRunType with 3 options: backfill, scheduled & manual. The value is set by Airflow itself.
We have Postgresql on AWS. All realtime changes from Portal UI are captured on this database. However there is a request to move these changes in realtime or near realtime to GCP.
Purpose: We want various consumers to ingest data from GCP, instead of master data source in PostgresAWS.
When a customer table (in AWS Postgres) is being inserted with a new customer record, then I want to immediately populate that record in JSON format in GCP pub sub topic.
Please let me know any reference to move a database table specific data across cloud as and when any DML event occurs?
Please note that am new to GCP and learning and exploring :)
Thanks
Databases use log shipping to update slaves/replicas. In your case, you want to update two targets (database, Cloud Pub/Sub) by having the database do the Pub/Sub update. That might be possible but will require development work.
PostgreSQL does not have a native ability to update Pub/Sub. Instead, change your requirements so that the application/service that is updating the database then updates Pub/Sub.
If you really want PostgreSQL to do this task, you will need to use PostgreSQL triggers and write a trigger function in C with the Google Cloud Pub/Sub REST API.
PostgreSQL Trigger Example
PostgreSQL Event Trigger Example
Event triggers for PostgreSQL on Amazon RDS
Cloud Pub/Sub API
I am looking for a service or framework in Native AWS which given, a csv file, creates a task and process that task asynchronously and returns a task id or job id to the client and notifies the client when the task is completed. Some requirements for this:
Client should be able to check the progress of the task by job id at any time.
Processing of entire task can take more than 15 mins.
There should be a way for clients to see the reasons of failures.
All the business logic would be at line item level. (this is the only thing developer should care about)
Is there any in-built service or framework for that in Native AWS? I know one can build this kind of service using some SQS, Lambda, SNS, Dynamodb but I am just looking if there is a already available AWS offering for it, which can do all of these?
The closest service to this concept is AWS Step Functions.
However, it would just be one component of a solution. You would still need to create the compute component by using Amazon EC2 or AWS Lambda. You would need to build the interface for users, add authentication, notifications, etc.
Bottom line: There is no AWS service that does what you describe. However, there are the building blocks if you wish to create one yourself.
In AWS it was possible to run cloudwatch to trigger callback lambda functions on events.
Is it possible in GCE to automatically tag servers with the user who created it based on the activity logs? Google Cloud functions seem to only be able run a non-public callback based on GCS events.
How would I do this?
As a matter of fact, there are four types of triggers for Google Cloud Functions. But none of them is useful in this case.
There is a way to automatically do so, though.
You can create an application setting up Stackdriver Logging using a Client Library, as for example Python, in App Engine.
Then you can schedule a task using a cron job which triggers the application. You can use the client library to review the logs and search for compute.instance.insert (CE creation), the "actor" or "user" and...
finally add a label to the existing resource.