AWS SWF Sample Code issue - amazon-web-services

I am new to Amazon simple workflow service and am following AWS Docs to understand SWF.
As per the documentation, once you execute the GreeterMain class after executing the GreeterWorker class, you should see active workflow execution on AWS console. However thats not the case with me. On executing the GreeeterMain class, the application prints out Hello World but I do not see any active workflows in "My Worfkflow Executions" sections on AWS console. I am not getting any errors as well.
On executing the GreeterWorker class, I can see "Workflow Types" and "Activities Types" section populated with appropriate workflows and activities.
Am I doing something wrong? Can someone please help out.
Thanks.

Ahh.. Found it.... As per doc, you create class with name "GreeterMain" in two different packages. One package is basic code path, second uses AWS SWF. While executing Eclipse was referring to basic code path and not invoking AWS SWF.

Related

Dataproc custom image: Cannot complete creation

For a project, I have to create a Dataproc cluster that has one of the outdated versions (for example, 1.3.94-debian10) that contain the vulnerabilities in Apache Log4j 2 utility. The goal is to get the alert related (DATAPROC_IMAGE_OUTDATED), in order to check how SCC works (it is just for a test environment).
I tried to run the command gcloud dataproc clusters create dataproc-cluster --region=us-east1 --image-version=1.3.94-debian10 but got the following message ERROR: (gcloud.dataproc.clusters.create) INVALID_ARGUMENT: Selected software image version 1.3.94-debian10 is vulnerable to remote code execution due to a log4j vulnerability (CVE-2021-44228) and cannot be used to create new clusters. Please upgrade to image versions >=1.3.95, >=1.4.77, >=1.5.53, or >=2.0.27. For more information, see https://cloud.google.com/dataproc/docs/guides/recreate-cluster, which makes sense, in order to protect the cluster.
I did some research and discovered that I will have to create a custom image with said version and generate the cluster from that. The thing is, I have tried to read the documentation or find some tutorial, but I still can't understand how to start or to run the file generate_custom_image.py, for example, since I am not confortable with cloud shell (I prefer the console).
Can someone help? Thank you

Cloud Functions Permissioning Issues

I created a GCP cloud function in Go runtime 1.13. All resources are under the same project.
It's reading from a pub-sub topic A doing a transformation on the message writing to a different topic B.
I've had this working on the test project and that worked fine but I can't seem to reproduce it in our production environment.
I bound the function to a service account that is given the Pub/Sub Publisher and Viewer role.
But I seem to keep on getting this error:
rpc error: code = PermissionDenied desc = User not authorized to perform this action.
So summarize/clarify, reading from topic A gives no problems but writing to topic B makes the function crash.
What am I missing?
This turned out to be a user error. I'm sorry for wasting everyone's time and appreciate all the feedback. It seems like I was pointing to the wrong project and go figure I didn't have permissions.
Thank you all for the help.

cloud-builds pub/sub topic appears to be unlisted or inaccessible

I'm attempting to create an integration between Bitbucket Repo and Google Cloud Build to automatically build and test upon pushes to certain branches and report status back (for that lovely green tick mark). I've got the first part working, but the second part (reporting back) has thrown up a bit of a stumbling block.
Per https://cloud.google.com/cloud-build/docs/send-build-notifications, Cloud Build is supposed to automatically publish update messages to a Pub/Sub topic entitled "cloud-builds". However, trying to find it (both through the web interface and via gcloud command line tool) has turned up nothing. Copious amounts of web searching has turned up https://github.com/GoogleCloudPlatform/google-cloud-visualstudio/issues/556, which seems to suggest that the topic referenced in that doc is now being filtered out of results; however, that issue seems to be specific to the visual studio tools and not GCP as a whole. Moreover, https://cloud.google.com/cloud-build/docs/configure-third-party-notifications suggests that it's still accessible, but perhaps only to Cloud Functions? And maybe only manually via the command line, since the web interface for Cloud Functions also does not display this phantom "cloud-builds" topic?
Any guidance as to where I can go from here? Near as I can tell, the two possibilities are that something is utterly borked in my GCP project and the Pub/Sub topic is either not visible just for me or has somehow been deleted, or I'm right and this topic just isn't accessible anymore.
I was stuck with the same issue, after a while I created the cloud-builds topic manually and created a cloud function that subscribed to that topic.
Build details are pushed to the topic as expected after that, and my cloud function gets triggered with new events.
You can check the existence of the cloud-builds topic an alternate way from the UI, by downloading the gcloud command line tool and, after running gcloud init, running gcloud pubsub topics list to list all topics for the configured project. If the topic projects/{your project}/topics/cloud-builds is not listed, I would suggest filing a bug to the cloud build team here.
Creating the cloud-builds topic manually won't work since it's a special topic that Google managed.
In this case, you have to go to the API central and disable the CloudBuild API, and then enable it again, the cloud-builds topic will be created for you. Enable and disable Cloud Build API

Amazon SWF Lambda Functions error - not available in region

I'm implementing a workflow with AmazonSWF and one of my activities comes in the form of a lambda function.
Both SWF and Lambda are being run on the London region, where they both work separately. However, my decider after polling for the task, it fails with the cause "LAMBDA_SERVICE_NOT_AVAILABLE_IN_REGION"
I haven't explicitly specified which region I'm working from in code, I assumed it would be the same one that I run the SWF web client in.
Here's the relevant code in my decider:
val attrs = ScheduleLambdaFunctionDecisionAttributes()
.withId("S3ControlWorkflowFunction")
.withName("S3ControlWorkflowFunction")
decisions.add(
Decision()
.withDecisionType(DecisionType.ScheduleLambdaFunction)
.withScheduleLambdaFunctionDecisionAttributes(attrs)
)
My activity worker doesn't do anything at all for the lambda function, but it shouldn't have to right?
I've registered the workflow with my IAM role here:
wf.registerWorkflowType(RegisterWorkflowTypeRequest()
.withDomain(DOMAIN)
.withName(WORKFLOW)
.withVersion(WORKFLOW_VERSION)
.withDefaultChildPolicy(ChildPolicy.TERMINATE)
.withDefaultTaskList(TaskList().withName(TASKLIST))
.withDefaultTaskStartToCloseTimeout("30")
.withDefaultLambdaRole(iamARN.id))
Found the fix.
Turns out calling lambda functions from SWF just isn't supported on region eu-west-2, as well as a few others. However I can't find any reference to this at all in the documentation. Found this forum post which gave me the hint. Migrating all the work I'd done over to eu-west-1 solved the issue. Poor show from Amazon here

MTurk external website example

I am looking for a example where a fully completed web app can be embedded into amazon mechanical turk. I am working on a "game-like" activity that does not really belong to a form structure.
Here is my game/activity:
http://52.91.100.69:3030/
I would like to embed such tasks inside mechanical turk. My code accepts url parameters such as assignmentId, workerId etc (which I have found form the aws mturk docks)
For example:
http://52.91.100.69:3030/?assignmentId=23423&workerId=34&hitId=455
Basically, I am handling all the data logging etc, I plan to generate codes for users to enter upon completion of a number of tasks.
I would like to know how I cam accomplish this? Preferably in python (Boto)?
I looked at this tutorial: http://kaflurbaleen.blogspot.com/2014/06/in-which-i-battle-mturk-external-hits.html
Using this I made this boto file: https://gist.github.com/arendu/631a416e4cb17decb9dd
When I run it I dont see any errors, but I can't seem to find out whether the hit is available? I checked my aws mturk requester console (looked at manage HITs individually) but no hits are present.
What am I doing wrong?