I am trying to host my Autodesk Forge model on AWS, in the process of creating the stack, there is this error (https://aws-quickstart.s3.amazonaws.com/quickstart-autodesk-forge/templates/autodesk-forge-master.json)
I just found this error shows that the file does not exist.
How can I solve this problem?
How can I find this file (autodesk-forge-master.json)?
The Learn Forge website has a Deployment section that provides tips for deploying your application to various cloud hosting providers, incl. AWS, Azure, or Heroku.
However, if you're getting started with Forge (now called Autodesk Platform Services) development, I would suggest using one of the PaaS solutions such as Heroku or Fly.io.
I am trying to set credentials for dynamodb following the instruction here: https://aws.amazon.com/getting-started/hands-on/real-time-leaderboard-amazon-aurora-serverless-elasticache/?trk=gs_card.
Now, I want to set a credential inside const client = new DynamoDBClient({ credential here }) by following https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-dynamodb/interfaces/awsauthinputconfig.html#signer. I wasn't sure of the format of the credential inside the new DynamoDBClient method so I tried looking for the credentials code. The documentation says credential is defined in packages/middleware-signing/dist/types/configurations.d.ts:6, but I cannot find that at all.
How would I set the configuration and also know what they mean that credentials is defined in 'packages/middleware-signing/dist/types/configurations.d.ts:6'?
All AWS SDKs have their own Developer Guides. The AWS SDK for JavaScript is no different. To learn how to work with the AWS SDK for JavaScript, refer to the Developer Guide:
AWS SDK for JavaScript v3 Developer Guide
This guide contains all the information you need to get up and running with this SDK, including how to work with credentials.
TO learn how to work with the JavaScript SDK and DynamoDB, see:
Build an app to submit data to DynamoDB
I am using Antora to generate a static site for our documentation. I have followed their guidance for private repository authentication but are being unsuccessful. It seems that they only support HTTPS Basic Auth for GIT over HTTPS. I have tried generating and using an Application Specific Password, GIT Cookie, OAuth token all without success. Do you have any guidance on how to provide authentication?
At the moment Cloud Source Repositories doesn't support this kind of user/password pair authentication. Here is a similar issue for eclipse.
The only supported ways to authenticate at the moment are described in the Public Documentation
If I understood your requirement correctly, you want to connect to GCP source code repository externally to push code. If so you need to use service account with source code repository access rights. Choose appropriate roles using below URL:
https://cloud.google.com/source-repositories/docs/reference/rest
Refer below mentioned URL for connecting to source code repository once you are authenticated and do git operations.
https://cloud.google.com/source-repositories/docs/authentication
Please let me know if this is what you are looking for.
Hope this helps.
I just started playing with Inferstracture as a code in Google cloud.
Installed Terraform
Installed Terraformer
Created a new GCP project with a virtual machine in it.
My goal is to duplicate the project, with all it's component, into a new project.
In order to do so, I using Terraformer to reverse terraform my existing project. Command:
$ terraformer import google --connect --projects=[project_id] --resources=autoscalers,backendBuckets,backendServices,bigQuery,cloudFunctions,cloudsql,dataProc,disks,dns,firewalls,forwardingRules,gcs,gke,globalAddresses,globalForwardingRules,healthChecks,httpHealthChecks,httpsHealthChecks,iam,images,instanceGroupManagers,instanceGroups,instanceTemplates,instances,interconnectAttachments,kms,memoryStore,monitoring,networkEndpointGroups,networks,nodeGroups,nodeTemplates,project,pubsub,regionAutoscalers,regionBackendServices,regionDisks,regionInstanceGroupManagers,routers,routes,schedulerJobs,securityPolicies,sslPolicies,subnetworks,targetHttpProxies,targetHttpsProxies,targetInstances,targetPools,targetSslProxies,targetTcpProxies,targetVpnGateways,urlMaps,vpnTunnels
2019/06/20 08:00:08 google importing project [project_id]
2019/06/20 08:00:08 google importing... autoscalers
2019/06/20 08:00:19 googleapi: got HTTP response code 404 with body: Not Found
Seems like I have kind of permission problem since google-api reply with Not-Found error code.
I guess Terraformer is accessing is using gcloud permissions to access my gcp environment, is this true?
If it's true, my logged in credientails are owner on this project.
What should I check? How to fix this issue?
You can use service account with read access to project. And set GOOGLE_CLOUD_KEYFILE_JSON to point to credentials.json of that service account.
I configured a parse server on my AWS elastic beanstalk using this guid I've tested it and it all works fine
Now I can't find a way to deploy parse dashboard on my server.
I did deployed parse dashboard on my local host and connected it to the application on server, But this way I cannot manage (Add and remove) my apps.
Another problem is that parse dashboard missing cloud code on default, I found this on git, but I cant understand where do I add the requested endpoints, is it something like adding app.use('/scripts', express.static(path.join(__dirname, '/scripts'))); on the index.js file?
in order to deploy parse-dashboard to your EC2 you need to follow the Deploying Parse Dashboard section in parse-dashboard github page
parse-dashbard github page
Please make sure that when you deploy parse-dashboard you are using https and also basic authentication (it is also part of the guide)
Now regarding the cloud code: the ability to deploy cloud code via parse CLI and to view the nodejs code in parse dashboard are not available in parse-server but those are parse.com features. Cloud code in parse-server is handled by modifying the main.js file which exist under the cloud folder and deployment should be done manually by you but the big advantage in parse-server cloud code is that you can use any NodeJS module that you want from there and you are not restricted to the modules that were used by parse.com .
Another point about the dashboard. What you can do is to create an express application and then add parse-server and parse-dashboard as a middleware to your express application and deploy the whole application to AWS and then you can enjoy both parse-server (that will be available under the /parse path, unless you changed it to something else) and parse dashboard that will be available under the /dashboard path
Enjoy :)