Amplify Push erroring out when updating CustomResources.json - amazon-web-services

I'm building a geospatial search on properties using AWS Amplify and ElasticSearch.
I'm currently following this guide: https://gerard-sans.medium.com/finding-the-nearest-locations-around-you-using-aws-amplify-part-2-ce4603605be6
I set up my model as follows
type Property #model #searchable #auth(rules: [{allow: public}]) {
id: ID!
...
Loc: Coord!
}
type Coord {
lon: Float!
lat: Float!
}
I also added a custom Query:
type Query {
nearbyProperties(
location: LocationInput!,
m: Int,
limit: Int,
nextToken: String
): ModelPropertyConnection
}
input LocationInput {
lat: Float!
lon: Float!
}
type ModelPropertyConnection {
items: [Property]
total: Int
nextToken: String
}
I added resolvers for request and response:
## Query.nearbyProperties.req.vtl
## Objects of type Property will be stored in the /property index
#set( $indexPath = "/property/doc/_search" )
#set( $distance = $util.defaultIfNull($ctx.args.m, 500) )
#set( $limit = $util.defaultIfNull($ctx.args.limit, 10) )
{
"version": "2017-02-28",
"operation": "GET",
"path": "$indexPath.toLowerCase()",
"params": {
"body": {
"from" : 0,
"size" : ${limit},
"query": {
"bool" : {
"must" : {
"match_all" : {}
},
"filter" : {
"geo_distance" : {
"distance" : "${distance}m",
"Loc" : $util.toJson($ctx.args.location)
}
}
}
},
"sort": [{
"_geo_distance": {
"Loc": $util.toJson($ctx.args.location),
"order": "asc",
"unit": "m",
"distance_type": "arc"
}
}]
}
}
}
and response:
## Query.nearbyProperties.res.vtl
#set( $items = [] )
#foreach( $entry in $context.result.hits.hits )
#if( !$foreach.hasNext )
#set( $nextToken = "$entry.sort.get(0)" )
#end
$util.qr($items.add($entry.get("_source")))
#end
$util.toJson({
"items": $items,
"total": $ctx.result.hits.total,
"nextToken": $nextToken
})
And now the CustomStacks.json:
{
"AWSTemplateFormatVersion": "2010-09-09",
"Description": "An auto-generated nested stack.",
"Metadata": {},
"Parameters": {
"AppSyncApiId": {
"Type": "String",
"Description": "The id of the AppSync API associated with this project."
},
"AppSyncApiName": {
"Type": "String",
"Description": "The name of the AppSync API",
"Default": "AppSyncSimpleTransform"
},
"env": {
"Type": "String",
"Description": "The environment name. e.g. Dev, Test, or Production",
"Default": "NONE"
},
"S3DeploymentBucket": {
"Type": "String",
"Description": "The S3 bucket containing all deployment assets for the project."
},
"S3DeploymentRootKey": {
"Type": "String",
"Description": "An S3 key relative to the S3DeploymentBucket that points to the root\nof the deployment directory."
}
},
"Resources": {
"QueryNearbyProperties": {
"Type": "AWS::AppSync::Resolver",
"Properties": {
"ApiId": { "Ref": "AppSyncApiId" },
"DataSourceName": "ElasticSearchDomain",
"TypeName": "Query",
"FieldName": "nearbyProperties",
"RequestMappingTemplateS3Location": {
"Fn::Sub": [
"s3://${S3DeploymentBucket}/${S3DeploymentRootKey}/resolvers/Query.nearbyProperties.req.vtl", {
"S3DeploymentBucket": { "Ref": "S3DeploymentBucket" },
"S3DeploymentRootKey": { "Ref": "S3DeploymentRootKey" }
}]
},
"ResponseMappingTemplateS3Location": {
"Fn::Sub": [ "s3://${S3DeploymentBucket}/${S3DeploymentRootKey}/resolvers/Query.nearbyProperties.res.vtl", {
"S3DeploymentBucket": { "Ref": "S3DeploymentBucket" },
"S3DeploymentRootKey": { "Ref": "S3DeploymentRootKey" }
}]
}
}
}
},
"Conditions": {
"HasEnvironmentParameter": {
"Fn::Not": [
{
"Fn::Equals": [
{
"Ref": "env"
},
"NONE"
]
}
]
},
"AlwaysFalse": {
"Fn::Equals": ["true", "false"]
}
},
"Outputs": {
"EmptyOutput": {
"Description": "An empty output. You may delete this if you have at least one resource above.",
"Value": ""
}
}
}
But when i try to amplify push, it does not work. Something about: Resource is not in the state stackUpdateComplete
Any help?

You could take a look in cloudformation at the resource. You're instance is probably stuck in update. Go to : Cloudformation, select your instance (or uncheck view nested first) and go to the events tab. There you will probably find a reason why the instance can't update.
If it's stuck, cancel within the stack actions.

Related

How to enable API Gateway endpoint Authorization flag to use AWS Cognito user pool - using Terraform

I am trying to enable the Authorization Flag and Enable OAuth scope directly from API JSON definition deployed through Terraform. Although I am able to attach Cognito to the API Gateway as the Authorizer but not able to enable the endpoints with it using terraform (Please see the attached screenshot).
#Screenshot
Here's the attached code for API Gateway:
#Create API Gateway
resource "aws_api_gateway_rest_api" "manidemoapi" {
name = "manidemoapi"
body = <<EOF
{
"openapi": "3.0.1",
"info": {
"title": "Example Pet Store",
"description": "A Pet Store API.",
"version": "1.0"
},
"paths": {
"/pets": {
"get": {
"operationId": "GET HTTP",
"parameters": [
{
"name": "type",
"in": "query",
"schema": {
"type": "string"
}
},
{
"name": "page",
"in": "query",
"schema": {
"type": "string"
}
}
],
"responses": {
"200": {
"description": "200 response",
"headers": {
"Access-Control-Allow-Origin": {
"schema": {
"type": "string"
}
}
},
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/Pets"
}
}
}
}
},
"x-amazon-apigateway-integration": {
"type": "HTTP_PROXY",
"httpMethod": "GET",
"uri": "http://petstore.execute-api.us-west-1.amazonaws.com/petstore/pets",
"payloadFormatVersion": 1.0
}
},
"post": {
"operationId": "Create Pet",
"requestBody": {
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/NewPet"
}
}
},
"required": true
},
"responses": {
"200": {
"description": "200 response",
"headers": {
"Access-Control-Allow-Origin": {
"schema": {
"type": "string"
}
}
},
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/NewPetResponse"
}
}
}
}
},
"x-amazon-apigateway-integration": {
"type": "HTTP_PROXY",
"httpMethod": "POST",
"uri": "http://petstore.execute-api.us-west-1.amazonaws.com/petstore/pets",
"payloadFormatVersion": 1.0
}
}
},
"/pets/{petId}": {
"get": {
"operationId": "Get Pet",
"parameters": [
{
"name": "petId",
"in": "path",
"required": true,
"schema": {
"type": "string"
}
}
],
"responses": {
"200": {
"description": "200 response",
"headers": {
"Access-Control-Allow-Origin": {
"schema": {
"type": "string"
}
}
},
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/Pet"
}
}
}
}
},
"x-amazon-apigateway-integration": {
"type": "HTTP_PROXY",
"httpMethod": "GET",
"uri": "http://petstore.execute-api.us-west-1.amazonaws.com/petstore/pets/{petId}",
"payloadFormatVersion": 1.0
}
}
}
},
"x-amazon-apigateway-cors": {
"allowOrigins": [
"*"
],
"security" : [ {
"manicognito-authorizer" : [ "get_details" ]
} ],
"allowMethods": [
"GET",
"OPTIONS",
"POST"
],
"allowHeaders": [
"x-amzm-header",
"x-apigateway-header",
"x-api-key",
"authorization",
"x-amz-date",
"content-type"
]
},
"components": {
"securitySchemes" : {
"manicognito-authorizer" : {
"type" : "apiKey",
"name" : "Authorization",
"in" : "header",
"x-amazon-apigateway-authtype" : "cognito_user_pools"
}
},
"schemas": {
"Pets": {
"type": "array",
"items": {
"$ref": "#/components/schemas/Pet"
}
},
"Empty": {
"type": "object"
},
"NewPetResponse": {
"type": "object",
"properties": {
"pet": {
"$ref": "#/components/schemas/Pet"
},
"message": {
"type": "string"
}
}
},
"Pet": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"type": {
"type": "string"
},
"price": {
"type": "number"
}
}
},
"NewPet": {
"type": "object",
"properties": {
"type": {
"$ref": "#/components/schemas/PetType"
},
"price": {
"type": "number"
}
}
},
"PetType": {
"type": "string",
"enum": [
"dog",
"cat",
"fish",
"bird",
"gecko"
]
}
}
}
}
EOF
endpoint_configuration {
types = ["REGIONAL"]
}
}
#Deploy API Gateway
resource "aws_api_gateway_deployment" "manidemoapi" {
rest_api_id = aws_api_gateway_rest_api.manidemoapi.id
triggers = {
redeployment = sha1(jsonencode(aws_api_gateway_rest_api.manidemoapi.body))
}
lifecycle {
create_before_destroy = true
}
}
resource "aws_api_gateway_stage" "manidemoapi" {
deployment_id = aws_api_gateway_deployment.manidemoapi.id
rest_api_id = aws_api_gateway_rest_api.manidemoapi.id
stage_name = "manidemoapi-dev"
}
resource "aws_api_gateway_authorizer" "manidemoapi" {
name = "manicognito-authorizer"
type = "COGNITO_USER_POOLS"
rest_api_id = aws_api_gateway_rest_api.manidemoapi.id
provider_arns = [aws_cognito_user_pool.pool.arn]
}
The root problem is that authorization is "method-scoped", i.e. you have to specify the authorizer for each API method. You should add a terraform resource "aws_api_gateway_method" as the following:
resource "aws_api_gateway_method" "default" {
http_method = <http-method>
authorization = "COGNITO_USER_POOLS"
authorizer_id = <your-authorizer-id>
resource_id = <resource-id>
rest_api_id = <rest-api-id>
}
However, since you are using OpenAPI Specification approach rather than Terraform resource approach to define Terraform, you may need to consider to transform your template to the latter approach.

Cannot run AWS Data Pipeline job due to ListObjectsV2 operation: Access Denied

I've written some CDK code to programmatically create a data pipeline that backs up a DynamoDB table into an S3 bucket on a daily basis.
But it keeps running into this error:
amazonaws.datapipeline.taskrunner.TaskExecutionException: Failed to complete EMR transform. at amazonaws.datapipeline.activity.EmrActivity.runActivity(EmrActivity.java:67) at amazonaws.datapipeline.objects.AbstractActivity.run(AbstractActivity.java:16) at amazonaws.datapipeline.taskrunner.TaskPoller.executeRemoteRunner(TaskPoller.java:136) at amazonaws.datapipeline.taskrunner.TaskPoller.executeTask(TaskPoller.java:105) at amazonaws.datapipeline.taskrunner.TaskPoller$1.run(TaskPoller.java:81) at private.com.amazonaws.services.datapipeline.poller.PollWorker.executeWork(PollWorker.java:76) at private.com.amazonaws.services.datapipeline.poller.PollWorker.run(PollWorker.java:53) at java.lang.Thread.run(Thread.java:750) Caused by:
....
fatal error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied at amazonaws.datapipeline.activity.mapreduce.HadoopJobRunner.executeCommand(HadoopJobRunner.java:175) at amazonaws.datapipeline.activity.mapreduce.HadoopJobRunner.ex
I'm using the DataPipelineDefaultResourceRole and DataPipelineDefaultRole for this data pipeline which has S3:* permission, so I'm puzzled why this is happening.
On top of that, I'm not sure why logging is not enabled on my EMR cluster which is spun off by this data pipeline although I've specified the logLocation parameter: myLogUri
Any pointers please?
import { CfnPipeline } from "monocdk/aws-datapipeline";
private createDataPipeline(props: InfrastructureStackProps) {
const dataPipelineName = "a-nice-datapipeline8.23";
const pipeline = new CfnPipeline(this, dataPipelineName, {
name: dataPipelineName,
parameterObjects: [
{
id: "myDDBTableName",
attributes: [
{
key: "Description",
stringValue: "Source table"
},
{
key: "Type",
stringValue: "String"
},
{
key: "Default",
stringValue: "Attributes"
}
]
},
{
id: "myOutputS3Location",
attributes: [
{
key: "Description",
stringValue: "Output S3 Location"
},
{
key: "Type",
stringValue: "String"
},
{
key: "Default",
stringValue: "s3://ddb-table-backup/"
}
]
},
{
id: "myDdbReadThroughputRatio",
attributes: [
{
key: "Description",
stringValue: "DynamoDB Read Throughput Ratio"
},
{
key: "Type",
stringValue: "Double"
},
{
key: "Default",
stringValue: "0.15"
}
]
},
{
id: 'myLogUri',
attributes: [
{
key: 'type',
stringValue: 'AWS::S3::ObjectKey',
},
{
key: 'description',
stringValue: 'DataPipeline Log Uri',
},
],
},
{
id: "myDDBRegion",
attributes: [
{
key: "Description",
stringValue: "Region of the DynamoDB Table"
},
{
key: "Type",
stringValue: "String"
},
{
key: "Default",
stringValue: props.region
}
]
}
],
parameterValues: [
{
id: "myDDBTableName",
stringValue: "Attributes"
},
{
id: "myOutputS3Location",
stringValue: "s3://ddb-table-backup/"
},
{
id: "myDdbReadThroughputRatio",
stringValue: "0.15"
},
{
id: 'myLogUri',
stringValue: `s3://data_pipeline_log/`,
},
{
id: "myDDBRegion",
stringValue: props.region
}
],
pipelineObjects: [
{
"id": "EmrClusterForBackup",
"name": "EmrClusterForBackup",
"fields": [
{
"key": "resourceRole",
"stringValue": "DataPipelineDefaultResourceRole"
},
{
"key": "role",
"stringValue": "DataPipelineDefaultRole"
},
{
"key": "coreInstanceCount",
"stringValue": "1"
},
{
"key": "coreInstanceType",
"stringValue": "m4.xlarge"
},
{
"key": "releaseLabel",
"stringValue": "emr-5.29.0"
},
{
"key": "masterInstanceType",
"stringValue": "m4.xlarge"
},
{
"key": "region",
"stringValue": props.region
},
{
"key": "type",
"stringValue": "EmrCluster"
},
{
"key": "terminateAfter",
"stringValue": "2 Hours"
}
]
},
{
"id": "S3BackupLocation",
"name": "S3BackupLocation",
"fields": [
{
"key": "directoryPath",
"stringValue": "s3://ddb-table-backup/"
},
{
"key": "type",
"stringValue": "S3DataNode"
}
]
},
{
"id": "DDBSourceTable",
"name": "DDBSourceTable",
"fields": [
{
"key": "readThroughputPercent",
"stringValue": "0.15"
},
{
"key": "type",
"stringValue": "DynamoDBDataNode"
},
{
"key": "tableName",
"stringValue": "Attributes"
}
]
},
{
"id": "Default",
"name": "Default",
"fields": [
{
"key": "failureAndRerunMode",
"stringValue": "CASCADE"
},
{
"key": "resourceRole",
"stringValue": "DataPipelineDefaultResourceRole"
},
{
"key": "role",
"stringValue": "DataPipelineDefaultRole"
},
{
"key": "scheduleType",
"stringValue": "cron"
},
{
key: 'schedule',
refValue: 'DailySchedule'
},
{
key: 'pipelineLogUri',
stringValue: 's3://data_pipeline_log/',
},
{
"key": "type",
"stringValue": "Default"
}
]
},
{
"name": "Every 1 day",
"id": "DailySchedule",
"fields": [
{
"key": 'type',
"stringValue": 'Schedule'
},
{
"key": 'period',
"stringValue": '1 Day'
},
{
"key": 'startDateTime',
"stringValue": "2021-12-20T00:00:00"
}
]
},
{
"id": "TableBackupActivity",
"name": "TableBackupActivity",
"fields": [
{
"key": "type",
"stringValue": "EmrActivity"
},
{
"key": "output",
"refValue": "S3BackupLocation"
},
{
"key": "input",
"refValue": "DDBSourceTable"
},
{
"key": "maximumRetries",
"stringValue": "2"
},
{
"key": "preStepCommand",
"stringValue": "(sudo yum -y update aws-cli) && (aws s3 rm #{output.directoryPath} --recursive)"
},
{
"key": "step",
"stringValue": "s3://dynamodb-dpl-#{myDDBRegion}/emr-ddb-storage-handler/4.11.0/emr-dynamodb-tools-4.11.0-SNAPSHOT-jar-with-dependencies.jar,org.apache.hadoop.dynamodb.tools.DynamoDBExport,#{output.directoryPath},#{input.tableName},#{input.readThroughputPercent}"
},
{
"key": "runsOn",
"refValue": "EmrClusterForBackup"
},
{
"key": "resizeClusterBeforeRunning",
"stringValue": "false"
}
]
}
],
activate: true
});
return pipeline;
}
I may be avoiding the direct issue at hand here, but I'm curious to know why you are using DataPipeline for this? You would probably be better served using AWS Backup which will allow you to take periodic backups in a manged fashion as well as other features such as expiring backups or sending to cold storage.
On the particular issue at hand, please check that your S3 bucket does not have a resource based policy blocking you: https://docs.aws.amazon.com/AmazonS3/latest/userguide/example-bucket-policies.html
Also check the EC2 role used for your DP, commonly called AmazonEC2RoleforDataPipelineRole. More info on IAM roles for DP here.

ElasticSearch reindexing with selected fields result into addition of non selected empty field

Scenario:
We are using AWS ElasticSearch 6.8. We got an index (index-A) with a mapping structure consist of multiple nested objects and JSON hierarchy. We need to create new index (index-B) and move all documents from index-A to index-B.
We need to create index-B with only specific fields.
We need to rename field names while reindexing
e.g.
index-A mapping:
{
"userdata": {
"properties": {
"payload": {
"type": "object",
"properties": {
"Alldata": {
"Username": {
"type": "keyword"
},
"Designation": {
"type": "keyword"
},
"Company": {
"type": "keyword"
},
"Region": {
"type": "keyword"
}
}
}
}
}
}}
Expected structure of index-B mapping after reindexing with rename (Company-cnm, Region-rg) :-
{
"userdata": {
"properties": {
"cnm": {
"type": "keyword"
},
"rg": {
"type": "keyword"
}
}
}}
Steps we are Following:
First we are using Create index API to create index-B with above mapping structure
Once index is created we are creating an ingest pipeline.
PUT ElasticSearch domain endpoint/_ingest/pipeline/my_rename_pipeline
{
"description": "rename field pipeline",
"processors": [{
"rename": {
"field": "payload.Company",
"target_field": "cnm",
"ignore_missing": true
}
},
{
"rename": {
"field": "payload.Region",
"target_field": "rg",
"ignore_missing": true
}
}
]
}
Perform reindexing operation, payload for the same below
let reindexParams = {
wait_for_completion: false,
slices: "auto",
body: {
"conflicts": "proceed",
"source": {
"size": 8000,
"index": "index-A",
"_source": ["payload.Company", "payload.Region"]
},
"dest": {
"index": "index-B",
"pipeline": "my_rename_pipeline",
"version_type": "external"
}
}
};
Problem:
Once the reindexing is complete as expected all documents transferred to new index with renamed fields but there is one additional field which is not selected. As you can see below the "payload" object with metadata is also added to the new index after reindexing. This field is empty and consist of no data.
index-B looks like below after reindexing:
{
"userdata": {
"properties": {
"cnm": {
"type": "keyword"
},
"rg": {
"type": "keyword"
},
"payload": {
"properties": {
"Alldata": {
"type": "object"
}
}
}
}
}}
We are unable to find the workaround and need help how to stop this field from creating. Any help will be appreciated.
Great job!! You're almost there, you simply need to remove the payload field within your pipeline using the remove processor and you're good:
{
"description": "rename field pipeline",
"processors": [
{
"rename": {
"field": "payload.Company",
"target_field": "cnm",
"ignore_missing": true
}
},
{
"rename": {
"field": "payload.Region",
"target_field": "rg",
"ignore_missing": true
}
},
{
"remove": { <--- add this processor
"field": "payload"
}
}
]
}

Data does not match any schemas from 'oneOf'

I am getting this error after upgrading my api from .netcore2.2 to 3.1 and trying to generate using autorest with the --v3 switch
WARNING: Schema violation: Data does not match any schemas from
'oneOf'
I have tried with and without SerializeAsV2
I see from the Autorest docs that this warning is because of an supported feature.
anyOf, oneOf are not currently supported
In services.AddSwaggerGen I have
c.ParameterFilter<SwaggerEnumParameterFilter>();
c.SchemaFilter<SwaggerEnumFilter>();
where
public void Apply(OpenApiParameter parameter, ParameterFilterContext context)
{
var type = context.ApiParameterDescription.Type;
if (type.IsEnum)
parameter.Extensions.Add("x-ms-enum", new OpenApiObject
{
["name"] = new OpenApiString(type.Name),
["modelAsString"] = new OpenApiBoolean(false)
});
}
public class SwaggerEnumFilter : ISchemaFilter
{
public void Apply(OpenApiSchema model, SchemaFilterContext context)
{
if (model == null)
throw new ArgumentNullException("model");
if (context == null)
throw new ArgumentNullException("context");
if (context.Type.IsEnum)
model.Extensions.Add(
"x-ms-enum",
new OpenApiObject
{
["name"] = new OpenApiString(context.Type.Name),
["modelAsString"] = new OpenApiBoolean(false)
}
);
}
}
[update]
After upgrading to Autorest 3.0.6244 the warnings have changed to errors and the error message ends with
post > parameters > 0)
If I don't use the v3 switch I get the error
FATAL: swagger-document/individual/schema-validator - FAILED
FATAL: Error: [OperationAbortedException] Error occurred. Exiting.
Process() cancelled due to exception : [OperationAbortedException] Error occurred. Exiting.
I can see in the swagger.json that the parameters property "name" is not generating correctly. Here it contains "body" whereas previously it contained "info"
"/api/FrameLookUp": {
"post": {
"tags": [
"Frame"
],
"operationId": "FrameLookup",
"consumes": [
"application/json-patch+json",
"application/json",
"text/json",
"application/*+json"
],
"produces": [
"application/json"
],
"parameters": [
{
"in": "header",
"name": "Authorization",
"description": "access token",
"required": true,
"type": "String"
},
{
"in": "body",
"name": "body",
"schema": {
"$ref": "#/definitions/FrameRequest"
}
}
],
"responses": {
"200": {
"description": "Success",
"schema": {
"$ref": "#/definitions/FrameResponse"
}
}
}
}
},
The controller is
[Produces("application/json")]
[Authorize(AuthenticationSchemes = JwtBearerDefaults.AuthenticationScheme)]
[Route("api")]
public class FrameController : MyController
{
[ProducesResponseType(typeof(FrameResponse), StatusCodes.Status200OK)]
[HttpPost("FrameLookUp")]
public IActionResult FrameLookup([FromBody] FrameRequest info)
{
IMyResponse MyFunc(IMyRequest x) => FrameData.FrameLookUp(info);
return InnerMethod(MyFunc, info);
}
}
Update
I have also tried using the SwaggerParameter from Swashbuckle.AspNetCore.Annotations
[Update]
I am thinking that maybe I just need to try the release for issue 1766
I tried cloning the swashbuckle.aspnetcore repo but ran into this issue
[Update]
I added c.GeneratePolymorphicSchemas(); to the AddSwaggerGen options but it has not helped.
[Update]
Here is the first error message
ERROR: Schema violation: Data does not match any schemas from 'oneOf'
- https://localhost:44348/api-docs/v1/swagger.json:1951:8 ($.paths["/api/synchronise-management/get-product-images-Ids"].post.parameters)
Investigating line 1951 in swagger.json
In the working swagger ( generated from dotnet2.2 project ) the json looks very similar however the parameter order is swapped
The other difference I can see is the generated name of the parameter
I see from this question the error occurs in the same place
[Update]
when I add the --debug switch to the autorest call I get
/configuration
DEBUG: pipeline-emitter - END
DEBUG: configuration-emitter - END
DEBUG: swagger-document-override/md-override-loader - END
DEBUG: swagger-document/loader - END
DEBUG: swagger-document/individual/transform - START
DEBUG: swagger-document/individual/transform - END
DEBUG: swagger-document/individual/schema-validator - START
ERROR: Schema violation: Data does not match any schemas from 'oneOf'
- https://localhost:44348/api/v1/swagger.json:1951:8 ($.paths["/api/synchronise-management/get-product-images-Ids"].
[Update]
Here is the cut down json
{
"swagger": "2.0",
"info": {
"title": "myapi API31",
"description": "ASP.NET Core Web API",
"version": "v1"
},
"host": "localhost:44348",
"basePath": "/v1",
"schemes": [
"https"
],
"paths": {
"/api/Test": {
"get": {
"tags": [
"Auth"
],
"operationId": "Test",
"responses": {
"200": {
"description": "Success"
}
}
}
},
"/api/RequestToken": {
"post": {
"tags": [
"Auth"
],
"operationId": "RequestToken",
"consumes": [
"application/json-patch+json",
"application/json",
"text/json",
"application/*+json"
],
"produces": [
"application/json"
],
"parameters": [
{
"in": "body",
"name": "body",
"schema": {
"$ref": "#/definitions/TokenRequest"
}
}
],
"responses": {
"200": {
"description": "Success",
"schema": {
"$ref": "#/definitions/TokenResponse"
}
}
}
}
},
"/api/FrameLookUp": {
"post": {
"tags": [
"Frame"
],
"operationId": "FrameLookup",
"consumes": [
"application/json-patch+json",
"application/json",
"text/json",
"application/*+json"
],
"produces": [
"application/json"
],
"parameters": [
{
"in": "header",
"name": "Authorization",
"description": "access token",
"required": true,
"type": "String"
},
{
"in": "body",
"name": "body",
"schema": {
"$ref": "#/definitions/FrameRequest"
}
}
],
"responses": {
"200": {
"description": "Success",
"schema": {
"$ref": "#/definitions/FrameResponse"
}
}
}
}
}
},
"definitions": {
"TokenRequest": {
"required": [
"password",
"username"
],
"type": "object",
"properties": {
"username": {
"type": "string"
},
"password": {
"type": "string"
}
}
},
"TokenResponse": {
"type": "object",
"properties": {
"tokenResult": {
"type": "string"
}
}
},
"FramePackTypeEnum": {
"enum": [
"NotApplicable",
"PipeRack",
"LwBVan",
"VanTray",
"Car",
"CarryBag"
],
"type": "string",
"x-ms-enum": {
"name": "FramePackTypeEnum",
"modelAsString": false
}
},
"FrameRequest": {
"type": "object",
"properties": {
"qCodeJobId": {
"format": "int32",
"type": "integer"
},
"quantity": {
"format": "int32",
"type": "integer"
},
"widthInMm": {
"format": "int32",
"type": "integer"
},
"heightInMm": {
"format": "int32",
"type": "integer"
},
"ePackingType": {
"$ref": "#/definitions/FramePackTypeEnum"
},
"userEmail": {
"type": "string"
}
}
},
"FrameCaseEnum": {
"enum": [
"Case0_NoBraces",
"Case1_1Vertical_0Horizontal",
"Case2_2Vertical_0Horizontal",
"Case3_NVertical_0Horizontal",
"Case4_0Vertical_1Horizontal",
"Case5_1Vertical_1Horizontal",
"Case6_2Vertical_1Horizontal",
"Case7_NVertical_1Horizontal",
"Case8_0Vertical_2Horizontal",
"Case9_1Vertical_2Horizontal",
"Case10_2Vertical_2Horizontal",
"Case11_NVertical_2Horizontal",
"Case12_0Vertical_NHorizontal",
"Case13_1Vertical_NHorizontal",
"Case14_2Vertical_NHorizontal",
"Case15_NVertical_NHorizontal"
],
"type": "string",
"x-ms-enum": {
"name": "FrameCaseEnum",
"modelAsString": false
}
},
"FrameResponse": {
"type": "object",
"properties": {
"description": {
"type": "string"
},
"caseNumber": {
"$ref": "#/definitions/FrameCaseEnum"
},
"memberPriceEachExGst": {
"format": "double",
"type": "number"
},
"retailPriceEachExGst": {
"format": "double",
"type": "number"
}
}
}
}
}
With the .netcore2.2 api the request generates as
"FrameRequest": {
"type": "object",
"properties": {
"qCodeJobId": {
"format": "int32",
"type": "integer"
},
"quantity": {
"format": "int32",
"type": "integer"
},
"widthInMm": {
"format": "int32",
"type": "integer"
},
"heightInMm": {
"format": "int32",
"type": "integer"
},
"ePackingType": {
"enum": [
"NotApplicable",
"PipeRack",
"LwBVan",
"VanTray",
"Car",
"CarryBag"
],
"type": "string",
"x-ms-enum": {
"name": "FramePackTypeEnum",
"modelAsString": false
}
},
"userEmail": {
"type": "string"
}
}
}
Here is the command line I am running
autorest --input-file=.\myswagger.json --output-folder=generated --csharp --namespace=DDD --debug
Some links which the author, Kirsten Greed, put in comments:
https://github.com/domaindrivendev/Swashbuckle.AspNetCore#schema-filters
https://github.com/domaindrivendev/Swashbuckle.AspNetCore/pull/1766
https://stackoverflow.com/questions/63857310/could-not-find-a-part-of-the-path-d-dev-swashbuckle-aspnetcore-src-swashbuckle
From your swagger.json we can see the validation shows:
https://validator.swagger.io/validator/debug?url=https://raw.githubusercontent.com/heldersepu/hs-scripts/master/swagger/63783800_swagger.json
{
"schemaValidationMessages": [
{
"level": "error",
"domain": "validation",
"keyword": "oneOf",
"message": "instance failed to match exactly one schema (matched 0 out of 2)",
"schema": {
"loadingURI": "http://swagger.io/v2/schema.json#",
"pointer": "/definitions/parametersList/items"
},
"instance": {
"pointer": "/paths/~1api~1FrameLookUp/post/parameters/0"
}
}
]
}
that lead us to your code:
that type: "String" should be: type: "string" with all lower case the error goes away

Cloudformation Property validation failure: Encountered unsupported properties

I'm trying to create a nested stack with the root stack looks like this:
{
"AWSTemplateFormatVersion": "2010-09-09",
"Resources": {
"DynamoDBTable": {
"Type": "AWS::CloudFormation::Stack",
"Properties": {
"Parameters": {
"TableName": {
"Fn::Sub": "${AWS::StackName}"
}
},
"TemplateURL": "https://s3.amazonaws.com/my-templates-bucket/dynamodb.json"
}
},
"S3WebsiteReact": {
"Type": "AWS::CloudFormation::Stack",
"Properties": {
"Parameters": {
"BucketName": {
"Fn::Sub": "${AWS::StackName}-website"
}
},
"TemplateURL": "https://s3.amazonaws.com/my-templates-bucket/s3-static-website-react.json"
}
},
"S3UploadBucket": {
"Type": "AWS::CloudFormation::Stack",
"Properties": {
"Parameters": {
"BucketName": {
"Fn::Sub": "${AWS::StackName}-upload"
}
},
"TemplateURL": "https://s3.amazonaws.com/my-templates-bucket/s3-with-cors.json"
}
},
"Cognito": {
"Type": "AWS::CloudFormation::Stack",
"DependsOn": "DynamoDBTable",
"Properties": {
"Parameters": {
"CognitoUserPoolName": {
"Fn::Join" : ["",
{
"Fn::Split": ["-", {
"Ref": "AWS::StackName"
}]
}
]
}
},
"TemplateURL": "https://s3.amazonaws.com/my-templates-bucket/cognito.json"
}
},
"ApiGateway": {
"Type": "AWS::CloudFormation::Stack",
"DependsOn": ["DynamoDBTable", "Cognito"],
"Properties": {
"Parameters": {
"ApiGatewayName": {
"Fn::Sub": "${AWS::StackName}-api"
},
"CognitoUserPoolArn": {
"Fn::GetAtt": [ "Cognito", "Outputs.UserPoolArn" ]
},
"DynamoDBStack": {
"Fn::GetAtt": [ "DynamoDBTable", "Outputs.DDBStackName" ]
}
},
"TemplateURL": "https://s3.amazonaws.com/my-templates-bucket/api-gateway.json"
}
},
"IdentityPool": {
"Description": "Cognito Identity Pool. Must be created after User Pool and API Gateway.",
"Type": "AWS::Cognito::IdentityPool",
"DependsOn": ["Cognito", "ApiGateway", "S3UploadBucket"],
"Properties": {
"Parameters": {
"AppClientId": {
"Fn::GetAtt": [ "Cognito", "Outputs.AppClientId" ]
},
"UserPoolProviderName": {
"Fn::GetAtt": [ "Cognito", "Outputs.ProviderName" ]
},
"UserPoolName": {
"Fn::GetAtt": [ "Cognito", "Outputs.UserPoolName" ]
},
"UploadBucketName": {
"Fn::GetAtt": [ "S3UploadBucket", "Outputs.UploadBucketName" ]
},
"ApiGatewayId": {
"Fn::GetAtt": [ "ApiGateway", "Outputs.ApiGatewayId" ]
}
},
"TemplateURL": "https://s3.amazonaws.com/my-templates-bucket/identity-pool.json"
}
}
},
"Outputs": {
}
}
And I get this error:
2019-06-19 14:45:14 UTC-0400 IdentityPool CREATE_FAILED Property validation failure: [Encountered unsupported properties in {/}: [TemplateURL, Parameters]]
It looks like my identity pool stack has some issues with the parameters. But the identity pool stack parameters look like this:
"Parameters" : {
"AppClientId": {
"Description": "ID of the App Client of the Cognito User Pool passed into this stack.",
"Type": "String"
},
"UserPoolProviderName": {
"Description": "Cognito User Pool Provider name passed into this stack.",
"Type": "String"
},
"UserPoolName": {
"Description": "Cognito User Pool Name passed into this stack.",
"Type": "String"
},
"UploadBucketName": {
"Description": "Name of the bucket that is used to upload files to.",
"Type": "String"
},
"ApiGatewayId": {
"Description": "ID of the API Gateway created for the stack.",
"Type": "String"
}
},
The funny thing is: I tried creating each stack on its own, then passed the outputs from them as parameters to the stacks that need those parameters and every single stack was created successfully without any problems.
I've tried to look for what is unsupported but was unable to find any answers.
The error:
[Encountered unsupported properties in {/}: [TemplateURL, Parameters]]
Says that those two properties are unsupported. Unlike all the rest of the resources declared in your template which also use those two properties, this resource is a AWS::Cognito::IdentityPool, while the rest are all of type AWS::CloudFormation::Stack.
Those two properties are only valid on the AWS::CloudFormation::Stack type, hence the validation error.