AWS Elemental MediaConvert: mov file is not supported - amazon-web-services

I wanna overlay a movie (.mov) on top of another movie (.mp4), using AWS Elemental MediaConvert.
I uploaded one .mov file and one .mp4 to S3, both have similar dimensions.
In AWS Elemental MediaConvert, I created a job:
Input: .mp4 file
Motion image inserter: .mov file
Output file will be an mp4 file
Result: the job is failed, error message:
MGILoaderMOV [s3://test/overlay.mov] file contains unsupported pixel format.
The .mov file is working fine because I can play it from my laptop, IAM includes Full access to your Amazon S3 resources.
Any suggestion is appreciated
***** More detail *****
#Tiziano Coroneo, I got a new overlay.mov (converted from FFmpeg, dimension (450x450)). I set up the output as below:
Preprocessors: Input cropping rectangle X(100), Y(100), Width(450), Height(450)
The job is executed successfully, but the output doesn't include the overlay.
{
"Queue": "arn:aws:mediaconvert:yyyyyyy:xxxxxxxx:queues/Default",
"UserMetadata": {},
"Role": "arn:aws:iam::xxxxxxxxxxxxx:role/my_media_role",
"Settings": {
"OutputGroups": [
{
"Name": "File Group",
"Outputs": [
{
"ContainerSettings": {
"Container": "MP4",
"Mp4Settings": {
"CslgAtom": "INCLUDE",
"FreeSpaceBox": "EXCLUDE",
"MoovPlacement": "PROGRESSIVE_DOWNLOAD"
}
},
"VideoDescription": {
"ScalingBehavior": "DEFAULT",
"Crop": {
"Height": 450,
"Width": 450,
"X": 100,
"Y": 100
},
"TimecodeInsertion": "DISABLED",
"AntiAlias": "ENABLED",
"Sharpness": 50,
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"InterlaceMode": "PROGRESSIVE",
"NumberReferenceFrames": 3,
"Syntax": "DEFAULT",
"Softness": 0,
"GopClosedCadence": 1,
"GopSize": 90,
"Slices": 1,
"GopBReference": "DISABLED",
"SlowPal": "DISABLED",
"SpatialAdaptiveQuantization": "ENABLED",
"TemporalAdaptiveQuantization": "ENABLED",
"FlickerAdaptiveQuantization": "DISABLED",
"EntropyEncoding": "CABAC",
"Bitrate": 1000000,
"FramerateControl": "INITIALIZE_FROM_SOURCE",
"RateControlMode": "CBR",
"CodecProfile": "MAIN",
"Telecine": "NONE",
"MinIInterval": 0,
"AdaptiveQuantization": "HIGH",
"CodecLevel": "AUTO",
"FieldEncoding": "PAFF",
"SceneChangeDetect": "ENABLED",
"QualityTuningLevel": "SINGLE_PASS",
"FramerateConversionAlgorithm": "DUPLICATE_DROP",
"UnregisteredSeiTimecode": "DISABLED",
"GopSizeUnits": "FRAMES",
"ParControl": "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames": 2,
"RepeatPps": "DISABLED",
"DynamicSubGop": "STATIC"
}
},
"AfdSignaling": "NONE",
"DropFrameTimecode": "ENABLED",
"RespondToAfd": "NONE",
"ColorMetadata": "INSERT"
},
"Extension": ".mp4",
"NameModifier": "overlay_video"
}
],
"OutputGroupSettings": {
"Type": "FILE_GROUP_SETTINGS",
"FileGroupSettings": {
"Destination": "s3://XXXXXXX/files/"
}
}
}
],
"AdAvailOffset": 0,
"MotionImageInserter": {
"InsertionMode": "MOV",
"Input": "s3://XXXXXXX/converted_overlay.mov",
"Offset": {
"ImageX": 0,
"ImageY": 0
},
"Playback": "ONCE"
},
"Inputs": [
{
"FilterEnable": "AUTO",
"PsiControl": "USE_PSI",
"FilterStrength": 0,
"DeblockFilter": "DISABLED",
"DenoiseFilter": "DISABLED",
"TimecodeSource": "EMBEDDED",
"FileInput": "s3://XXXXXXX/sample_video.mp4"
}
]
},
"StatusUpdateInterval": "SECONDS_60"
}

After some trial and error, I found out that there is a requirement for the motion graphic overlay file to be in pixel format argb.
The error message here tells you that you mov file has a different pixel format. If you have ffmpeg installed on your machine, you can run the following command to convert your file to the right format:
ffmpeg -i your_input_file.mov -sn -dn -an -vcodec qtrle -pix_fmt argb -f mov your_output_file.mov
-i is to specify the input file,
-sn removes any subtitle track
-dn removes any data track
-an removes any audio track
-vcodec qtrle sets "Quicktime Animation" as the codec
-pix_fmt argb sets argb as the pixel data format
-f mov sets mov as the output file container type.
Good luck!

Related

Cannot add widget to AWS Cloudwatch Dashboard

I am trying to configure an existing AWS Dashboard with adding one new widget.
In Amazon Kinesis / Analytics application / Streaming application I click on the graphs 'View in metrics" of which I would like to add to my dashboard
In the next screen I click Actions / Add to dashboard
after selecting my dashboard I click add, and then I can see my dashboard with the chart:
However, if I click on "Save" I get the following error:
There was an error while trying to save your dashboard:
The dashboard body is invalid, there are 6 validation errors: [
{ "dataPath": "/widgets/5/properties/metrics/0", "message": "Should NOT have more than 4 items" },
{ "dataPath": "/widgets/5/properties/metrics/1", "message": "Should NOT have more than 4 items" },
{ "dataPath": "/widgets/5/properties/yAxis/left", "message": "Should be null" },
{ "dataPath": "/widgets/5/properties/yAxis/left", "message": "Should match some schema in anyOf" },
{ "dataPath": "/widgets/5/properties/yAxis/right", "message": "Should be null" },
{ "dataPath": "/widgets/5/properties/yAxis/right", "message": "Should match some schema in anyOf" } ]
I am totally clueless, as I did not enter anything manually, all I done was just clicking on the menu items. What is the problem here? I don't even understand the error messages even. I have 4 logs, and 1 chart already on the screen, this would be the 6th item if that is important.
Update: adding the source code of the template (I censored some sensitive information with "......."):
{
"widgets": [
{
"height": 6,
"width": 24,
"y": 12,
"x": 0,
"type": "log",
"properties": {
"query": "SOURCE '/aws/kinesis-analytics/.......' | fields #timestamp, message | filter applicationARN like /arn:aws:kinesisanalytics:eu-west-1:......./| filter messageType = \"ERROR\"| sort #timestamp desc",
"region": "eu-west-1",
"title": "Error log (last 1000 records)",
"view": "table"
}
},
{
"height": 6,
"width": 24,
"y": 6,
"x": 0,
"type": "log",
"properties": {
"query": "SOURCE '/aws/kinesis-analytics/.......' | fields #timestamp, message | filter applicationARN like /arn:aws:kinesisanalytics:eu-west-1:......./| sort #timestamp desc",
"region": "eu-west-1",
"title": "Full log (last 1000 records)",
"view": "table"
}
},
{
"height": 6,
"width": 24,
"y": 18,
"x": 0,
"type": "log",
"properties": {
"query": "SOURCE '/aws/kinesis-analytics/.......' | fields #timestamp, message | filter applicationARN like /arn:aws:kinesisanalytics:eu-west-1:......./| filter message like / OEE Data Streaming app v / | sort #timestamp desc",
"region": "eu-west-1",
"title": "Version - works only right after deployment, othervise look at the name of the jar file :) ",
"view": "table"
}
},
{
"height": 6,
"width": 24,
"y": 0,
"x": 0,
"type": "log",
"properties": {
"query": "SOURCE '/aws/kinesis-analytics/.......' | fields #timestamp, message | filter applicationARN like /arn:aws:kinesisanalytics:eu-west-1:338785721659:.......") | sort #timestamp desc",
"region": "eu-west-1",
"stacked": false,
"title": "OEE app inside logs",
"view": "table"
}
},
{
"height": 6,
"width": 6,
"y": 24,
"x": 0,
"type": "metric",
"properties": {
"region": "eu-west-1",
"yAxis": {
"left": {
"min": 0
}
},
"metrics": [
[ "AWS/Kinesis", "GetRecords.Records", "StreamName", ".......", { "id": "m3", "visible": true } ]
],
"stat": "Sum",
"title": "GetRecords - .......",
"start": "-PT3H",
"end": "P0D",
"view": "timeSeries",
"stacked": false
}
}
]
}
and if I try to add the uptime widget, it's code is this :
{
"type": "metric",
"x": 6,
"y": 24,
"width": 6,
"height": 6,
"properties": {
"region": "eu-west-1",
"yAxis": {
"left": {
"min": 0,
"stat": "Maximum",
"showUnits": false
},
"right": {
"min": 0,
"stat": "Maximum",
"showUnits": false
}
},
"metrics": [
[ "AWS/KinesisAnalytics", "uptime", "Application", "...", { "yAxis": "left", "label": "uptime", "stat": "Maximum", "showUnits": false } ],
[ ".", "fullRestarts", ".", ".", { "yAxis": "right", "label": "fullRestarts", "stat": "Maximum", "showUnits": false } ]
],
"stat": "Maximum",
"title": "Uptime (Milliseconds) - Maximum",
"start": "-PT3H",
"end": "P0D",
"view": "timeSeries",
"stacked": false
}
}
but I cannot save it now with the error message I described earlier.
Looks like the properties on axis definition and metric definition are mixed up.
Axis should not have the stat property: https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/CloudWatch-Dashboard-Body-Structure.html#CloudWatch-Dashboard-Properties-YAxis-Properties-Format
Metric definition should not have the showUnits property: https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/CloudWatch-Dashboard-Body-Structure.html#CloudWatch-Dashboard-Properties-Rendering-Object-Format
Try removing the stat property from both left and right axis definition. Also remove the showUnits property from the metrics definition (that should only be on the axis definitions).
If this was generated automatically, then it looks like a bug in the console.

Cloudformation sub function for cloudwatch dashboard body

I'm running into issues using the !Sub intrinsic cloudformation function with AWS::Region pseudoparameter within the body of my cloudwatch dashboard (to ensure my stack is region agnostic). The cloudformation I am using is given below
OrderDashboard:
Type: AWS::CloudWatch::Dashboard
Properties:
DashboardBody: !Sub '{ "widgets": [ { "type": "metric", "x": 6, "y": 0, "width": 6, "height": 6, "properties": { "metrics": [ [ "address", "validateAddressApiLatency" ] ], "view": "timeSeries", "stacked": false, "region": "${AWS::Region}", "title": "ValidateAddressApiSuccessLatencyP99", "period": 300, "stat": "p99" } }, { "type": "metric", "x": 12, "y": 0, "width": 8, "height": 6, "properties": { "metrics": [ [ "address", "validateAddressApiErrorLatency" ] ], "view": "timeSeries", "stacked": false, "region": "${AWS::Region}", "title": "ValidateAddressApiErrorLatencyP99", "period": 300, "stat": "p99" } }, { "type": "text", "x": 0, "y": 0, "width": 6, "height": 6, "properties": { "markdown": "# Heading \nThis dashboard exists to show that our success latency metric and error latency metric are published successfully using a single annotation and aspectj.\n\nThe first row shows the 99th percentile latencies, and the bottom column shows the count of the number of calls" } }, { "type": "metric", "x": 6, "y": 6, "width": 6, "height": 6, "properties": { "metrics": [ [ { "expression": "SELECT COUNT(validateAddressApiLatency) FROM SCHEMA(address)", "label": "NumberOfSuccessfulCalls", "id": "q1", "region": "${AWS::Region}" } ] ], "view": "timeSeries", "stacked": false, "region": "${AWS::Region}", "stat": "Average", "period": 300, "title": "NumberOfSuccessfulValidateCalls" } }, { "type": "metric", "x": 12, "y": 6, "width": 6, "height": 6, "properties": { "metrics": [ [ { "expression": "SELECT COUNT(validateAddressApiErrorLatency) FROM SCHEMA(address)", "label": "NumberOfErroredCalls", "id": "q1", "region": "${AWS::Region}" } ] ], "view": "timeSeries", "stacked": false, "region": "${AWS::Region}", "stat": "Average", "period": 300, "title": "NumberOfErrorValidateCalls" } } ]}'
DashboardName: order-dashboard
When I deploy the dashboard, the region is not substituted
The interesting thing is I use sub with the region parameter other places in the template, it works.
Outputs:
OrderApiUrl:
Description: "The endpoint you can use to place orders. Make sure to append the order id to the end"
Value: !Sub "https://${OrderApi}.execute-api.${AWS::Region}.amazonaws.com/v1/orders/"
Any idea on what I can do to get the value substituted? Thanks
I agree with #ErikAsplund, something like:
OrderDashboard:
Type: AWS::CloudWatch::Dashboard
Properties:
DashboardName: order-dashboard
DashboardBody: !Sub |
{
"widgets": [
{
"properties": {
"metrics": [
"AWS/Lambda",
"Duration",
"FunctionName",
"${MyReference}"
]
}
}
}
The code that you provided works perfectly fine. Thus the issue that you have must be related to other factors than your CloudFormation code in the question.
Maybe you are using some other code, not the one in the question.

Set segment duration with media convert

With AWS media convert everytime I convert an mp4 (6 second video) to a dash, the segment duration is about 30 seconds however I'd much appreciate if it were 1 second or less:
<SegmentTemplate timescale="90000" duration="324000" startNumber="1"/>
<Representation id="1" width="1280" height="720" bandwidth="72000000" codecs="avc1.4d4032">
<SegmentTemplate media="5f8283b60a3ac3640191892_$Number%09d$.mp4" initialization="5f8283b60a3ac3640191892init.mp4" duration="324000" startNumber="1"/>
</Representation>
<Representation id="2" width="1920" height="1080" bandwidth="16200000" codecs="avc1.4d4029">
<SegmentTemplate media="5f8283b60a3ac3640191891_$Number%09d$.mp4" initialization="5f8283b60a3ac3640191891init.mp4" duration="324000" startNumber="1"/>
</Representation>
</AdaptationSet>
I've tried messing with the job settings, but nothing I did seemed to work. Sometimes I got it down to 10 seconds, but that still isn't great, anyway here are the settings I used:
$jobSetting = [
"OutputGroups"=> [
[
"CustomName"=> "nicenice",
"Name"=> "DASH ISO",
"Outputs"=> [
[
"ContainerSettings"=> [
"Container"=> "MPD"
],
"VideoDescription"=> [
"Width"=> 1920,
"ScalingBehavior"=> "DEFAULT",
"Height"=> 1080,
"TimecodeInsertion"=> "DISABLED",
"AntiAlias"=> "ENABLED",
"Sharpness"=> 50,
"CodecSettings"=> [
"Codec"=> "H_264",
"H264Settings"=> [
"InterlaceMode"=> "PROGRESSIVE",
"NumberReferenceFrames"=> 3,
"Syntax"=> "DEFAULT",
"Softness"=> 0,
"GopClosedCadence"=> 1,
"GopSize"=> 60,
"Slices"=> 1,
"GopBReference"=> "DISABLED",
"SlowPal"=> "DISABLED",
"SpatialAdaptiveQuantization"=> "ENABLED",
"TemporalAdaptiveQuantization"=> "ENABLED",
"FlickerAdaptiveQuantization"=> "DISABLED",
"EntropyEncoding"=> "CABAC",
"Bitrate"=> 16200000,
"FramerateControl"=> "INITIALIZE_FROM_SOURCE",
"RateControlMode"=> "CBR",
"CodecProfile"=> "MAIN",
"Telecine"=> "NONE",
"MinIInterval"=> 0,
"AdaptiveQuantization"=> "HIGH",
"CodecLevel"=> "AUTO",
"FieldEncoding"=> "PAFF",
"SceneChangeDetect"=> "ENABLED",
"QualityTuningLevel"=> "SINGLE_PASS",
"FramerateConversionAlgorithm"=> "DUPLICATE_DROP",
"UnregisteredSeiTimecode"=> "DISABLED",
"GopSizeUnits"=> "FRAMES",
"ParControl"=> "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames"=> 2,
"RepeatPps"=> "DISABLED",
"DynamicSubGop"=> "STATIC"
]
],
"AfdSignaling"=> "NONE",
"DropFrameTimecode"=> "ENABLED",
"RespondToAfd"=> "NONE",
"ColorMetadata"=> "INSERT"
],
"NameModifier"=> "1"
],
[
"ContainerSettings"=> [
"Container"=> "MPD"
],
"VideoDescription"=> [
"Width"=> 1280,
"ScalingBehavior"=> "DEFAULT",
"Height"=> 720,
"TimecodeInsertion"=> "DISABLED",
"AntiAlias"=> "ENABLED",
"Sharpness"=> 50,
"CodecSettings"=> [
"Codec"=> "H_264",
"H264Settings"=> [
"InterlaceMode"=> "PROGRESSIVE",
"NumberReferenceFrames"=> 3,
"Syntax"=> "DEFAULT",
"Softness"=> 0,
"GopClosedCadence"=> 1,
"GopSize"=> 60,
"Slices"=> 1,
"GopBReference"=> "DISABLED",
"SlowPal"=> "DISABLED",
"SpatialAdaptiveQuantization"=> "ENABLED",
"TemporalAdaptiveQuantization"=> "ENABLED",
"FlickerAdaptiveQuantization"=> "DISABLED",
"EntropyEncoding"=> "CABAC",
"Bitrate"=> 7200000,
"FramerateControl"=> "INITIALIZE_FROM_SOURCE",
"RateControlMode"=> "CBR",
"CodecProfile"=> "MAIN",
"Telecine"=> "NONE",
"MinIInterval"=> 0,
"AdaptiveQuantization"=> "HIGH",
"CodecLevel"=> "AUTO",
"FieldEncoding"=> "PAFF",
"SceneChangeDetect"=> "ENABLED",
"QualityTuningLevel"=> "SINGLE_PASS",
"FramerateConversionAlgorithm"=> "DUPLICATE_DROP",
"UnregisteredSeiTimecode"=> "DISABLED",
"GopSizeUnits"=> "FRAMES",
"ParControl"=> "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames"=> 2,
"RepeatPps"=> "DISABLED",
"DynamicSubGop"=> "STATIC"
]
],
"AfdSignaling"=> "NONE",
"DropFrameTimecode"=> "ENABLED",
"RespondToAfd"=> "NONE",
"ColorMetadata"=> "INSERT"
],
"NameModifier"=> "2"
],
[
"ContainerSettings"=> [
"Container"=> "MPD"
],
"AudioDescriptions"=> [
[
"AudioTypeControl"=> "FOLLOW_INPUT",
"AudioSourceName"=> "Audio Selector 1",
"CodecSettings"=> [
"Codec"=> "AAC",
"AacSettings"=> [
"AudioDescriptionBroadcasterMix"=> "NORMAL",
"Bitrate"=> 96000,
"RateControlMode"=> "CBR",
"CodecProfile"=> "LC",
"CodingMode"=> "CODING_MODE_2_0",
"RawFormat"=> "NONE",
"SampleRate"=> 48000,
"Specification"=> "MPEG4"
]
],
"LanguageCodeControl"=> "FOLLOW_INPUT"
]
],
"NameModifier"=> "3"
]
],
"OutputGroupSettings"=> [
"Type"=> "DASH_ISO_GROUP_SETTINGS",
"DashIsoGroupSettings"=> [
"SegmentLength"=> 1,
"Destination"=> "s3://cactustestphp/videouploads/".$link . "/".$link,
"FragmentLength"=> 2,
"SegmentControl"=> "SEGMENTED_FILES",
"MpdProfile"=> "MAIN_PROFILE",
"HbbtvCompliance"=> "NONE"
]
]
]
],
"AdAvailOffset"=> 0,
"Inputs"=> [
[
"AudioSelectors"=> [
"Audio Selector 1"=> [
"Offset"=> 0,
"DefaultSelection"=> "DEFAULT",
"ProgramSelection"=> 1
]
],
"VideoSelector"=> [
"ColorSpace"=> "FOLLOW",
"Rotate"=> "DEGREE_0",
"AlphaBehavior"=> "DISCARD"
],
"FilterEnable"=> "AUTO",
"PsiControl"=> "USE_PSI",
"FilterStrength"=> 0,
"DeblockFilter"=> "DISABLED",
"DenoiseFilter"=> "DISABLED",
"InputScanType"=> "AUTO",
"TimecodeSource"=> "ZEROBASED",
"FileInput"=> "s3://cactustestphp/videouploads/test/". $fileid
]
]
];
Json:
{
"Queue": "!!",
"UserMetadata": {
"Customer": "Amazon"
},
"Role": "!!",
"Settings": {
"OutputGroups": [
{
"CustomName": "nicenice",
"Name": "DASH ISO",
"Outputs": [
{
"ContainerSettings": {
"Container": "MPD"
},
"VideoDescription": {
"Width": 3840,
"ScalingBehavior": "DEFAULT",
"Height": 2160,
"TimecodeInsertion": "DISABLED",
"AntiAlias": "ENABLED",
"Sharpness": 50,
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"InterlaceMode": "PROGRESSIVE",
"NumberReferenceFrames": 3,
"Syntax": "DEFAULT",
"Softness": 0,
"FramerateDenominator": 1,
"GopClosedCadence": 1,
"GopSize": 30,
"Slices": 1,
"GopBReference": "DISABLED",
"SlowPal": "DISABLED",
"SpatialAdaptiveQuantization": "ENABLED",
"TemporalAdaptiveQuantization": "ENABLED",
"FlickerAdaptiveQuantization": "DISABLED",
"EntropyEncoding": "CABAC",
"Bitrate": 66200000,
"FramerateControl": "SPECIFIED",
"RateControlMode": "CBR",
"CodecProfile": "MAIN",
"Telecine": "NONE",
"FramerateNumerator": 30,
"MinIInterval": 0,
"AdaptiveQuantization": "HIGH",
"CodecLevel": "AUTO",
"FieldEncoding": "PAFF",
"SceneChangeDetect": "ENABLED",
"QualityTuningLevel": "SINGLE_PASS",
"FramerateConversionAlgorithm": "DUPLICATE_DROP",
"UnregisteredSeiTimecode": "DISABLED",
"GopSizeUnits": "FRAMES",
"ParControl": "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames": 2,
"RepeatPps": "DISABLED",
"DynamicSubGop": "STATIC"
}
},
"AfdSignaling": "NONE",
"DropFrameTimecode": "ENABLED",
"RespondToAfd": "NONE",
"ColorMetadata": "INSERT"
},
"NameModifier": "1"
},
{
"ContainerSettings": {
"Container": "MPD"
},
"VideoDescription": {
"Width": 1920,
"ScalingBehavior": "DEFAULT",
"Height": 1080,
"TimecodeInsertion": "DISABLED",
"AntiAlias": "ENABLED",
"Sharpness": 50,
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"InterlaceMode": "PROGRESSIVE",
"NumberReferenceFrames": 3,
"Syntax": "DEFAULT",
"Softness": 0,
"FramerateDenominator": 1,
"GopClosedCadence": 1,
"GopSize": 30,
"Slices": 1,
"GopBReference": "DISABLED",
"SlowPal": "DISABLED",
"SpatialAdaptiveQuantization": "ENABLED",
"TemporalAdaptiveQuantization": "ENABLED",
"FlickerAdaptiveQuantization": "DISABLED",
"EntropyEncoding": "CABAC",
"Bitrate": 16200000,
"FramerateControl": "SPECIFIED",
"RateControlMode": "CBR",
"CodecProfile": "MAIN",
"Telecine": "NONE",
"FramerateNumerator": 30,
"MinIInterval": 0,
"AdaptiveQuantization": "HIGH",
"CodecLevel": "AUTO",
"FieldEncoding": "PAFF",
"SceneChangeDetect": "ENABLED",
"QualityTuningLevel": "SINGLE_PASS",
"FramerateConversionAlgorithm": "DUPLICATE_DROP",
"UnregisteredSeiTimecode": "DISABLED",
"GopSizeUnits": "FRAMES",
"ParControl": "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames": 2,
"RepeatPps": "DISABLED",
"DynamicSubGop": "STATIC"
}
},
"AfdSignaling": "NONE",
"DropFrameTimecode": "ENABLED",
"RespondToAfd": "NONE",
"ColorMetadata": "INSERT"
},
"NameModifier": "2"
},
{
"ContainerSettings": {
"Container": "MPD"
},
"VideoDescription": {
"Width": 1280,
"ScalingBehavior": "DEFAULT",
"Height": 720,
"TimecodeInsertion": "DISABLED",
"AntiAlias": "ENABLED",
"Sharpness": 50,
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"InterlaceMode": "PROGRESSIVE",
"NumberReferenceFrames": 3,
"Syntax": "DEFAULT",
"Softness": 0,
"FramerateDenominator": 1,
"GopClosedCadence": 1,
"GopSize": 30,
"Slices": 1,
"GopBReference": "DISABLED",
"SlowPal": "DISABLED",
"SpatialAdaptiveQuantization": "ENABLED",
"TemporalAdaptiveQuantization": "ENABLED",
"FlickerAdaptiveQuantization": "DISABLED",
"EntropyEncoding": "CABAC",
"Bitrate": 5200000,
"FramerateControl": "SPECIFIED",
"RateControlMode": "CBR",
"CodecProfile": "MAIN",
"Telecine": "NONE",
"FramerateNumerator": 30,
"MinIInterval": 0,
"AdaptiveQuantization": "HIGH",
"CodecLevel": "AUTO",
"FieldEncoding": "PAFF",
"SceneChangeDetect": "ENABLED",
"QualityTuningLevel": "SINGLE_PASS",
"FramerateConversionAlgorithm": "DUPLICATE_DROP",
"UnregisteredSeiTimecode": "DISABLED",
"GopSizeUnits": "FRAMES",
"ParControl": "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames": 2,
"RepeatPps": "DISABLED",
"DynamicSubGop": "STATIC"
}
},
"AfdSignaling": "NONE",
"DropFrameTimecode": "ENABLED",
"RespondToAfd": "NONE",
"ColorMetadata": "INSERT"
},
"NameModifier": "3"
},
{
"ContainerSettings": {
"Container": "MPD"
},
"VideoDescription": {
"Width": 640,
"ScalingBehavior": "DEFAULT",
"Height": 360,
"TimecodeInsertion": "DISABLED",
"AntiAlias": "ENABLED",
"Sharpness": 50,
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"InterlaceMode": "PROGRESSIVE",
"NumberReferenceFrames": 3,
"Syntax": "DEFAULT",
"Softness": 0,
"FramerateDenominator": 1,
"GopClosedCadence": 1,
"GopSize": 30,
"Slices": 1,
"GopBReference": "DISABLED",
"SlowPal": "DISABLED",
"SpatialAdaptiveQuantization": "ENABLED",
"TemporalAdaptiveQuantization": "ENABLED",
"FlickerAdaptiveQuantization": "DISABLED",
"EntropyEncoding": "CABAC",
"Bitrate": 1200000,
"FramerateControl": "SPECIFIED",
"RateControlMode": "CBR",
"CodecProfile": "MAIN",
"Telecine": "NONE",
"FramerateNumerator": 30,
"MinIInterval": 0,
"AdaptiveQuantization": "HIGH",
"CodecLevel": "AUTO",
"FieldEncoding": "PAFF",
"SceneChangeDetect": "ENABLED",
"QualityTuningLevel": "SINGLE_PASS",
"FramerateConversionAlgorithm": "DUPLICATE_DROP",
"UnregisteredSeiTimecode": "DISABLED",
"GopSizeUnits": "FRAMES",
"ParControl": "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames": 2,
"RepeatPps": "DISABLED",
"DynamicSubGop": "STATIC"
}
},
"AfdSignaling": "NONE",
"DropFrameTimecode": "ENABLED",
"RespondToAfd": "NONE",
"ColorMetadata": "INSERT"
},
"NameModifier": "4"
},
{
"ContainerSettings": {
"Container": "MPD"
},
"VideoDescription": {
"Width": 256,
"ScalingBehavior": "DEFAULT",
"Height": 144,
"TimecodeInsertion": "DISABLED",
"AntiAlias": "ENABLED",
"Sharpness": 50,
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"InterlaceMode": "PROGRESSIVE",
"NumberReferenceFrames": 3,
"Syntax": "DEFAULT",
"Softness": 0,
"FramerateDenominator": 1,
"GopClosedCadence": 1,
"GopSize": 30,
"Slices": 1,
"GopBReference": "DISABLED",
"SlowPal": "DISABLED",
"SpatialAdaptiveQuantization": "ENABLED",
"TemporalAdaptiveQuantization": "ENABLED",
"FlickerAdaptiveQuantization": "DISABLED",
"EntropyEncoding": "CABAC",
"Bitrate": 200000,
"FramerateControl": "SPECIFIED",
"RateControlMode": "CBR",
"CodecProfile": "MAIN",
"Telecine": "NONE",
"FramerateNumerator": 30,
"MinIInterval": 0,
"AdaptiveQuantization": "HIGH",
"CodecLevel": "AUTO",
"FieldEncoding": "PAFF",
"SceneChangeDetect": "ENABLED",
"QualityTuningLevel": "SINGLE_PASS",
"FramerateConversionAlgorithm": "DUPLICATE_DROP",
"UnregisteredSeiTimecode": "DISABLED",
"GopSizeUnits": "FRAMES",
"ParControl": "INITIALIZE_FROM_SOURCE",
"NumberBFramesBetweenReferenceFrames": 2,
"RepeatPps": "DISABLED",
"DynamicSubGop": "STATIC"
}
},
"AfdSignaling": "NONE",
"DropFrameTimecode": "ENABLED",
"RespondToAfd": "NONE",
"ColorMetadata": "INSERT"
},
"NameModifier": "5"
},
{
"ContainerSettings": {
"Container": "MPD"
},
"AudioDescriptions": [
{
"AudioTypeControl": "FOLLOW_INPUT",
"AudioSourceName": "Audio Selector 1",
"CodecSettings": {
"Codec": "AAC",
"AacSettings": {
"AudioDescriptionBroadcasterMix": "NORMAL",
"Bitrate": 96000,
"RateControlMode": "CBR",
"CodecProfile": "LC",
"CodingMode": "CODING_MODE_2_0",
"RawFormat": "NONE",
"SampleRate": 48000,
"Specification": "MPEG4"
}
},
"LanguageCodeControl": "FOLLOW_INPUT"
}
],
"NameModifier": "6"
}
],
"OutputGroupSettings": {
"Type": "DASH_ISO_GROUP_SETTINGS",
"DashIsoGroupSettings": {
"SegmentLength": 1,
"Destination": "!!/videouploads/5fa1ababa7cea975176544/5fa1ababa7cea975176544",
"FragmentLength": 1,
"SegmentControl": "SEGMENTED_FILES",
"MpdProfile": "MAIN_PROFILE",
"HbbtvCompliance": "NONE"
}
}
}
],
"AdAvailOffset": 0,
"Inputs": [
{
"AudioSelectors": {
"Audio Selector 1": {
"Offset": 0,
"DefaultSelection": "DEFAULT",
"ProgramSelection": 1
}
},
"VideoSelector": {
"ColorSpace": "FOLLOW",
"Rotate": "DEGREE_0",
"AlphaBehavior": "DISCARD"
},
"FilterEnable": "AUTO",
"PsiControl": "USE_PSI",
"FilterStrength": 0,
"DeblockFilter": "DISABLED",
"DenoiseFilter": "DISABLED",
"TimecodeSource": "ZEROBASED",
"FileInput": "!!/videouploads/test/5fa1ababa7cea975176544.mp4"
}
]
},
"AccelerationSettings": {
"Mode": "DISABLED"
},
"StatusUpdateInterval": "SECONDS_60",
"Priority": 0
}
So, in other words, what would I have to change to make my segment duration set to 1 second or less?
Thank you for providing the job's JSON settings for further review. Looking at the OutputGroupSettings I can see that you are specifying a SegmentLength value of 1 second so the resulting fragmented MP4 files should be about 1 second in length give or take a few frames.
How are you confirming that the resulting fMP4 files are 30 and 10 seconds respectively? The most accurate way of determining this information is to combine a variant's init MP4 with one of the fragments into a separate file and observe it using a media inspector like Mediainfo or ffprobe. I used your job settings on an MP4 file I had available to me and confirmed that the resulting MP4 segments produces are 1 second in length:
Concatenate a segment with its init file for the full MP4 asset
$ cat 5fa1ababa7cea9751765441init.mp4 >> 5fa1ababa7cea9751765441_concat.mp4 && cat 5fa1ababa7cea9751765441_000000001.mp4 >> 5fa1ababa7cea9751765441_concat.mp4
Probe the newly concatenated file to review the details
$ ffprobe -hide_banner -i 5fa1ababa7cea9751765441_concat.mp4
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '5fa1ababa7cea9751765441_concat.mp4':
Metadata:
major_brand : isom
minor_version : 1
compatible_brands: isomavc1dash
creation_time : 2020-11-20T20:30:59.000000Z
Duration: 00:00:01.03, start: 0.066667, bitrate: 7658 kb/s
Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 3840x2160 [SAR 1:1 DAR 16:9], 7648 kb/s, 30 fps, 30 tbr, 90k tbn, 60 tbc (default)
Metadata:
creation_time : 2020-11-20T20:30:59.000000Z
handler_name : ETI ISO Video Media Handler
encoder : Elemental H.264
I'm curious how you are confirming the resulting duration of each fMP4 asset and if you can try the above steps as well if you have not already done so.

How to set resoutlion for AWS media convert

I was planning on using aws mediaconvert to make multiple different copies of videos with different resolution via aws sdk and I noticed in the example that the Resolution was not included, so I how would I be able to specify it (tell it to do 1920 by 1080 for example)?
"Outputs" => [
[
"VideoDescription" => [
??? "Resolution" => "DEFAULT", ???
"ScalingBehavior" => "DEFAULT",
"TimecodeInsertion" => "DISABLED",
"AntiAlias" => "ENABLED",
"Sharpness" => 50,
"CodecSettings" => [
"Codec" => "H_264",
"H264Settings" => [
...
Not tested, but if you look at the job request json it has "width" and "height" parameters. These will be in the "VideoDescription" object. You can try and see if that works:
"VideoDescription": {
"ScalingBehavior": "DEFAULT",
"TimecodeInsertion": "DISABLED",
"AntiAlias": "ENABLED",
"Sharpness": 50,
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"InterlaceMode": "PROGRESSIVE",
"NumberReferenceFrames": 3,
"Bitrate": 5000000
...
}
},
"AfdSignaling": "NONE",
"DropFrameTimecode": "ENABLED",
"RespondToAfd": "NONE",
"ColorMetadata": "INSERT",
"Width": 1920,
"Height": 1080
}

Facebook graph API, access location tags in photos and posts

On photos and posts I often see location tags pointing to a Facebook page representing a place. They are typically prepended by "near" , "at" or "in". Does anybody know of a way to access this data via the graph API or any other way?
Here is an example :
webpage, I made the photo public so everyone can see http://www.facebook.com/photo.php?pid=11671735&id=506482094
api call https://graph.facebook.com/10150601497547095?access_token=*
{
"id": "10150601497547095",
"from": {
"name": "Edouard Tabet",
"id": "506482094"
},
"name": "Alcohol grows on trees in Isabela, Galapagos",
"picture": "http://photos-a.ak.fbcdn.net/hphotos-ak-snc7/381095_10150601497547095_506482094_11671735_110244218_s.jpg",
"source": "http://a1.sphotos.ak.fbcdn.net/hphotos-ak-snc7/s720x720/381095_10150601497547095_506482094_11671735_110244218_n.jpg",
"height": 720,
"width": 479,
"images": [
{
"height": 2048,
"width": 1365,
"source": "http://a1.sphotos.ak.fbcdn.net/hphotos-ak-snc7/329294_10150601497547095_506482094_11671735_110244218_o.jpg"
},
{
"height": 720,
"width": 479,
"source": "http://a1.sphotos.ak.fbcdn.net/hphotos-ak-snc7/s720x720/381095_10150601497547095_506482094_11671735_110244218_n.jpg"
},
{
"height": 270,
"width": 180,
"source": "http://photos-a.ak.fbcdn.net/hphotos-ak-snc7/381095_10150601497547095_506482094_11671735_110244218_a.jpg"
},
{
"height": 130,
"width": 86,
"source": "http://photos-a.ak.fbcdn.net/hphotos-ak-snc7/381095_10150601497547095_506482094_11671735_110244218_s.jpg"
},
{
"height": 112,
"width": 75,
"source": "http://photos-a.ak.fbcdn.net/hphotos-ak-snc7/381095_10150601497547095_506482094_11671735_110244218_t.jpg"
}
],
"link": "http://www.facebook.com/photo.php?pid=11671735&id=506482094",
"icon": "http://static.ak.fbcdn.net/rsrc.php/v1/yz/r/StEh3RhPvjk.gif",
"created_time": "2012-01-16T23:34:54+0000",
"position": 1,
"updated_time": "2012-01-21T02:16:55+0000",
"comments": {
"data": [
{
"id": "10150601497547095_7207114",
"from": {
"name": "Tom LeNoble",
"id": "218686"
},
"message": "hope you are having fun!",
"can_remove": true,
"created_time": "2012-01-16T23:36:33+0000"
},
{
"id": "10150601497547095_7207963",
"from": {
"name": "Sol McKinney",
"id": "1021642751"
},
"message": "How come Darwin didn't write about that?!",
"can_remove": true,
"created_time": "2012-01-17T01:31:39+0000"
},
{
"id": "10150601497547095_7212820",
"from": {
"name": "Romain BL",
"id": "556337447"
},
"message": "Des bisous mr Tabet! J'esp\u00e8re que tu vas bien depuis tout ce temps!",
"can_remove": true,
"created_time": "2012-01-17T18:19:13+0000"
}
],
"paging": {
"next": "https://graph.facebook.com/10150601497547095/comments?access_token="
}
},
"likes": {
"data": [
{
"id": "1404245",
"name": "Hannah Russin"
},
{
"id": "1278210658",
"name": "Seth Long"
},
{
"id": "218686",
"name": "Tom LeNoble"
}
],
"paging": {
"next": "https://graph.facebook.com/10150601497547095/likes?access_token=&limit=25&offset=25&__after_id=218686"
}
}
}
I figured it out... When accessing a photo with the "photo id" the place attribute is not present in the result... when accessing the same photo through its "post id" then the place attribute appears in the JSON result of the GRAPH API call. Not very convenient...
They should all have place:
https://developers.facebook.com/tools/explorer?method=GET&path=15500414_689594654776