I am using the below code, in order to download the clip. I have successfully received the response and I am trying to download the video but I cannot be able.
const completeUrl = url + "/getClip";
const res = await client.post(completeUrl, params, {
headers: {
"Content-Type": "video/mp4",
"X-Amz-Target": "CodeBuild_20161006.StartBuild",
},
});
console.log("axios signedRquest is ", res);
var binaryData: any = [];
binaryData.push(res.data);
const downloadElement = document.createElement("a");
const href = window.URL.createObjectURL(
new Blob(binaryData, { type: contentType })
);
downloadElement.href = href;
downloadElement.download = "test.mp4";
document.body.appendChild(downloadElement);
downloadElement.click();
document.body.removeChild(downloadElement);
window.URL.revokeObjectURL(href);
I have found the solution by just adding the responseType: "blob" in the request
const res = await client.post(completeUrl, params, {
headers: {
"Content-Type": "video/mp4",
"X-Amz-Target": "CodeBuild_20161006.StartBuild",
},
responseType: "blob", // added this line
});
Related
I'm reformulating my question and code examples so it is easy to understand.
When I call this PUT endpoint, cookies are set correctly.
export const put: RequestHandler<Locals> = async (event) => {
const userInfo = {
refresh_token: Math.random().toString(),
};
const json = JSON.stringify(userInfo);
const jwt = cookie.serialize("jwt", json, {
httpOnly: true,
path: "/",
});
const headers = {
"Set-Cookie": [jwt],
};
return {
status: 200,
headers,
body: {},
};
};
When I call this GET endpoint, cookies are not set.
export const get: RequestHandler<Locals> = async (event) => {
const userInfo = {
refresh_token: Math.random().toString(),
};
const json = JSON.stringify(userInfo);
const jwt = cookie.serialize("jwt", json, {
httpOnly: true,
path: "/",
});
const headers = {
"Set-Cookie": [jwt],
};
return {
status: 200,
headers,
body: {},
};
};
This is the option sent to fetch, where method equals "GET" of "PUT" for each case:
const opts: RequestInit = {
method,
credentials: "include",
headers: {
"Content-Type": "application/json",
},
};
if (data) {
opts.body = JSON.stringify(data);
}
Set the cookie in the handle hook where it can be set in a single place for both endpoints. See https://kit.svelte.dev/docs#hooks-handle.
Pass the JWT to the handle method from your endpoint through the event.locals object.
e.g.,
/** #type {import('#sveltejs/kit').Handle} */
export async function handle({ event, resolve }) {
const response = await resolve(event);
response.headers.append('set-cookie', event.locals.jwt);
return response;
}
I'm trying to upload files to my S3 bucket via PresignedUrl Lambda function. everything works fine via post man. but the Browser based application is failing saying "SignatureDoesNotMatch"
My Lambda function region is ap-southeast-1.
but similar function works fine in ap-south1 (which is same timezone as mine). any idea why is this happening. could this be anything to do with the timezone difference between the server and client.
Please see my code below:
<script>
$(document).one('submit', '#memberForm', function (e) {
e.preventDefault();
$.get("<FUNCTION URL>", function (data) {
var getUrl = data.uploadURL;
var fileName = data.fileName;
var theFormFile = $('#fileLogo').get()[0].files[0];
if (theFormFile != null) {
console.log(theFormFile);
$.ajax({
type: 'PUT',
url: getUrl,
contentType: 'binary/octet-stream',
processData: false,
crossDomain: true,
data: theFormFile,
success: function () {
alert('Yeehaaaw');
},
error: function (e) {
console.log(e);
alert('File NOT uploaded');
console.log(arguments);
}
});
} else {
$("#memberForm").submit();
}
});
return false;
});
</script>
My Code for Url Generation is as below:
'use strict'
const AWS = require('aws-sdk')
AWS.config.update({ region: process.env.AWS_REGION || 'ap-southeast-1' })
const s3 = new AWS.S3()
// Main Lambda entry point
exports.handler = async (event) => {
console.log("execution started")
var contentType=event["queryStringParameters"]['contentType']
var path=event["queryStringParameters"]['path']
const result = await getUploadURL(contentType,path)
console.log('Result: ', result)
return result
}
const getContentType=function(contentType){
switch(contentType) {
case "png":
return "image/png"
case "jpg":
return "image/jpeg"
case "pdf":
return "application/pdf"
default:
return "application/json"
}
}
const getExtension=function(contentType){
switch(contentType) {
case "png":
return "png"
case "jpg":
return "jpg"
case "pdf":
return "pdf"
default:
return `${contentType}`
}
}
const getUploadURL = async function(contentType,path) {
console.log(`Content type is ${contentType}`)
const actionId = parseInt(Math.random()*10000000)
var type=getContentType(contentType);
var ext= getExtension(contentType);
const s3Params = {
Bucket: process.env.UploadBucket,
Key: `${path}/${actionId}.${ext}`,
ContentType: type,// Update to match whichever content type you need to upload
ACL: 'public-read', // Enable this setting to make the object publicly readable - only works if the bucket can support public objects,
Expires: 300
}
console.log('getUploadURL: ', s3Params)
return new Promise((resolve, reject) => {
// Get signed URL
resolve({
"statusCode": 200,
"isBase64Encoded": false,
"headers": {
"Access-Control-Allow-Origin": "*"
},
"body": JSON.stringify({
"uploadURL": s3.getSignedUrl('putObject', s3Params),
"fileName": `${actionId}.${ext}`
})
})
})
}
Also the same works when i try with PostMan.
I resolved this by adding the signature version:
const s3=new AWS.S3({
signatureVersion:'v4'
});
I am trying to post a file to s3 using createPresignedPost. The file is posting to my bucket but it is not respecting the file size constraint. Here is my code and the file upload is base64 encoded string.
function postObjectSignedUrl(req) {
const key = `${req + "/" + uuid.v4()}`;
return new Promise(function (resolve, reject) {
const params = {
Bucket: 'base',
Expires: 60 * 60, // in seconds,
Fields: {
key: key,
},
conditions: [
['content-length-range', 0,1000000]
]
}
s3.createPresignedPost(params, (err, data) => {
if (err) {
reject(err)
} else {
resolve(data);
}
})
})
}
My client side code is the following:
var data = new FormData();
const getUrl = await getSignedUrl();
const keys = getUrl["fields"];
$.each(keys, function(key,value){
data.append(key,value);
});
data.append("file", profilePic);
try {
const result = await fetch(getUrl["url"], {
method: "POST",
mode: "cors",
headers: {
'Access-Control-Allow-Origin': '*',
},
body: data
})
if (result.status === 204){
}
} catch (err) {
console.log(err, " error ")
}
Normally params attributes in NodeJS SDK are Upper Camel Case so you have to change "conditions" for "Conditions".
BTW you can change your url generator code as follow :)
function postObjectSignedUrl(req) {
const key = `${req + "/" + uuid.v4()}`;
const params = {
Bucket: 'base',
Expires: 60 * 60, // in seconds,
Fields: {
key: key,
},
Conditions: [
['content-length-range', 0,1000000]
]
}
return s3.createPresignedPost(params).promise();
})
Regards,
I am working on one node application which is using AWS. now i want to get all cognito users but as per doc it returns first 60 users but i want all users. can you assist me with this? In doc, they mentioned that pass PaginationToken (string) . but i don't know what to pass in it.
Here what i have done so far :
exports.handler = (event, context, callback) => {
const requestBody = JSON.parse(event.body);
var params = {
"UserPoolId": "****************",
"Limit": 60,
"PaginationToken" : (what to pass here????),
}
const cognitoidentityserviceprovider = new AWS.CognitoIdentityServiceProvider();
cognitoidentityserviceprovider.listUsers(params, (err, data) => {
if (err) {
callback(null, { headers: { "Content-Type": "application/json", "Access-Control-Allow-Origin": "*" }, body: JSON.stringify({ statusCode: 405, data: err }) });
} else {
console.log(data);
let userdata = [];
for(let i=0; i<data.Users.length;i++){
// console.log(data.Users[i].Attributes);
userdata.push(getAttributes(data.Users[i].Attributes));
}
callback(null, { headers: { "Content-Type": "application/json", "Access-Control-Allow-Origin": "*" }, body: JSON.stringify({ statusCode: 200, data: userdata }) });
}
});
};
function getAttributes(attributes){
let jsonObj = {};
attributes.forEach((obj) => {
jsonObj[obj.Name] = obj.Value;
});
return jsonObj;
}
In your response you should see a property called PaginationToken. If you make the same call but include this value in your params you will receive the next 60 users. Here's the concept:
cognitoidentityserviceprovider.listUsers(params, (err, data) => {
// data.Users is the first 60 users
params.PaginationToken = data.PaginationToken;
cognitoidentityserviceprovider.listUsers(params, (err, data) => {
// data.Users is the next 60 users
});
});
You might want to consider switching to promises and async/await if your environment supports it. That would make this code easier to read and write.
const data = await cognitoidentityserviceprovider.listUsers(params).promise();
params.PaginationToken = data.PaginationToken;
const data2 = await cognitoidentityserviceprovider.listUsers(params).promise();
I have a dynamodb table with attributes: userId, propertyInfo, and propertyId. userId is primary index. When I use the following lambda code to update(PUT) the item in the table, I get "The provided key element does not match the schema".
const AWS = require('aws-sdk'); // eslint-disable-line import/no-extraneous-dependencies
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.update = (event, context, callback) => {
const timestamp = new Date().getTime();
const data = JSON.parse(event.body);
const params = {
TableName: process.env.DYNAMODB_TABLE,
Key: {
propertyId: event.pathParameters.id,
},
ExpressionAttributeNames: {
'#new_propertyInfo': 'propertyInfo',
},
ExpressionAttributeValues: {
':propertyInfo': data.propertyInfo,
},
UpdateExpression: 'SET #new_propertyInfo = :propertyInfo',
ReturnValues: 'ALL_NEW',
};
dynamoDb.update(params, (error, result) => {
// handle potential errors
if (error) {
console.error(error);
callback(null, {
statusCode: error.statusCode || 501,
headers: { 'Content-Type': 'text/plain' },
body: 'Couldn\'t fetch the item.',
});
return;
}
// create a response
const response = {
statusCode: 200,
body: JSON.stringify(result.Attributes),
};
callback(null, response);
});
};
Body of my update request is:
{
"propertyInfo":
{
"houseNumber": 2000,
"street": "easy st"
}
}
The event.pathParameters.id is obtained from /property/{id}. I need this id to search DB's propertyId. The userId is needed for authorization purpose. Search and update I need to search by propertyId. Could someone help to explain to me what I need to do to set this up correctly please?