I have been trying to search through the documentations and I think I found a solution but it isn't working and I continue to get an error. What I am doing for a signedUrl is
const [url] = await googleCloud
.bucket(bucketName)
.file(filename)
.makePublic()
.getSignedUrl(options);
return { url };
What I added from what I found in the docs was makePublic() but this throws and error as it is followed by getSignedUrl. I don't have any problem if I remove makePublic() and upload the file from the signed url but the file won't be public.
Any advice on how I should proceed?
UPDATE
The error I am seeing when trying to get a signed url with makePublic is [Unhandled promise rejection: Error: googleCloud.bucket(...).file(...).makePublic(...).getSignedUrl is not a function].
Additional information in regards to the options being send with getSignedUrl function are
const options = {
version: 'v4',
action: 'write',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
contentType: 'video/quicktime',
};
UPDATE (How frontend handles signed url)
const submitReview = async () => {
const url = await getSignedUrl({
variables: { filename: "google3.mov" }
}).then(async response => {
if (response.data && response.data.getSignedUrl) {
const url = response.data.getSignedUrl.url;
const pathUrl = url.split('?');
const videoPath = pathUrl[0];
const uploadedReponse = await uploadToGoogleCloud(url);
}
});
}
const uploadToGoogleCloud = async (url) => {
const videoFile = await fetch(video.uri);
const blob = await videoFile.blob();
const response = await fetch(url, {
method: 'PUT',
body: blob
}).then(res => console.log("thres is ", res)).catch(e => console.log(e));
}
Related
My question is this: how do I call a Firebase Cloud Function from a Cloud task and pass a payload through?
I tried following the tutorial here. The only difference is that I'm using Cloud functions for Firebase instead of regular Cloud Functions.
Here is my cloud function.
const functions = require("firebase-functions");
exports.myFunction = functions.https.onRequest((req, res) => {
console.log(req.query);
res.send('success');
});
When I query the url in the browser with parameters ?myparams=data I can log 'data' so I know the cloud function is basically working.
But when I try to call it from my queue (below) I get:
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
My guess is that req is undefined.
I've been looking at this SO question and I am wondering if it has something to do with needing to use bodyParser for onRequest functions.
HTTP Event Cloud Function: request body value is undefined
I'm also seeing that some people have CORS issues with their cloud functions, which seems like it might be related.
Here is the task queue code that should be sending the payload.
const seconds = 5;
const project = 'xxxxx-xxxxxxx';
const queue = 'xxxxx';
const location = 'us-west2';
const url = 'https://us-central1-xxxxx-xxxxx.cloudfunctions.net/writeDB';
const payload = 'My data';
const parent = client.queuePath(project, location, queue);
const task = {
httpRequest: {
httpMethod: "POST",
url: url,
body: Buffer.from(JSON.stringify(payload)).toString("base64"),
headers: {
"Content-Type": "application/json"
},
oidcToken: {
serviceAccountEmail
}
}
};
task.scheduleTime = {
seconds: seconds + Date.now() / 1000,
};
const request = {parent: parent, task: task};
await client.createTask(request)
.then(response => {
const task = response[0].name;
console.log(`Created task ${task}`);
return {'Response': String(response)}
})
.catch(err => {
console.error(`Error in createTask: ${err.message || err}`);
next()
});
It calls the function, but for some reason it results in the error and the payload isn't logged.
Can anyone help?
As always, I'm happy to clarify the question if anything is unclear. Thanks!
I was able to replicate your error and I managed to fix it by changing the content type headers from "application/json" to "text/plain". I have also removed the JSON.stringify() function in the body value because your payload variable is a String type. Below is my modified sample of your code:
const {CloudTasksClient} = require('#google-cloud/tasks');
// Instantiates a client.
const client = new CloudTasksClient();
const seconds = 5;
const serviceAccountEmail = "xxxx-xxxxx-xxxxxx#appspot.gserviceaccount.com";
const project = 'xxxx-xxxxxx';
const queue = "xx-xxxxx";
const location = 'us-central1';
const url = "https://us-central1-xxxxx-xxxxx.cloudfunctions.net/myFunction";
const payload = 'My Data';
const parent = client.queuePath(project, location, queue);
async function quickstart() {
const task = {
httpRequest: {
httpMethod: "POST",
url: url,
body: Buffer.from(payload).toString("base64"), // your previous code: body: Buffer.from(JSON.stringify(payload)).toString("base64"),
headers: {
"Content-Type": "text/plain"
},
oidcToken: {
serviceAccountEmail
}
}
};
task.scheduleTime = {
seconds: seconds + Date.now() / 1000,
};
const request = {parent: parent, task: task};
await client.createTask(request)
.then(response => {
const task = response[0].name;
console.log(`Created task ${task}`);
return {'Response': String(response)}
})
.catch(err => {
console.error(`Error in createTask: ${err.message || err}`);
next()
});
}
quickstart();
In Cloud Functions, I changed req.query to req.body to get the result from Cloud Tasks
const functions = require("firebase-functions");
exports.myFunction = functions.https.onRequest((req, res) => {
console.log(req.body);
console.log('success')
res.send('success');
});
I am trying to encrypt data using AWS Encryption SDK for javascript(react) in browser but getting this error : export 'encrypt' (imported as 'encrypt') was not found in '#aws-crypto/client-browser'
import { KmsKeyringBrowser, encrypt } from "#aws-crypto/client-browser";
const generatorKeyId = '*********************'
const keyIds = ['**************************']
const keyring = new KmsKeyringBrowser({generatorKeyId,keyIds})
const context = {
stage: 'demo',
purpose: 'simple demonstration app',
origin: 'us-west-2'
}
const encryptData = async (data) => {
const result = await encrypt(keyring,data, { encryptionContext: context })
console.log(result);
}
I have alos tried using buildEncrypt but it is not working. Below is the code :
import { KmsKeyringBrowser, encrypt } from "#aws-crypto/client-browser";
const generatorKeyId = '*********************'
const keyIds = ['**************************']
const keyring = new KmsKeyringBrowser({generatorKeyId,keyIds})
const context = {
stage: 'demo',
purpose: 'simple demonstration app',
origin: 'us-west-2'
}
const encryptData = async (data) => {
const result = await buildEncrypt({encrypt: (keyring,data, { encryptionContext: context })} )
console.log(result);
}
I am unable to find out what I was doing wrong. Any help would be apprecieated. Thankyou
Hi I have a working serverless function that uses s3 signedurl to put a file in an s3 bucket using on the Serverless framework that I am trying to migrate to a vercel serverless function using Next.
The function works via the serverless function and Postman, but when I try on Vercel although it generates the signedurl ok but when I try to use it with a "x-amz-tagging"="test" header I get a 403 error. Here is the relevant bit of my code:
//serverless function
const allowCors = fn => async (req, res) => {
res.setHeader('Access-Control-Allow-Credentials', true)
res.setHeader('Access-Control-Allow-Origin', '*')
res.setHeader('Access-Control-Allow-Methods', 'GET,OPTIONS,PATCH,DELETE,POST,PUT')
res.setHeader(
'Access-Control-Allow-Headers',
'X-CSRF-Token, X-Requested-With, x-amz-tagging, Accept, Accept-Version, Content-Length, Content-MD5, Content-Type, Date, X-Api-Version'
)
return await fn(req, res)
}
function requestUploadURL(req, res) {
...
}
module.exports = allowCors(requestUploadURL)
//code in app
try {
const config = {
onUploadProgress(progressEvent) {
const percent = Math.round(
(progressEvent.loaded * 100) / progressEvent.total)
adduploadprogress({
file: fileSelectedArray[i].file,
fileType: fileSelectedArray[i].fileType,
myFsize: fileSelectedArray[i].myFsize,
percent,
})
},
headers: {
'Content-Type': fileSelectedArray[i].fileType,
'x-amz-tagging': 'test', // THIS CAUSES 403 ERROR
},
}
const resp1 = await axios.put(
uploadURL,
fileSelectedArray[i].file,
config
)
Any advice gratefully received
For some reason I also need to use Tagging:'' as part of the s3 params in the function being wrapped by the allowCors function. I didn't need to do this before
const { body } = req
const s3Params = {
Bucket: UNIQUEBUCKET,
Key: body.name,
ContentType: body.type,
ACL: 'public-read',
Tagging: '',
}
const uploadURL = s3.getSignedUrl('putObject', s3Params)
I am following the proposed solution by Trivikr for adding support for s3.getSignedUrl api which is not currently available in newer v3. I am trying to make a signed url for getting an object from bucket.
Just for convenience, the code is being added below:
const { S3, GetObjectCommand } = require("#aws-sdk/client-s3"); // 1.0.0-gamma.2 version
const { S3RequestPresigner } = require("#aws-sdk/s3-request-presigner"); // 0.1.0-preview.2 version
const { createRequest } = require("#aws-sdk/util-create-request"); // 0.1.0-preview.2 version
const { formatUrl } = require("#aws-sdk/util-format-url"); // 0.1.0-preview.1 //version
const fetch = require("node-fetch");
(async () => {
try {
const region = "us-east-1";
const Bucket = `SOME_BUCKET_NAME`;
const Key = `SOME_KEY_VALUE`;
const credentials = {
accessKeyId: ACCESS_KEY_HERE,
secretAccessKey: SECRET_KEY_HERE,
sessionToken: SESSION_TOKEN_HERE
};
const S3Client = new S3({ region, credentials, signatureVersion: 'v4' });
console.log('1'); // for quick debugging
const signer = new S3RequestPresigner({ ...S3Client.config });
console.log('2')
const request = await createRequest(
S3Client,
new GetObjectCommand({ Key, Bucket })
);
console.log('3');
let signedUrl = formatUrl(await signer.presign(request));
console.log(signedUrl);
let response = await fetch(signedUrl);
console.log("Response", response);
}catch(e) {
console.error(e);
}
I successfully create S3Client and signer but on creating request, I get the following error:
clientStack.concat(...).filter is not a function
Anything wrong I am doing?
Please also note that I am using webpack for bundling
Just add my example in TypeScript:
import { S3Client, GetObjectCommand, S3ClientConfig } from '#aws-sdk/client-s3';
import { getSignedUrl } from '#aws-sdk/s3-request-presigner';
const s3Configuration: S3ClientConfig = {
credentials: {
accessKeyId: '<ACCESS_KEY_ID>',
secretAccessKey: '<SECRET_ACCESS_KEY>'
},
region: '<REGION>',
};
const s3 = new S3Client(s3Configuration);
const command = new GetObjectCommand({Bucket: '<BUCKET>', Key: '<KEY>' });
const url = await getSignedUrl(s3, command, { expiresIn: 15 * 60 }); // expires in seconds
console.log('Presigned URL: ', url);
RESOLVED
I ended up successfully making the signed urls by installing the beta versions rather than preview (default) ones
I have Cloudfront in front of an s3 bucket that serves HLS videos. I'm trying to dynamically modify the manifest files to add an auth token to the segments inside of them.
What I would really like to do is modify the body I send back to the client in a viewer response function, but since that isn't possible, I'm attempting to use a origin request function to manually fetch the object from S3, modify it, and return a Cloudfront request with the new body. I get a 503 error of "The Lambda function result failed validation: The body is not a string, is not an object, or exceeds the maximum size"
My body is under 8kb (1MB is the limit in the docs). As far as I can tell the cloudfront request object I'm generating looks good and the base64 data decodes to what I want. I've also tried using text instead of base64. I have "include body" enabled in Cloudfront.
const fs = require('fs');
const querystring = require('querystring');
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
exports.handler = async (event) => {
const cfrequest = event.Records[0].cf.request;
const queryString = querystring.parse(event.Records[0].cf.request.querystring);
const jwtToken = queryString.token;
if (cfrequest.uri.match(/\.m3u8?$/mi)) {
const s3Response = await (new Promise((resolve, reject) => {
S3.getObject({
Bucket: 'bucket',
Key: cfrequest.uri.substring(1)
}, (err, data) => {
if (err) {
reject(err)
} else {
resolve(data);
}
});
}));
const manifestFile = s3Response.Body.toString('utf8');
const newManifest = manifestFile.replace(/^((\S+)\.(m3u8|ts|vtt))$/gmi, (_, url) => `${url}?token=${jwtToken}`);
const base64NewManifest = Buffer.from(newManifest, 'utf8').toString('base64');
const tokenizedCfRequest = {
...cfrequest,
body: {
action: 'replace',
data: base64NewManifest,
encoding: 'base64'
}
};
return tokenizedCfRequest;
}
return cfrequest;
}
If you want to generate your own response you need to use a viewer request or origin request event and return a response like this:
exports.handler = async (event) => {
const cfRequest = event.Records[0].cf.request;
const queryString = querystring.parse(event.Records[0].cf.request.querystring);
const jwtToken = queryString.token;
if (cfrequest.uri.match(/\.m3u8?$/mi)) {
// ... your code here ...
const response = {
status: 200, // only mandatory field
body: base64NewManifest,
bodyEncoding: 'base64',
};
return response;
}
// Return original request if no uri match
return cfRequest;
}
See also Generating HTTP Responses in Request Triggers.