Reading file from assets folder with expo go - expo

Trying to read a file from assets folder with react native expo go
import * as FileSystem from "expo-file-system";
...
const getFiles = async () => {
const [{ localUri }] = await Asset.loadAsync(
require("./assets/gpx/test.txt")
);
console.log("localUri", localUri);
console.log(
"readAsStringAsync",
FileSystem.readAsStringAsync(localUri)
);
...
this code returns the correct localUri file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540anonymous%252FExpoGeolocation-c1ebf94c-e515-4d59-bcb8-5d1f1ce948bf/ExponentAsset-5d41402abc4b2a76b9719d911017c592.txt
but the function readAsStringAsync() returns always {"_A": null, "_x": 0, "_y": 0, "_z": null}
how can I read txt (or csv or something else) file from assets directory with react native expo go

You're so close, you're just missing an await.
console.log(
"readAsStringAsync",
await FileSystem.readAsStringAsync(localUri)
);

Related

Automatically call API from Postman and generate a JSON response in a file

I am using Postman to call API. I have a couple to call, so I made a collection and I am able to run it.
I am trying to get the results of the API saved in a file (JSON is fine) and then everytime I call the API, to get the file updated. It is possible to automatically save a response for each API call?
Could you please how can I do it? I have tried with newman, but I`m not being succesful. Thank you.
Managed to get results in JSON files, although I have to update it manually. Followed this video on Youtube and this is the code in .js (if helps anyone in the future)
const newman = require('newman'); // require newman in your project
const fs = require('fs');
// call newman.run to pass `options` object and wait for callback
newman.run({
collection: require('./name_of_postman_collection.json'),
reporters: 'cli'
}).on('beforeRequest', (error, data) => {
if (error) {
console.log(error);
return;
}
console.log(data);
})
.on('request', (error, data) =>{
if (error) {
console.log(error);
return;
}
const fileName = `response ${data.item.name}.json`;
const content = data.response.stream.toString();
fs.writeFile(fileName, content, function (error) {
if (error) {
console.error(error);
}
});
});
I am still trying to find out how to automate the process, to get refreshed files every 2-3 hours for example.

Uppy Companion doesn't work for > 5GB files with Multipart S3 uploads

Our app allow our clients large file uploads. Files are stored on AWS/S3 and we use Uppy for the upload, and dockerize it to be used under a kubernetes deployment where we can up the number of instances.
It works well, but we noticed all > 5GB uploads fail. I know uppy has a plugin for AWS multipart uploads, but even when installed during the container image creation, the result is the same.
Here's our Dockerfile. Has someone ever succeeded in uploading > 5GB files to S3 via uppy? IS there anything we're missing?
FROM node:alpine AS companion
RUN yarn global add #uppy/companion#3.0.1
RUN yarn global add #uppy/aws-s3-multipart
ARG UPPY_COMPANION_DOMAIN=[...redacted..]
ARG UPPY_AWS_BUCKET=[...redacted..]
ENV COMPANION_SECRET=[...redacted..]
ENV COMPANION_PREAUTH_SECRET=[...redacted..]
ENV COMPANION_DOMAIN=${UPPY_COMPANION_DOMAIN}
ENV COMPANION_PROTOCOL="https"
ENV COMPANION_DATADIR="COMPANION_DATA"
# ENV COMPANION_HIDE_WELCOME="true"
# ENV COMPANION_HIDE_METRICS="true"
ENV COMPANION_CLIENT_ORIGINS=[...redacted..]
ENV COMPANION_AWS_KEY=[...redacted..]
ENV COMPANION_AWS_SECRET=[...redacted..]
ENV COMPANION_AWS_BUCKET=${UPPY_AWS_BUCKET}
ENV COMPANION_AWS_REGION="us-east-2"
ENV COMPANION_AWS_USE_ACCELERATE_ENDPOINT="true"
ENV COMPANION_AWS_EXPIRES="3600"
ENV COMPANION_AWS_ACL="public-read"
# We don't need to store data for just S3 uploads, but Uppy throws unless this dir exists.
RUN mkdir COMPANION_DATA
CMD ["companion"]
EXPOSE 3020
EDIT:
I made sure I had:
uppy.use(AwsS3Multipart, {
limit: 5,
companionUrl: '<our uppy url',
})
And it still doesn't work- I see all the chunks of the 9GB file sent on the network tab but as soon as it hits 100% -- uppy throws an error "cannot post" (to our S3 url) and that's it. failure.
Has anyone ever encountered this? upload goes fine till 100%, then the last chunk gets HTTP error 413, making the entire upload fail.
Thanks!
Here I'm adding some code samples from my repository that will help you to understand the flow of using the BUSBOY package to stream the data to the S3 bucket. Also, I'm adding the reference links here for you to get the package details I'm using.
https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/index.html
https://www.npmjs.com/package/busboy
export const uploadStreamFile = async (req: Request, res: Response) => {
const busboy = new Busboy({ headers: req.headers });
const streamResponse = await busboyStream(busboy, req);
const uploadResponse = await s3FileUpload(streamResponse.data.buffer);
return res.send(uploadResponse);
};
const busboyStream = async (busboy: any, req: Request): Promise<any> {
return new Promise((resolve, reject) => {
try {
const fileData: any[] = [];
let fileBuffer: Buffer;
busboy.on('file', async (fieldName: any, file: any, fileName: any, encoding: any, mimetype: any) => {
// ! File is missing in the request
if (!fileName)
reject("File not found!");
let totalBytes: number = 0;
file.on('data', (chunk: any) => {
fileData.push(chunk);
// ! given code is only for logging purpose
// TODO will remove once project is live
totalBytes += chunk.length;
console.log('File [' + fieldName + '] got ' + chunk.length + ' bytes');
});
file.on('error', (err: any) => {
reject(err);
});
file.on('end', () => {
fileBuffer = Buffer.concat(fileData);
});
});
// ? Haa, finally file parsing wen't well
busboy.on('finish', () => {
const responseData: ResponseDto = {
status: true, message: "File parsing done", data: {
buffer: fileBuffer,
metaData
}
};
resolve(responseData)
console.log('Done parsing data! -> File uploaded');
});
req.pipe(busboy);
} catch (error) {
reject(error);
}
});
}
const s3FileUpload = async (fileData: any): Promise<ResponseDto> {
try {
const params: any = {
Bucket: <BUCKET_NAME>,
Key: <path>,
Body: fileData,
ContentType: <content_type>,
ServerSideEncryption: "AES256",
};
const command = new PutObjectCommand(params);
const uploadResponse: any = await this.S3.send(command);
return { status: true, message: "File uploaded successfully", data: uploadResponse };
} catch (error) {
const responseData = { status: false, message: "Monitor connection failed, please contact tech support!", error: error.message };
return responseData;
}
}
In the AWS S3 service in a single PUT operation, you can upload a single object up to 5 GB in size.
To upload > 5GB files to S3 you need to use the multipart upload S3 API, and also the AwsS3Multipart Uppy API.
Check your upload code to understand if you are using AWSS3Multipart correctly, setting the limit properly for example, in this case a limit between 5 and 15 is recommended.
import AwsS3Multipart from '#uppy/aws-s3-multipart'
uppy.use(AwsS3Multipart, {
limit: 5,
companionUrl: 'https://uppy-companion.myapp.net/',
})
Also, check this issue on Github Uploading a large >5GB file to S3 errors out #1945
If you're getting Error: request entity too large in your Companion server logs I fixed this in my Companion express server by increasing the body-parser limit:
app.use(bodyparser.json({ limit: '21GB', type: 'application/json' }))
This is a good working example of Uppy S3 MultiPart uploads (without this limit increased): https://github.com/jhanitesh10/uppy
I'm able to upload files up to a (self-imposed) limit of 20GB using this code.

Compression (gzip) does not work in NestJS

Created a standard NestJS project using the command: nest new project-name. After that, I installed compression npm i --save compression and plugged it into the project.
import { NestFactory } from '#nestjs/core';
import { AppModule } from './app.module';
import * as compression from 'compression';
async function bootstrap() {
const app = await NestFactory.create(AppModule);
app.use(compression());
await app.listen(3000);
}
bootstrap();
But the compression does not work, response does not have gzip.
Response in browser
I've already tried clearing the cache, restarting the browser, nothing works.
Thanks for the help
First, install the following package:
npm i --save compression
In the bootstrap function in the main.ts file, apply the compression middleware as follow
async function bootstrap() {
const app: INestApplication = await NestFactory.create(AppModule);
...
app.use(compression({
filter: () => { return true },
threshold: 0
}));
...
await app.listen(4100);
}
Then, you will see the result
Note
if your browser is open, first close it then opens it again
Control+Shift+R or Shift + F5 = Reload your current page, ignoring cached content.
If nothing else works:
const expressApp = express();
expressApp.use(compression());
const expressAdapter = new ExpressAdapter(expressApp);
const app = await NestFactory.create(AppModule, expressAdapter);
await app.listen(3000);

Cannot find module '#google-cloud/storage'

I am using the GCP console on my browser. I have created a function as following:
function listFiles(bucketName) {
// [START storage_list_files]
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
storage
.bucket(bucketName)
.getFiles()
.then(results => {
const files = results[0];
console.log('Files:');
files.forEach(file => {
console.log(file.name);
});
})
.catch(err => {
console.error('ERROR:', err);
});
// [END storage_list_files]
}
exports.helloWorld = function helloWorld (req, res) {
if (req.body.message === undefined) {
// This is an error case, as "message" is required
res.status(400).send('No message defined!');
}
else {
// Everything is ok
console.log(req.body.lat);
console.log(req.body.lon);
listFiles("drive-test-demo");
res.status(200).end();
}
}
Literally all I am trying to do right now is list the files inside a bucket, if a certain HTTPS trigger comes through.
my package.json file is as follows:
{
"name": "sample-http",
"version": "0.0.1",
"dependencies": {
"#google-cloud/storage": "1.5.1"
}
}
and I am getting the error "Cannot find module '#google-cloud/storage'"
Most queries I have seen thus far have been resolved by using npm install, but I don't know how to do that considering that my index.js and package.json files are stored in a zip file inside a gcloud bucket. Any advice on how to solve this would be much apreciated.
Open console, change dir to you functions project and type:
npm install --save #google-cloud/storage
That's all!

Save image picked from ReactNative/Expo ImagePicker to Baqend

I'm having a hard time saving an image that is being picked from Expo (React Native).
https://docs.expo.io/versions/latest/sdk/imagepicker.html
It seems that React Native does not have support for uploading the selected image as blob, but does have a base64 option.
The code:
_pickImage = async () => {
let pickerResult = await ImagePicker.launchImageLibraryAsync({
allowsEditing: true,
base64: true,
aspect: [4, 4],
});
this._handleImagePicked(pickerResult);
};
_handleImagePicked(pickerResult) {
const uri = pickerResult.base64;
const img = new db.File({ name: 'test.jpg', data: uri, type: 'base64', mimeType: 'image/jpg' });
db.UserData.load(this.state.UserDataID).then(UserData => {
img.upload({ force: true }).then((file) => {
UserData.photo = "https://remarkable-apple-95.app.baqend.com/v1" + file.id;
alert(file.id)
return UserData.update();
},
(error) => { alert(error); }
);
});
}
When I console.log(pickerResult.base64) I get a super long string that looks like base64, but when this is run, the img.upload is throwing the error and it says "PersistentError: An unexpected persistent error occurred."
You're right. React Native has no support for binary data. Unfortunately Baqend does not support base64 file uploads yet.
As a workaround you have 2 options:
Use the React Native Fetch Blob library, which bypasses the limitations of React Native not supporting binary files by uploading and downloading the files directly via native code and gives back a reference to those. Your code could look similar to this:
ImagePicker.showImagePicker(options, async (response) => {
const upload = new db.message.UploadFile('files', 'uploadFetchBlob.jpg')
const body = 'RNFetchBlob-' + response.uri;
RNFetchBlob.fetch('PUT', 'https://{YOUR-APP-NAME}.app.baqend.com/v1' + upload.request.path, upload.request.headers, body).then((res) => {
db.File({ parent: 'files', name: 'uploadFetchBlob.jpg'}).url
})
});
Unfortunately this wont work with the expo client right now, but you'd have to eject your project and use 'native code'.
The second option would be not to use the baqend file endpoint directly, but upload your base64 string to a baqend module instead. There you can parse your base64 string and upload it to your files from within your backend module. You can find an example for this in our Guide. https://www.baqend.com/guide/topics/baqend-code/#handling-binary-data
Hope this helps