I have 20 jsons files. In every file there are 12 markers:
{"mean":0.08614979321142166,
"min":-3.1884,
"max":2.1901,
"peak2peak":5.3785000000000007,
"std":1.0903240544512534,
"variance":1.1888065437150195,
"kurtosis":2.165975182739587,
"skewness":-0.041289626007299074,
"rmsOriginalSignal":1.0937041158972693,
"rmsFiltSignal_01":1.0922792414296567,
"rmsFiltSignal_02":0.00036638538212209666,
"rmsFiltSignal_03":0.032341062844535272}
I have to create a graph like this with mean:
mean
How can I load this external json files, also from a local folder?
How can I take every mean from every single file and create a graph for mean?
As I understood, in your json files, the structure like below.
var jsonObjectArray = [{"mean":0.08614979321142166,
"min":-3.1884,
"max":2.1901,
"peak2peak":5.3785000000000007,
"std":1.0903240544512534,
"variance":1.1888065437150195,
"kurtosis":2.165975182739587,
"skewness":-0.041289626007299074,
"rmsOriginalSignal":1.0937041158972693,
"rmsFiltSignal_01":1.0922792414296567,
"rmsFiltSignal_02":0.00036638538212209666,
"rmsFiltSignal_03":0.032341062844535272},
{"mean":0.08614979321142166,
"min":-3.1884,
"max":2.1901,
"peak2peak":5.3785000000000007,
"std":1.0903240544512534,
"variance":1.1888065437150195,
"kurtosis":2.165975182739587,
"skewness":-0.041289626007299074,
"rmsOriginalSignal":1.0937041158972693,
"rmsFiltSignal_01":1.0922792414296567,
"rmsFiltSignal_02":0.00036638538212209666,
"rmsFiltSignal_03":0.032341062844535272}, ....]
you can use loadash.js map function to pisk just mean property from json object array.
_.map(jsonObjectArray, "mean");
then you will get like [0.08614979321142166, 0.18614979321142186]
for example if you are trying to build a web page with all of theese data, you are gonna include 20 different files to html like .js files, or you are gonna get your json files like below,
<script>
var jsonArray = null;
for(i=0; i<19 ; i++){
$.getJSON('mean' + i + '.json', function(data) {
jsonArray.push(data);
});
}
<script/>`
I named your 20 json files like mean1.json, mean2.json ... so on.
Related
I created a Google Form with a linked Google Spreadsheet. I would like that everytime someone submits the form, the spreadsheet is copied to an s3 bucket in AWS. To do so, I just got started with Google Scripts. I managed to get the trigger part working on form submit but I am struggling to understand the readme of this GitHub project to upload to s3.
function setUpTrigger() {
ScriptApp.newTrigger('copyDataS3')
.forForm('1SK-2Ow63vs_TaoF54UjSgn35FL7F8_ANHDTOOiTabMM')
.onFormSubmit()
.create();
}
function copyDataS3() {
// https://github.com/viuinsight/google-apps-script-for-aws
// I do not understand where should I place aws.js and util.js.
// Should I do File -> New -> Script file and copy paste the contents? Should the file be .js or .gs?
S3.init("MY_ACCESS_KEY", "MY_SECRET_KEY");
// if I wanwt to copy an spreadsheet with the following id, what should go into "object" below?
var ssID = "SPREADSHEET_ID";
S3.putObject(bucketName, objectName, object, region)
}
I believe your goal as follows.
You want to send Google Spreadsheet to s3 bucket as a CSV data using Google Apps Script.
Modification points:
When I saw google-apps-script-for-aws of the library you are using, I noticed that the data is requested as the string. I thought that in this case, your CSV data might be able to be directly sent. But for example, when you want to sent a binary data, it will occur an error. So in this answer, I would like to propose the modified script of 2 patterns.
I thought that the situation might similar to this thread. But I noticed that you are using the different library from the thread. So I post this answer.
Pattern 1:
In this pattern, it supposes that only the text data is sent. It's like the CSV data in your replying. In this case, I think that it is not required to modify the library.
Modified script:
S3.init("MY_ACCESS_KEY", "MY_SECRET_KEY"); // Please set this.
var spreadsheetId = "###"; // Please set the Spreadsheet ID.
var sheetName = "Sheet1"; // Please set the sheet name.
var region = "###"; // Please set this.
var csv = SpreadsheetApp
.openById(spreadsheetId)
.getSheetByName(sheetName)
.getDataRange()
.getValues() // or .getDisplayValues()
.map(r => r.join(","))
.join("\n");
var blob = Utilities.newBlob(csv, MimeType.CSV, sheetName + ".csv");
S3.putObject("bucketName", "test.csv", blob, region);
Pattern 2:
In this pattern, it supposes that both the text data and binary data are sent. In this case, it is required to also modify the library side.
For google-apps-script-for-aws
Please modify the line 110 in s3.js as follows.
From:
var content = object.getDataAsString();
To:
var content = object.getBytes();
And, please modify the line 146 in s3.js as follows.
From:
Utilities.DigestAlgorithm.MD5, content, Utilities.Charset.UTF_8));
To:
Utilities.DigestAlgorithm.MD5, content));
For Google Apps Script:
In this case, please give the blob to S3.putObject as follows.
Script:
S3.init("MY_ACCESS_KEY", "MY_SECRET_KEY"); // Please set this.
var fileId = "###"; // Please set the file ID.
var region = "###"; // Please set this.
var blob = DriveApp.getFileById(fileId).getBlob();
S3.putObject("bucketName", blob.getName(), blob, region);
References:
viuinsight/google-apps-script-for-aws
Class UrlFetchApp
computeDigest(algorithm, value)
PutObject
I am using a REST API with a POST request. I have created a CSV file to load in various inputs and using the Collection Runner to submit my requests and run the associated JavaScript Tests iteratively. I am trying to figure out how I can also have an entry in each row of the CSV to reference for my JavaScript Test in order to make the JavaScript dynamic. I've searched the Postman documentation and forums, as well as Google and Stack Overflow, but I haven't found anything that works. Here is a basic example of what I'm trying to accomplish.
Let's say I have a basic adding API. Here is my Request:
{
"Numbers": {
"Value_1": {{val1}},
"Value_2": {{val2}},
}
}
The CSV file is as follows:
val1,val2,sum
1,1,2
2,2,4
3,3,6
For this example, lets assume that the API returns a response that includes the sum of val1 and val2; something like this:
{
"Numbers": {{sum}},
}
I am able to load val1 and val2 into my request and iterate through the request for each row, but I am having trouble incorporating the sum values (from the same CSV) into the JavaScript Test.
I am trying to do something like the test below where I can reference the sum value from my spreadsheet, but Postman doesn't like my syntax.
pm.test("Adding machine", function () {
var jsonData = pm.response.json();
pm.expect(jsonData.Numbers === {{sum}});
});
Does anyone have any suggestions? Is this even possible to do?
You could use the pm.iterationData().get('var_name') function and create a check like this?
pm.test("Sums are correctly calculated", () => {
pm.expect(pm.response.json().Numbers).to.equal(pm.iterationData.get('sum'))
})
Using a GET in postman with the URL posted below, I am able to store the entire response header in question with all of its data in a var, the issue for me is how do I verify the pieces of data inside that var
here is my URL
http://localhost/v1/accounts?pageNumber=1&pageSize=2
[
using postman I am able to get the above in a var
var XPaginationData = postman.getResponseHeader(pm.globals.get("PaginationHeader"));
pm.globals.set("XPaginationData", XPaginationData);
is there a way to get the individual values inside the response header X-Pagination stored in a different var to assert later
using this in postman
pm.globals.set("XPaginationData", JSON.stringify(pm.response.headers));
console.log(JSON.parse(pm.globals.get('XPaginationData')));
console.log(JSON.parse(pm.globals.get('XPaginationData'))[4].value);
I get
how would i go about getting "TotalCount" for example
BIG EDIT:
thanks to a coworker, the solution is this
//Filtering Response Headers to get PaginationHeader
var filteredHeaders = pm.response.headers.all()
.filter(headerObj => {
return headerObj.key == pm.globals.get("PaginationHeader");
});
// JSON parse the string of the requested response header
// from var filteredHeaders
var paginationObj = filteredHeaders[0].value;
paginationObj = JSON.parse(paginationObj);
//Stores global variable for nextpageURL
var nextPageURL = paginationObj.NextPageLink;
postman.setGlobalVariable("nextPageURL", nextPageURL);
You could use JSON.stringfy() when saving the environment variable and then use JSON.parse() to access the different properties or property that you need.
If you set a global variable for the response headers like this:
pm.globals.set('PaginationHeader', JSON.stringify(pm.response.headers))
Then you can get any of the data from the variable like this:
console.log(JSON.parse(pm.globals.get('PaginationHeader'))[1].value)
The image shows how this works in Postman. The ordering of the headers returned in the console is inconsistent so you will need to find the correct one to extract data from the X-Pagination header
Looks like an issue with Postman itself.
The only solution that worked for me was to stringify & parse the JSON again, like this:
var response = JSON.parse(JSON.stringify(res))
After doing this, the headers and all other properties are accessible as expected.
I'm using fabric.js to dynamically create textures in Threes.js, and I need to save the textures to AWS. I'm using meteor-slingshot, which normally takes images passed in through a file selector input. Here's the uploader:
var uploader = new Slingshot.Upload("myFileUploads");
uploader.send(document.getElementById('input').files[0], function (error, downloadUrl) {
if (error) {
console.error('Error uploading', uploader.xhr.response);
alert (error);
}
else {
Meteor.users.update(Meteor.userId(), {$push: {"profile.files":downloadUrl}});
}
});
Uploading works fine from the drive ... but I'm generating my files in the browser, not getting them from the drive. Instead, they are generated from a canvas element with the following method:
generateTex: function(){
var canvTex = document.getElementById('texture-generator');
var canvImg = canvTex.toDataURL('image/jpeg');
var imageNew = document.createElement( 'img' );
imageNew.src = canvImg;
}
This works great as well. If I console.log the imageNew, I get my lovely image with base 64 encoding:
<img src="data:image/jpeg;base64,/9j/
4AAQSkZJRgABAQAAAQABAAD/2wBDAAMCAgICAgMCAgID
//....carries on to 15k or so characters
If I console.log a file object added from the drive via filepicker ( not generated from a canvas ), I can see what the file object should look like:
file{
lastModified: 1384216556000
lastModifiedDate: Mon Nov 11 2013 16:35:56 GMT-0800 (PST)
name: "filename.png"
size: 3034
type: "image/png"
webkitRelativePath: ""
__proto__: File
}
But I can't create a file from the blob for upload, because there is no place in the file object to add the actual data.
To sum up I can:
Generate an image blob and display it in a dom element
Upload files from the drive using meteor-slingshot
inspect the existing file object
But I don't know how to convert the blob into a named file, so I can pass it to the uploader.
I don't want to download the image, (there are answers for that), I want to upload it. There is a "chrome only" way to do this with the filesystem API but I need something cross browser (and eventually cross platform). If someone could help me with this, I would have uncontainable joy.
Slingshot supports blobs just as well as files: https://github.com/CulturalMe/meteor-slingshot/issues/22
So when you have a canvas object called canvTex and a Slingshot.Upload instance called uploader, then uploading the canvas image is as easy as:
canvTex.toBlob(function (blob) {
uploader.send(blob, function (error, downloadUrl) {
//...
});
});
Because blobs have no names, you must take that into account when defining your directive. Do not attempt to generate a key based on the name of the file.
I am trying to use TideSDK's Ti.Network to set the name and value of my cookie.
But how do I get this cookie's value from my other pages?
var httpcli;
httpcli = Ti.Network.createHTTPCookie();
httpcli.setName(cname); //cname is my cookie name
httpcli.setValue(cvalue); //cvalue is the value that I am going to give my cookie
alert("COOKIE value is: "+httpcli.getValue());
How would I retrieve this cookie value from my next page? Thank you in advance!
ok, there are a lot of ways to create storage content on tidesdk. cookies could be one of them, but not necessary mandatory.
In my personal oppinion, cookies are too limited to store information, so I suggest you to store user information in a JSON File, so you can store from single pieces of information to large structures (depending of the project). Supposing you have a project in which the client have to store the app configuration like 'preferred path' to store files or saving strings (such first name, last name) you can use Ti.FileSystem to store and read such information.:
in the following example, I use jQuery to read a stored json string in a file:
File Contents (conf.json):
{
"fname" : "erick",
"lname" : "rodriguez",
"customFolder" : "c:\\myApp\\userConfig\\"
}
Note : For some reason, Tidesdk cannot parse a json structure like because it interprets conf.json as a textfile, so the parsing will work if you remove all the tabs and spaces:
{"fname":"erick","lname":"rodriguez","customFolder":"c:\\myApp\\userConfig\\"}
now let's read it.... (myappfolder is the path of your storage folder)
readfi = Ti.Filesystem.getFile(myappfolder,"conf.json");
Stream = Ti.Filesystem.getFileStream(readfi);
Stream.open(Ti.Filesystem.MODE_READ);
contents = Stream.read();
contents = JSON.parse(contents.toString);
console.log(contents);
now let's store it....
function saveFile(pathToFile) {
var readfi,Stream,contents;
readfi = Ti.Filesystem.getFile(pathToFile);
Stream = Ti.Filesystem.getFileStream(readfi);
Stream.open(Ti.Filesystem.MODE_READ);
contents = Stream.read();
return contents.toString();
};
//if a JSON var is defined into js, there is no problem
var jsonObject = {
"fname":"joe",
"lname":"doe",
"customFolder" : "c:\\joe\\folder\\"
}
var file = pathToMyAppStorage + "\\" + "conf.json";
var saved = writeTextFile(file,JSON.stringify(jsonObject));
console.log(saved);