Suppose I have the following Json response
[
{
id: 1,
name: "John",
password: "JohnsPassword54",
},
{
id: 2,
name: "David",
password: "DavidsPassword24",
}
]
Then how can I extract the array with name David to do further validation?
e.g. I want to say if name == David then save the id
Well done :) Mastering Json-Path is key to get the most out of Karate !
Just for the sake of demo, here is another option, using the get keyword to get the first element out of the array returned, as Json-Path wildcard searches always return an array:
* def response =
"""
[
{
id: 1,
name: "John",
password: "JohnsPassword54"
},
{
id: 2,
name: "David",
password: "DavidsPassword24"
}
]
"""
* def userId = get[0] response $[?(#.name == 'David')].id
* match userId == 2
I found the solution in the Json expression evaluation -
def user = $..[?(#.name == 'David')]
Then I can use the following -
def userId = user[0].id
Related
I have a mongodb collection named "users" with a few thousand users. Due to lack of validation users were able to create "username" with spaces in it. I.e, user was able to create username such as "I am the best" or " I am the best" or "I am the best " and so on. Since "username" field was not used in any form in the system it was just ok until now.
From now on the client wants to use "username" field finally, that is, to make urls such as "https://example.com/profile/{username}".
The problem is this that the "username" field values have spaces at the beginning, middle and at the end as shown above, on random. So I want to remove them using a query.
I am able to list all users using:
db.users.find({username:{ "$regex" : ".*[^\S].*" , "$options" : "i"}}).pretty();
What is the best approach to remove all spaces in username field and save them back? I am not sure how to update them in a single query.
Help is appreciated!
Ps. I actually need to write a code block to replace these usernames while checking for "existing" usernames so that there is no duplicate but I would still want to know how I do it if I need to do it using mongodb query.
The problem is this that the "username" field values have spaces at the beginning, middle and at the end as shown above, on random. So I want to remove them using a query.
MongoDB 4.4 or Above:
You can use update with aggregation pipeline starting from MongoDB 4.2,
$replaceAll starting from MongoDB 4.4
it will find white space and replace with blank
db.users.update(
{ username: { $regex: " " } },
[{
$set: {
username: {
$replaceAll: {
input: "$username",
find: " ",
replacement: ""
}
}
}
}],
{ multi: true }
)
Playground
MongoDB 4.2 or Above:
You can use update with aggregation pipeline starting from MongoDB 4.2,
$trim to remove white space from both left and right
$split to split username by space and result array
$reduce to iterate loop of above split result
$concat to concat username
db.users.update(
{ username: { $regex: " " } },
[{
$set: {
username: {
$reduce: {
input: { $split: [{ $trim: { input: "$username" } }, " "] },
initialValue: "",
in: { $concat: ["$$value", "$$this"] }
}
}
}
}],
{ multi: true }
)
Playground
MongoDB 3.6 or Above:
find all users and loop through forEach
replace to apply pattern to remove white space, you can update pattern as per your requirement
updateOne to update updated username
db.users.find({ username: { $regex: " " } }, { username: 1 }).forEach(function(user) {
let username = user.username.replace(/\s/g, "");
db.users.updateOne({ _id: user._id }, { $set: { username: username } });
})
I am new to flutter.
How can I retrieve data from Firestore and form a new profile with every child in my collection? Second question is how can I use this list in another dart file? thank you
Thank you.
final List<Profile> demoProfiles = [
Profile (
photos: [
"https://",
],
name: "abc",
),
Profile (
photos: [
"https://",
],
name: "abc",
)
];
Assuming you have a firestore strucuture like this:
Profile
photos: []
name: ""
age: ""
distance: ""
education: ""
You can fetch data and build it into your object with this code snippet:
fetchData() async{
final db = await Firestore.instance;
List<Profile> demoProfiles = []
db.collection("Profile").get().then(function(snapshot){
snapshot.forEach((document) {
demoProfiles.add(Profile(
photos: document.data.photos,
name: document.data.name,
age: document.data.age,
distance: document.data.distance,
education: document.data.education
))
})
})
}
Edit:
1) Remove the mockedup list of profiles from your profiles class, it should not be there
2) Edit your mainController to the following:
class _MainControllerState extends State<MainController> {
List<Profile> demoProfiles = [];
demoProfiles = fetchData();
Final MatchEngine match Engine = MatchEngine (
matches:demoProfiles.map((Profile profile) => Match(profile: profile)).toList();
);
fetchData() async{
final db = await Firestore.instance;
List<Profile> list = [];
db.collection("Profile").get().then(function(snapshot){
snapshot.forEach((document) {
list.add(Profile(
photos: document.data.photos,
name: document.data.name,
age: document.data.age,
distance: document.data.distance,
education: document.data.education
))
})
});
return list;
}
}
I'm playing with the New Data API for Amazon Aurora Serverless
Is it possible to get the table column names in the response?
If for example I run the following query in a user table with the columns id, first_name, last_name, email, phone:
const sqlStatement = `
SELECT *
FROM user
WHERE id = :id
`;
const params = {
secretArn: <mySecretArn>,
resourceArn: <myResourceArn>,
database: <myDatabase>,
sql: sqlStatement,
parameters: [
{
name: "id",
value: {
"stringValue": 1
}
}
]
};
let res = await this.RDS.executeStatement(params)
console.log(res);
I'm getting a response like this one, So I need to guess which column corresponds with each value:
{
"numberOfRecordsUpdated": 0,
"records": [
[
{
"longValue": 1
},
{
"stringValue": "Nicolas"
},
{
"stringValue": "Perez"
},
{
"stringValue": "example#example.com"
},
{
"isNull": true
}
]
]
}
I would like to have a response like this one:
{
id: 1,
first_name: "Nicolas",
last_name: "Perez",
email: "example#example.com",
phone: null
}
update1
I have found an npm module that wrap Aurora Serverless Data API and simplify the development
We decided to take the current approach because we were trying to cut down on the response size and including column information with each record was redundant.
You can explicitly choose to include column metadata in the result. See the parameter: "includeResultMetadata".
https://docs.aws.amazon.com/rdsdataservice/latest/APIReference/API_ExecuteStatement.html#API_ExecuteStatement_RequestSyntax
Agree with the consensus here that there should be an out of the box way to do this from the data service API. Because there is not, here's a JavaScript function that will parse the response.
const parseDataServiceResponse = res => {
let columns = res.columnMetadata.map(c => c.name);
let data = res.records.map(r => {
let obj = {};
r.map((v, i) => {
obj[columns[i]] = Object.values(v)[0]
});
return obj
})
return data
}
I understand the pain but it looks like this is reasonable based on the fact that select statement can join multiple tables and duplicated column names may exist.
Similar to the answer above from #C.Slack but I used a combination of map and reduce to parse response from Aurora Postgres.
// declarative column names in array
const columns = ['a.id', 'u.id', 'u.username', 'g.id', 'g.name'];
// execute sql statement
const params = {
database: AWS_PROVIDER_STAGE,
resourceArn: AWS_DATABASE_CLUSTER,
secretArn: AWS_SECRET_STORE_ARN,
// includeResultMetadata: true,
sql: `
SELECT ${columns.join()} FROM accounts a
FULL OUTER JOIN users u ON u.id = a.user_id
FULL OUTER JOIN groups g ON g.id = a.group_id
WHERE u.username=:username;
`,
parameters: [
{
name: 'username',
value: {
stringValue: 'rick.cha',
},
},
],
};
const rds = new AWS.RDSDataService();
const response = await rds.executeStatement(params).promise();
// parse response into json array
const data = response.records.map((record) => {
return record.reduce((prev, val, index) => {
return { ...prev, [columns[index]]: Object.values(val)[0] };
}, {});
});
Hope this code snippet helps someone.
And here is the response
[
{
'a.id': '8bfc547c-3c42-4203-aa2a-d0ee35996e60',
'u.id': '01129aaf-736a-4e86-93a9-0ab3e08b3d11',
'u.username': 'rick.cha',
'g.id': 'ff6ebd78-a1cf-452c-91e0-ed5d0aaaa624',
'g.name': 'valentree',
},
{
'a.id': '983f2919-1b52-4544-9f58-c3de61925647',
'u.id': '01129aaf-736a-4e86-93a9-0ab3e08b3d11',
'u.username': 'rick.cha',
'g.id': '2f1858b4-1468-447f-ba94-330de76de5d1',
'g.name': 'ensightful',
},
]
Similar to the other answers, but if you are using Python/Boto3:
def parse_data_service_response(res):
columns = [column['name'] for column in res['columnMetadata']]
parsed_records = []
for record in res['records']:
parsed_record = {}
for i, cell in enumerate(record):
key = columns[i]
value = list(cell.values())[0]
parsed_record[key] = value
parsed_records.append(parsed_record)
return parsed_records
I've added to the great answer already provided by C. Slack to deal with AWS handling empty nullable character fields by giving the response { "isNull": true } in the JSON.
Here's my function to handle this by returning an empty string value - this is what I would expect anyway.
const parseRDSdata = (input) => {
let columns = input.columnMetadata.map(c => { return { name: c.name, typeName: c.typeName}; });
let parsedData = input.records.map(row => {
let response = {};
row.map((v, i) => {
//test the typeName in the column metadata, and also the keyName in the values - we need to cater for a return value of { "isNull": true } - pflangan
if ((columns[i].typeName == 'VARCHAR' || columns[i].typeName == 'CHAR') && Object.keys(v)[0] == 'isNull' && Object.values(v)[0] == true)
response[columns[i].name] = '';
else
response[columns[i].name] = Object.values(v)[0];
}
);
return response;
}
);
return parsedData;
}
I am trying to iterate through a map and create a new map value. The below is the input
def map = [[name: 'hello', email: ['on', 'off'] ], [ name: 'bye', email: ['abc', 'xyz']]]
I want the resulting data to be like:
[hello: ['on', 'off'], bye: ['abc', 'xyz']]
The code I have right now -
result = [:]
map.each { key ->
result[random] = key.email.each {random ->
"$random"
}
}
return result
The above code returns
[hello: [on, off], bye: [abc, xyz]]
As you can see from above, the quotes from on, off and abc, xyz have disappeared, which is causing problems for me when i am trying to do checks on the list value [on, off]
It should not matter. If you see the result in Groovy console, they are still String.
Below should be sufficient:
map.collectEntries {
[ it.name, it.email ]
}
If you still need the single quotes to create a GString instead of a String, then below tweak would be required:
map.collectEntries {
[ it.name, it.email.collect { "'$it'" } ]
}
I personally do not see any reasoning behind doing the later way. BTW, map is not a Map, it is a List, you can rename it to avoid unnecessary confusions.
You could convert it to a json object and then everything will have quotes
This does it. There should/may be a groovier way though.
def listOfMaps = [[name: 'hello', email: ['on', 'off'] ], [ name: 'bye', email: ['abc', 'xyz']]]
def result = [:]
listOfMaps.each { map ->
def list = map.collect { k, v ->
v
}
result[list[0]] = ["'${list[1][0]}'", "'${list[1][1]}'"]
}
println result
I am so sorry, but after one day researching and trying all different combinations and npm packages, I am still not sure how to deal with the following task.
Setup:
MongoDB 2.6
Node.JS with Mongoose 4
I have a schema like so:
var trackingSchema = mongoose.Schema({
tracking_number: String,
zip_code: String,
courier: String,
user_id: Number,
created: { type: Date, default: Date.now },
international_shipment: { type: Boolean, default: false },
delivery_info: {
recipient: String,
street: String,
city: String
}
});
Now user gives me a search string, a rather an array of strings, which will be substrings of what I want to search:
var search = ['15323', 'julian', 'administ'];
Now I want to find those documents, where any of the fields tracking_number, zip_code, or these fields in delivery_info contain my search elements.
How should I do that? I get that there are indexes, but I probably need a compound index, or maybe a text index? And for search, I then can use RegEx, or the $text $search syntax?
The problem is that I have several strings to look for (my search), and several fields to look in. And due to one of those aspects, every approach failed for me at some point.
Your use case is a good fit for text search.
Define a text index on your schema over the searchable fields:
trackingSchema.index({
tracking_number: 'text',
zip_code: 'text',
'delivery_info.recipient': 'text',
'delivery_info.street': 'text',
'delivery_info.city': 'text'
}, {name: 'search'});
Join your search terms into a single string and execute the search using the $text query operator:
var search = ['15232', 'julian'];
Test.find({$text: {$search: search.join(' ')}}, function(err, docs) {...});
Even though this passes all your search values as a single string, this still performs a logical OR search of the values.
Why just dont try
var trackingSchema = mongoose.Schema({
tracking_number: String,
zip_code: String,
courier: String,
user_id: Number,
created: { type: Date, default: Date.now },
international_shipment: { type: Boolean, default: false },
delivery_info: {
recipient: String,
street: String,
city: String
}
});
var Tracking = mongoose.model('Tracking', trackingSchema );
var search = [ "word1", "word2", ...]
var results = []
for(var i=0; i<search.length; i++){
Tracking.find({$or : [
{ tracking_number : search[i]},
{zip_code: search[i]},
{courier: search[i]},
{delivery_info.recipient: search[i]},
{delivery_info.street: search[i]},
{delivery_info.city: search[i]}]
}).map(function(tracking){
//it will push every unique result to variable results
if(results.indexOf(tracking)<0) results.push(tracking);
});
Okay, I came up with this.
My schema now has an extra field search with an array of all my searchable fields:
var trackingSchema = mongoose.Schema({
...
search: [String]
});
With a pre-save hook, I populate this field:
trackingSchema.pre('save', function(next) {
this.search = [ this.tracking_number ];
var searchIfAvailable = [
this.zip_code,
this.delivery_info.recipient,
this.delivery_info.street,
this.delivery_info.city
];
for (var i = 0; i < searchIfAvailable.length; i++) {
if (!validator.isNull(searchIfAvailable[i])) {
this.search.push(searchIfAvailable[i].toLowerCase());
}
}
next();
});
In the hope of improving performance, I also index that field (also the user_id as I limit search results by that):
trackingSchema.index({ search: 1 });
trackingSchema.index({ user_id: 1 });
Now, when searching I first list all substrings I want to look for in an array:
var andArray = [];
var searchTerms = searchRequest.split(" ");
searchTerms.forEach(function(searchTerm) {
andArray.push({
search: { $regex: searchTerm, $options: 'i'
}
});
});
I use this array in my find() and chain it with an $and:
Tracking.
find({ $and: andArray }).
where('user_id').equals(userId).
limit(pageSize).
skip(pageSize * page).
exec(function(err, docs) {
// hooray!
});
This works.