Why does this MapReduce mapping function emit not work? - mapreduce

I have a Cloudant database with documents that use the following format:
{
"_id": "0ea1ac7d5ef28860abc7030444515c4c",
"_rev": "1-362058dda0b8680a818b38e9c68c5389",
"text": "text-data",
"time-data": 1452988105,
"time-text": "3:48 PM - 16 Jan 2016",
"link": "http://url/to/website"
}
I'm trying to create a view to easily count documents between a start and end time-data. However, this mapping function results in a query returning "No Documents Found":
function (doc) {
emit(doc.time-data, 1);
}
... while this does:
function (doc) {
emit(doc._id, 1);
}
Why is this the case?

The issue is with the name of your field. It contains a dash: -
Javascript interprets this as:
return doc.time - data
return doc.time minus data
You can either change your property (to something like time_data), or you can create your view like this:
function (doc) {
if (doc['time-data']) {
emit(doc['time-data'], 1);
}
}

Related

Apollo - how to use cache.readFragment() on a filtered field?

I am trying to use cache.readFragment() to retrieve posts from the following cache entry:
{
...
"PublisherType:37": {
"__typename": "PublisherType",
"id": "37",
"posts({\"archived\":false,\"collection\":\"6\",\"sort\":\"-ordering\"})": {
"__typename": "PostConnection",
"totalCount": 8,
"edges": [...]
}
}
...
}
The following does not work and always returns null. I believe this is because there isn't a posts field in the cache:
const posts = cache.readFragment({
id: "PublisherType:37",
fragment: gql`
fragment PublisherPosts on PublisherType {
posts
}
`
});
​​
Is there a way I can retrieve the posts without knowing the filters?
It works fine if I change the fragment to:
fragment PublisherPosts on PublisherType {
posts(archived:false, collection:6, sort: "-ordering")
}
Is there a way I can query this data without having to know which filters are used? Something like posts* or even posts(*) (tried both BTW).

AdonisJS exists Validator

I'm following the official [documentation] (https://legacy.adonisjs.com/docs/4.0/validator) && indicative, but I couldn't find anything to help me.
I want to validate if the given param exists on database.
So I tried:
app/Validators/myValidator
const { rule } = use('Validator')
get rules () {
return {
userId: 'required|integer|exists:MyDatabase.users,userId', <-- this is what isn't working
date: [
rule('date'),
rule('dateFormat', 'YYYY-MM-DD')
]
}
}
// Getting the data to be validated
get data () {
const params = this.ctx.params
const { userId, date } = params
return Object.assign({}, { userId }, { date })
}
It gives me the following error:
{
"error": {
"message": "select * from `MyDatabase`.`users`.`userId` where `undefined` = '2' limit 1 - ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '.`userId` where `undefined` = '2' limit 1' at line 1",
"name": "ErrorValidator",
"status": 40
}
}
How should I properly indicate that I want to compare MyDatabase.users.userid to the given parameter?
After a few hard try/error I stumbled upon the solution.
Just need to follow what is inside hooks.js and pass the values separated by comma, like so:
get rules () {
return {
userId: 'required|integer|exists:MyDatabase,MyTable,columntoCompare',
}
}

Flutter upload list to Google Firestore

I would like to add a list from my flutter test app to my Google Firestore.
This is my method, which adds all the data:
void postToFireStore(
{String mediaUrl, String location, String description}) {
var reference = Firestore.instance.collection('insta_posts');
reference.add({
"marked_friends": [chipstuete.toString()],
"username": currentUserModel.username,
"location": location,
"likes": {},
"mediaUrl": mediaUrl,
"description": description,
"ownerId": googleSignIn.currentUser.id,
"timestamp": new DateTime.now().toString(),
}).then((DocumentReference doc) {
String docId = doc.documentID;
reference.document(docId).updateData({"postId": docId});
});
}
Everything is working fine, expect the list "marked_friends"...
The list "chipstuete" has multiple strings:
[Martin Seubert, Lena Hessler, Vivien Jones]
But my Firestore looks like that:
At the moment the whole list is stored in marked_friends[0]...
What do I need to change, that every entry of my list "chipstuete" is stored in a seperate field of my array "marked_friends" in Firestore?
Best regards!
You have to add a method in your AppProfile class that serializes it to a List.
So in your AppProfile class:
class AppProfile {
... // Whatever fields/methods you have
// Add this method
List<String> getFriendList() {
// Somehow implement it so it returns a List<String> based on your fields
return ['name1','name2','name3'];
}
}
Then you can do
"marked_friends": chipstuete.getFriendList(),
I have the solution.
Like SwiftingDuster said, I needed a new method which serializes it to a List:
List<String> toList() {
chipstuete.forEach((item) {
newtuete.add(item.toString());
});
return newtuete.toList();
}
After that I just call toList() in my postToFirestore() Method and add "marked_friends": newtuete. Thats it!

How do I resolve data for a schema type that implement an interface in GraphQL?

I’m trying to develop a spring boot graphQl service using graphql-java-8 library. I’m fetching data from a web-service, the response I get from the service is a bit like dynamic for which I have to introduce graphQl interface in my response graphQl schema.
extend type Query {
search(
name: String,
category: String
): [ResultOne]
}
interface ResultOne {
name: String
content: String
}
type Fish implements ResultOne {
name: String
content: String
weight: Float
}
type Fruit implements ResultOne {
name: String
content: String
Color: String
}
type Toy implements ResultOne {
name: String
content: String
description: String
}
To wiring my model with graphQl framework,
return RuntimeWiring.newRuntimeWiring()
.wiringFactory(new WiringFactory() {})
.type(
TypeRuntimeWiring.newTypeWiring("ResultOne")
.typeResolver(env -> {
if(env.getObject() instanceof Map) {
Map object = env.getObject();
if (object.containsKey("name") && object.get("name").equals("fish")) {
return (GraphQLObjectType) env.getSchema().getType("Fish");
} else if (object.containsKey("name") && object.get("name").equals("fruit")) {
return (GraphQLObjectType) env.getSchema().getType("Fruit");
} else if(object.containsKey("name") && object.get("name").equals("toy")) {
return (GraphQLObjectType) env.getSchema().getType("Toy");
} else {
return null;
}
} else {
return null;
}
})
)
.build();
So, type resolving issue is also fix a way, not sure it’s ideal or not. For data binding I’m not sure how do I do that in graphQl’s recommended way. I would like to add that, I’ve a single End-Point and single fetcher for the whole API. Data are fetched in a single request and don't want to call twice as I already have whole response. I had to resolve the type at runtime and wire the data for implemented model. So far data are fetched perfectly and the values are also coming till the interface against my query, but appeared null for interface implemented model part e.g: Fish, Fruit & Toy in this example. My question is how do I manupulate dynamically resolved type data for the java library?
Feel free to ask me any question regarding this issue.
Sample query:
{
search() {
ResultOne {
name
content
... on Fish {
weight
}
}
}
}
Corresponding response that I'm currently getting:
{
"data": {
"search": [
{
"resultOne": [
{
"name": "Salmon",
"content": "Frozen Food",
"weight": null
}
]
}
]
},
"extensions": {
"Total-ResponseTime": 23020,
"resultOne-Time": 22683
}
}

Advanced update using mongodb [duplicate]

In MongoDB, is it possible to update the value of a field using the value from another field? The equivalent SQL would be something like:
UPDATE Person SET Name = FirstName + ' ' + LastName
And the MongoDB pseudo-code would be:
db.person.update( {}, { $set : { name : firstName + ' ' + lastName } );
The best way to do this is in version 4.2+ which allows using the aggregation pipeline in the update document and the updateOne, updateMany, or update(deprecated in most if not all languages drivers) collection methods.
MongoDB 4.2+
Version 4.2 also introduced the $set pipeline stage operator, which is an alias for $addFields. I will use $set here as it maps with what we are trying to achieve.
db.collection.<update method>(
{},
[
{"$set": {"name": { "$concat": ["$firstName", " ", "$lastName"]}}}
]
)
Note that square brackets in the second argument to the method specify an aggregation pipeline instead of a plain update document because using a simple document will not work correctly.
MongoDB 3.4+
In 3.4+, you can use $addFields and the $out aggregation pipeline operators.
db.collection.aggregate(
[
{ "$addFields": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}},
{ "$out": <output collection name> }
]
)
Note that this does not update your collection but instead replaces the existing collection or creates a new one. Also, for update operations that require "typecasting", you will need client-side processing, and depending on the operation, you may need to use the find() method instead of the .aggreate() method.
MongoDB 3.2 and 3.0
The way we do this is by $projecting our documents and using the $concat string aggregation operator to return the concatenated string.
You then iterate the cursor and use the $set update operator to add the new field to your documents using bulk operations for maximum efficiency.
Aggregation query:
var cursor = db.collection.aggregate([
{ "$project": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}}
])
MongoDB 3.2 or newer
You need to use the bulkWrite method.
var requests = [];
cursor.forEach(document => {
requests.push( {
'updateOne': {
'filter': { '_id': document._id },
'update': { '$set': { 'name': document.name } }
}
});
if (requests.length === 500) {
//Execute per 500 operations and re-init
db.collection.bulkWrite(requests);
requests = [];
}
});
if(requests.length > 0) {
db.collection.bulkWrite(requests);
}
MongoDB 2.6 and 3.0
From this version, you need to use the now deprecated Bulk API and its associated methods.
var bulk = db.collection.initializeUnorderedBulkOp();
var count = 0;
cursor.snapshot().forEach(function(document) {
bulk.find({ '_id': document._id }).updateOne( {
'$set': { 'name': document.name }
});
count++;
if(count%500 === 0) {
// Excecute per 500 operations and re-init
bulk.execute();
bulk = db.collection.initializeUnorderedBulkOp();
}
})
// clean up queues
if(count > 0) {
bulk.execute();
}
MongoDB 2.4
cursor["result"].forEach(function(document) {
db.collection.update(
{ "_id": document._id },
{ "$set": { "name": document.name } }
);
})
You should iterate through. For your specific case:
db.person.find().snapshot().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
Apparently there is a way to do this efficiently since MongoDB 3.4, see styvane's answer.
Obsolete answer below
You cannot refer to the document itself in an update (yet). You'll need to iterate through the documents and update each document using a function. See this answer for an example, or this one for server-side eval().
For a database with high activity, you may run into issues where your updates affect actively changing records and for this reason I recommend using snapshot()
db.person.find().snapshot().forEach( function (hombre) {
hombre.name = hombre.firstName + ' ' + hombre.lastName;
db.person.save(hombre);
});
http://docs.mongodb.org/manual/reference/method/cursor.snapshot/
Starting Mongo 4.2, db.collection.update() can accept an aggregation pipeline, finally allowing the update/creation of a field based on another field:
// { firstName: "Hello", lastName: "World" }
db.collection.updateMany(
{},
[{ $set: { name: { $concat: [ "$firstName", " ", "$lastName" ] } } }]
)
// { "firstName" : "Hello", "lastName" : "World", "name" : "Hello World" }
The first part {} is the match query, filtering which documents to update (in our case all documents).
The second part [{ $set: { name: { ... } }] is the update aggregation pipeline (note the squared brackets signifying the use of an aggregation pipeline). $set is a new aggregation operator and an alias of $addFields.
Regarding this answer, the snapshot function is deprecated in version 3.6, according to this update. So, on version 3.6 and above, it is possible to perform the operation this way:
db.person.find().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
I tried the above solution but I found it unsuitable for large amounts of data. I then discovered the stream feature:
MongoClient.connect("...", function(err, db){
var c = db.collection('yourCollection');
var s = c.find({/* your query */}).stream();
s.on('data', function(doc){
c.update({_id: doc._id}, {$set: {name : doc.firstName + ' ' + doc.lastName}}, function(err, result) { /* result == true? */} }
});
s.on('end', function(){
// stream can end before all your updates do if you have a lot
})
})
update() method takes aggregation pipeline as parameter like
db.collection_name.update(
{
// Query
},
[
// Aggregation pipeline
{ "$set": { "id": "$_id" } }
],
{
// Options
"multi": true // false when a single doc has to be updated
}
)
The field can be set or unset with existing values using the aggregation pipeline.
Note: use $ with field name to specify the field which has to be read.
Here's what we came up with for copying one field to another for ~150_000 records. It took about 6 minutes, but is still significantly less resource intensive than it would have been to instantiate and iterate over the same number of ruby objects.
js_query = %({
$or : [
{
'settings.mobile_notifications' : { $exists : false },
'settings.mobile_admin_notifications' : { $exists : false }
}
]
})
js_for_each = %(function(user) {
if (!user.settings.hasOwnProperty('mobile_notifications')) {
user.settings.mobile_notifications = user.settings.email_notifications;
}
if (!user.settings.hasOwnProperty('mobile_admin_notifications')) {
user.settings.mobile_admin_notifications = user.settings.email_admin_notifications;
}
db.users.save(user);
})
js = "db.users.find(#{js_query}).forEach(#{js_for_each});"
Mongoid::Sessions.default.command('$eval' => js)
With MongoDB version 4.2+, updates are more flexible as it allows the use of aggregation pipeline in its update, updateOne and updateMany. You can now transform your documents using the aggregation operators then update without the need to explicity state the $set command (instead we use $replaceRoot: {newRoot: "$$ROOT"})
Here we use the aggregate query to extract the timestamp from MongoDB's ObjectID "_id" field and update the documents (I am not an expert in SQL but I think SQL does not provide any auto generated ObjectID that has timestamp to it, you would have to automatically create that date)
var collection = "person"
agg_query = [
{
"$addFields" : {
"_last_updated" : {
"$toDate" : "$_id"
}
}
},
{
$replaceRoot: {
newRoot: "$$ROOT"
}
}
]
db.getCollection(collection).updateMany({}, agg_query, {upsert: true})
(I would have posted this as a comment, but couldn't)
For anyone who lands here trying to update one field using another in the document with the c# driver...
I could not figure out how to use any of the UpdateXXX methods and their associated overloads since they take an UpdateDefinition as an argument.
// we want to set Prop1 to Prop2
class Foo { public string Prop1 { get; set; } public string Prop2 { get; set;} }
void Test()
{
var update = new UpdateDefinitionBuilder<Foo>();
update.Set(x => x.Prop1, <new value; no way to get a hold of the object that I can find>)
}
As a workaround, I found that you can use the RunCommand method on an IMongoDatabase (https://docs.mongodb.com/manual/reference/command/update/#dbcmd.update).
var command = new BsonDocument
{
{ "update", "CollectionToUpdate" },
{ "updates", new BsonArray
{
new BsonDocument
{
// Any filter; here the check is if Prop1 does not exist
{ "q", new BsonDocument{ ["Prop1"] = new BsonDocument("$exists", false) }},
// set it to the value of Prop2
{ "u", new BsonArray { new BsonDocument { ["$set"] = new BsonDocument("Prop1", "$Prop2") }}},
{ "multi", true }
}
}
}
};
database.RunCommand<BsonDocument>(command);
MongoDB 4.2+ Golang
result, err := collection.UpdateMany(ctx, bson.M{},
mongo.Pipeline{
bson.D{{"$set",
bson.M{"name": bson.M{"$concat": []string{"$lastName", " ", "$firstName"}}}
}},
)