Refetch queries with any combination of parameters - apollo

I have faced with a problem when refetching queries after mutation. If query has no parameters thats ok, but if query has several parameters, and different pages uses different of them. For example, GET_ITEMS query accepts parameters: userId, companyId, categoryId. How can I say to Apollo to refetch all this queries with any combination of parameters?

It seem there is no way I can make it now with Apollo Client. So I had to save the parameters of all GET_ITEMS calls from all pages, and then transfer the saved parameters to the refetchQueries mutation method. The code turned out like this:
ItemsContext.js
const ItemsContext = React.createContext({
cachedQueryVars: [],
});
ItemsList.js
...
render() {
...
return <ItemsContext.Consumer>{({cachedQueryVars}) => {
cachedQueryVars.push(variables);
return <Query query={GET_ITEMS} variables={variables} >
...
ItemEdit.js
...
render() {
...
return <ItemsContext.Consumer>{({cachedQueryVars}) =>
<Mutation mutation={UPDATE_ITEM_MUTATION}
refetchQueries={({data}) => this.handleRefetchQueries(data.updateItem, cachedQueryVars)}
...
}
handleRefetchQueries(newItem, cachedItemsQueryVars) {
let result = [];
let filtered = null;
if(this.state.oldCategoryId != newItem.category.id) {
filtered = cachedItemsQueryVars.filter(v => v.categoryId == this.state.oldCategoryId);
result = this.concatItemQueryVars(result, filtered);
filtered = cachedItemsQueryVars.filter(v => v.categoryId == newItem.category.id);
result = this.concatItemQueryVars(result, filtered);
}
if(this.state.oldCompanyId != newItem.company.id) {
filtered = cachedItemsQueryVars.filter(v => v.companyId == this.state.oldCompanyId);
result = this.concatItemQueryVars(result, filtered);
filtered = cachedItemsQueryVars.filter(v => v.companyId == newItem.company.id);
result = this.concatItemQueryVars(result, filtered);
}
...
return result;
}
concatItemQueryVars(result, filtered) {
return result.concat(filtered.map(v => ({
query: GET_ITEMS,
variables: v
})));
}

Related

Is it possible to build dynamic queries for Amplify Datastore?

I am looking to create a query-builder for my Amplify Datastore.
The function should process an an array of conditions, that need to be applied to the query and return the according Predicate.
This is easily done, if there is only one filter, but I would like to be able to process any amount of filters.
My goal is to be able to write the queries like so:
Datastore.query(Post, *queryBuilder(filters)*)
Where I can pass an array of filters with a filter looking like this:
filter = {
connector: 'or' |
property: rating
predicate: 'gt'
value: 4
}
and the query builder returns the Predicate in the below mentioned format.
I have tried to chain and return multiple functions in the query builder, but I was not able to figure out a pattern for how to create the correct predicate function.
For reference, this is how queries are built according to the docs: https://docs.amplify.aws/lib/datastore/data-access/q/platform/js#predicates
const posts = await DataStore.query(Post, c => c.rating("gt", 4));
and for multiple conditions:
const posts = await DataStore.query(Post, c =>
c.rating("gt", 4).status("eq", PostStatus.PUBLISHED)
);
Let's say we have the model:
type Post #model{
id: ID!
category: String
city: String
content: String
}
And we want to query & filter by city and category by a dynamic amount of variables. Then we can make a function as such on our script:
const fetchData = async props => {
/*
More configurable wrapper for Datastore.query calls
#param props: {model: Model, criteria: [{fieldId, predicate, value}]}.
*/
try {
let criteria;
if (props.criteria && typeof props.criteria === 'object') {
criteria = c => {
props.criteria.forEach(item => {
const predicate = item.predicate || 'eq';
c[item.fieldId](predicate, item.value);
});
return c;
};
} else {
criteria = props.criteria;
}
return await DataStore.query(props.model, criteria);
} catch (e) {
throw new Error(e);
}
}
So now if we want to execute this we can pass the parameters:
// where Post = models.Post
const myResult = fetchData({model: Post, criteria: [
{ fieldId: 'category',
predicate: 'eq',
value: 'news'
},
{
fieldId: 'city',
predicate: 'eq',
value: 'SomeCityName'
}]
})
Unfortunately I do not know of a way to also query linked relationships as you would using a direct graphQL api query while using DataStore and this method I presented only uses implicit AND between criteria.
I don't know if this has changed since you asked the question but, based on the documents, it looks like multiple conditions have an implicit and, but you can explicitly chain them with or/and/not:
const posts = await DataStore.query(Post, c => c.or(
c => c.rating("gt", 4).status("eq", PostStatus.PUBLISHED)
));

User Logging automation via Cloudwatch

I Have this task for my company where i have to do a monthly User access review via cloudwatch.
This is a manual process where i have to go to cloudwatch > cloudwatch_logs > log_groups > /var/log/example_access > example-instance and then document the logs for a list of users from random generated date. The example instance is a certificate manager box which is linked to our entire production fleet nodes. I also have to document what command that user used on a specific nodes.
Wondering is there any way i can automate this process and dump it into word docs? it's getting painful as the list of user/employees are increasing. Thanks
Sure there is, I don't reckon you want Word docs though, I'd launch an elasticsearch instance on AWS and then give users who want data Kibana access.
Also circulating word docs in an org is bad juju, depending on your windows/office version it carries risks.
Add this lambda function and then go into cloudwatch and add it as subscription filter to the right log groups.
Note you may get missing log entries if they're not logged in JSON format or have funky formatting, if you're using a standard log format it should work.
/* eslint-disable */
// Eslint disabled as this is adapted AWS code.
const zlib = require('zlib')
const elasticsearch = require('elasticsearch')
/**
* This is an example function to stream CloudWatch logs to ElasticSearch.
* #param event
* #param context
* #param callback
* #param utils
*/
export default (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = true
const payload = new Buffer(event.awslogs.data, 'base64')
const esClient = new elasticsearch.Client({
httpAuth: process.env.esAuth, // your params here
host: process.env.esEndpoint, // your params here.
})
zlib.gunzip(payload, (err, result) => {
if (err) {
return callback(null, err)
}
const logObject = JSON.parse(result.toString('utf8'))
const elasticsearchBulkData = transform(logObject)
const params = { body: [] }
params.body.push(elasticsearchBulkData)
esClient.bulk(params, (err, resp) => {
if (err) {
callback(null, 'success')
return
}
})
callback(null, 'success')
})
}
function transform(payload) {
if (payload.messageType === 'CONTROL_MESSAGE') {
return null
}
let bulkRequestBody = ''
payload.logEvents.forEach((logEvent) => {
const timestamp = new Date(1 * logEvent.timestamp)
// index name format: cwl-YYYY.MM.DD
const indexName = [
`cwl-${process.env.NODE_ENV}-${timestamp.getUTCFullYear()}`, // year
(`0${timestamp.getUTCMonth() + 1}`).slice(-2), // month
(`0${timestamp.getUTCDate()}`).slice(-2), // day
].join('.')
const source = buildSource(logEvent.message, logEvent.extractedFields)
source['#id'] = logEvent.id
source['#timestamp'] = new Date(1 * logEvent.timestamp).toISOString()
source['#message'] = logEvent.message
source['#owner'] = payload.owner
source['#log_group'] = payload.logGroup
source['#log_stream'] = payload.logStream
const action = { index: {} }
action.index._index = indexName
action.index._type = 'lambdaLogs'
action.index._id = logEvent.id
bulkRequestBody += `${[
JSON.stringify(action),
JSON.stringify(source),
].join('\n')}\n`
})
return bulkRequestBody
}
function buildSource(message, extractedFields) {
if (extractedFields) {
const source = {}
for (const key in extractedFields) {
if (extractedFields.hasOwnProperty(key) && extractedFields[key]) {
const value = extractedFields[key]
if (isNumeric(value)) {
source[key] = 1 * value
continue
}
const jsonSubString = extractJson(value)
if (jsonSubString !== null) {
source[`$${key}`] = JSON.parse(jsonSubString)
}
source[key] = value
}
}
return source
}
const jsonSubString = extractJson(message)
if (jsonSubString !== null) {
return JSON.parse(jsonSubString)
}
return {}
}
function extractJson(message) {
const jsonStart = message.indexOf('{')
if (jsonStart < 0) return null
const jsonSubString = message.substring(jsonStart)
return isValidJson(jsonSubString) ? jsonSubString : null
}
function isValidJson(message) {
try {
JSON.parse(message)
} catch (e) { return false }
return true
}
function isNumeric(n) {
return !isNaN(parseFloat(n)) && isFinite(n)
}
Now you should have your logs going into elastic, go into Kibana and you can search by date and even write endpoints to allow people to query their own data!
Easy way is just give stakeholders Kibana access and let them check it out.
Might not be exactly what ya wanted by I reckon it'll work better.

How can I pass parameters to django api call via redux?

How can I pass the name, age, height arguments to my redux action ?
I am using django rest api as the backend
export const Search_Results = (name, age, height) => {
return dispatch => {
axios.get("http://127.0.0.1:8000/api/")
.then(res => {
const info = res.data;
dispatch(presentResult(info))
});
}
};
The presentResult is
export const presentResult = results => {
return {
type: actionTypes.PRESENT_RESULTS,
results: results
}
};
My reducer is
const presentResult = (state, action) => {
return updateObject(state, {
results: action.results
});
};
switch (action.type)
case actionTypes.PRESENT_RESULTS:
return presentResult(state, action);
The updateObject is simply this
export const updateObject = (oldObject, updatedProperties) => {
return {
...oldObject,
...updatedProperties
}
};
Update:
Basically info is all the data in the database, and the user searches by passing the parameters name, age, height. I am trying to filter the info data and to pass it only if it has any of the keywords in name, age, height.
You can just make them part of the payload:
dispatch(presentResult({info, name, age, height}));
You'll be able to access the values in reducer like so:
//...
case actionTypes.PRESENT_RESULTS,:
return {...state, results: action.results}
//...
Edit: If you want to actually filter the data in reducer, use filter method for arrays:
//...
case actionTypes.PRESENT_RESULTS,:
return {
...state,
results: state.results.filter(result => {
return result.name === action.results.name || result.age === action.results.age || result.height === action.results.height
})
}
//...
If you want to find an object by exact match on all the fields, use `&&` instead.

Shorting list by name - Angular 5 + Firebase

I have created a service where I get all the elements of my database:
Service
getElements() {
return (this.eleList= this.firebase.list("elements"));
}
Component
eleList: Element[];
getBets() {
return this.databaseService
.getElements()
.snapshotChanges()
.subscribe(item => {
this.eleList= [];
item.forEach(element => {
let x = element.payload.toJSON();
x["$key"] = element.key;
this.eleList.push(x as Element);
});
});
}
With these two methods what I do is to store all my elements in this.eleList.
I would like to create a new method, named filterByName(name), where I would update this.eleList to an array which contains only the ones that contain namein the object, for example, this.eleList[1].name
I do not know if Firebase provides a way to short it, or I need to use Javascript/Typescript for it.
Firebase takes full advantage of the observables and async pipes.
You should take advantage of that :
eleList$ = new Subject();
getElements() {
this.this.firebase.list("elements")
.pipe(take(1))
.subscribe(list => this.eleList$.next(list));
}
getBets() {
this.databaseService
.getElements()
.snapshotChanges()
.pipe(
map(item => items.map(element => ({
...element.payload.toJSON(),
'$key': element.key
})))
)
.subscribe(elements => this.eleList$.next(list));
}
Now for a sorted list :
sortedList$ = this.eleList$.pipe(
map(elements => elements.filter(element => !!element.name))
);

How do you sort results of a _View_ by value in the in Couchbase?

So from what I understand in Couchbase is that one can sort keys* by using
descending=true
but in my case I want to sort by values instead. Consider the Twitter data in json format, my question is What it the most popular user mentioned?
Each tweet has the structure of:
{
"text": "",
"entities" : {
"hashtags" : [ ... ],
"user_mentions" : [ ...],
"urls" : [ ... ]
}
So having used MongoDB before I reused the Map function and modified it slightly to be usable in Couchbase as follows:
function (doc, meta) {
if (!doc.entities) { return; }
doc.entities.user_mentions.forEach(
function(mention) {
if (mention.screen_name !== undefined) {
emit(mention.screen_name, null);
}
}
)
}
And then I used the reduce function _count to count all the screen_name occurrences. Now my problem is How do I sort by the count values, rather than the key?
Thanks
The short answer is you cannot sort by value the result of you view. You can only sort by key.
Some work around will be to either:
analyze the data before inserting them into Couchbase and create a counter for the values you are interested by (mentions in your case)
use the view you have to sort on the application size if the size of the view is acceptable for a client side sort.
The following JS code calls a view, sorts the result, and prints the 10 hottest subjects (hashtags):
var http = require('http');
var options = {
host: '127.0.0.1',
port: 8092,
path: '/social/_design/dev_tags/_view/tags?full_set=true&connection_timeout=60000&group=true',
method: 'GET'
}
http.request(
options,
function(res) {
var buf = new Buffer(0);
res.on('data', function(data) {
buf += data;
});
res.on('end', function() {
var tweets = JSON.parse(buf);
var rows = tweets.rows;
rows.sort( function (a,b){ return b.value - a.value }
);
for ( var i = 0; i < 10; i++ ) {
console.log( rows[i] );
}
});
}
).end();
In the same time I am looking at other options to achieve this
I solved this by using a compound key.
function (doc, meta) {
emit([doc.constraint,doc.yoursortvalue]);
}
url elements:
&startkey=["jim",5]&endkey=["jim",10]&descending=true