how to get data from {} in graphql - django

I want to get data about user addinfo(bool value).
when i do console.log(data.user), i can get data.user referred to below picture.
if when i do console.log(data.user.user), it shows that user is undefined referred to below picture.
{
user(token: "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VybmFtZSI6ImI3ZTA5YmVhOTAzNzQ3ODQiLCJleHAiOjE1NjM4OTcxNzksIm9yaWdJYXQiOjE1NjM4OTY4Nzl9.QFB58dAvqIC9RBBohN1b3TdR542dBZEcXOG1MSTqAQQ") {
user {
id
addinfo
}
}
}
this code show that
{
"data": {
"user": {
"user": {
"id": "4",
"addinfo": false
}
}
}
}

I can't see the rest of your code, but if the code is fetching your users, there is a time before the request comes back where your user has not been fetched yet. It looks like your screenshot shows this. There is an undefined before the successful object.
You need to ensure that the data has come back first be checking if the data prop is truthy or some other way to check if the promise has completed yet.
ie
if (!data.user) return 'Loading...';
return (
<Switch>
...

In GraphQL I'm getting user info using e.g. below code:
async getUser(id) {
const result = await this.api.query({
query: gql(getUser),
variables: {
id,
},
});
return result.data.getUser || null;
}
I'm invoking it by:
const user = await userService.getUser(id);
and I do have access to user properties.
Maybe you're trying to get user data before they are retrieved and available?

Related

GCP Video Intelligence - batchPredict error

Following this documentation, when requesting a batchPredict I run into this error via API
{
"error": {
"code": 13
"message": "internal",
}
}
Additionally, here's a screenshot screenshot of the error I see when I try to use the "Test & Use" tab. Neither of which are descriptive, so I'm not sure where the error lies.
In the request, I include the path to my CSV file in the Google Storage, which links to a video in the same bucket. Here's the contents of the CSV:
gs://XXXXXXXXXXXX/movie1.mov,0,inf
gs://XXXXXXXXXXXX/movie2.mov,0,inf
I also include the path to a /Results folder (in the same bucket) to save the predictions.
Code making the call:
const client = new PredictionServiceClient();
async function batchPredict() {
const request = {
name: client.modelPath('project-id-xxxxxx', 'us-central1', 'VOTxxxxxxxxxx'),
inputConfig: {
gcsSource: {
inputUris: ['gs://XXXXXXXXXXXX/apitest.csv'],
},
},
outputConfig: {
gcsDestination: {
outputUriPrefix: 'gs://XXXXXXXXXXXX/results/',
},
},
};
Please let me know if I need to provide any more detail.
The possible root cause is one of those two:
There is an issue somewhere in your code. So, if your code is not the same as below, I suggest that you try it out (changing the appropriate variables of course).
There is something wrong with your model, which is the most probable root cause (as per the error message itself).
So, if it is not your code, you should create a private issue report on issue-tracker explaining your issue and giving as much details as possible on it as well as your use case and impact.
As it is private, only Googlers and you will have access to it so feel free to share your project and model IDs.
Here is what I did to try to reproduce your issue (be sure to follow the before you begin guide):
I have trained a model on gs://YOUR_BUCKET/TRAINING.csv
TRAIN,gs://automl-video-demo-data/traffic_videos/traffic_videos_train.csv
TEST,gs://automl-video-demo-data/traffic_videos/traffic_videos_test.csv
Predicted on a couple of images on gs://YOUR_BUCKET/VIDEOS_TO_ANNOTATE.csv (inputUri):
gs://automl-video-demo-data/traffic_videos/highway_078.mp4, 0,inf
gs://automl-video-demo-data/traffic_videos/highway_079.mp4,10.00000,15.50000
using the Node.js predict example from the tutorial:
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
const projectId = 'YOUR_PROJECT';
const location = 'us-central1';
const modelId = 'VOTXXXXXXXXXXXXXXXXXX';
const inputUri = 'gs://YOUR_BUCKET/VIDEOS_TO_ANNOTATE.csv';
const outputUri = 'gs://YOUR_BUCKET/outputs/';
// Imports the Google Cloud AutoML library
const {PredictionServiceClient} = require('#google-cloud/automl').v1beta1;
// Instantiates a client
const client = new PredictionServiceClient();
async function batchPredict() {
// Construct request
const request = {
name: client.modelPath(projectId, location, modelId),
inputConfig: {
gcsSource: {
inputUris: [inputUri],
},
},
outputConfig: {
gcsDestination: {
outputUriPrefix: outputUri,
},
},
};
const [operation] = await client.batchPredict(request);
console.log('Waiting for operation to complete...');
// Wait for operation to complete.
const [response] = await operation.promise();
console.log(
`Batch Prediction results saved to Cloud Storage bucket. ${response}`
);
}
batchPredict();
Note that I have also tried the REST & CMD LINE predict example.
And in both cases, it worked well and I received a correct response:
Nodejs prediction's response:
Waiting for operation to complete...
Batch Prediction results saved to Cloud Storage bucket. [object Object]
REST & CMD LINE prediction's response:
{
"name": "projects/XXXXXXXXXX/locations/us-central1/operations/VOTXXXXXXXXXXXXXXX",
"metadata": {
"#type": "type.googleapis.com/google.cloud.automl.v1beta1.OperationMetadata",
"createTime": "2021-04-16T08:09:52.102270Z",
"updateTime": "2021-04-16T08:09:52.102270Z",
"batchPredictDetails": {
"inputConfig": {
"gcsSource": {
"inputUris": [
"gs://MY_BUCKET/VIDEOS_TO_ANNOTATE.csv"
]
}
}
}
}
}

Apollo client mutation with writeQuery not triggering UI update

I have a mutation to create a new card object, and I expect it should be added to the user interface after update. Cache, Apollo Chrome tool, and console logging reflect the changes, but the UI does not without a manual reload.
const [createCard, { loading, error }] = useMutation(CREATE_CARD, {
update(cache, { data: { createCard } }) {
let localData = cache.readQuery({
query: CARDS_QUERY,
variables: { id: deckId }
});
localData.deck.cards = [...localData.deck.cards, createCard];
;
client.writeQuery({
query: CARDS_QUERY,
variables: { id: parseInt(localData.deck.id, 10) },
data: { ...localData }
});
I have changed cache.writeQuery to client.writeQuery, but that didn't solve the problem.
For reference, here is the Query I am running...
const CARDS_QUERY = gql`
query CardsQuery($id: ID!) {
deck(id: $id) {
id
deckName
user {
id
}
cards {
id
front
back
pictureName
pictureUrl
createdAt
}
}
toggleDeleteSuccess #client
}
`;
I managed the same result without the cloneDeep method. Just using the spread operator solved my problem.
const update = (cache, {data}) => {
const queryData = cache.readQuery({query: USER_QUERY})
const cartItemId = data.cartItem.id
queryData.me.cart = queryData.me.cart.filter(v => v.id !== cartItemId)
cache.writeQuery({query: USER_QUERY, data: {...queryData}})
}
Hope this helps someone else.
Ok, finally ran into a long Github thread discussing their solutions for the same issue. The solution that ultimately worked for me was deep cloning the data object (I personally used Lodash cloneDeep), which after passing in the mutated data object to cache.writeQuery, it was finally updating the UI. Ultimately, it still seems like there ought to be a way to trigger the UI update, considering the cache reflects the changes.
Here's the after, view my original question for the before...
const [createCard, { loading, error }] = useMutation(CREATE_CARD, {
update(cache, { data: { createCard } }) {
const localData = cloneDeep( // Lodash cloneDeep to make a fresh object
cache.readQuery({
query: CARDS_QUERY,
variables: { id: deckId }
})
);
localData.deck.cards = [...localData.deck.cards, createCard]; //Push the mutation to the object
cache.writeQuery({
query: CARDS_QUERY,
variables: { id: localData.deck.id },
data: { ...localData } // Cloning ultimately triggers the UI update since writeQuery now sees a new object.
});
},
});

Apollo GraphQL client doesn't return cached nested types in a query

I'm performing a query to get PowerMeter details in which contains another type inside called Project. I write the query this way:
query getPowerMeter($powerMeterId: ID!) {
powerMeter: powerMeter(powerMeterId: $powerMeterId) {
id
name
registry
project {
id
name
}
}
}
When I perform the query for the first time, project is successfully returned. The problem is that when I perform subsequent queries with the same parameters and default fetchPolicy (cache-first), project isn't returned anymore.
How may I solve this problem?
Also, I call readFragment to check how powerMeter is saved in the cache and the response shows that powerMeter has project saved.
const frag = client.readFragment({
fragment: gql`
fragment P on PowerMeter {
id
name
registry
project {
id
name
}
}
`,
id: 'PowerMeter:' + powerMeterId,
});
Power Meter returned first time
{
"powerMeter":{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"project":{
"id":"41d8e71b-d1e9-41af-af96-5b4ae9e492c1",
"name":"ProjectName",
"__typename":"Project"
},
"__typename":"PowerMeter"
}
}
Fragment after calling power meter first time
{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"project":{
"id":"41d8e71b-d1e9-41af-af96-5b4ae9e492c1",
"name":"ProjectName",
"__typename":"Project"
},
"__typename":"PowerMeter"
}
Power Meter returned second time
{
"powerMeter":{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"__typename":"PowerMeter"
}
}
Fragment after calling power meter second time
{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"project":{
"id":"41d8e71b-d1e9-41af-af96-5b4ae9e492c1",
"name":"ProjectName",
"__typename":"Project"
},
"__typename":"PowerMeter"
}
Edit 1: Fetching Query
The code below is how I'm fetching data. I'm using useApolloClient and not a query hook because I'm using AWS AppSync and it doesn't support query hook yet.
import { useApolloClient } from '#apollo/react-hooks';
import gql from 'graphql-tag';
import { useEffect, useState } from 'react';
export const getPowerMeterQuery = gql`
query getPowerMeter($powerMeterId: ID!) {
powerMeter: powerMeter(powerMeterId: $powerMeterId) {
id
name
registry
project {
id
name
}
}
}
`;
export const useGetPowerMeter = (powerMeterId?: string) => {
const client = useApolloClient();
const [state, setState] = useState<{
loading: boolean;
powerMeter?: PowerMeter;
error?: string;
}>({
loading: true,
});
useEffect(() => {
if (!powerMeterId) {
return setState({ loading: false });
}
client
.query<GetPowerMeterQueryResponse, GetPowerMeterQueryVariables>({
query: getPowerMeterQuery,
variables: {
powerMeterId,
},
})
.then(({ data, errors }) => {
if (errors) {
setState({ loading: false, error: errors[0].message });
}
console.log(JSON.stringify(data));
const frag = client.readFragment({
fragment: gql`
fragment P on PowerMeter {
id
name
registry
project {
id
name
}
}
`,
id: 'PowerMeter:' + powerMeterId,
});
console.log(JSON.stringify(frag));
setState({
loading: false,
powerMeter: data.powerMeter,
});
})
.catch(err => setState({ loading: false, error: err.message }));
}, [powerMeterId]);
return state;
};
Edit 2: Fetching Policy Details
When I use fetchPolice equals cache-first or network-only, the error persists. When I use no-cache, I don't get the error.
I think this might have been the solution:
https://github.com/apollographql/apollo-client/issues/7050
Probably way too late, but it could help people coming to this issue in the future.
When using apollo client's InMemoryCache it seems you need to provide a list of possible types so the fragment matching can be done correctly when using the InMemoryCache.
You can do that manually when having few union types and a pretty stable API which doesn't change very often.
Or you automatically generate these types into a json file, which you can use directly in the InMemoryCache's possibleTypes config directly.
Visit this link to the official docs to find out how to do it.
Cheers.

AWS S3 Bucket Upload using CollectionFS and cfs-s3 meteor package

I am using Meteor.js with Amazon S3 Bucket for uploading and storing photos. I am using the meteorite packges collectionFS and aws-s3. I have setup my aws-s3 connection correctly and the images collection is working fine.
Client side event handler:
'click .submit': function(evt, templ) {
var user = Meteor.user();
var photoFile = $('#photoInput').get(0).files[0];
if(photoFile){
var readPhoto = new FileReader();
readPhoto.onload = function(event) {
photodata = event.target.result;
console.log("calling method");
Meteor.call('uploadPhoto', photodata, user);
};
}
And my server side method:
'uploadPhoto': function uploadPhoto(photodata, user) {
var tag = Random.id([10] + "jpg");
var photoObj = new FS.File({name: tag});
photoObj.attachData(photodata);
console.log("s3 method called");
Images.insert(photoObj, function (err, fileObj) {
if(err){
console.log(err, err.stack)
}else{
console.log(fileObj._id);
}
});
The file that is selected is a .jpg image file but upon upload I get this error on the server method:
Exception while invoking method 'uploadPhoto' Error: DataMan constructor received data that it doesn't support
And no matter whether I directly pass the image file, or attach it as data or use the fileReader to read as text/binary/string. I still get that error. Please advise.
Ok, maybe some thoughts. I have done things with collectionFS some months ago, so take care to the docs, because my examples maybe not 100% correct.
Credentials should be set via environment variables. So your key and secret is available on server only. Check this link for further reading.
Ok first, here is some example code which is working for me. Check yours for differences.
Template helper:
'dropped #dropzone': function(event, template) {
addImage(event);
}
Function addImage:
function addImagePreview(event) {
//Go throw each file,
FS.Utility.eachFile(event, function(file) {
//Some Validationchecks
var reader = new FileReader();
reader.onload = (function(theFile) {
return function(e) {
var fsFile = new FS.File(image.src);
//setMetadata, that is validated in collection
//just own user can update/remove fsFile
fsFile.metadata = {owner: Meteor.userId()};
PostImages.insert(fsFile, function (err, fileObj) {
if(err) {
console.log(err);
}
});
};
})(file);
// Read in the image file as a data URL.
reader.readAsDataURL(file);
});
}
Ok, your next point is the validation. The validation can be done with allow/deny rules and with a filter on the FS.Collection. This way you can do all your validation AND insert via client.
Example:
PostImages = new FS.Collection('profileImages', {
stores: [profileImagesStore],
filter: {
maxSize: 3145728,
allow: {
contentTypes: ['image/*'],
extensions: ['png', 'PNG', 'jpg', 'JPG', 'jpeg', 'JPEG']
}
},
onInvalid: function(message) {
console.log(message);
}
});
PostImages.allow({
insert: function(userId, doc) {
return (userId && doc.metadata.owner === userId);
},
update: function(userId, doc, fieldNames, modifier) {
return (userId === doc.metadata.owner);
},
remove: function(userId, doc) {
return false;
},
download: function(userId) {
return true;
},
fetch: []
});
Here you will find another example click
Another point of error is maybe your aws configuration. Have you done everything like it is written here?
Based on this post click it seems that this error occures when FS.File() is not constructed correctly. So maybe this should be you first way to start.
A lot for reading so i hope this helps you :)

Facebook FQLproblem with javascript sdk

Hey everyone,
i do the following query to get a user statuses:
FB.api(
{
method: 'fql.query',
query: 'SELECT message FROM statuses WHERE uid = ' + userId
},
function(data) {
// do something with the response
}
);
It works great when the number of result are more than 0.
but when there are no results, the callback function is not called at all.
i need to know if there are 0 rows returning from this query, is there any way to do it?
Thanks :)
First of all, the statuses table does not exists. You should be using status table.
The callback is always called but you should properly check against empty objects. Just paste this on the Javascript Test Console:
<fb:login-button scope="read_stream">
Grant access to statuses
</fb:login-button>
<button onclick="getStatuses()">Get Statuses</button>
<script>
window.getStatuses = function() {
FB.api(
{
method: 'fql.query',
query: 'SELECT message FROM status WHERE uid = me() AND time < 315532800'
},
function(data) {
if(!isEmpty(data)) {
for(var key in data) {
var obj = data[key];
console.log(obj['message'])
}
} else {
console.log("data is empty")
}
});
};
function isEmpty(obj) {
for(var prop in obj) {
if(obj.hasOwnProperty(prop))
return false;
}
return true;
}
</script>
Here I am checking for statuses before 1/1/1980 to insure that an empty result is returned. In your console you should note the data is empty response.
When there are no results from a query, you should be getting an empty array.
Also, there isn't a FQL table named "statuses", it's "status".