I am trying to use Ionic2 Storage , which use Indexed DB on browser. How can I get total number of keys in storedin DB.
this._storage.length() gives me t {__zone_symbol__state: null, __zone_symbol__value: Array[0]}
I got the answer, this._storage.length() returns promise
this._storage.length().then((data) => {
console.log(data);
})
Related
I'm trying to import existing users from another application into Google Cloud Identity Platform using passwords that were hashed using PHP's crypt function with an MD5 output.
This is an example of the hashed password being used in PHP:
$hashed_password = '$1$AT$JGYIRSP7xIYmg1XSoJmvB1';
$user_input = 'test123';
if (hash_equals($hashed_password, crypt($user_input, $hashed_password))) {
echo "Password verified!";
}
I've tried all sorts of combinations for importing the user and their password, but no combination seems to work. This is the NodeJS import script I'm using:
var admin = require('firebase-admin');
var app = admin.initializeApp({
credential: admin.credential.applicationDefault()
});
admin
.auth()
.importUsers(
[
{
uid: '31',
email: 'user31#test.test',
// Must be provided in a byte buffer.
passwordHash: Buffer.from('$1$AT$JGYIRSP7xIYmg1XSoJmvB1'),
// Must be provided in a byte buffer.
passwordSalt: Buffer.from('$1$AT$'),
},
],
{
hash: {
algorithm: 'MD5',
rounds: 0,
},
}
)
.then((results) => {
console.log(results);
results.errors.forEach((indexedError) => {
console.log(`Error importing user ${indexedError.index}`);
});
})
.catch((error) => {
console.log('Error importing users :', error);
});
As mentioned above, I've tried just about every combination of hash and passwordSalt I could think of. I've tried:
passing no passwordSalt
passing a passwordSalt of AT
passing a passwordSalt of AT$
passing a passwordSalt of $AT
passing a passwordSalt of $AT$
all of the above but with a hash algorithm of BCRYPT
I can see the user getting imported. If I change the hash to something like a regular MD5 hash, I'm able to authenticate as that user, so I know the import process is working correctly.
Does GCP Identity Platform simply not support these hashes? Am I passing the salt incorrectly or passing an incorrect number of rounds? Am I passing the wrong hash algorithm? I'm a little surprised, as I would've thought passwords hashed using PHP's crypt function would've been supported.
4 points:
1.- You are mentioning that you are trying to import existing users and their passwords from another application into Google Cloud Identity Platform. Also, you mentioned that you are using a NodeJS import script focused on MD5. Based on that, and using the same official GCP documentation that you quoted Migrating users from an existing app vs the NodeJS’s script you posted, it seems that you didn’t use the code sample that is posted in that documentation, exactly in the way it should be used, unless that you are the owner of the domain "#test.test":
getAuth()
.importUsers(
[
{
uid: 'some-uid',
email: 'user#example.com',
// Must be provided in a byte buffer.
passwordHash: Buffer.from('password-hash'),
// Must be provided in a byte buffer.
passwordSalt: Buffer.from('salt'),
},
],
{
hash: {
algorithm: 'PBKDF2_SHA256',
rounds: 100000,
},
}
)
.then((results) => {
results.errors.forEach((indexedError) => {
console.log(`Error importing user ${indexedError.index}`);
});
})
.catch((error) => {
console.log('Error importing users :', error);
});
2.- Are we sure you have the required admin rights or role over IAM in your GCP’s Organization? You can use the following official GCP documentation for understanding roles Understanding roles
3.- How are you figuring out that it is not working? Are you facing any of the following errors? Error codes. Is there any particular behavior after the importing process finishes or any log that you can share here editing your original post? Or is it only because you see that once the users are imported to IAM, they are not able to authenticate?
4.- I suggest you by now to follow (Method: users.insert) as stated here Method: users.insert first, this way you are going to review the hash formats that are supported. Finally, you need to do this test with the domains that you have verified in Cloud Identity.
I have request with number of tests cases, same endpoint, different actual values, different expected error messages.
I would like to create parameterized request sending particular value and check particular error message from list with all of the cases.
Request body:
{
"username": "{{username}}",
"password": "{{password}}",
...
}
Response:
{
"error_message": "{{error_message}}",
"error_code": "{{error_code}}"
}
Error message changes due to different cases:
Missed username
Missed password
Incorrect password or username
etc
Now, I have separate request on each case.
Question:
Is there way have 1 request with set of different values, checking
particular error messages/codes?
Create a csv:
username,password,error_message,error_code
username1,password1,errormessage1,errorcode1
username1,password1,errormessage1,errorcode1
Now use this as data file in collection runner or newman.
variable name is same as the column name and, for each iteration you will have corresponding row-column value as the variable value. Eg for iteration1 username will be username1
. As danny mentioned postman has a really rich documentation that you can make use of
https://learning.postman.com/docs/running-collections/working-with-data-files/
Adding another answer on how to run data driven from same request:
Create a environment variable called "csv" and copy the below content and paste it as value:
username,password,error_message,error_code
username1,password1,errormessage1,errorcode1
username1,password1,errormessage1,errorcode1
Now in pr-request add :
if (!pm.variables.get("index")) {
const parse = require('csv-parse/lib/sync')
//Environmental variable where we copy-pasted the csv content
const input = pm.environment.get("csv");
const records = parse(input, {
columns: true,
skip_empty_lines: true
})
pm.variables.set("index", 0)
pm.variables.set("records", records)
}
records = pm.variables.get("records")
index = pm.variables.get("index")
if (index !== records.length) {
for (let i of Object.entries(records[index])) {
pm.variables.set(i[0], i[1])
}
pm.variables.set("index", ++index)
pm.variables.get("index")===records.length?null:postman.setNextRequest(pm.info.requestName)
}
Now you can run data driven for that one particular request:
Eg collection:
https://www.getpostman.com/collections/eb144d613b7becb22482
use the same data as environment variable content , now run the collection using collection runner or newman
Output
i'm new to all the hot graphql/apollo stuff.
I have a subscription which gets a search result:
export const SEARCH_RESULTS_SUBSCRIPTION = gql`
subscription onSearchResultsRetrieved($sid: String!) {
searchResultsRetrieved(sid: $sid) {
status
clusteredOffers {
id
}
}
}
`;
Is it possible to query the "status" field from client cache if i need it inside another component? Or do i have to use an additional ?
In the apollo dev-tools i can see that there is a cache entry under "ROOT_SUBSCRIPTION" not "ROOT_QUERY". What does that mean?
....thanks
I found out that subscribeToMore is my friend to solve this.
At first i wrote a normal query for the data i want to subscribe to have cached data, then the cache will be updated by the subscription.
<3 apollo
I have setup Parse-Server on AWS Elastic Beanstalk by following this guide. I've then written a cloud-code function which fetches a single record from a specific class/collection. The collection contains about 20 columns. However, the object fetched as a result of the query contains only about 8 columns. I've made sure the record does have data in the columns which are missed by the query. Am I missing something here or is it some limitation in Parse? Is there any way to force Parse to fetch these columns?
Parse.Cloud.define('confirmAppointment', function(request, response) {
var staffId = request.params.staffId;
var appointmentId = request.params.appointmentId;
var appointmentRequest = Parse.Object.extend("AppointmentRequest");
appointmentRequest.id = appointmentId;
appointmentRequest.staffId = staffId;
var query = new Parse.Query(appointmentRequest);
query.first({
useMasterKey: true,
success: function(appointment) {
if (appointment) {
// these fields are not found in the fetched appointment object
// they do exist however in mongodb
var requesterUserId = appointment.get("requesterUserId");
var staffUserId = appointment.get("staffUserId");
var staffName = appointment.get("staffNameEn");
...
}
}
...
});
});
There might be some typos in your code (the construction of the query part). Try this instead:
Parse.Cloud.define('confirmAppointment', function(req, res) {
var staffId = req.params.staffId;
var appointmentId = req.params.appointmentId;
var query = new Parse.Query("AppointmentRequest");
query.equalTo('objectId', appointmentId);
query.equalTo('staffId', staffId);
query.first({
useMasterKey: true,
success: function(appointment) {
res.success(appointment.get("requesterUserId"));
},
error: function(err) {
res.error(err);
}
});
});
The issue turned out to be that when i did migration of data from Parse to my mongolab hosted MongoDB instance, I did not click 'Finalize' button in Parse migration wizard. That was intentional, as Parse was warning me that clicking Finalize would make the migration permanent and I would no longer be able to get back to the Parse managed database. On the other hand, I could see that all the data was successfuly migrated to mongolab, and technically it should have been enough to have my AWS hosted parse server work on this new database without any issue. But somehow, clicking "Finalize" button in Parse did some magic (I still dont understand what it could be) and my queries started returning the expected results.
I was able to reproduce the same issue when migrating to Heroku as well, so i was sure it had nothing to do with AWS.
Hope this would help someone.
I am moving from SQL to Couch DB from my web application, my very first application.
While i can not say why I do not like SQL queries, not sure that i don not, the idea of making CURL requests to access my database sound must better than using PHPs PDO .
I have spent a little over a day and a half trying to acquaint myself with the couch DB HTTP API. I can not claim I have throughly read the API , but who thoroughly reads an API before beginning to code. So my, possibly silly, question is - how do I pass an variable other than doc to a map function while making a http request to the view. The API clearly says that a map function only takes a single parameter which is "doc", in which case the function below itself is wrong but I can't find any section in the API that lets me query a database using end-user provided input.
my map function is
function(doc, pid2){
if (doc.pid === pid2)
{
emit(doc._id, doc) ;
}
}
pid2 is a number that will be provided by a front end user.
<?php
$pid2 = file_get_contents(facebook graphi api call_returns a Profile ID) ;
$user_exists = HTTP request to couch DB view to return
in JSON format the list of JSON documents with pid = $pid2
?>
Let your view emit the documents with doc.pid as the key
function(doc) {
emit(doc.pid, doc);
}
and use the key parameter to retrieve the right document:
http://localhost:5984/<database>/_design/<designdoc>/_view/<viewname>?key=<pid2>
This should return all documents with doc.pid === pid2.