How to iterate through all <select> field options in behat / mink - list

I am testing a product search form. Products may be searched by different parameters (like status, material, wight etc.). When I want to search by status I do the following:
Scenario Outline: search by status
When I select "<status>" from "search_form_status"
And I press "Search"
And I wait for 3 seconds // implemented
And I follow random link from test result table // implemented
Then I should see "<status>" in the "div#status" element
Examples:
|status |
|enabled |
|disabled|
And everything is fine. But if I wanted to test the same search for say, productMaterial I'm stuck as product materials are the subject that can be changed at any time (we may need new materials, may edit material names, or delete old unused ones). Add to that the fact that materials will differ on testing environment and on production site.
I know that I can do something like:
Given I select product material
And implement the step with foreach loop like this:
$matList = $this->getSession()->getPage()->findAll('css', "select option");
foreach($matList as $material){
// DO SOMETHING
}
}
But how would I create all the other steps like in the status example?
I imagine that I would want to use a $material variable in the following steps in my search.feature file for the steps that follow that custom step. But how do I do that?
How would I iterate through all of the options list and do a bunch of steps in each iteration?

You'll need to write the PHP code that runs the individual steps that you want, inside your method that contains the code to select all the options.
For example:
$handler = $this->getSession()->getSelectorsHandler();
$optionElements = $this->getSession()->getPage()->findAll('named', array('option', $handler->selectorToXpath('css', 'select ));
foreach ($optionElements as $optionElement) {
$this->getSession()->getPage()->selectFieldOption('portal', $optionElement->getValue());
$this->pressButton("show");
$this->assertPageContainsText(" - Report Report");
}

Related

How can I dynamically create multi-level hierarchical forms in Django?

I'm building an advanced search page for a scientific database using Django. The goal is to be able to allow some dynamically created sophisticated searches, joining groups of search terms with and & or.
I got part-way there by following this example, which allows multiple terms anded together. It's basically a single group of search terms, dynamically created, that I can either and-together or or-together. E.g.
<field1|field2> <is|is not|contains|doesn't contain> <search term> <->
<+>
...where <-> will remove a search term row and <+> will add a new row.
But I would like the user to be able to either add another search term row, or add an and-group and an or-group, so that I'd have something like:
<and-group|or-group> <->
<field1|field2> <is|is not|contains|doesn't contain> <search term> <->
<+term|+and-group|_or-group>
A user could then add terms or groups. The result search might end up like:
and-group
compound is lysine
or-group
tissue is brain
tissue is spleen
feeding status is not fasted
Thus the resulting filter would be like the following.
Data.objects.filter(Q(compound="lysine") & (Q(tissue=brain) | Q(tissue="spleen")) & ~Q(feeding_status="fasted"))
Note - I'm not necessarily asking how to get the filter expression below correct - it's just the dynamic hierarchical construction component that I'm trying to figure out. Please excuse me if I got the Q and/or filter syntax wrong. I've made these queries before, but I'm still new to Django, so getting it right off the top of my head here is pretty much guaranteed to be zero-chance. I also skipped the model relationships I spanned here, so let's assume these are all fields in the same model, for simplicity.
I'm not sure how I would dynamically add parentheses to the filter expression, but my current code could easily join individual Q expressions with and or or.
I'm also not sure how I could dynamically create a hierarchal form to create the sub-groups. I'm guessing any such solution would have to be a hack and that there are not established mechanisms for doing something like this...
Here's a screenshot example of what I've currently got working:
UPDATE:
I got really far following this example I found. I forked that fiddle and got this proof of concept working before incorporating it into my Django project:
http://jsfiddle.net/hepcat72/d42k38j1/18/
The console spits out exactly the object I want. And there are no errors. Clicking the search button works for form validation. Any fields I leave empty causes a prompt to fill in the field. Here's a demo gif:
Now I need to process the POST input to construct the query (which I think I can handle) and restore the form above the results - which I'm not quite sure how to accomplish - perhaps a recursive function in a custom tag?
Although, is there a way to snapshot the form and restore it when the results load below it? Or maybe have the results load in a different frame?
I don't know if I'm teaching a grandmother to suck eggs, but in case not, one of the features of the Python language may be useful.
foo( bar = 27, baz = None)
can instead be coded
args = {}
a1, a2 = 'bar', 'baz'
d[a1] = 27
d[a2] = None
foo( **args )
so an arbitrary Q object specified by runtime keys and values can be constructed q1 = Q(**args)
IIRC q1 & q2 and q1 | q2 are themselves Q objects so you can build up a filter of arbitrary complexity.
I'll also include a mention of Django-filter which is usually my answer to filtering questions like this one, but I suspect in this case you are after greater power than it easily provides. Basically, it will "and" together a list of filter conditions specified by the user. The built-in ones are simple .filter( key=value), but by adding code you can create custom filters with complex Q expressions related to a user-supplied value.
As for the forms, a Django form is a linear construct, and a formset is a list of similar forms. I think I might resort to JavaScript to build some sort of tree representing a complex query in the browser, and have the submit button encode it as JSON and return it through a single text field (or just pick it out of request.POST without using a form). There may be some Javascript out there already written to do this, but I'm not aware of it. You'd need to be sure that malicious submission of field names and values you weren't expecting doesn't result in security issues. For a pure filtering operation, this basically amounts to being sure that the user is entitled to get all data in database table in any case.
There's a form JSONField in the Django PostgreSQL extensions, which validates that user-supplied (or Javascript-generated) text is indeed JSON, and supplies it to you as Python dicts and lists.

Kibana: can I store "Time" as a variable and run a consecutive search?

I want to automate a few search in one, here are the steps:
Search in Kibana for this ID:"b2c729b5-6440-4829-8562-abd81991e2a0" which will return me a bunch of logs. Of these logs I need to take the first and the last timestamp:
I now would like to store these two data FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524 in 2 variables
Run a second search in Kibana for the word "fail" in between these two variable of time
How to automate the whole process without need of copy/paste and running a second query?
EDIT:
SHORT STORY LONG: I work in a company that produce a software for autonomous vehicles.
SCENARIO: A booking is rejected and we need to understand why.
WHERE IS THE PROBLE: I need to monitor just a few seconds of logs on 3 different machines. Each log is completely separated, there is no relation between the logs so I cannot write a query in discover, I need to run 3 separated queries.
EXAMPLE:
A booking was rejected, so I open Chrome and I search on "elk-prod.myhost.com" for the BookingID:"b2c729b5-6440-4829-8562-abd81991e2a0" and I have a dozen of logs returned during a range of 2 seconds (FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524).
Now I need to know what was happening on the car so I open a new Chrome tab and I search on "elk-prod.myhost.com" for the CarID: "Tesla-45-OU" on the time range FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524
Now I need to know why the server which calculate the matching rejected the booking so I open a new Chrome tab and I search for the word CalculationMatrix always on the time range FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524
CONCLUSION: I want to stop to keep opening Chrome tabs by hand and automate the whole thing. I have no idea around what time the book was made so I first need to search for the BookingID "b2c729b5-6440-4829-8562-abd81991e2a0", then store the timestamp of first and last log and run a second and third query based on those timestamps.
There is no relation between the 3 logs I search so there is no way to filter from the Discover, I need to automate 3 different query.
Here is how I would do it. First of all, from what I understand, you have three different indexes:
one for "bookings"
one for "cars"
one for "matchings"
First, in Discover, I would create three Saved Searches, one per index pattern. Then in Visualize, I would create a Vertical bar chart on the bookings saved search (Bucket X-Axis by date_histogram on the timestamp field, leave the rest as is). You'll get a nice histogram of all your booking events bucketed by time.
Finally, I would create a dashboard and add the vertical bar chart + those three saved searches inside it.
When done, the way I would search according to the process you've described above is as follows:
Search for the booking ID b2c729b5-6440-4829-8562-abd81991e2a0 in the top filter bar. In the bar chart histogram (bookings), you will see all documents related to the selected booking. On that chart, you can select the exact period from when the very first booking document happened to the very last. This will adapt the main time picker at the top and the start/end time will be "remembered" by Kibana
Remove the booking ID from the top filter (since we now know the time range and Kibana stores it). Search for Tesla-45-OU in the top filter bar. The bar histogram + the booking saved search + the matchings saved search will be empty, but you'll have data inside the second list, the one for cars. Find whatever you need to find in there and go to the next step.
Remove the car ID from the top filter and search for ComputationMatrix. Now the third saved search is going to show you whatever documents you need to see within that time range.
I'm lacking realistic data to try this out, but I definitely think this is possible as I've laid out above, probably with some adaptations.
Kibana does work like this (any order is ok):
Select time filter: https://www.elastic.co/guide/en/kibana/current/set-time-filter.html
Add additional criteria for search like for example field s is b2c729b5-6440-4829-8562-abd81991e2a0.
Add aditional criteria for search like for example field x is Fail.
Additionaly you can view surrounding documents https://www.elastic.co/guide/en/kibana/current/document-context.html#document-context
This is how Kibana works.
You can prepare some filters beforehands, save them and then use them if you want to automate the process of discovering somehow.
You can do that in Discover tab in Kibana using New/Save/Open options.
Edit:
I do not think you can achieve what you need in Kibana. As I mentioned earlier one option is to change the data that is comming to Elasticsearch so you can search for it via discover in Kibana. Another option could be builiding for example Java application, that is using Elasticsearch - then you can write algorithm that returns the data that you want. But i think it's a big overhead and I recommend checking the data first.
Edit: To clarify - you can create external Java let's say SpringBoot application that uses Elasticsearch - all the data that you need is inside it.
But in this option you will not use Kibana at all.
You can export the result to csv or what you want in the code.
SpringBoot application can ask ElasticSearch for whatever it needs, then it would be easy to store these time variables inside of Java code.
EDIT: After OP edited question to change it dramatically:
#FrancescoMantovani Well the edited version is very different from where you first posted here How to automate the whole process without need of copy/paste and running a second query? and search for word fail in a single shot. In accepted answer you are still using a three filters one at a time so it is not one search, but three.
What's more if you would use one index, and send data from multiple hosts via filebeat you don't even to have to create this dashboard to do that. Then you can you can select the exact period from when the very first document happened to the very last regarding filter and then remove it and add another filter that you need - it's simple as that. Before you were writing about one query,
How to automate the whole process without need of copy/paste and
running a second query?
not three. And you don't need to open new tab in Chrome each time you want to change filter just organize the data by for example using filebeat as mentioned before.
There is no relation between the 3 logs
From what you wrote the realation exist and it is time.
If the data is in for example three diferent indicies (cause documents don't have much similiar data) you can do it like that:
You change them easily in dicover see:
You can go to discover select index 1 search, select time range that you need, when you change index the time range is still the one you selected, you only need to change filter - you will get what you need.

Adding a new real-time shipping system to older X-Cart

Trying to add a new real time shipping system to an existing, older (4.2.x) version of X-Cart and I can not figure out how to implement it properly. Plan is to put the lookup into a new shipping/mod_*.php file and from what I can tell merge $intershipper_rates with the response I get from the rating API. I just don't know how to reliably integrate it nor if I need to manually add anything into the database to make it work properly. There doesn't seem to be any reference material or documentation for the older version I can easily access to figure it out, either. If anybody can give me a hand wrapping my head around this, I'd appreciate it.
In the code below replace the 'CPC' substring with your new shipper code.
1) Create functions like
func_shipper_CPC
func_get_package_limits_CPC
func_check_limits_CPC
in a new file like
shipping/mod_CPC.php
2) Change the array
$mods = array("USPS", "CPC", "ARB", "FEDEX");
in the shipping/myshipper.php
3) Add a row to the shipping options table
$params = func_query_first ("SELECT * FROM $sql_tbl[shipping_options] WHERE carrier='CPC'");
4) Add possible shipping methods in the xcart_shipping table
INSERT INTO xcart_shipping VALUES (null,'Canada Post Expedited','','L','CPC','81',20,'Y','CEX',0.00,0.00,1020,'','');
INSERT INTO xcart_shipping VALUES (null,'Canada Post Regular','','L','CPC','82',10,'Y','CRE',0.00,0.00,1010,'','');
INSERT INTO xcart_shipping VALUES (null,'Canada Post Xpresspost USA','','I','CPC','89',90,'Y','',0.00,0.00,2030,'','');
.....

Filter Gtest based on two filters

I have multiple GTests. I would like to run only some of them based on two filters. For example, I want to run all UI properties tests. Something like: --gtest_filter=Properties and --gtest_filter=UI. I would like to use both filters in the same run.
I could not find the syntax for passing 2 filters at the same time.
I read this, I would like to do something like the 3rd example - *./foo_test --gtest_filter=Null:Constructor Runs any test whose full name contains either "Null" or "Constructor"*, but to run any test whose full name contains both "Null" and "Constructor".
any ideas?
This is how you specify multiple filters:
--gtest_filter=*Properties*:*UI*
EDIT:
In case you want both words to be included in the filter, you can use this:
--gtest_filter=*Properties*UI*:*UI*Properties*

Tableau: Creating Dynamic Filter to exclude names

I have been working on this for the better part of the day and would like to crowd source as I must just be missing something simple.
I would like to use a parameter control to create a dynamic filter that would exclude the names of individuals that have already participated in an event. For example in the following list of two fields:
Name-Event Name
Carl-Agriculture
Carl-Agriculture
Carl-Agriculture
Jodie-Business
Jodie-Agriculture
Jodie-Agriculture
Pam-Business
Pam-Business
Pam-Business
if the parameter was set to Agriculture, only Pam would show up on the list, and if it was set to Business only Carl would show up. This list will help stakeholders send invitations to potentially interested parties.
I have tried so many calculations including the parameter itself in IF statements, IIF statements, CASE statements, etc. I've also tried creating a second calculation to work off of the first but am still striking out.
Any ideas?
You got most of the way there on your own. To finish the job:
Place Name on the filter shelf
Select "Use all" on the General tab of the filter panel
Select "By field:" on the Condition tab of the filter panel
Choose the "If Exclusion Statement" field, Count aggregation function (NOT Sum in this case), and set the test to "= 0"
The effect of this filter is equivalent to the SQL group by Name having Count(If Exclusion Statement) = 0