Blue Prism Decision stage to ignore letter capitalization - case-insensitive

I am new to Blue Prism and trying to develop a bot that will do some searches on a CRM portal. The bot should search for a certain customer in the web-based CRM app database, and should select the correct one, based on the information provided in an Excel file.
For example, my Excel file has the following information:
Customer name: BLABLA LTD
Contact: test.email#example.com
First Name: John
Last Name: Smith
The bot will use the information in the cells above to perform the search in the web portal, but the web portal contains information which is sometimes in capital letters. I have managed to make the bot go through each element in the webpage that contains a search result, but I want it to click on the one that matches the information above. I used a decision stage, so if the Customer name in the table is the same as in the element, then it will click on it.
The problem is in the table, the text is capitalized, but in the web form it's not, so Blue Prism will consider that the value is different. Is there any way that I can make the bot ignore the capitalization when performing the calculation logic? What I am doing now is adding a new calculation stage to store all the elements in lowercase, and afterwards perform the equality logic between the new lowercase variables, but I was hoping there is an easier way.

You can usually use Lower() or Upper() on both strings, that will make sure the two of them are in the same casing:
You will see the function available under Text if you want to see more details about it.

Related

Power BIs Map with Danish post codes

I am trying to add data to a Map with Danish postcodes, e.g. 1000-9000 (we have four digits in Denmark).
When I add that into a Map, it scatters all over the world, as Power BI do not recognize it as Danish locations, even my Power BI is setup in Danish and the Map has Danish spelled city names.
I tried to add the regions Jylland, Fyn, Sjælland as a country hierarchy, but doing that moved Jylland (Jutland) as a place in Norway...
I also tried to use city names instead of post codes, but then a city shows up in Sweden...
It does not change whether the post code format is Text or Number format, and I have no option to use a Location format in the query.
Can anyone help me use Danish post codes for Map visualization? : )
Thanks
Ok, I solved it myself!
I found the place in the modelling part where I could force PowerBI to accept my city names, region names etc. and it now works.
More detailed: go into the middle of the three left side windows called Data (not Report, not Model), and click on the column you want to change format of. Then find the Tools section and change the Data Category to for example Address, or Country etc. Hope that helps

Kibana: can I store "Time" as a variable and run a consecutive search?

I want to automate a few search in one, here are the steps:
Search in Kibana for this ID:"b2c729b5-6440-4829-8562-abd81991e2a0" which will return me a bunch of logs. Of these logs I need to take the first and the last timestamp:
I now would like to store these two data FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524 in 2 variables
Run a second search in Kibana for the word "fail" in between these two variable of time
How to automate the whole process without need of copy/paste and running a second query?
EDIT:
SHORT STORY LONG: I work in a company that produce a software for autonomous vehicles.
SCENARIO: A booking is rejected and we need to understand why.
WHERE IS THE PROBLE: I need to monitor just a few seconds of logs on 3 different machines. Each log is completely separated, there is no relation between the logs so I cannot write a query in discover, I need to run 3 separated queries.
EXAMPLE:
A booking was rejected, so I open Chrome and I search on "elk-prod.myhost.com" for the BookingID:"b2c729b5-6440-4829-8562-abd81991e2a0" and I have a dozen of logs returned during a range of 2 seconds (FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524).
Now I need to know what was happening on the car so I open a new Chrome tab and I search on "elk-prod.myhost.com" for the CarID: "Tesla-45-OU" on the time range FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524
Now I need to know why the server which calculate the matching rejected the booking so I open a new Chrome tab and I search for the word CalculationMatrix always on the time range FROM: September 3rd 2019, 21:28:22.155, TO: September 3rd 2019, 21:28:23.524
CONCLUSION: I want to stop to keep opening Chrome tabs by hand and automate the whole thing. I have no idea around what time the book was made so I first need to search for the BookingID "b2c729b5-6440-4829-8562-abd81991e2a0", then store the timestamp of first and last log and run a second and third query based on those timestamps.
There is no relation between the 3 logs I search so there is no way to filter from the Discover, I need to automate 3 different query.
Here is how I would do it. First of all, from what I understand, you have three different indexes:
one for "bookings"
one for "cars"
one for "matchings"
First, in Discover, I would create three Saved Searches, one per index pattern. Then in Visualize, I would create a Vertical bar chart on the bookings saved search (Bucket X-Axis by date_histogram on the timestamp field, leave the rest as is). You'll get a nice histogram of all your booking events bucketed by time.
Finally, I would create a dashboard and add the vertical bar chart + those three saved searches inside it.
When done, the way I would search according to the process you've described above is as follows:
Search for the booking ID b2c729b5-6440-4829-8562-abd81991e2a0 in the top filter bar. In the bar chart histogram (bookings), you will see all documents related to the selected booking. On that chart, you can select the exact period from when the very first booking document happened to the very last. This will adapt the main time picker at the top and the start/end time will be "remembered" by Kibana
Remove the booking ID from the top filter (since we now know the time range and Kibana stores it). Search for Tesla-45-OU in the top filter bar. The bar histogram + the booking saved search + the matchings saved search will be empty, but you'll have data inside the second list, the one for cars. Find whatever you need to find in there and go to the next step.
Remove the car ID from the top filter and search for ComputationMatrix. Now the third saved search is going to show you whatever documents you need to see within that time range.
I'm lacking realistic data to try this out, but I definitely think this is possible as I've laid out above, probably with some adaptations.
Kibana does work like this (any order is ok):
Select time filter: https://www.elastic.co/guide/en/kibana/current/set-time-filter.html
Add additional criteria for search like for example field s is b2c729b5-6440-4829-8562-abd81991e2a0.
Add aditional criteria for search like for example field x is Fail.
Additionaly you can view surrounding documents https://www.elastic.co/guide/en/kibana/current/document-context.html#document-context
This is how Kibana works.
You can prepare some filters beforehands, save them and then use them if you want to automate the process of discovering somehow.
You can do that in Discover tab in Kibana using New/Save/Open options.
Edit:
I do not think you can achieve what you need in Kibana. As I mentioned earlier one option is to change the data that is comming to Elasticsearch so you can search for it via discover in Kibana. Another option could be builiding for example Java application, that is using Elasticsearch - then you can write algorithm that returns the data that you want. But i think it's a big overhead and I recommend checking the data first.
Edit: To clarify - you can create external Java let's say SpringBoot application that uses Elasticsearch - all the data that you need is inside it.
But in this option you will not use Kibana at all.
You can export the result to csv or what you want in the code.
SpringBoot application can ask ElasticSearch for whatever it needs, then it would be easy to store these time variables inside of Java code.
EDIT: After OP edited question to change it dramatically:
#FrancescoMantovani Well the edited version is very different from where you first posted here How to automate the whole process without need of copy/paste and running a second query? and search for word fail in a single shot. In accepted answer you are still using a three filters one at a time so it is not one search, but three.
What's more if you would use one index, and send data from multiple hosts via filebeat you don't even to have to create this dashboard to do that. Then you can you can select the exact period from when the very first document happened to the very last regarding filter and then remove it and add another filter that you need - it's simple as that. Before you were writing about one query,
How to automate the whole process without need of copy/paste and
running a second query?
not three. And you don't need to open new tab in Chrome each time you want to change filter just organize the data by for example using filebeat as mentioned before.
There is no relation between the 3 logs
From what you wrote the realation exist and it is time.
If the data is in for example three diferent indicies (cause documents don't have much similiar data) you can do it like that:
You change them easily in dicover see:
You can go to discover select index 1 search, select time range that you need, when you change index the time range is still the one you selected, you only need to change filter - you will get what you need.

Dynamics SL Field Length limit / Invoice Number length

If anyone can point me to the documentation for SL, which address the field length limits, right now i need to know about the length limit of invoice number.
I'm not quite sure where the documentation is, but the way I figure out the field length limits is with SQL Server Management Studio (SSMS). You can look at the databases and various tables/fields and figure out how big each field is as well as myriad of other information.
To find out the invoice number field length with SSMS I would connect to the Dynamics SL Company database (not the system database). The invoice number is a part of the accounts payable (AP) screens, so I would expand the APDOC table. Once you've done that you will see a few folders, one of which is the Columns folder. Expand the Columns folder and you will be presented with a list of fields. Within the parenthesis next to each of those fields you will find the length. In your case you will want to look at the InvcNbr field which is 15 characters for me, which I believe is the out of the box length.
An alternative method is to use customize mode within Dynamics SL. If you open up any screen that has the invoice number field, like Voucher and Adjustment Entry you can open up customize mode using the menu at the top of the screen. Next, open up the Property Window either by using the customize menu or hitting F4. Next, select the field you want to know the length for. For all character fields there will be a property called Mask that will be filled with several Xs. To figure out the field length, you simply need to count the number of Xs.
These 2 methods should be fairly future proof and will allow you to not have to search for the documentation for whatever version of Dynamics SL you might be on.

variable date option in opencart

1st I want to remove text field for date so the calendar will replace it.
2nd I want to make the status order.. I want to sell the service, so I need to make booking order by calendar. If the date is green client can make an order. If red the client can't book an order. If yellow there certain items can be ordered.
I hope someone can help..
Thanks.
You have to try something at least and ask only for advice then.
Anyway, few suggestions:
it cannot be done using that option field of type date, at least not with the default datepicker.
You will need to create Your own datepicker component that will search for free/partialy/fully ordered days in the database and color the table cells accordingly.
It is not very wise to hide the input - by this visible user could anytime check what date did he pick - if it is not visible he would need to always open the datepicker to check for it...
Disallowing to order some service based on some reservations is highly decreasing Your conversion rate - thus decreasing Your income. I would definitely go the way let the user buy/order anything at anytime while having separate reservation system. If user buys a service at thank You page I would recommend him to book a concrete date for the service to be drawn. Here You do not need to fight with product options which are meant totally for something different that You are trying to.
Keep that in mind (mainly the 4th point) and re-think Your problem.

webservice for autosuggest on city names / postal codes including long-lat coordinates?

i'm looking for a webservice, to be used for an autocomplete field,
where people can fill in either a postal code / city name or both
this service will need all cities in Europe, so we can use it for all country websites.
and in a later stadium we want to keep the world open for asia and america so this would be a plus.
preferably it would also return the long-lat coordinates for the locations,
Now it is a free textfield, after leaving the field, we hit the google geocoding service,
to find coordinates... preferably i would tie these two together.
so we don't have to query 2 services for one thing.
does anyone know of the existance of such a service online somewhere?
or would you suggest to build our own database with cities / postal codes / coordinates?
if so we would need to get the content from somewhere too, and i was trying to avoid that issue :)
I recently searched for a similar service, in vain.
I wanted my users to have auto-complete on entering a city name, and once a city is chosen I needed to pass the name and lat/long onto the Google API. In the end I did this: -
downloaded the geonames allcountries.zip, full extract: this
Imported it into a SQL DB via SSIS (about 7.5 million records!)
Wrote a simple query to extract just the cities (only the PPLC, PPLA and PPLA2 records).
This left me with a manageable table of 9112 records (with lat / long and country code) which covers all the cities in the world. I then wrote my own code to query the data.
Not ideal, but I needed a solution.
I know this post is very old but for thouse who are looking for a simple solution that can be integrated in 5 minutes here is the link:
Geocomplete jQuery...
For my case I followed this steps:
1 - Download the plugin from here.
2 - Add the jquery.geocomplete.js or jquery.geocomplete.min.js file into your javascript folder of your project.
3 - Call this file in script tags on the html page where you have the input field that you have to autocomplete with cities:
<script src='/PathToTheFile/jquery.geocomplete.js'></script>
4 - To convert an input into an autocomplete field, simply call the Geocomplete plugin in script tags: <script>
$("#IdOfTheInputField").geocomplete(); // Option 1: Call on element.
$.fn.geocomplete("input"); // Option 2: Pass element as argument.
</script>
5- You can check for the complete list of options on the link provided at the top.
Hope that this helped!