Open Street maps query for ATM - web-services

I'm trying to find a way to query Open Street Maps for a list of ATMs for a particular city, any ideas how to achieve this?
I know we can use mapquest to query for this information, but it needs a box and it gives me less informaton than Open Street Maps.
Thanks.

I guess you meant MapQuest's Nominatim instance. Just for the record: Nominatim is actually created by the OpenStreetMap community. MapQuest just runs it, too.
Querying for specific objects is best done by using the Overpass API. This API also has a nice frontend, overpass turbo. It makes creating queries and running them really easy and also includes a nice visualization of the result.
This query will retrieve all ATMs (which are tagged as amenity=atm) in Berlin:
<osm-script output="json" timeout="25">
<!-- fetch area “berlin” to search in -->
<id-query {{nominatimArea:berlin}} into="area"/>
<!-- gather results -->
<union>
<!-- query part for: “atm” -->
<query type="node">
<has-kv k="amenity" v="atm"/>
<area-query from="area"/>
</query>
</union>
<!-- print results -->
<print mode="body"/>
<recurse type="down"/>
<print mode="skeleton" order="quadtile"/>
</osm-script>
You can view the result on overpass turbo. This query has been automatically generated by overpass turbo, I just had to type "atm in Berlin" into it's wizard.
Note that this is a special overpass turbo query which cannot be run directly via the Overpass API. overpass turbo adds some extra keywords like {{nominatimArea:berlin}} which will be automatically replaced by the bounding box of Berlin returned by Nominatim. But if you already know the bbox or want to retrieve it on your own using Nominatim, then you can specify it directly:
<osm-script output="json" timeout="25">
<!-- gather results -->
<union>
<!-- query part for: “atm” -->
<query type="node">
<has-kv k="amenity" v="atm"/>
<bbox-query e="13.92242431640625" n="52.67221863915279" s="52.32778621884898" w="12.992706298828125"/>
</query>
</union>
<!-- print results -->
<print mode="body"/>
<recurse type="down"/>
<print mode="skeleton" order="quadtile"/>
</osm-script>
Both the Overpass API and overpass turbo support various output formats for the result, including XML and JSON.

Related

How can you delete Contacts that don't have any recent Cases using Bulk Deletion Jobs?

I'm trying to figure out the criteria for a Bulk Deletion Job in Dynamics 365 to enforce GDPR.
Essentially, I need to delete Contact records that haven't been used for more than 13 months.
The main part of the Criteria I've been using is where 'Modified On is Older Than 13 months'.
The problem with this is that the 'Modified On' field only updates when you change the details of the Contact record.
For example, a Contact might've been created in 2019 and the details have remained the same since then - therefore the Modified On date is 2019. However, this Contact only just emailed us last week and so their Contact is still in use but this isn't reflected in the Modified On date.
The criteria I have come up with to try and get around this is as follows:
Unfortunately, this is still returning Contacts that don't match the criteria I need.
Is it possible to get this criteria using the advanced find functionality, or will it require something external?
First of all, the usual recommendation is not to delete the records or you'll lose historic data, and you will end up with several records in the system with broken relationships.
If you already backed-up historic data or it does not have any value to the business, unless someone can correct me, I think what you are trying to do won't work with advance find. This is because of the limitation in advanced find with related entities and the fact the we cannot add conditions to the "Does not contain data" relationship option.
Your criteria is not giving the results you expect because it is requesting the following (assuming you have selected the Contact table in advanced find):
Select Contacts whose modifiedOn date is older than 13 months and that
they are linked as the customer of a case whose modifiedOn date is
older than 13 months and these cases have a related email whose
modiedOn date is older than 13 months
So, what you are retrieving are old Contacts that had a at least one Case some time ago, and not Contacts with no recent activity (A Contact can have an old Case, but may/may not have a recent one).
Another thing to consider is that you are using the "Customer" column [Cases(Customer)] from the Case table. Depending on how your organizations is handling this column and since the Customer column can hold Accounts or Contacts, you might want to use "primarycontactid" [Cases(Contact)] or another custom column (I've seen some designs were a custom column is used to track the Contact).
Last year I had a request for an organization to automatically merge thousands of contacts following some rules. What I ended up doing was a Console Application and one of the steps was to validate if the Contacts had any interactions (Leads, Opportunities, Cases and Activities) and count them, this way when the merge was performed I chose the Contact with more related records as the main Contact. So a similar approach could be use in your scenario.
You can create an efficient query with QueryExpressions, but if you are not used to them you can use this FetchXML in the console Application
<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="true">
<entity name="contact">
<attribute name="fullname" />
<attribute name="telephone1" />
<attribute name="contactid" />
<order attribute="fullname" descending="false" />
<link-entity name="incident" from="primarycontactid" to="contactid" link-type="outer" alias="case">
<attribute name="incidentid" />
<filter type="and">
<condition attribute="modifiedon" operator="last-x-months" value="13" />
</filter>
</link-entity>
</entity>
</fetch>
It will give you a list of Contacts and if they had a Case that was modified in the last 13 months, you'll get a GUID in the case.incidentid column, if you have a null value in case.incidentid, it means that there are no recent Cases and should be deleted.
Keep in mind that :
You might need to update the FetchXML to your needs.
You'll need to handle paging on the query results if there are more than 5,000 Contacts.
Depending on the number of Contacts in the system you'll want to create different batches to process them because it can take a while to complete.
It would be a good idea to create a report to validate the Contacts before deleting them.

OBIEE: how to set 'is prompted' values in saved filter

My task is to automate testing of OBIEE report data. The main step is to get report's logical SQL.
I have dashboard with reports. Every report has named filter (not inline one) attached. So, I'd like to find a way to set up filter values and programmatically run generation of report SQL (so that WHERE clause is filled in with my values), play it and retrieve data. I have tried the following approaches:
OBIEE WebServices. First, I use generateReportSQL, then call for executeSQLQuery. This approach works fine for inline filters, I managed to set them up in . But I can not get it working with saved filters. How to generate report with values set up for columns in attached saved filter? No information in documentation or in internet found.
Generate Dashboard URL with all prompts set, run it and then read usage tracking tables to retrieve SQL queries. But it seems a bit strange approach, I believe there must be a simpler way to do the task. Moreover, usage tracking does not put information about report execution in its DB immediately, it has some timeout. Is there a way to avoid it?
runcat.sh + nqcmd - still, have not found a way to set values for saved filter.
So, my question is: how generate report's logical SQL with prompt values set for saved filter attached?
Thanks in advance,
Jol
UPDATE
Some examples:
XML of my usage tracking analysis contains the following:
<saw:filter>
<sawx:expr xsi:type="sawx:logical" op="and">
<sawx:expr xsi:type="sawx:special" op="prompted">
<sawx:expr xsi:type="sawx:sqlExpression">"S_NQ_ACCT"."START_DT"</sawx:expr>
</sawx:expr>
<sawx:expr xsi:type="sawx:special" op="prompted">
<sawx:expr xsi:type="sawx:sqlExpression">"S_NQ_ACCT"."USER_NAME"</sawx:expr>
</sawx:expr>
<sawx:expr xsi:type="sawx:special" op="prompted">
<sawx:expr xsi:type="sawx:sqlExpression">"S_NQ_ACCT"."SAW_DASHBOARD_PG"</sawx:expr>
</sawx:expr>
</sawx:expr>
I can use filterExpressions tag of generateReportSQL to create logicalSQL that includes my values in WHERE clause. Everything is OK if tag filter is included in analysis's XML (the case of inline filters, as in the example above):
<v7:generateReportSQL>
<v7:reportRef>
<v7:reportPath>report path</v7:reportPath>
</v7:reportRef>
<v7:reportParams>
<!--Zero or more repetitions:-->
<v7:filterExpressions>
<![CDATA[<sawx:expr xsi:type="sawx:string" op="equal" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:saw="com.siebel.analytics.web/report/v1.1" xmlns:sawx="com.siebel.analytics.web/expression/v1.1" subjectArea="Usage Tracking">
<sawx:expr xsi:type="sawx:sqlExpression">"S_NQ_ACCT"."USER_NAME"</sawx:expr>
<sawx:expr xsi:type="sawx:string">testuser</sawx:expr></sawx:expr>
]]>
</v7:filterExpressions>
.............................
</v7:reportParams>
<v7:sessionID>...</v7:sessionID>
</v7:generateReportSQL>
XML of my test analysis contains the following:
<saw:filter>
<sawx:expr xsi:type="sawx:savedFilter" path="/shared/myproject/_filters/myroject/my saved filter" name="my saved filter" /></saw:filter>
'my saved filter' has 'is prompted' columns that I'd like to set to my values and run an analysis to get dataset. But how to do it?
if webservices are useless here, what could be used?
Since those are normally used for completely dynamic population in terms of content (instantiated variables pulled from user profiles, stuff gotten from prompts, etc....) you won't get them in the LSQL.
tl;dr - Robin wrote a bice post about load testing with LSQL https://www.rittmanmead.com/blog/2014/03/built-in-obiee-load-testing-with-nqcmd/
Edit: Baseline Validation Tool (BVT) was proposed and is the answer.

How do I save the web service response to the same excel sheet I extracted the data from?

For example:
The given sample HP Flights SampleAppData.xls and using the CreateFlightOrder, we can link the data to the test functions and get a OrderNumber and Price response from the Web Service. And in the SampleAppData.xls Input tab, we can see that there is a empty column of OrderNumber.
So here is my question, is there any ways that I can take the OrderNumber response and fill the empty column in SampleAppData.xls?
My point to do this is because, let's say I have many test cases to do and will take days, and today I do this certain test and I would need the result of today for the next day's test.
Although I know that the responses are saved in the result but it beats the point of automation if I am required to check the response for each and every test cases?
Yes of course you can. There are a number of ways to do this. The simplest is as follows.
'Datatable.Value("columnName","sheetName")="Value"
DataTable.Value(“Result”,”Action1”)=“Pass”
Once you have recorded the results in the Datasheet, your can export them using
DataTable.ExportSheet("C:\SavePath\Results.xls")
You can write back the response programatically , if you already imported mannually .
You can use GetDataSource Class of UFT API , it will work like this lets say you imported excel from FlightSampleData.xls, and named it as FlightSampleData, you have sheet, accessing the sheet will be like below:
GetDataSource("FlightSampleData!input).Set(ROW,ColumnName,yourValue);
GetDataSource("FlightSampleData!input).Get(ROW,ColumnName);
for exporting you can use ExportToExcelFile method of GetDataSourse class after your test run . Please let me know if you have any furthur question about this.

How can I get stock option chains using YQL?

I am trying to get stock option chains from Yahoo using YQL. I have tried this command in the YQL console:
select * from yahoo.finance.options
However, I get this error XML:
<?xml version="1.0" encoding="UTF-8"?>
<error xmlns:yahoo="http://www.yahooapis.com/v1/base.rng" yahoo:lang="en-US">
<diagnostics>
<publiclyCallable>true</publiclyCallable>
</diagnostics>
<description>No definition found for Table yahoo.finance.options</description>
</error>
It looks like this table doesn't exist anymore. Does anyone know what the correct table is?
You have to provide at least one where clause to make this query work. Like this:
select * from yahoo.finance.options where symbol='MMM'
or,
select * from yahoo.finance.options where symbol='A'
or,
select * from yahoo.finance.options where symbol='YHOO'
All the above queries work fine. If you want more specific data then you have to provide more conditions to filter out your desired data out of the whole data.
yahoo.finance.options is a community table. To read about community tables check here. From the link I just posted:
In order to use YQL with the community tables, you must pass in the datatables env file. You can do this on the YQL console as part of a YQL statement, or by passing in a query parameter.
The YQL console to test this can be found here
Something I just figured out today is that you can just use this to get the option data from yahoo
https://query2.finance.yahoo.com/v7/finance/options/SPXS?straddle=true
Since the middle of 2014 YQL has the problem, so the option chain from yahoo.finance with scraping could be get by the following codes with: Python and Matlab

Sorting results in Advanced System Reporter in Sitecore

In Sitecore's Advanced System Reporter (v1.3) shared source module, is there an out-of-the-box way of sorting the results before the results are displayed to email/screen or will I need to implement something myself?
In a standard ASR install, I can see the Media Viewer viewer configuration item has a sort parameter in the attributes field but it's using ASR.Reports.Items.ItemViewer class which, after checking in reflector, doesn't respect the sort parameter. I take this to mean that the class might have respected the sort parameter previously but doesn't now.
As a side thought, I would have thought that a Scanner class would be a much more logical place to put sorting logic than at the Viewer class level.
Ok, found the answer. The sort parameter I found is actually used when running the report by the ASR module.
The sort parameter is set up in the attributes and is in the following format:
sort=ColumnName,ASC|DESC,[DateTime]
where Column Name is the display name of the column, ASC or DESC is the sort direction and is required and DateTime is to be set if the column is a date time value.
Example:
Given the column formatting of
<Columns>
<Column name="item name">Item Name</Column>
<Column name="publish date">Publish Date</Column>
</Columns>
to sort by publish date descending, the appropriate sort parameter would be
sort=Publish Date,DESC,DateTime
and to sort by item name, the sort parameter would be
sort=Item Name,ASC
I'm not sure anyone can answer your question immediately, apart from probably the module author. But you have a huge advantage in this case - the module sources. Instead of browsing the assemblies with the Reflector, you can check out the latest sources and just debug it. One debug session can answer more questions than a bunch of SO posts. ;-)
Also, as a side note, you might have noticed special Sitecore logos on that page - this blog post will tell you what it means.