I'm writing a Powershell script and am querying the local DNS resource for both CNAME and A records matching specific criteria (specifically against HP servers). Using full administrator access rights using WQL against the root\MicrosoftDNS provider, I was presented with 0 records for the following:
select * from MicrosoftDNS_ResourceRecord where TextRepresentation like '%sql%'
However, negating a negation works.
select * from MicrosoftDNS_ResourceRecord where NOT(NOT(TextRepresentation like '%sql%'))
Why? Am I going insane?
The full query is:
select * from MicrosoftDNS_ResourceRecord where NOT (ContainerName like '..%' OR OwnerName like '%ilo%') AND (__CLASS = 'MicrosoftDNS_AType' OR __CLASS = 'MicrosoftDNS_CNAMEType') AND NOT(NOT(TextRepresentation like '%sql%'))
Related
I have an IFrame which shows a PowerBI embedded Report that shows a list of Accounts.
I want to pass in an Account ID so that I only see the sales for my Account.
In side the report I have a Table lets say in called Query1 and Inside that table I have a field called AccountID. I need to add to my URL to filter the Accountid = 123.
My URL is something like this....
https://app.powerbi.com/reportEmbed?reportId=xxxxxxxxxxxx&autoAuth=true&ctid=xxxxxxxxxxx-xxxxxxxxx&config=eyJjbHVzdGVyVXJsIjoiaHR0cHM6Ly93YWJpLXdlc3QtZXVyb3BlLXJlZGlyZWN0LmFuYWx5c2lzLndpbmRvd3MubmV0LyJ9
What exactly should I add to filter the report by the AccountID?
You should add url parameter called filter. You need to specify table and field you want to filter and add value of the filter after eq. So your end result should be something like that:
URL?filter=Table/Accountid eq 123.
Here's Microsoft documentation about it https://learn.microsoft.com/en-us/power-bi/collaborate-share/service-url-filters#query-string-parameter-syntax-for-filtering
Update top part of course works for filtering reports in the appor work space itself. To filter embedded report you need to specify the page and filter in a similar fashion for the embedded link https://learn.microsoft.com/en-us/power-bi/collaborate-share/service-embed-secure. So you link will be something like that:
https://app.powerbi.com/reportEmbed?blabla&pageName=Page1&$filter=Table/Accountid eq 123
Here how it would like once embedded:
I have a gcp based environment. I use standard SQL scripting in gcp BigQuery and federated query to cloudsql MySql. Federated query selects data from cloudsql mysql database. I need to select data from cloudsql mysql database based on condition that depends on data in BigQuery. I use variables in standard sql scriping in gcp bigquery to store the value that I select from bigquery. I want to value of this variable in the where clause of mysql query. See following example where I select a date from BigQuery and store it in a variable "BQ_LAST_DATETIME".
DECLARE BQ_LAST_DATETIME DATETIME
SET BQ_LAST_DATETIME = (select max(date_created) from bq_my_dataset.bq_my_table);
Since I am using bigquery federated query to read data out of cloudsql database (https://cloud.google.com/bigquery/docs/cloud-sql-federated-queries) as shown below and I want to use value that I stored in the variable "BQ_LAST_DATETIME" in the mysql query where clause
SELECT * FROM EXTERNAL_QUERY("my-gcp-project.my-region.my-connection2-cloudsql", "select * from mysqlschema.mysql_table where where date_created = #BQ_LAST_DATETIME;" );
Please note that in above query I have used "#BQ_LAST_DATETIME" as a placeholder to show what I want to achieve. I am not sure if I can directly use bigquery scripting variable as query parameter in the "external" query part of federated query.
Any suggestions on how to achieve parametrization of external queries in federated query, or if you know how I could achieve effect similar to what my intent is?
I actually tried following as depicted . I used bigquery scripting variable as query parameter in the "external" query part of federated query. only nuance here is that since the I was dealing with dates I performed a cast and also since the date variable actually is treated as a string I formatted it back to date using mysql STR_TO_DATE as follows
DECLARE BQ_LAST_DATETIME DATETIME
SET BQ_LAST_DATETIME = (select max(date_created) from bq_my_dataset.bq_my_table);
SET BQ_LAST_DATE= CAST(BQ_LAST_DATETIME AS DATE);
SELECT * FROM EXTERNAL_QUERY("my-gcp-project.my-region.my-connection2-cloudsql", "select * from mysqlschema.mysql_table where where date_created = STR_TO_DATE(#BQ_LAST_DATE,'%Y-%m-%d') ;" );
While this query is accepted by parser it is NOT giving expected result.
Basically the value of the variable #BQ_LAST_DATE does not seem to get to MySQL query as expected.
Does anyone know what am I missing ?
Thanks a lot for your help
You can try EXECUTE IMMEDIATE:
DECLARE BQ_LAST_DATETIME STRING;
DECLARE DSQL STRING;
SET BQ_LAST_DATETIME = 'SELECT max(date_created) from bq_my_dataset.bq_my_table';
SET DSQL = '"select * from mysqlschema.mysql_table where date_created = (' || BQ_LAST_DATETIME || ')"';
EXECUTE IMMEDIATE 'SELECT * FROM EXTERNAL_QUERY("my-gcp-project.my-region.my-connection2-cloudsql",' || DSQL || ');'
I believe I've done everything right when creating my graphite DB. Grafana can see the data but won't let me select all the fields when I try to "Add Query".
Output from my server shows that the DB is working:
show measurements
name: measurements
name
PORT
select * from "PORT"
name: PORT
time CardNo Counter Nodename PortNo value
---- ------ ------- -------- ------ -----
1511214407000000000 18 bcast_inpackets ALPRGAGQPN2 1 500
However, when I try to "Add Query" in Grafana, I can see PORT in "FROM" (which is what I want), but in the "WHERE" section, when I try to narrow my selection using CardNo, Counter, etc., it appears to behave randomly. If I select CardNo first, it will let me select 18 (see picture below), but then clicking "+" to add another criteria doesn't display the option for say "PortNo" (all I get is an empty dialog box). I can enter the field value manually (eg PortNo) but other users will be plotting graphs and won't necessarily know the underlying schema. Also, if I select Nodename first, then I can select CardNo (weird). I'd like it so the end user can specify ALL the fields (in this case CardNo, Counter, Nodename and PortNo).
My graphite template is this:
"[[graphite]]
# Determines whether the graphite endpoint is enabled.
enabled = true
database = "graphite"
# retention-policy = ""
bind-address = ":2003"
protocol = "tcp"
# consistency-level = "one"
templates = [ "ASR.PORT.* .measurement.Nodename.CardNo.PortNo.Counter"
]
and the data I feed to InfluxDB to test my setup is:
echo "ASR.PORT.ALPRGAGQPN2.18.1.bcast_inpackets 500 `date +%s`" | nc localhost 2003
Firstly, template is better written as:
"ASR.PORT.* .measurement.Nodename.CardNo.PortNo.field"
Which makes bcast_inpackets and any other value after PortNo into a field containing data. This reduces cardinality of series, which improves performance and scalability, by combining all counters into multiple fields on the same series as opposed to separate series with unique tags with their own value fields.
Grafana's influx query builder will filter tag values for the value of the already selected tags. In other words, if you select PortNo=1 and try to select another tag, only tag keys where PortNo=1 will be shown.
If you look at queries Grafana runs in browser, you will see something like show tag keys from PORT where PortNo='1' if PortNo=1 is already selected and different queries for other tags.
This is why you may not see other tags and why which tags you see depends on the tags already selected. This is by design so if you want something different you will need to adjust the schema by, for example, making PortNo and CardNo into fields instead of tags.
You might also be interested in InfluxGraph which can query InfluxDB via Graphite API and also supports same template configuration as InfluxDB.
So last week I was able to begin to stream my Appengine logs into BigQuery and am now attempting to pull some data out of the log entries into a table.
The data in protoPayload.resource is the page requested with the querystring paramters included.
The contents of protoPayload.resource looks like the following examples:
/service.html?device_ID=123456
/service.html?v=2&device_ID=78ec9b4a56
I am getting close, but when there is another entry before device_ID, I am not getting it. As you can see I am not great with Regex, but it is the only way I think I can parse the data in the query. To get just the device ID from the first example, I was able to use the following example. Works great. My next challenge is to the data when the second parameter exists. The device IDs can vary in length from about 10 to 26 characters.
SELECT
RIGHT(Regexp_extract(protoPayload.resource,r'[\?&]([^&]+)'),
length(Regexp_extract(protoPayload.resource,r'[\?&]([^&]+)'))-10) as Device_ID
FROM logs
What I would like is just the values from the querystring device_ID such as:
123456
78ec9b4a56
Assuming you have just 1 query string per record then you can do this:
SELECT REGEXP_EXTRACT(protoPayload.resource, r'device_ID=(.*)$') as device_id FROM mytable
The part within the parentheses will be captured and returned in the result.
If device_ID isn't guaranteed to be the last parameter in the string, then use something like this:
SELECT REGEXP_EXTRACT(protoPayload.resource, r'device_ID=([^\&]*)') as device_id FROM mytable
One approach is to split protoPayload.resource into multiple service entries, and then apply regexp - this way it will support arbitrary number of device_id, i.e.
select regexp_extract(service_entry, r'device_ID=(.*$)') from
(select split(protoPayload.resource, ' ') service_entry from
(select
'/service.html?device_ID=123456 /service.html?v=2&device_ID=78ec9b4a56'
as protoPayload.resource))
class Log:
project = ForeignKey(Project)
msg = CharField(...)
date = DateField(...)
I want to select the four most recent Log entries where each Log entry must have a unique project foreign key. I've tries the solutions on google search but none of them works and the django documentation isn't that very good for lookup..
I tried stuff like:
Log.objects.all().distinct('project')[:4]
Log.objects.values('project').distinct()[:4]
Log.objects.values_list('project').distinct('project')[:4]
But this either return nothing or Log entries of the same project..
Any help would be appreciated!
Queries don't work like that - either in Django's ORM or in the underlying SQL. If you want to get unique IDs, you can only query for the ID. So you'll need to do two queries to get the actual Log entries. Something like:
id_list = Log.objects.order_by('-date').values_list('project_id').distinct()[:4]
entries = Log.objects.filter(id__in=id_list)
Actually, you can get the project_ids in SQL. Assuming that you want the unique project ids for the four projects with the latest log entries, the SQL would look like this:
SELECT project_id, max(log.date) as max_date
FROM logs
GROUP BY project_id
ORDER BY max_date DESC LIMIT 4;
Now, you actually want all of the log information. In PostgreSQL 8.4 and later you can use windowing functions, but that doesn't work on other versions/databases, so I'll do it the more complex way:
SELECT logs.*
FROM logs JOIN (
SELECT project_id, max(log.date) as max_date
FROM logs
GROUP BY project_id
ORDER BY max_date DESC LIMIT 4 ) as latest
ON logs.project_id = latest.project_id
AND logs.date = latest.max_date;
Now, if you have access to windowing functions, it's a bit neater (I think anyway), and certainly faster to execute:
SELECT * FROM (
SELECT logs.field1, logs.field2, logs.field3, logs.date
rank() over ( partition by project_id
order by "date" DESC ) as dateorder
FROM logs ) as logsort
WHERE dateorder = 1
ORDER BY logs.date DESC LIMIT 1;
OK, maybe it's not easier to understand, but take my word for it, it runs worlds faster on a large database.
I'm not entirely sure how that translates to object syntax, though, or even if it does. Also, if you wanted to get other project data, you'd need to join against the projects table.
I know this is an old post, but in Django 2.0, I think you could just use:
Log.objects.values('project').distinct().order_by('project')[:4]
You need two querysets. The good thing is it still results in a single trip to the database (though there is a subquery involved).
latest_ids_per_project = Log.objects.values_list(
'project').annotate(latest=Max('date')).order_by(
'-latest').values_list('project')
log_objects = Log.objects.filter(
id__in=latest_ids_per_project[:4]).order_by('-date')
This looks a bit convoluted, but it actually results in a surprisingly compact query:
SELECT "log"."id",
"log"."project_id",
"log"."msg"
"log"."date"
FROM "log"
WHERE "log"."id" IN
(SELECT U0."id"
FROM "log" U0
GROUP BY U0."project_id"
ORDER BY MAX(U0."date") DESC
LIMIT 4)
ORDER BY "log"."date" DESC