I have a column of 12.000 VAT numbers, and I add a custom Column with code to get data from web API. I want it to pause for 1 second after every 100th row. how do i manage this?
current code:
Function.InvokeAfter(()=>
Json.Document(
Web.Contents("https://data.brreg.no/enhetsregisteret/api/enheter/"&
Number.ToText([Kunde ID]), [Headers=[Accept="application/json"]]))
,#duration(0,0,0,1))
See this
https://community.powerbi.com/t5/Desktop/Query-to-Scrap-URLs-with-a-pause-every-xx-URLs/td-p/352809
which basically involves a wait function you call every 100 times before hitting the API
let Wait = (seconds as number, action as function) =>
if (List.Count(List.Generate(() => DateTimeZone.LocalNow() + #duration(0,0,0,seconds), (x) => DateTimeZone.LocalNow() < x, (x) => x)) = 0)
then null else action()
in Wait
Related
New to Informatica.
For Ex: This is a Flat file to Flat file load.
I have a expression that has calculated the data to the sample given below:
The some CUST has one entry with N flag and some has two with N and Y.
I need only the 1 and N or 2 and Y occurrence to be on target table, as sated below, pls let me know how to do it in Informatica.
Source
CUST-111|N|1
CUST-222|N|1
CUST-222|Y|2
CUST-333|N|1
CUST-444|N|1
CUST-555|N|1
CUST-555|Y|2
CUST-666|N|1
CUST-666|Y|2
Target:
CUST-111|N|1
CUST-222|Y|2
CUST-333|N|1
CUST-444|N|1
CUST-555|Y|2
CUST-666|Y|2
Thanks a lot guys
You can first calculate count of customer. Then, if count =1 and flag = N, pass it to target else if count >1, then pass to target only the record with flag =Y.
Steps below -
Sort data by Cust ID (CID)
Use Aggregator to calculate count.
Use CUST_ID as group by. Create a new output port
out_FLAG_CNT = COUNT(*).
Use joiner to join step 2 and step1. Join condition is Cust ID.
Then use a filter with below condition-
IIF (out_FLAG_CNT>1 AND FLAG='Y',TRUE, IIF( out_FLAG_CNT=1 AND FLAG='N', TRUE, FALSE))
Finally link this data to target.
|-->Agg( count by CID)-|
SQ --> SRT (Sort by CID) -->|---------------------->|JNR (on CID) -->FIL (Cond above) --> Target
Pls note, if you have more than 1 N or more than 1 Y data, then above will not work and you need to attach another aggregator in the end.
I have one sheet with data on my facebook ads. I have another sheet with data on the products in my store. I'm having trouble with some countifs where I'm counting how many times my product ID exists in a row where multiple numbers are. They are formatted like this: /2032/2034/2040/1/
It's easy on the rows where only one product ID exists but some rows have multiple ID's separated by a /. And I need to see if the ID exists as a exact match alone or somewhere between the /'s.
Rows with facebook ads data:
A1: /2032/2034/2040/1/
A2: /1548/84/2154/2001/
A3: /2032/1689/1840/2548/
Row with product data:
B1: 2034
C1: I need a countifs here that checks how many times B1 exists in column A. Lets say I have thousands of rows with different variations of A1 where B1 could standalone. How do I count this? I always need exact matches.
You can compare the number you want (56) with the REGEX #MonkeyZeus commented whith a little change -> "(?:^|/)"&B1&"(?:/|$)" so the end result is:
=IF(REGEXMATCH(A1, "(?:^|/)"&B1&"(?:/|$)"), true, false)
Example:
UPDATE
If you need to count the total of 56 in X rows you can change the "True / False" of the condition for "1 / 0" and then do a =SUM(C1:C5) on the last row:
=IF(REGEXMATCH(A1, "(?:^|/)"&B1&"(?:/|$)"), 1, 0)
UPDATE 2
Thanks for contributing. Unfortunately I'm not able to do it this way
since I have loads of data to do this on. Is there a way to do it with
a countif in a single cell without adding a extra step with "sum"?
In that case you can do:
=COUNTA(FILTER(A:A, REGEXMATCH(A:A, "(?:^|/)"&B2&"(?:/|$)")))
Example:
UPDATE 3
With the following condition you check every single possibility just by adding another COUNTIF:
=COUNTIF(A:A,B1) + COUNTIF(A:A, "*/"&B1) + COUNTIF(A:A, B1&"/*") + COUNTIF(A:A, "*/"&B1&"/*")
Hope this helps!
try:
=COUNTIF(SPLIT(A1, "/"), B1)
UPDATE:
=ARRAYFORMULA(IF(A2<>"", {
SUM(IF((REGEXMATCH(""&DATA!C:C, ""&A2))*(DATA!B:B="carousel"), 1, )),
SUM(IF((REGEXMATCH(""&DATA!C:C, ""&A2))*(DATA!B:B="imagepost"), 1, ))}, ))
I am writing a pattern in Siddhi CEP like -
"from every e1=inputStream[(e1.name == 'A')]<2> -> e2=inputStream[(e2.name == 'B')<2:> within 10 seconds select 'rule1' as ruleId insert into outputStream"
The above query only executes once when I pass AABB into the stream, and then nothing happened after that. Even I pass AABB again.
When I write this -
"from every e1=inputStream[(e1.name == 'A')] -> e2=inputStream[(e2.name == 'B')<2:> within 10 seconds select 'ruel2' as ruleId insert into outputStream"
The above query works well for each ABB pattern.
Is there a way to achieve AAB and then pattern should start from next upcoming AAB when the window time expires.
I have an actor which receives WeatherConditions and pushes it (by using OfferAsync) it to source. Currently it is setup to run for each item it receives (it stores it to db).
public class StoreConditionsActor : ReceiveActor
{
public StoreConditionsActor(ITemperatureDataProvider temperatureDataProvider)
{
var materializer = Context.Materializer();
var source = Source.Queue<WeatherConditions>(10, OverflowStrategy.DropTail);
var graph = source
.To(Sink.ForEach<WeatherConditions>(conditions => temperatureDataProvider.Store(conditions)))
.Run(materializer);
Receive<WeatherConditions>(i =>
{
graph.OfferAsync(i);
});
}
}
What I would like to achieve is:
Run it only once every N minutes and store average value of WeatherConditions from all items received in this N minutes time window
If item received matches certain condition (i.e. item temperature is 30% higher than previous item's temperature) run it despite of being "hidden" in time window.
I've been trying ConflateWithSeed, Buffer, Throttle but neither seems to be working (I'm newbie in Akka / Akka Streams so I may be missing something basic)
This answer uses Akka Streams and Scala, but perhaps it will inspire your Akka.NET solution.
The groupedWithin method could meet your first requirement:
val queue =
Source.queue[Int](10, OverflowStrategy.dropTail)
.groupedWithin(10, 1 second)
.map(group => group.sum / group.size)
.toMat(Sink.foreach(println))(Keep.left)
.run()
Source(1 to 10000)
.throttle(10, 1 second)
.mapAsync(1)(queue.offer(_))
.runWith(Sink.ignore)
In the above example, up to 10 integers per second are offered to the SourceQueue, which groups the incoming elements in one-second bundles and calculates the respective averages of each bundle.
As for your second requirement, you could use sliding to compare an element with the previous element. The below example passes an element downstream only if it is at least 30% greater than the previous element:
val source: Source[Int, _] = ???
source
.sliding(2, 1)
.collect {
case Seq(a, b) if b >= 1.3 * a => b
}
.runForeach(println)
I'm trying to see if there is a way to do pagination with Doctrine2 without writing custom DQL. I've found that the findBy() function returns the result set that I want, however I'm missing one piece of information to properly paginate on the UI, namely the total number of records that the result could have returned.
I'm really hoping that this is possible, since it's a "simple" one liner.
$transactions = \Icarus\Entity\ServicePlan\Transaction::getRepository()->findBy(array('user' => $userId, 'device' => $device), null, $transactionsPerPage, $currentPage);
Does anyone know how/if I can get this information from the findBy() function?
Short anwser, no. You're essentially running this query:
SELECT * FROM transaction WHERE user = $userId AND device = "$device" LIMIT $currentPage, $transactionPerPage;
By specifying a limit, the query is only going to return the amount of rows from your offset inside that limit. So if $transactionPerPage = 10, the total rows returned by that query will be 10.
Assuming the total count is somewhat static, I would suggest first running a count on the total matching documents on the first page request and caching that result ( or storing in sessions ), so you only need to grab the total count once.
edit: Example of count query, using just normal php sessions:
if ( !isset( $_SESSION['transactionCount'] ) )
{
$transactionCount = $em->createQuery('SELECT COUNT(*) FROM \Icarus\Entity\ServicePlan\Transaction WHERE user = ?1 AND device = ?2')
->setParameters( array( 1 => $userId, 2 => $device ) )
->getSingleScalarResult();
$_SESSION['transactionCount'] = $transactionCount;
}
edit2: If you really dont want to use DQL, you can run your .findBy() with out the offset and limit, and do a sizeof on the results:
$transactions = \Icarus\Entity\ServicePlan\Transaction::getRepository()->findBy(array('user' => $userId, 'device' => $device) );
$totalTransactions = sizeof( $transactions );
But the performance on this wont be as good, as you are actually fetching all the objects.
Did you try this ?
$queryBuilder->select('p.id, p.name')
->from('\Api\Project\Entity\Project', 'p')
->where('p.status = :status')
->setParameter('status', 1)
->orderBy('p.createdDate', 'asc')
->setFirstResult($page)
->setMaxResults($limit);