According to Google Analytics, I had 5 visits from zero unique visitors. Is that a bug or did I perhaps implement something wrongly? Or hasn't the data processing finished yet (I created this view 2 days ago)?
The view is based on an include custom filter that's supposed to include only traffic from any of three ip addresses. The regex I used for this is
62\.58\.32\.193|77\.172\.143\.12$|213\.125\.166\.98
My best guess would be the way Google defines unique Visitors. Sometimes I have been visiting my own website periodically and I ended up showing up as a unique visitor (My site isn't so popular so it's easy for me to track that). I would either have to say that it has to do with the nature of visits or the actual way of unique visitors. According to google this is how the find unique visitors
The other Unique Visitors metric calculation (Calculation #2) is based
on the __utma cookie. Calculation #2 is used when segmenting the
Audience Overview report or when viewing Unique Visitors over any
dimension other than date. As such, Calculation #2 is used in custom
reports to allow for the calculation of Unique Visitors over any
dimension, such as browser, city, or traffic source.
source: https://support.google.com/analytics/answer/2992042?hl=en
Occasionally, there are problems with Google Analytics reporting. Check the product forums. For example, here is an issue that happened on Nov 11, 2013:
http://productforums.google.com/forum/#!topic/analytics/fsurDK8AOcY
This issue can also crop up when you are using the page dimension. Unique visitors are only assigned to the first page in a visit as described here. But, it doesn't seem like that is the case for you.
Finally, its possible, analogous to the page dimension situation, that unique visitors are only assigned to the first IP address that a visitor came from. If that is true, then if the people who came to your site had previously come from a different IP address, then they wouldn't show up as unique visitors in your filter.
Related
On May 15th, 2017, three metrics will be removed from the Reporting API of the Google Apps Admin SDK:
num_docs_internally_visible
num_docs_externally_visible
num_docs_shared_outside_domain
I use all of these metrics in a scripts that performs some audits of our G Suite domain.
The migration docs say to use num_owned_items_with_visibility_shared_externally_delta instead of num_docs_shared_outside_domain for instance, but I don't understand how a delta metric can be used as a replacement unless you keep track of the actual number from day zero on.
How do I get the number of externally shared documents as a total, not a delta value?
Based from the documentation given, num_owned_items_with_visibility_shared_externally_delta is the number of items within the user's domain account that are not public or visible to anyone with link, but shared explicitly either with users or groups outside the domain up to the date of the report. So I think you can use it to get the number of externally shared documents. Also it is stated in the notice that differences in metrics calculations exist in these changes, so review the notes as well as Key issues for relevant details.
After creating goals and associate them with pages on Sitecore 7.2. I can't see the Goals Conversion report on the Executive Insight Dashboard. The other metrics are being filled but not the Goals Conversion. Am I missing something?
If I query the Sitecore Analytics database I can see records on the [Visitors] table, with a value different from 0 in the "Value" field (I believe that is the value filled by the configured goals), also I can see the goals triggered on the [PageEvents] table.
Other thing, is it normal each page request for the same user the same goal is triggered and engagement value points get accumulated?
One thing you could check is the MinimumVisitFilter setting in this file in your webroot:
\sitecore\shell\Applications\Reports\Dashboard\Configuration.config
By default this is set to 50 visits - you'll only get data in the dashboard if you get 50+ visits triggering the Goal.
As far as I aware the engagement value points should be accumulated in the scenario you describe - though I haven't tested this in 7.2.
I am using Amazon's Product Advertising API. When I'm searching products by keyword from an item search operation I get only 10 results, is there any way to get all result in one single call?
No - Paging Through Results explains some of the details:
It is possible to create a request that returns many thousands of
items in a response. This is problematic for several reasons.
Returning all of the item attributes for those items would
dramatically impact the performance of Product Advertising API in a
negative way. Also, posting a thousand responses on a web page is
impractical.
...
This example shows that 9729 items matched the search criteria. Also,
it shows that those results are on 973 (~9729/10) pages. You might try
putting in an ItemPage value over 10. If you do, Product Advertising
API returns the following error.
...
So, how do you get that 973rd page? You cannot. A better approach is
to submit a new request that is more targeted and yields fewer items
in the response.
I am planning to do a network analysis of bmtc bus connectivity network... So i need to acquire data regarding bus routes. The best website as far as i know is
http://www.narasimhadatta.info/bmtc_query.html
Under the "search by route " option the whole list of routes is given and one can select any one of them and on clicking "submit" it displays the detailed route . Previously when I acquired data online I used to encash upon the fact that each item (in this case route number) lead to distinct URL, and I used to acquire data from the source page using Python. But here irrespective of the bus route the final page always has the URL
http://www.narasimhadatta.info/cgi-bin/find.cgi
and it's source page doesn't contain the route details!
I am only comfortable with Python and Matlab. I couldn't figure out any means to acquire data from that website. If you can see something, technically one should be able to download the data (at least thats what I believe). So can you please help me out with a code which crawls through each bus route number automatically and downloads the route details?
I looked at the url you mentioned. if you have a list of route numbers, you can use the following url sturcture to extract data.
http://www.narasimhadatta.info/cgi-bin/find.cgi?route=270S
or
http://www.narasimhadatta.info/cgi-bin/find.cgi?route=[route number from you list]
I'd like to enter a URL into google and get all options chain data for a particular stock. Is there a guide that shows you how to use it, like if I wanted to grab all options that expire in the next year without knowing the individual expiration dates, or if I just wanted a particular strike price? I found another question that gives me the basic outline, but doesn't specify the details:
Finance historical options data (with strikes etc) on google finance API
Here is an example on how to do this using the ImportHTML function, although there is a delay in the data of course.
=importhtml("http://finance.yahoo.com/q/op?s=QQQ&m=2013-12","table",0)