I have uploaded a number of jobs to CTS and am facing the issue, that for every job instance information, that is listed in job['responsilities'] is not found during searches.
Is there any way to add this field to the index?
It seems now, that the indexable fields are actually limited and are now also documented in the official docs.
For further info on implementing google CTS, feel free to ask me.
Related
During the pandemic around March, Google started allowing business owners to tag their restaurants with the dining option they offer in light of the pandemic.
Here is an example of these tags
I was wondering if the Places API (or any other Google API) has the ability to return these dining types. I've checked the docs for the Places API and it seems to only be capable of returning the business' business_status which only includes OPERATIONAL, CLOSED_TEMPORARILY, CLOSED_PERMANENTLY and not the fields I am looking for.
Would the only other way to obtain these tags be by web scraping a search result?
I just spoke with Google support and they said no. There is also no plan in the future to add these data points to the JSON.
They gave me some workaround ideas but nothing realistic. Let me know if you find a workaround! I am trying to gather this data as well.
I am building the issue tracker application similar to the one in oracle docs.
https://docs.oracle.com/cd/E11882_01/appdev.112/e11945/issue_track_obj.htm#BABIIBBF
In short a dynamic approval workflow, where multiple approvers can be added or deleted using tree structure.
I am not asking anyone to design this for me, but any help in atleast giving a brief overview of how this can be done or if someone has done it, maybe tips would be very valuable.
Vini,
From a high level you would need to define a DEPT_APPROVERS Table which lists for each department who the individual approvers are. Then you could send notifications to all of the approvers listed based on which Department is assigned to the person making the request.
Regards,
David
I have just recently started to work with Google Cloud and I am trying to wrap my head around some of its inner workings, mainly the audit logging part.
What I want do is get the log activity from when my keys are used for anything and also when someone actually logged into the Google Console Cloud (it could be the Key Vault or the Key Ring, too).
I have been using power shell to extract these logs using gcloud read logging and this is where I start to doubt whether I have the right place. I will explain:
I have created new keys and I see in the Activity Panel this action, and I can already extract this through gcloud read logging resource.type=cloudkms_cryptokey (there could be a typo on the command line, since I am writing it from the top of my head, sorry for that!).
Albeit I have this information, I am rather curious if this is the correct course of action here. I saw the CreateCryptoKey and SetIamPolicy methods on my logs, alright, but am I going to see all actions related to these keys? By reading the GCloud docs, I feel as though I am only getting some of the actions?
As I have said, I am trying to work my way around the GCloud Documentation, but it is such an overwhelming amount of information that I am not really getting the proper answer I am looking for, this is why I thought about resorting to this community.
So, to summarize, am I getting all the information related to my keys the way I am doing right now? And what about the people that have access to the Google Cloud Console page, is there a way to find who accessed it and which part (Crypto Keys page, Crypto Vault page for example)? That's something I have not understood from the docs as well, sadly. Perhaps someone could show me the proper page where I can make references to what I am looking for? Because the Cloud Audit Logging page doesn't feel totally clear to me on this front (and I assume I could be at fault here, these past weeks have been harsh!)
Thanks for anyone that takes some time to answer my question!
Admin activities such as creating a key or setting IAM policy are logged by default.
Data access activities such as listing Cloud KMS resources (key rings, keys, etc.), or performing cryptographic operations (encryption, decryption, etc.) are not logged by default. You can enable data access logging, via the steps at https://cloud.google.com/kms/docs/logging. I'm not sure if that is the topic you are referring to, or https://cloud.google.com/logging/docs/audit/.
I operate a number of content websites that have several million user sessions and need a reliable way to monitor some real-time metrics on particular pieces of content (key metrics being: pageviews/unique pageviews over time, unique users, referrers).
The use case here is for the stats to be visible to authors/staff on the site, as well as to act as source data for real-time content popularity algorithms.
We already use Google Analytics, but this does not update quickly enough (4-24 hours depending on traffic volume). Google Analytics does offer a real-time reporting API, but this is currently in closed beta (I have requested access several times, but no joy yet).
New Relic appears to offer a few analytics products, but they are quite expensive ($149/500k pageviews - we have several times this).
Other answers I found on StackOverflow suggest building your own, but this was 3-5 years ago. Any ideas?
Heard some good things about Woopra and they offer 1.2m page views for the same price as Relic.
https://www.woopra.com/pricing/
If that's too expensive then it's live loading your logs and using an elastic search service to read them to get he data you want but you will need access to your logs whilst they are being written to.
A service like Loggly might suit you which would enable you to "live tail" your logs (view whilst being written) but again there is a cost to that.
Failing that you could do something yourself or get someone on freelancer to knock something up for you enabling logs to be read and displayed in a format you recognise.
https://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the metrics that you need to track are just limited to the ones that you have listed (Page Views, Unique Users, Referrers) you may think of collecting the logs of your web servers and using a log analyzer.
There are several free tools available on the Internet to get real-time statistics out of those logs.
Take a look at www.elastic.co, for example.
Hope this helps!
Google Analytics offers real time data viewing now, if that's what you want?
https://support.google.com/analytics/answer/1638635?hl=en
I believe their API is now released as we are now looking at incorporating this!
If you have access to web server logs then you can actually set up Elastic Search as a search engine and along with log parser as Logstash and Kibana as Front end tool for analyzing the data.
For more information: please go through the elastic search link.
Elasticsearch weblink
I need to create a form, which on submission, filters search results based on certain keywords.
The keywords I'm working with a year, make, model (it's a car part site). If I use Amazon's webstore search function. Make and Model are parameters that are able to be used because most of the products have make and model in the product name, which isn't really the most efficient to say the least.
I need to be able to query the product database, but it's just not how Amazon Webstore, I assume.
Does anybody have an example of this that I can look at? Does anyone develop custom templates in Amazon Webstore so I can ask questions?
Start at the Amazon Webstore Forum with dedicated staff, documentation and help folks, and more solutions.