I am trying to implement GCP FHIR store.
Like it says on the GCP documnentation, I should be able to see data when I click the resource type.
However, I see 18 data exists at the pagination section at the the bottom, but data does not show.
I am able to click little empty space of a row and it shows overview but elements and json tab of the data is not available.
Since I cannot embed an image, I will share a link of an image.
Thx in advance.
As suggested in the comments, this seems like a bug on GCP's side.
You can report it on the Public Issues Tracker including symptoms and reproduction steps.
Google Cloud Console has identified a bug in FHIR viewer in which certain types of data would not render under certain circumstances pertaining to dates.
A fix has been submitted and should be included in an upcoming release, usually deployed within a week.
Related
I am trying to wrap my head around Microsoft Cloud for Sustainability. Apparently it's a solution based on Microsoft Dynamics. I need to have more back-end to that solution, because as it is right now I'm either lacking permissions (or extra paid access to Microsoft resources) or missing a chunk of documentation, because I'm unable to:
Change default language across the board - I can switch MS Dynamics to any language I want, but it will work for a shell only. Anything that's CfS specific, is in English. Do I remove the demo data and import my own scopes and data? As only thing available are database and Cube for BI analytics and JSON files describing CfS structure in general (that's in CDM), do I really have to create it from scratch? This brings me to second question:
Access entry-level data that's already in demo version - I need to see what's in the database the CfS is using, or be able to modify it. Is there any way to get to it via Business Central, if at all possible?
Since I will be preparing several presentations for potential customers, I need a way to quickly create a dataset based on initial and very basic information provided by each customer, how can I do that with trial user
I work for a company that's Microsoft Certified Partner, so logically resources for what I need should be available to me, but either links in the documentation are dead (and some are, as they redirect to general info) or require some special access level (or are dead, but error message is really not helpful at all).
Is there somewhere else I can go? The Documentation page offers little towards what I need...
P.S. I think I should tag this question with CfS specific tags, but not enough rep...
I have a Books API project, and the GCP shows "No data is available for the selected time frame" for the last 30 days. This message appears on both the "Metrics" and "Quotas" pages. See screenshots below.
Clearly there is data, which I can see via my app analytics reports.
Any suggestions on how to fix it?
UPDATE 1:
Following are some points that were missing on the original post:
The Google Books API is used by an iOS app, which is available on the App Store and widely used across many iOS devices (iPhone and iPads) in many countries.
There are thousands of iOS devices running my app so the Google Books API calls are invoked from thousands of endpoints with different locations and different IPs. All endpoints are using the same API_KEY.
The Google Books API calls are performed successfully from the iOS devices and there is no API issue (I can clearly see that using analytics tool).
The only issue I have, is with GCP console not showing the number of the API calls (and other metrics) associated with my API_KEY. As you can see in the previous screenshots, I get "No data is available for the selected time frame" anywhere.
This is a regression issue since until recently I could successfully view the actual data of the API usage. I didn't change anything in this period.
When going to GCP > IAM & Admin > Quotas, you can clearly see that the app indeed consumes API calls (see screenshot below).
Any suggestion why would the GCP console tell that no data is available, while data is indeed available?
As the documentation [1], Google Books respects copyright, contract, and other legal restrictions associated with the end user's location. As a result, some users might not be able to access book content from certain countries. For example, certain books are "previewable" only in the United States; we omit such preview links for users in other countries. Therefore, the API results are restricted based on your server or client application's IP address.
On the other hand, I hope link [2] could be helpful for you which seems similar to the issue you are facing. Also, documentation [3] [4] could be helpful for us to have more information about books API to use in the Google Cloud Platform.
[1] https://developers.google.com/books/docs/v1/using#UserLocation
[2] Google books api always returns nothing
[3] https://developers.google.com/books/docs/v1/using
[4] https://developers.google.com/books/docs/v1/getting_started
I'm working on a project where I am tasked to use google cloud services to process and visualize fitness data. For example, I have exported some apple health data from my watch, and it is in .xml format. From a high level, I envision this .xml file starting off in object storage, and being converted to .csv through a cloud function (triggered by the creation of the .xml object in storage) and stored again in object storage (different bucket). Then I see these .csv files being processed by a DataFlow pipeline, which will reformat the data to the template schema that I would like the data to be organized with. This pipeline will output the resultant .csv to BigQuery, which will then be designated as a data source for Data Studio. I will then configure Data Studio to produce some simple reports that compare the health data to recommended values. I would like for this report to be accessible as a .pdf in object storage potentially as well. Am I on the right track, or am I missing some key services to accomplish this?
Also, I'm new to posting on StackOverflow, so if this question is against the rules or not welcome, please let me know.
Any feedback is greatly appreciated, as I have not been able to bounce these ideas off of other experienced cloud architects/developers.
This question is currently off-topics by the rule of StackOverflow, as it does not contain any problems to resolve. See point 4-5.
As a high-level advice, I do not see why it should not be possible based on the services you mentioned but you would need to implement it and try it on your side and evaluate the features of each service in your workflow.
In terms of solution or architecture advice, those are generally paid services and you would most likely find little help here for those unless you have a specific problem to solve with said services. You might find some help on the internet as well. ie.Cloud Solutions, Built it on GCP, etc
You might find this interesting to review as well as it mimics your solution. Hope this helps.
To be totally clear. This question is about SteamVR dashboard overlays specifically, not regular overlays.
I have been playing around with the C++ SteamVR SDK and working on some overlay application prototypes lately. Something I have not managed to do so far is to get a dashboard overlay to show up. The error I get when I call CreateDashboardOverlay is VROverlayError_PermissionDenied. I'm guessing that this is because I need to be authenticated with a SteamVR developer account, which I don't currently have. Can anyone verify that? There doesn't seem to be any (public) documentation on this at all beyond what's in openvr.h and the openvr github docs page, which is somewhat sparse.
I'm also guessing that any dashboard overlay application would need to be distributed through the official Steam store, but again I can't find anything official on that. I suspect that Valve would require this since otherwise any old malware that happens to be running on the system could easily create an official-looking dashboard overlay.
Note again that I am referring specifically to dashboard overlays. I can get regular overlays to show up just fine.
There are a few repos on github with implementations of steamvr overlays (https://github.com/Hotrian/OpenVRDesktopDisplayPortal for example), but I have yet to find one that is actually creating a dashboard overlay.
Any info or links to documentation I'm somehow missing would be greatly appreciated. I'm starting to think I might be missing something obvious.
Thanks
Edit for clarity:
My questions are: Am I getting the permission denied error when calling CreateDashboardOverlay because I need to satisfy some kind of authentication requirement such as having a steam dev account? And do SteamVR dashboard overlay apps need to be distributed via an official channel?
On further review it appears I was misinterpreting my own debug output and reading a bit too much into it because the conclusions sort of made sense in my mind.
The CreateDashboardOverlay call was working fine. Later on in my code I was calling ShowOverlay, which of course is not allowed for dashboard overlays (They are shown by opening them via the SteamVR dashboard UI).
My dashboard overlay is working fine after all.
To summarize, the answer to both of my questions is no. No Steam developer status is needed to create a dashboard overlay and SteamVR dashboard overlay apps do not need to be distributed through any kind of official channel.
I have created a flow in Cloud Dataprep, job executed. All fine.
However, my colleagues, who also has owner role in this GCP project, are not able to see the flow I created. I'm not able to find sharing options anywhere.
How should it be setup so that Dataprep flow can be worked on by multiple users?
Thanks.
UPDATE from September 2018 onwards
Sharing is now available. See the answer from #justbeez below https://stackoverflow.com/a/55386000/945789 for details.
No longer relevant - original answer prior to this release below
At the current time this is not possible. There were a bunch of new features released Jan 23, 2018 but sharing flows and/or datasets is still not there. Given it's still Beta keep an eye on releases as they seem to be big jumps: https://cloud.google.com/dataprep/docs/release-notes.
There are two workarounds I know of at the moment:
Whenever you create flows, do so under a shared/generic Google login. Google don't recommend this as it's not great security, but it's the only way to have live access with more than one person. Of course that user will also need access to your project.
Periodically export your flow as a zip (three dots menu when you hover a flow) and store that on a shared drive. You can import that flow at the flows menu. I'm not sure whether export and/or import was new in the Jan 2018 release or not, but I didn't notice it prior to that.
We've been using sharing at the Flow level since it was added in September 2018 (check the changelog). You get to this option from the kabob menu on the individual flow:
From there, you'll get a sharing menu you can use (keep in mind, you can only share with people who are in the project and have suitable permissions):
From there, you can see either all flows by default or just those that are shared:
Note that there's currently only one shared permission level, which is Collaborator. This means the person you share with can work with the flow, add recipes, etc.—but they don't have the option of renaming the flow or managing sharing. The recently added Folders option doesn't support sharing at all.
Full details on sharing flows can be found here:
https://cloud.google.com/dataprep/docs/html/Overview-of-Sharing_118228675