AWS Quicksight content gone (databases, analysis, dashboards) - amazon-iam

i just logged into my Quicksight account and all my stuff is gone (no databases, no analysis, no dashboards, no recent viewed stuff). Also my colleague canĀ“t see anything. We are in the correct account and correct location. Also we are both admins and nobody touched the Quicksight account.
Do we need to enable something in the account management or something?

Related

Staying signed in to a shared AWS Cloudwatch dashboard

I have made a dashboard in AWS Cloudwatch. My goal is to display the dashboard on a display in our office. Hence I shared the dashboard using the option "Share your dashboard and require a username and password".
This makes me able to log in and view the dashboard from a non-AWS user, but I am not completely satisfied with the behaviour. My challenges are:
When logging in there is no option to "keep me signed in". So every day or so I need to sign in again. This is not desirable for an office dashboard.
Dashboard settings like light vs dark mode and refresh frequency are not stored anywhere, so after each log in (from the non-AWS user) I need to manually adjust these.
After a lot of struggle with CloudWatch dashboards I decided to switch to Grafana for my dashboards instead.
Grafana solved all my issues with CloudWatch and we even got more advanced dashboard features as well. Currently, the Grafana free tier also support all our needs, so no extra cost.

Have no access to my VM instances, no sufficient permissions

I lost my access to my VM instances. I am a student at the University of Melbourne, thus I tried to use GCP with my google account provided by the university (***#student.unimelb.edu.au).
I created my VM instances and I am still able to SSH to my VM, however, I lost my access to it via GCP Dashboard. And here goes the error code:
You are missing at least one of the following required permissions:
Project
resourcemanager.projects.get
And when I checked API with troubleshooter, it says
You do not have the required permissions to make this request. Please
contact your organization administrator.
I don't think I have done anything concerning administrative problems, why did I lose control of my instances?
Your permissions have been limited. like the message say, try to contact your University administrator to know more. If your project is billed to the University, University is organisation admin and can do anything on its own domain.
Is your VM respected the rules of your university? What was its size? did you mining on it? did you install unauthorized software? At administration level, Google provide a lot of metrics and alert about abnormal resource usage. Maybe you broke a rule.
Are you sure you had permission before creating the instance, to view the instance ?
With the error you are talking about maybe you didn't have access from the start itself. You would have been given access to create, but not to list VMs. I found the access-control of GCP very complex, you can have access to many things through CLI, but to get the same through console, you need to be granted more accesses.
Ask your admin to provide your account, one of the viewer or editor or maybe browser roles from here:
https://cloud.google.com/resource-manager/docs/access-control-proj
Or may be as guillaume suggested you might have broken a rule ;)

how to securely publish logs to the cloud

My library is a CLI utility, and people get it by running pip install [libname]. I would like to automatically record exceptions that occur when people use it and store these logs in the cloud. I have found services that should do just that: AWS CloudWatch, GCP Stackdriver.
However, while looking at their API it appears that I would have to ship my private key in order for the library to authenticate to my account. This doesn't sound right and I am warned by the cloud providers not to do this.
Example from GCP fails, requires credentials:
from google.cloud import logging
client = logging.Client()
logger = client.logger('log_name')
logger.log_text('A simple entry') # API call
While python library exposes source, I understand that any kind of authentication I ship would bear the risk of people sending any fake logs, but this is OK to me, as I would just limit the spending on my account for the (unexpected) case that somebody does just that. Of Course the credentials that ship with the library should be restricted to logging only.
Any example of how to enable logging to a cloud service from user machines?
For Azure Application Insights' "Instrumentation Key" there is a very good article about that subject here: https://devblogs.microsoft.com/premier-developer/alternative-way-to-protect-your-application-insights-instrumentation-key-in-javascript/
While I'm not familiar with the offerings of AWS or GCP, I would assume similar points are vaild.
Generally speaking: While the instrumentation key is a method of authentication, it is not considered a very secret key in most scenarios. The worst damage somebody can do is to send unwanted logs. They cannot read any data or overwrite anything with that key. And you already stated above that you are not really worried in your case about the issue of unwated logs.
So, as long as you are using an App Insights instance only for one specific application / purpose, I would say you are fine. You can still further aggregate that data in the background with data from different sources.
To add an concrete example to this: This little tool from Microsoft (the specific use case does not matter here), collects telemetry as well and sends it to Azure Application Insights - if the user does not opt out. I won't point to the exact code line but their instrumentation key is checked-in to that public GitHub repo for anybody to find.
Alternatively, the most secure way would be to send data from the
browser to your custom API on your server then forward to Application
Insights resource with the correct instrumentation key (see diagram
below).
(Source: the link above)
App Insights SDK for python is here btw: https://github.com/microsoft/ApplicationInsights-Python
To write logs to Stackdriver requires credentials. Anonymous connections to Stackdriver are NOT supported.
Under no circumstances give non-privileged users logging read permissions. Stackdriver records sensitive information in Stackdriver Logs.
Google Cloud IAM provides the role roles/logging.logWriter. This role gives users just enough permissions to write logs. This role does not grant read permissions.
The role roles/logging.logWriter is fairly safe. A user can write logs, but cannot read, overwrite or delete logs. I say fairly safe as there is private information stored in the service account. I would create a separate project only for Stackdriver logging with no other services.
The second issue with providing external users access is cost. Stackdriver logs are $0.50 per GiB. You would not want someone uploading a ton of logfile entries. Make sure that you monitor external usage. Create an alert to monitor costs.
Creating and managing service accounts
Chargeable Stackdriver Products
Alert on Stackdriver usage
Stackdriver Permissions and Roles

Google Analytics Reporting API service account issues

I've been trying to get data from GA using a service account, however, my issue is that it keeps saying;
Error: User does not have sufficient permissions for this profile.
I have enabled GA reporting API and given access to GA account using the email of the service account. In addition, it was granted "read and analyze" permissions on the account.
Tried this method on a personal account, and everything worked fine, however, when working on a client project, the issue comes back.
What could I be missing?
This was interesting to figure out.
I've used Account ID against one Google Analytics Account and that worked.
For the one I have been having an issue with, I needed to use the View ID.

Provide S3 bucket access to Any Authenticated AWS user in the new console

I am trying to provide full access for "Any Authenticated AWS User" to my S3 Bucket. The old S3 console has an option as below to do this:
The new console doesn't have a similar option any more. I am switching to the old console now to enable this option but the old console is going to be inaccessible soon by Aug 31st 2017. Any idea how can i do this in the new Console ? If not a UI option , at least with a bucket policy ?
I am trying to provide full access for "Any Authenticated AWS User" to my S3 Bucket.
I strongly advise against that. You are opening yourself up to an incredibly large number of potential issues.
What if somebody uploads 10PB of data to your bucket? Your bill would be at least $260k for that month.
What if somebody uploads copyrighted material (or any kind of illegal material) to your bucket? You would be responsible for the likely illegal distribution of copyrighted material.
Many, many companies are actively scanning their buckets to find those that have open write privileges and taking action to remediate that ASAP! Even some of the security related services from AWS will complain if you do that. As an example, Trusted Advisor has a built-in rule to detect and notify you against what you are trying to do.
If you describe the problem you are trying to solve, you might get betters suggestions.