How to view table data in DynamoDB - amazon-web-services

AWS Console seems to indicate my tables have some data from my test put_item() calls, but I would like to actually see the data. Is there a means to do this in AWS Console? I've read something on AWS Explorer that can be installed as a plugin to eclipse or visual studios, but I'm a PHP developer who doesn't use Eclipse, so it seems silly to install a whole IDE just to ensure the correct data is being entered.
How can I check the data in my DynamoDB tables?

[UPDATE]
Amazon just launched "Explore Table" for DynamoDB in the AWS Management Console.
If you are using Visual Studio or Eclipse, you can use the AWS Explorer to see all of your tables and data.
In Eclipse it looks like this:

Related

How to delete amzon sagemaker studio project

I am unable to delete the project in amazon sagemaker studios have tried using the different methods
There's no hard delete on SageMaker Projects. When you delete them through the SDK or CLI (API reference), it updates the status of the project to 'Deleted'. You can extend the left pane to see the status, like the screenshot below.

AWS S3 Get cli command line from each operation done by GUI

When I do an operation like an S3 upload, using the AWS GUI from browser, is it possible to retrieve the relative CLI command for generating the same operation already done?
Thanks.
There is an extension for chrome and Firefox (As far as I know) that records the changes made in the AWS console and translates it to CLI commands.
Plugin Link
Although not all the services / actions are supported It does a pretty good job. Here you can check the service coverage of the plugin
Service Coverage

Where are AWS CLI and SDK credentials actually stored? My computer disagrees with documentation

I effectively have a credential leak in my dev environment and can't plug the hole.
Windows 10
Visual Studio Community 2017.
AWS Toolkit for Visual Studio... 1.16.0.0
aws-cli/1.17.15 Python/3.6.0 Windows/10 botocore/1.14.15
I am aware of single sign on but have never attempted to use it. Access Key/Secret Key only.
I started with the documented storage locations existing. Perhaps created in earlier versions of the cli and studio extension? I then manually deleted ~/.aws and the stumbled upon ~/appdata/local/awstoolkit but I can just keep firing off CLI commands and editing in studio as if nothing changed. Using the aws cli --no-sign-request switch or using some other computer demonstrates the normal disposition of my reference commands is to require my credentials.
I've closed and reopened terminals, studio. I rebooted. Nothing cached, didn't matter. With those credential folders still not present, I can uninstall the AWS CLI and AWS SDK, reinstall them and without additional steps dive back into CLI commands and studio work without providing keys.
The only functional means to delete local credentials is to delete my profile within AWS's studio extension. With credentials deleted in that way, stuff stops working as it should. Using the AWS CLI configure command the CLI and studio will both be able to do credentialed work again but somehow without creating ~/.aws or storing encrypted credentials in ~/appdata/local/awstoolkit. I can use the --debug switch with AWS CLI commands to see that when they do succeed that the tool claims it finds my shared credentials at ~/.aws.
While my credentials are working the expected files and folders do not exist. I cannot find the files or folders in Windows Explorer, PowerShell, or cmd.
What am I missing?
The paths you mentioned are the main relevant ones for files, at least if you are talking about CLI access (there are some other options within an application). The one other place to look is in your environment variables.
I would recommend creating a new set of credentials and disabling (not deleting) the old one. At a minimum you can start working with the new credentials and be aware of where you put them. Then, if something isn't working, you can enable the old credentials for long enough to do what you need to do with them before trying to locate them once again.

Where is the visual config view in AWS Lambda?

I have been trying to look at the code that is deployed in an aws lambda.
There is an existing go function that is running in the go lambda.
However, I am not able to. AWS docs says we can look at the code through the visual config view, where is this view? This is the screen that I see, where is the view to see the code?
Please help.
Or is it because we are using a go server, only the executable which is a binary is running in the lambda and hence we are not able to see the code?
Code inline is supported only for interpreted languages (js for example) and not compiled languages.
Beside below lamda limitation, It seems console lamda editor does not support go.
However the documentation suggest to use code star.
You can also get started with AWS Lambda Go support through AWS
CodeStar. AWS CodeStar lets you quickly launch development projects
that include a sample application, source control and release
automation. With this announcement, AWS CodeStar introduced new
project templates for Go running on AWS Lambda. Select one of the
CodeStar Go project templates to get started. CodeStar makes it easy
to begin editing your Go project code in AWS Cloud9, an online IDE,
with just a few clicks.
announcing-go-support-for-aws-lambda
Q: How do I create an AWS Lambda function using the Lambda console?
If you are using Node.js or Python, you can author the code for your
function using code editor in the AWS Lambda console which lets you
author and test your functions, and view the results of function
executions in a robust, IDE-like environment
lambda-faqs
Deployment package size
50 MB (zipped, for direct upload)
250 MB (unzipped, including layers)
3 MB (console editor)
lambda limits
lambda-go-how-to-create-deployment-package
Based on the code size the AWS code editor will display the code. Since the size of the code/package is large AWS Code editor can't display the same.
But you can download the package from AWS Lambda function using export.
Kindly follow the below steps:
Go to the lambda
Select the lambda
Click on Actions
Select Export function
You will get few options. Select Download deployment package.

VS2017 Data Lake tools cannot submit U-SQL script to Azure

Using Visual Studio 2017 with Data Lake Tools v. 2.3.1000.1, I am unable to submit a U-SQL job directly to Azure. I only have the option to submit locally:
This is the case even though I am connected to Azure through the "Server Explorer" tool window, from which I can access my U-SQL databases, view jobs that were previously submitted to Azure, etc.:
Using Visual Studio 2015, I have no such issue:
Am I forgetting a setting or a property somewhere, or is this perhaps a bug in Data Lake Tools for VS2017?
Do you still have the issue?
I sometimes see this happen if I open an existing solution and the tabs are already open, and I have not yet logged in. I then log in and the drop down menu of the open window will not be refreshed.
I close and reopen the script and it normally shows up.
Another reason, that I think will be addressed soon, could be that you have a filter on your cloud view explorer on which subscriptions you expose. If you hid the subscription there with your ADLA account, you will not see it in the pull down.
In any case, please let us know if you still have the issue and none of the two suggestions help.