Amazon Web Service IP address and service protocol - amazon-web-services

In my organisation, installation of Amazon Web Service in eclipse is blocked. To get the access i need destination IP address and service protocol so that i can raise a request to get access. Does anyone has any information about the same?

This may not be exactly what you are looking for but may be helpful
Amazon Web Services (AWS) publishes its current IP address ranges in JSON format. To view the current ranges, download the .json file. To maintain history, save successive versions of the .json file on your system. To determine whether there have been changes since the last time that you saved the file, check the publication time in the current file and compare it to the publication time in the last file that you saved.
https://docs.aws.amazon.com/general/latest/gr/aws-ip-ranges.html
You can also download from source and install that way
https://github.com/aws/aws-toolkit-eclipse/releases
Compiling JS-Test-Driver Plugin and Installing it on Eclipse 3.5.1 Galileo?
When installing the Eclipse plugin, if the software list contains an item that says "There are no categorized items", untick the "Group items by category" checkbox below.
This fixed the above errors I was getting after an initial install when I displayed or tried to configure the plugin tab.

Related

How to accessing the SCORM package from s3 bucket?

Successfully able to upload the SCORM package zip and unzip in S3 bucket using drupal 8.
While trying to read the SCORM files in the extracted data folder we got the error message like
"ERROR – unable to acquire LMS API, content may not play properly and results may not be recorded. Please contact technical support"
I checked the access stuff all are in public only
Can anyone tell me where i missed
image
That content sounds like its setup to look for the API or API_1484_11 (SCORM API's for 1.2 and 2004) and pop up an alert.
With a runtime API present that alert would go away. Your next question - "How do I expose a runtime API?" the answer is normally you hand roll one, or look for a Runtime API paid or otherwise.
Something like https://github.com/cybercussion/SCOBot/blob/master/QUnit-Tests/js/scorm/SCOBot_API_1484_11.js might get you started if your looking for free.
If you plan to build a LMS you may want to look into paid options.

setting up cluster on GCP with Cloudera Director

I'm following along with the instructions on Cloudera's website to set up a cluster using Cloudera Director. However, when I get to the step where I'm supposed to "Add an Environment," I'm presented with two issues. First, the region I selected (us-east1-b) when configuring my Google Compute instance is not available for selection on the Cloudera Director software. Second, there is no option for me to upload Client ID JSON Keys, as the documentation says we should be able to do. I've attached a screenshot of what I'm looking at. Any clues?
My Cloudera director software is reporting itself as version 2.1.1, and the docs I'm looking at are for version 2.1.x. Am I somehow working with an older version of the software? Or are the Cloudera docs not in line with the current version? Can anyone else running Cloudera 2.1.1 confirm that they're seeing something similar or different?
There is a field to load the Client ID JSON keys in the "Advanced Options" section under General Information. Click the > to expand the Advanced Options.
You should be able to type in the region you want even if it isn't provided as a value in the drop-down.

Pulling file from the Google Cloud server to local machine

Linux n00b here having trouble pulling a file from the server to my local Windows 7 professional 64 bit machine. I am using Wowza to stream live video and I am recording these live videos to my Google Cloud instance located here:
/usr/local/WowzaStreamingEngine/content/myStream.mp4
When I ssh:
gcutil --project=”myprojectname” pull “my instance”
“/usr/local/WowzaStreamingEngine/content/myStream.mp4” “/folder1”
I receive a permission denied error. When I try saving another folder deep on my local machine i.e "/folder1/folder2" the error returned is file or directory not found. I've checked that I have write permisions set on my local Windows 7 machine so I do not think it is a permissions error. Again, apologize for the n00b question, I'm just been stuck here for hours.
Thx,
~Greg
Comment added 7/18:
I enter the following through ssh:
gcutil --project=”Myproject” pull “instance-1” "/usr/local/WowzaStreamingEngine/content/myStream.mp4” “/content"
By entering this I'm expecting the file mystream.mp4 to be copied to my C:/content folder. The following is returned: Warning: Permanently added '107.178.218.8' (ECDSA) to the list of known hosts. Enter passphrase for key '/home/Greg/.ssh/google_compute_engine':
Here I enter the passphrase and the following error is returned: /content: Permission denied Have write set up on this folder. Thanks! – Greg
-=-=-==->
To answer the question about using Cygwin, I'm not familiar with Cygwin and I do not believe it was used in this instance. I ran these commands through the Google Cloud SDK shell which I installed per the directions found here: https://developers.google.com/compute/docs/gcutil/.
What I am doing:
After setting up my google cloud instance I open Google CLoud SDK and enter the following:
gcutil --service_version="v1" --project="myproject" ssh --zone="us-central1-a" "instance-1"
I then am prompted for a passphrase which I create and then run the following:
curl http://metadata/computeMetadata/v1/instance/id -H "X-Google-Metadata-Request:True"
This provides the password I use to login to the Wowza live video streaming engine. All of this works beautifully, I can stream video and record the video to the following location: /usr/local/WowzaStreamingEngine/content/myStream.mp4
Next I attempt to save the .mp4 file to my local drive and that is where I'm having issues. I attempt to run:
gcutil --project=”myproject” pull “instance-1” “/usr/local/WowzaStreamingEngine/content/myStream.mp4” “C:/content”
also tried, C:/content C:\content and C:\content
These attempts threw the following error:
Could not resolve hostname C: Name or service not known
Thanks again for your time, I know it is valuable, I really appreciate you helping out a novice.
Update I believe I am close thanks to your help. Switched to local C drive, entered the command as you displayed in your Answer update. Now returning a new, not before seen error:
Error: API rate limit exceeded
I did some research on S.O. and some suggestions made were that billing is not enabled or the relevant API is not enabled and I could solve by turning on Google Compute Engine. Billing has been enabled for a few weeks now on my project. In terms of Google Compute Engine, below are what I believe to be the relevant items turned on:
User Info: Enabled
Compute: Read Write
Storage: Full
Task Queue: Enabled
BigQuery: Enabled
Cloud SQL: Enabled
Cloud Database: Enabled
The test video I recorded was short and small in size. I also have not done anything else with this instance so at a loss as to why I am getting the API rate exceeded error.
I also went to the Google APIs console. I see very limited usage reported so, again, not sure why I am exceeding the API limit. Perhaps I do not have something set appropriately in the APIs console?
I'm guessing you're using Cygwin here (please correct me if I'm wrong).
The root directory for your Cygwin installation is most likely C:\cygwin (see FAQ) and not C: so when you say /content on the command line, you're referring to C:\cygwin\content and not C:\content.
Secondly, since you're likely running as a regular user (and not root) you cannot write to /content so that's why you're getting the permission denied error.
Solution: specify the target directory as C:/content (or C:\\content) rather than /content.
Update: from the update to the question, you're using the Google Cloud SDK shell, not Cygwin, so the above answer does not apply. The reason you're seeing the error:
Could not resolve hostname C: Name or service not known
is because gcutil (like ssh) parses destinations which include : as having the pattern [hostname]:[path]. Thus, you should avoid : in the destination, which means we need to drop the drive spec.
In this case, the following should suffice, assuming that you're currently at a prompt that looks like C:\...>:
gcutil --project=myproject pull instance-1 /usr/local/WowzaStreamingEngine/content/myStream.mp4 \content
If not, first switch to the C: drive by issuing the command:
C:
and then run the above command.
Note: I removed the quotes from the command line because you don't need it in the case where parameters don't have spaces in them.

Publishing to ProGet and I can't see any packages

I'm trying to set up a NuGet server using ProGet and am hitting a brick wall when publishing a package and it doesn't appear in the feed. The package is written to disk and works in other NuGet feeds. Other packages also don't appear in the ProGet feed so I'm pretty certain that the package is fine and that the problem lies with ProGet.
I'm using the community version of ProGet, but I don't see why that would affect anything.
Any ideas are most welcome!
The ProGet service is responsible for indexing packages, so if it's not running, packages could be uploaded but not displayed in any feeds. Here are the common troubleshooting steps for this scenario:
Verify that the ProGet Windows service (INEDOPROGETSVC) is running.
Ensure that the user account hosting the ProGet service has access to the feed storage path. Since it is NETWORK SERVICE by default, it would not see your mapped drives, and may not have access to the UNC path where the packages are stored.
Try running the ProGet service interactively, i.e. stop the ProGet Windows service and run ProGet.Service.exe manually as a console application to see any live output. Remember to restart the ProGet service when you close the console application.
Check for feed indexing errors to see if there was a problem indexing a particular packages. I know in much older versions that single "poisoned" packages (bad .nuspec file, invalid directory structure, etc.) could halt the indexing altogether.
Thanks John Rasch, found that your first point gave me what i needed to look in the right direction.
I could not find the (INEDOPROGETSVC) service but I did find (ProGet Servie) - I restarted this and then refreshed the my feed and all the missing packages showed up.
My version of ProGet is v3.3.12
Thanks John.

VMWare infrastructure web access, hyperlinks between vmware servers

We are using multiple vmware servers, that each host several vmware images/instances. Each department uses its own vmware server. The vmware instances are always accessed through the "VMWare infrastructure web access" web page from the console tab panel. The vmware servers are plain windows servers (nothing fancy).
Now it turns out that some of these vmware images are useful for multiple departments.
Of course we considered to copy these images, distributing them to all vmware servers, hosting the same image multiple times.
But we would in fact prefer to only host 1 copy of each instance. But still we would like to have all images accessible from 1 web page. Merging them to 1 server is of course impossible (performance-wise).
So, this got me wondering, perhaps there is a way to create hyperlinks within the vmware web access portal to vmware instances that are actually hosted on a different server. They would appear to be all on the same server but in fact they are distributed.
Does such thing exist, and how should it be configured ?
In mean time I found a reasonable solution for my problem.
The vmware-vmrc.exe can be called from the commandline with several parameters. For example with following parameters it will open the vmware session immediately without the need of specifying any credentials.
vmware-vmrc.exe -X -h hostname:8333 -u "username" -p "password" "[standard] ... .vmx"
Important: The "[standard] ... .vmx" value is not just a file name.
To know this value, you need to visit the webpage of your vmware
(e.g. https:// hostname:8333/ui/).
Next click the button "Configure VM" which will open a tab panel with configuration settings.
There you will find a setting called "Virtual Machine Configuration File". It often starts with the string "[standard]".
Next, it was really easy to write a little batch file that allows me to pick the desired vmware from a menu.