Can ANSYS be installed and run on Amazon Workspace? - amazon-web-services

Seeing the great utility of the modernisation of computational resources, it would be great if computationally heavy software such as ansys could be installed on cloud workspace such as Amazon
Can ANSYS be installed and run on Amazon Workspace?

Related

What is difference between AI Notebook and Cloud Datalab in GCP?

I have searched for an answer to this question and this question is duplicate but I need clarification as I looked at two different places and answers are a bit opposite.
The following Stack Overflow answer mentions that Google Cloud AI Platform Notebooks is an upgraded version of Google Cloud Datalab. On the following Quora page, one of the architects mentions that Cloud Datalab is built on top of Jypyter Notebook.
Cloud Datalab is adding a new network of its own. AI Notebooks remains within an existing network. With the current setup of my environment, I do not want to add overhead of maintaining extra network and security to watch over, and so AI Notebooks is the immediate solution. But I would also want to understand the benefits that Cloud Datalab provides.
Between AI Notebook and Cloud Datalab, which should be used and in which
scenario?
Is Cloud Datalab also providing pre-installed packages of Python,
Tensorflow or R environment like AI Notebooks?
Between AI Notebook and Cloud Datalab, which should be used and in
which scenario?
You should use AI notebooks on new projects in any case since Cloud Datalab would be deprecated sooner than later.
Is Cloud Datalab also providing pre-installed packages of Python,
Tensorflow or R environment like AI Notebooks?
Yes it does.
Summary of the differences between the two products.
DataLab
Custom UI that is not compatible with latest JupyterLab extensions.
Using old PyDatalab SDK since when DataLab was released there were no official SDK available for many of GCP services.
No major changes on RoadMap.
Requires SSH with port mapping to use
Notebooks:
Using JupyterLab UI.
Using official SDKs (like BigQuery Python SDK), therefore better integration.
Since UI (JupyterLab) is community driven releasing new changes rapidly.
Access to UI is simple, no SSH, no CLI usage is required.
Notebooks API
Terraform support
Client libraries (Python, Java, NodeJS) to manage Notebooks

Which build server and code scan tool to use on AWS EC2 Windows instance?

I have to implement Code Scan tool in CI/CD pipeline in AWS. I have an EC2 Windows Instance.
I checked few tutorials and found some plugins with Jenkins but these all samples are in Linux.
I want to know how to install Jenkins or any other alternative in EC2 Windowsand which code scan tool to use in this environment?
You can follow How to Install Jenkins on Windows tutorial to see how you can use Jenkins on windows.
Alternative to Jenkins is Atlassian Bamboo which is widely used as well for CI/CD
Some of the widely used code scan tools are
Checkmarx
Sonarqube
IBM Appscan - Commercial

Pros and Cons of Amazon SageMaker VS. Amazon EMR, for deploying TensorFlow-based deep learning models?

I want to build some neural network models for NLP and recommendation applications. The framework I want to use is TensorFlow. I plan to train these models and make predictions on Amazon web services. The application will be most likely distributed computing.
I am wondering what are the pros and cons of SageMaker and EMR for TensorFlow applications?
They both have TensorFlow integrated.
In general terms, they serve different purposes.
EMR is when you need to process massive amounts of data and heavily rely on Spark, Hadoop, and MapReduce (EMR = Elastic MapReduce). Essentially, if your data is in large enough volume to make use of the efficiencies of Spark, Hadoop, Hive, HDFS, HBase and Pig stack then go with EMR.
EMR Pros:
Generally, low cost compared to EC2 instances
As the name suggests Elastic meaning you can provision what you need when you need it
Hive, Pig, and HBase out of the box
EMR Cons:
You need a very specific use case to truly benefit from all the offerings in EMR. Most don't take advantage of its entire offering
SageMaker is an attempt to make Machine Learning easier and distributed. The marketplace provides out of the box algos and models for quick use. It's a great service if you conform to the workflows it enforces. Meaning creating training jobs, deploying inference endpoints
SageMaker Pros:
Easy to get up and running with Notebooks
Rich marketplace to quickly try existing models
Many different example notebooks for popular algorithms
Predefined kernels that minimize configuration
Easy to deploy models
Allows you to distribute inference compute by deploying endpoints
SageMaker Cons:
Expensive!
Enforces a certain workflow making it hard to be fully custom
Expensive!
From AWS documentation:
Amazon EMR is a managed cluster platform that simplifies running big data frameworks, such as Apache Hadoop and Apache Spark, on AWS to process and analyze vast amounts of data. By using these frameworks and related open-source projects, such as Apache Hive and Apache Pig, you can process data for analytics purposes and business intelligence workloads. Additionally, you can use Amazon EMR to transform and move large amounts of data into and out of other AWS data stores and databases, such as Amazon Simple Storage Service (Amazon S3) and Amazon DynamoDB.
(...) Amazon SageMaker is a fully-managed platform that enables developers and data scientists to quickly and easily build, train, and deploy machine learning models at any scale. Amazon SageMaker removes all the barriers that typically slow down developers who want to use machine learning.
Conclussion:
If you want to deploy AI models just use AWS SageMaker

Ubuntu AWS Workspace

We've been toying with switching to cloud based desktops, specifically AWS Workspace. Is there support for Ubuntu desktops though? To this point I've only been able to generate Windows environments.
UPDATE: Amazon Workspaces now supports Amazon Linux 2, an offshoot of CentOS.
Update: Workspaces now have a linux option, in case anyone finds this.
AWS Workspaces only supports Windows at the moment.
From the product description:
Amazon WorkSpaces is a managed, secure cloud desktop service. You can use Amazon WorkSpaces to provision either Windows or Linux desktops
Amazon Workspaces now allows the use of Windows 7 and Windows 10, as well as Amazon Linux 2. There are options that are eligible for the Free Tier.
Descriptions of what software you can install are available here.

How do we use clusters in open source Spark and Hortonworks' Hadoop sandbox?

I have a conceptual question. I downloaded Apache Spark and Hortonworks Hadoop Sandbox. As far as I know, we analyze big data by distributing the tasks to multiple machines or clusters. Amazon Web Services provide customers clusters when they pay for their services. But in the case of Spark or Hadoop, whose clusters I am using when I simply download these environments? They say that these environments provide a single-node clusters, which is, I assume my computer itself. But then, how can I analyze big data if I am limited to my computer itself? In brief, what is the logic of using Spark on my own laptop?
The environments are exactly what they say they are, a sandbox. It can be used to test functionality but not performance because as you rightly said, they are running out of your laptop. The VM comes configured with all the software neccesary for you to test exactly this.
If you wish to get the true performance potential of spark, then you will need to install spark on a cluster of servers using the procedures that they describe here and then you will be truly using the computational power from the servers that you just installed spark on.
Hope that helps!