is it possible to connect Talend to Google Cloud Dataproc? And are there any connectors available for it?
On 1 it says it does but can't find any documentation related to it.
If the above is true, I would also like to know if it's is possible to run a job on Spark/Hadoop in Dataproc using Talend.
Thanks.
Yes, it should be possible to connect Talend with multiple Google Cloud products.
Specifically, Talend should support:
Google BigQuery
Google Cloud Dataproc
Google Cloud Storage
Google Pub/Sub
I don't have a specific link on hand to walk through the install and configuration process; however, Talend should be able to provide such details if you contact them.
Related
How can I orchestrate and call cloud services from different providers: Google Cloud, AWS and Azure
We did check workflows, but it seems that workflows are limited in each provider's platform, so if I build a workflow in Azure, I can only include services from Azure, but not from Google Cloud or AWS.
Cloud Workflow is an orchestrator that call APIs. There is a better integration for Google Cloud services (and with Google Security) but you can also use other API, such as other Cloud Provider services API.
You can also use Cloud Composer, based on Apache Airflow, which has a large library of operator that help you to call different services on different platform. Because it's initially (and still) an open source project, it's portable and there is no Cloud Preference on that solution.
If you want an orchestration tool for multi Cloud usage, Airflow should be the best approach because it proposes many operators for each Cloud Provider.
It's open source, well documented, there are many supports and example in each Cloud Provider.
It's also the best tool if you want to be Cloud Agnostic and migrate it easily to any Cloud provider if needed.
If you host Airflow in Google Cloud, you can use a managed solution with Cloud Composer 2 (GKE autopilot).
Cloud Workflow is a good solution if you stay in the GCP world, it's serverless and based on API calls but it is less documented than Airflow for an usage outside of GCP. It's also not Cloud agnostic and strongly coupled to GCP.
I recommend you using Airflow for your need.
My service is currently deployed on GCP, and all the monitoring & alerts are aggregated to Google Cloud Logging.
Recently I'm planning to introduce full text search feature to my service, and ElasticCloud is an option to achieve it because it supports deployment on GCP.
However, as far as I know, the documentation says that it supports import logs from Google Cloud Logging, but there is no description how to export the logs to Google Cloud Logging.
Is there a good way to export logs from ElasticCloud to Google Cloud Logging?
I need to use Elixir/Phoenix and Google Serverless DBs for a realtime Forex App.
So how to connect from Phoenix/Elixir to Google Cloud Spanner or Google Cloud Bigtable? Any link for Doc or tutorial would be appreciated.
This question was first asked on ElixirForum. and the answer suggested was Elixir on GCP, however, I have gone through all the materials on that link and found nothing.
Disclosure: I am a product manager for Google Cloud Bigtable.
There's an Elixir client for Cloud Spanner you can take a look at.
There isn't a Google-written and maintained Elixir client for Cloud Bigtable yet, but there's Elixir support for a number of other products in the same repo as well.
However, there's a third-party Elixir client for Cloud Bigtable you can explore; please direct any questions to their GitHub repo.
That said, if you're looking for a true serverless database backend for your app, consider using Google Cloud Firestore, which has also has Elixir support. Cloud Spanner and Cloud Bigtable require provisioning dedicated resources so there will be a minimum charge regardless of usage, while Cloud Firestore provides a pay-as-you-go model that will make it easy to scale from a small app to a large application.
Take a look at the pricing guides for each of these databases for more information:
Cloud Spanner pricing
Cloud Bigtable pricing
Cloud Firestore pricing
I am new to gcloud, I want to create my first gcloud vision and video intelligence project but do not know how to start and what are the pre-requisite for this. I have Windows 10 laptop and also created my goggle cloud account. Can you please guide my on setup work and how to start development work.
Thanks in Advance
Prerequisites: a Google Cloud Platform project with the Video Intelligence API and Vision API both enabled. The APIs are the interfaces between you / your applications and the Cloud services. There are multiple options to access the APIs:
One option is to install the Cloud SDK on your machine and initializing it - linking it with your project.
Afterwards, the Cloud SDK, which
is a set of tools for Cloud Platform. It contains gcloud, gsutil, and bq, which you can use to access Google Compute Engine, Google Cloud Storage, Google BigQuery, and other products and services from the command-line. You can run these tools interactively or in your automated scripts.
has the proper (command-line) tools to access the Google Cloud Vision and the Google Cloud Video Intelligence products.
Another option is to make direct calls to the APIs, i.e. via the REST endpoints. You can read about Cloud Vision APIs here, Video Intelligence APIs here and test them via Google's API Explorer.
We created an application which might be useful for any developer.
We like to create a deployment in Google Cloud Launcher so any Google Cloud users can make use of application via Google Cloud Launcher.
I don't find in-depth documentation to create a solution in Google Cloud Launcher. Also I don't find the steps required to push our application into Google Cloud Launcher.
Basically we would like to do something like bitnami does.
What are the steps required to package our application as Google Cloud Launcher solution?
To offer a Launcher, you can contact the Partner Program here https://cloud-partners-program.appspot.com/registration/technology