I was trying to execute the sample Access Analysis from the WSO CEP samples. After I created the bucket same as given in the sample I received an error and it doesn't created the bucket. Can any one help what is the problem?
Error log:
[2012-12-13 15:09:21,654] INFO {org.apache.axis2.deployment.DeploymentEngine} -
org.apache.axis2.deployment.DeploymentException: wrong configuration provided for adding AccessAnalysisBucket.xml
This error was appearing when I gave query syntax at the Add query Expression. If i am not giving this, it will create the bucket.
Related
I am trying to use Terraform with a Google Cloud Storage backend, but I'm facing some issues when executing this in my CI pipeline.
I have set the GOOGLE_APPLICATION_CREDENTIALS to my service account JSON keyfile, but whenever I try to init Terraform, I get the following errors:
Error loading state: 2 errors occurred:
* writing "gs://[my bucket name]/state/default.tflock" failed: googleapi: Error 403: Access denied., forbidden
* storage: object doesn't exist
I have tried all documented methods of authentication, but still no luck.
Turns out only the second error was actually relevant and there were no authentication issues after all.
My remote backend only contained my custom workspace state files and no default state.
Since terraform init needs to be executed before being able to switch to a workspace, it was looking for a default.tflock/default.tfstate file that did not exist.
From my local workstation I initialized the default workspace, which created the file that Terraform was looking for.
I wasted a good few hours trying to debug a service account authentication issue that did not exist. I hope this answer can save someone else from that rabbit hole...
Im keep getting this error when I try to go to the scheduled queries dashboard.
I have a scheduled query that is importing aggregated data from another project trough a service account.
The import seems like it's working for a while but then shortly after I get this error.
Error loading location europe: BigQuery Data Transfer service account does not have sufficient permission. Please ask the project owner to disable the BigQuery Data Transfer service and then re-enable it.
Error loading location asia-northeast3: Unknown Error
It looks like I get this error multiple times for multiple regions.To make it work (for a while) I disabled the relevant API and reactivated it but after a while I keep getting the same errors.
I'm not if this could be a permission error (if the service account trough which this data transfer is being made does not have sufficient permission) or it's an API problem.
Do you guys know what could be the issue here and how I could test this out?
My goal is to get a new column in Power BI with keyphrases based on a column with text data. I try to connect the Azure text analytics API to PowerBI. I use this tutorial:
https://learn.microsoft.com/nl-nl/azure/cognitive-services/text-analytics/tutorials/tutorial-power-bi-key-phrases
After I invoke the custom function, and set the authentication and privacy to "anonymous" and "public", the KeyPhrases column I get only contains the values "Error" with the following description:
An error occurred in the ‘’ query. DataSource.Error: Web.Contents failed to get contents from 'https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases' (404): Resource Not Found
Details:
DataSourceKind=Web
DataSourcePath=https://*******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
Url=https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
Also, not sure if it is related to my issue, but I see the following warning on my Azure account in the Networking menu:
"VNet setting is not supported for current API type or resource location."
I checked all the steps in the tutorial, I re-entered the authentication and privacy settings. Also, I tried the same for the sentiment analysis function. Finally, I tried everything on a different and very simplistic dataset.
Not sure what the cause of my issue is and how to solve it.
Any suggestions would be much appreciated.
Best, Rosanne
Look at your error message:
'https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases' (404): Resource Not Found Details: DataSourceKind=Web DataSourcePath=https://*******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases Url=https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
It throw a 404, so you are pointing to the wrong URL.
And s you can see in the beginning of your url:
https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com << here you have twice ".cognitiveservices.azure.com/" so you url setup is wrong.
I don't know exactly how it is setup on your side, but you may have provided a region or endpoint during it and it's here where you put the wrong value.
Unable to view the table contents in Tibco Spotfire Analytics Explorer.
I have copied the hive EMR jar files to library folder in tibco server. Once the server is up, I tried to configure the data source in Information Designer. I am able to setup the datasource and I can see schema. But when I tried to expand the table, I am not getting any results
This is the error message I am getting here
Error message: An issue occurred while creating the default model. It may be partially constructed.
The data source reported a failure.
InformationModelException at Spotfire.Dxp.Data:
Error retrieving metadata: java.lang.NullPointerException (HRESULT: 80131500)
The following is the URL template in Information Designer
jdbc:hive2://<host>:<port10000>/<database>
which I changed to
jdbc:hive2://sitename:10000/db_name.
Please let me know what need to be changed in the driver config or any other place to see the contents of the table.
I am trying to run DataTransformation from java API and get the following error message:
Failure while trying to create engine log /Informatica/9.1.0/DataTransformation/CMReports/Init/Events.cme- for more information see file://internal
Could you please advise about the reason for this exception and how to fix it ?
It seems that the user which is running the informatica Workflow through B2B doesn;t have the permission to create this log file. Please assign correct permission to this directory.
Please note that this is the log file created by Informatica DT or UDT Transformation and on failure you have to import this event.cme into Informatica DT for spotting the error.