Unable to upload bpmn (.bar) file in BPS Server - wso2

I am trying to upload a bpmn (designed in eclipse using Activiti) in the BPS server. I have done all the required steps but still the .bar file present in deployment folder is not getting uploaded on the server.
The error I'm facing is : Removing the faulty archive : Order_approval.bar

There is no way to remove activities from assemly,The only solution to your problem is, check the log to see problem releted to youe webdynpor DC. then rectify your web dynpro DC.
or create an activity which will delete your webdynpro DC. 4shared

Related

Azure Data Factory HDFS dataset preview error

I'm trying to connect to the HDFS from the ADF. I created a folder and sample file (orc format) and put it in the newly created folder.
Then in ADF I created successfully linked service for HDFS using my Windows credentials (the same user which was used for creating sample file):
But when trying to browse the data through dataset:
I'm getting an error: The response content from the data store is not expected, and cannot be parsed.:
Is there something I'm doing wrongly or it is kind of permissions issue?
Please advise
This appears to be a generic issue, you need to point to a file with appropriate extension rather than a folder itself. Also make sure you are using a supported data store activity.
You can follow this official MS doc to use HDFS server with Azure Data Factory

Springboot server in Elastic Beanstalk creates files that I can't see

I have a Springboot server that is deployed to an Elastic Beanstalk environment in AWS. The basic functionality is this:
1. Upload a file to the server
2. The server processes file by doing some data manipulation.
3. Then the file that is created is sent to a user via email.
The strange thing is that, the functionality mentioned above is working. The output file is sent to my email inbox successfully. However, the file cannot be seen when SSHed into the instance. The entire directory that gets created for the data manipulation is just not there. I have looked everywhere.
To test this, I even created a simple function in my Springboot Controller like this:
#GetMapping("/")
public ResponseEntity<String> dummyMethod() {
// TODO : remove line below after testing
new File(directoryToCreate).mkdirs();
return new ResponseEntity<>("Successful health check. Status: 200 - OK", HttpStatus.OK);
}
If I use Postman to hit this endpoint, the directory CANNOT be seen via the terminal that I am SSHed into. The program is working so I know that the code is correct in that sense, but its like the files and directories are invisible to me.
Furthermore, if I were to run the server locally (using Windows OR Linux) and hit this endpoint, the directory is successfully created.
Update:
I found where the app lives in the environment at /var/app. But my folders and files are still not there, only the source code files, ect are there. The files that my server is supposed to be creating are still missing. I can even print out the absolute path to the file after creating it, but that file still doesn't exist. Here is an example:
Files.copy(source, dest);
logger.info("Successfully copied file to: {}", dest.getAbsolutePath());
will print...
Successfully copied file to: /tmp/TESTING/Test-Results/GVA_output_2021-12-13 12.32.58/results_map_GVA.csv
That path DOES NOT exist in my server, but I CAN send it to me via email from the server code after being processed. But if I SSH into the instance and go to that path, nothing is there.
If I use the command: find . -name "GVA*" (to search for the file I am looking for) then it prints this:
./var/lib/docker/overlay2/fbf04e23e39d61896a1c935748a63f2d3836487d9b166bae490764c30b8870ae/diff/tmp/TESTING/Test-Results/GVA_output_2021-12-09 18.15.59
./var/lib/docker/overlay2/fbf04e23e39d61896a1c935748a63f2d3836487d9b166bae490764c30b8870ae/diff/tmp/TESTING/Test-Results/GVA_output_2021-12-13 12.26.34
./var/lib/docker/overlay2/fbf04e23e39d61896a1c935748a63f2d3836487d9b166bae490764c30b8870ae/diff/tmp/TESTING/Test-Results/GVA_output_2021-12-13 12.32.58
./var/lib/docker/overlay2/fbf04e23e39d61896a1c935748a63f2d3836487d9b166bae490764c30b8870ae/merged/tmp/TESTING/Test-Results/GVA_output_2021-12-09 18.15.59
./var/lib/docker/overlay2/fbf04e23e39d61896a1c935748a63f2d3836487d9b166bae490764c30b8870ae/merged/tmp/TESTING/Test-Results/GVA_output_2021-12-13 12.26.34
./var/lib/docker/overlay2/fbf04e23e39d61896a1c935748a63f2d3836487d9b166bae490764c30b8870ae/merged/tmp/TESTING/Test-Results/GVA_output_2021-12-13 12.32.58
But this looks like it is keeping track of differences between versions of files since I see diff and merged in the file path. I just want to find where that file is actually residing.
If you need to store an uploaded file somewhere from a Spring BOOT app, look at using an Amazon S3 bucket as opposed to writing the file to a folder on the server. For example, assume you are working with a Photo app and the photos can be uploaded via the Spring BOOT app. Instead of placing this in a directory on the server, use the Amazon S3 Java API to store the file in an Amazon S3 bucket.
Here is an example of using a Spring BOOT app and handling uploaded files by placing them in a bucket.
Creating a dynamic web application that analyzes photos using the AWS SDK for Java
This example app also shows you how to use the SES API to send data (a report in this example) to a user via email.

Issue with uploading GeoLite2-City.mmdb.missing file in mautic

I have a mautic marketing automation installed on my server (I am a beginner)
However i replicated this issue when configuring GeoLite2-City IP lookup
Automatically fetching the IP lookup data failed. Download http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.mmdb.gz, extract if necessary, and upload to /home/ol*****/public_html/mautic4/app/cache/prod/../ip_data/GeoLite2-City.mmdb.
What i attempted
i FTP into the /home/ol****/public_html/mautic4/app/cache/prod/../ip_data/GeoLite2-City.mmdb. directory
uploaded the file (the original GeoLite2-City.mmdb has '0 byte', while the newly added file is about '6000 kb'
However, once i go back into mautic to implement the lookup, the newly added file reverts back to '0byte" and i still cant get the IP lookup configured.
I have also changed the file permission to 0744, but the issue still replicates.
Did you disable the cron job which looks for the file? If not, or if you clicked the button again in the dashboard, it will overwrite the file you manually placed there.
As a side note, the 2.16 release addresses this issue, please take a look at https://www.mautic.org/blog/community/announcing-mautic-2-16/.
Please ensure you take a full backup (files and database) and where possible, run the update at command line to avoid browser timeouts :)

Tibco 7.14 connectivity issues with Hive EMR (AWS) using JDBC driver

Unable to view the table contents in Tibco Spotfire Analytics Explorer.
I have copied the hive EMR jar files to library folder in tibco server. Once the server is up, I tried to configure the data source in Information Designer. I am able to setup the datasource and I can see schema. But when I tried to expand the table, I am not getting any results
This is the error message I am getting here
Error message: An issue occurred while creating the default model. It may be partially constructed.
The data source reported a failure.
InformationModelException at Spotfire.Dxp.Data:
Error retrieving metadata: java.lang.NullPointerException (HRESULT: 80131500)
The following is the URL template in Information Designer
jdbc:hive2://<host>:<port10000>/<database>
which I changed to
jdbc:hive2://sitename:10000/db_name.
Please let me know what need to be changed in the driver config or any other place to see the contents of the table.

Unable to upload to S3 with Grails S3 Demo Application

I am trying to run a demo project for uploading to S3 with Grails 3.
The project in question is this, more specifically the S3 upload is only for the 'Hotel' example at the end.
When I run the project and go to upload the image, I get an updated message but nothing actually happens - there's no inserted url in the dbconsole table.
I think the issue lies with how I am running the project, I am using the command:
grails -Daws.accessKeyId=XXXXX -Daws.secretKey=XXXXX run-app
(where I am supplementing the X's for my keys obviously).
This method of running the project appears to be slightly different to the method shown in the example. I run my project from the command line and I do not use GGTS, just Sublime.
I have tried inserting my AWS keys into the application.yml but I receive an internal server error then.
Can anyone help me out here?
Check your bucket policy in s3. You need to grant permissions to the API user to allow uploads.