Write csv file to S3 Bucket without storing it locally - amazon-web-services

Before I used to create a .csv file and store it in a local system where the application is running and then I used to send it to AWS S3 Bucket.
Now I don't want to create any file on the application side. I want to write my data to S3 Bucket as a .csv file directly, Is it possible? If possible please guide me the way I can do this.

There is a way to achieve this, we have to send our data as InputStream
In my case, I did these changes:
[1]I updated my Bean class to Override toString() method as follows
public class Employee{
String Id;
String employeeNo;
String name;
//get and set
#Override
public String toString() {
return "{"+ Id +","+ employeeNo + ","+ name + "}";
}
[2]I took the list of data and converted into List<String[]>
List<Employee> employeeList=//get employee list
List<String[]> csvData = toStringArray(employeeList);
My toStringArray() will return List<String[]>
private List<String[]> toStringArray(List<Employee> employeeList) {
List<String[]> records = new ArrayList<String[]>();
// adding header to csv
records.add(new String[] { "EmployeeId", "EmployeeNo", "Name"});
// add data to csv
Iterator<Employee> emp = employeeList.iterator();
while (emp.hasNext()) {
Employee data = emp.next();
records.add(new String[] { emp.getId(), emp.getEmployeeNo(), emp.getName() });
}
return records;
}
[3] Now I converted this data to String
String strCsvData=writeCsvAsString(csvData);
my writeCsvAsString() method is
public String writeCsvAsString(List<String[]> csvData) {
StringWriter s = new StringWriter();
CSVWriter writer = new CSVWriter(s);
writer.writeAll(csvData);
try {
writer.close();
} catch (IOException e) {
}
String finalString = s.toString();
logger.debug("Actual data:- {}", finalString);
return finalString;
}
Now I converted this String into InputStream and I send this to S3 bucket with contentType metadata "text/csv"
InputStream targetStream = new ByteArrayInputStream(strCsvData.getBytes());

Related

wso2 identity server custom handler reading from properties file

public class UserRegistrationCustomEventHandler extends AbstractEventHandler {
JSONObject jsonObject = null;
private static final Log log = LogFactory.getLog(UserRegistrationCustomEventHandler.class);
#Override
public String getName() {
return "customClaimUpdate";
}
if (IdentityEventConstants.Event.POST_SET_USER_CLAIMS.equals(event.getEventName())) {
String tenantDomain = (String) event.getEventProperties()
.get(IdentityEventConstants.EventProperty.TENANT_DOMAIN);
String userName = (String) event.getEventProperties().get(IdentityEventConstants.EventProperty.USER_NAME);
Map<String, Object> eventProperties = event.getEventProperties();
String eventName = event.getEventName();
UserStoreManager userStoreManager = (UserStoreManager) eventProperties.get(IdentityEventConstants.EventProperty.USER_STORE_MANAGER);
// String userStoreDomain = UserCoreUtil.getDomainName(userStoreManager.getRealmConfiguration());
#SuppressWarnings("unchecked")
Map<String, String> claimValues = (Map<String, String>) eventProperties.get(IdentityEventConstants.EventProperty
.USER_CLAIMS);
String emailId = claimValues.get("http://wso2.org/claims/emailaddress");
userName = "USERS/"+userName;
JSONObject json = new JSONObject();
json.put("userName",userName );
json.put("emailId",emailId );
log.info("JSON:::::::"+json);
// Sample API
//String apiValue = "http://192.168.1.X:8080/SomeService/user/updateUserEmail?email=sujith#gmail.com&userName=USERS/sujith";
try {
URL url = new URL(cityAppUrl) ;
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(5000);
con.setRequestProperty("Content-Type", "application/json; charset=UTF-8");
con.setDoOutput(true);
con.setDoInput(true);
con.setRequestMethod("POST");
log.info("CONN:::::::::::::"+con);
OutputStream os = con.getOutputStream();
os.write(cityAppUrl.toString().getBytes("UTF-8"));
os.close();
InputStream in = new BufferedInputStream(con.getInputStream());
String result = org.apache.commons.io.IOUtils.toString(in, "UTF-8");
jsonObject = new JSONObject(result);
log.info("JSON OBJECT:::::::::"+jsonObject);
}
catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
#Override
public void init(InitConfig configuration) throws IdentityRuntimeException {
super.init(configuration);
}
#Override
public int getPriority(MessageContext messageContext) {
return 250;
}
}
I'm using wso2 identity server 5.10.0 and have to push the updated claim value to an API so I'm using a custom handler and have subscribed to POST_SET_USER_CLAIMS, i have to read the API value from deployment.toml file in jave code of the custom handler. So can any one please help here to read the value from deployment file
I can fetch the updated claim value in logs but im not able to get the API value. So can anyone help me here to read the value from deployment file.
Since the API path is required inside your custom event handler, let's define the API path value as one of the properties of the event handler.
Add the deployment.toml config as follows.
[[event_handler]]
name= "UserRegistrationCustomEventHandler"
subscriptions =["POST_SET_USER_CLAIMS"]
properties.apiPath = "http://192.168.1.X:8080/SomeService/user/updateUserEmail"
Once you restart the server identity-event.properties file populates the given configs.
In your custom event handler java code needs to read the config from identity-event.properties file. The file reading is done at the server startup and every config is loaded to the memory.
By adding this to your java code, you can load to configured value in the property.
configs.getModuleProperties().getProperty("UserRegistrationCustomEventHandler.apiPath")
NOTE: property name needs to be defined as <event_handler_name>.<property_name>
Here is a reference to such event hanlder's property loading code snippet https://github.com/wso2-extensions/identity-governance/blob/68e3f2d5e246b6a75f48e314ee1019230c662b55/components/org.wso2.carbon.identity.password.policy/src/main/java/org/wso2/carbon/identity/password/policy/handler/PasswordPolicyValidationHandler.java#L128-L133

List Objects from AWS S3 bucket using ASP.NET

I have an S3 bucket which has multiple folders. I want to list S3 objects from specific sub folders. For example my folder structure is: Folder1/Download, Folder1/Upload. Here, I want the objects only from Folder1/Download and not the upload. The function that I'm using is:
public void GetFTPFilesAWS(long ID)
{
AmazonS3Client amazonS3Client = new AmazonS3Client("", "", Amazon.Region);
ListObjectsRequest request = new ListObjectsRequest
{
BucketName = "",
//Prefix = "Download",
//Delimiter = "/"
};
var practiceFolderName = "Practice_" + ID;
string[] businessClaim = { "Folder1/Download/", "Folder2/Download/",};
foreach (string s in businessClaim)
{
_ = practiceFolderName + s;
}
ListObjectsResponse response = amazonS3Client.ListObjects(request);
}
I'm getting every object from my bucket using this and when I'm trying to use the prefix and delimiter it's not working.

How to show progress bar while uploading files to amazon s3?

I am uploading files to Amazon S3 bucket and that is working perfectly fine. Now I want to show the progress while uploading them, I have research everywhere but none of them is working. Below is the code for uploading file to Amazon S3 bucket-
On Spring boot controller-
private static final String UPLOAD_FILE_URL = "/uploadImages.htm";
#RequestMapping(value = UPLOAD_FILE_URL, method = RequestMethod.POST)
public ModelAndView uploadImagesRequest(#RequestParam("file") MultipartFile[] files, #RequestParam("filename") String title, HttpServletRequest request) {
ModelAndView model = new ModelAndView("index");
if(awsUploadImageInterface.uploadImage(filess, title, request)) {
model.addObject("successMsg", "Banner Image Is Added Successfully");
}
return "index";
}
Upload Image Functionality-
#Override
public boolean uploadImage(MultipartFile[] file, String title, HttpServletRequest request) {
boolean result = false;
S3BucketUtility s3client = new S3BucketUtility();
InputStream Is;
String key;
Properties prop = new Properties();
InputStream propstream = getClass().getClassLoader().getResourceAsStream("s3.properties");
try {
prop.load(propstream);
} catch (IOException e) {
System.out.println("Properties File Exception in AWS Connection Class: " + e);
}
try {
for (int j = 0; j < file.length; j++) {
if (file[j].getSize() != 0) {
Is = file[j].getInputStream();
String fileext = FilenameUtils.getExtension(file[j].getOriginalFilename());
AWSCredentials credentials = new BasicAWSCredentials(prop.getProperty("acceskey"), prop.getProperty("scretkey"));
String BucketName = prop.getProperty("bucketName");
key = title.concat(".").concat(fileext);
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(Long.valueOf(Is.available()));
metadata.setContentType("image".concat("/").concat(fileext));
s3client.uploadfile(credentials, BucketName, key, Is, metadata);
}
}
}catch (AmazonClientException e) {
return result;
} catch (IOException ex) {
ex.printStackTrace();
}
return result;
}
Hope anyone can help me how to show the progress as I am unable to do. Thanks in advance.

com.amazonaws.SdkClientException: Unable to calculate MD5 hash: /home/bharath/Documents/demo/demo.txt (No such file or directory)

By taking the reference of this Question link below
How to solve 'Client is immutable when created with the builder'?
I do have a similar code structure wherein which am getting an error saying
Unable to calculate MD5 hash
public class LambdaFunctionS3 implements RequestHandler<Object, String> {
#Override
public String handleRequest(Object input, Context context) {
context.getLogger().log("Input: " + input);
String clientRegion = "ap-south-1";
String bucketName = "Bucket Name";
String fileObjKeyName = "demo.txt";
String fileName = "/home/bharath/Documents/demo/demo.txt";
try {
System.out.println("Uploading a new file to s3 bucket...");
File file = new File(fileName);
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withRegion(clientRegion)
.withCredentials(new ProfileCredentialsProvider())
.build();
PutObjectRequest request = new PutObjectRequest(bucketName, fileObjKeyName,file);
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentType("plain/text");
metadata.addUserMetadata("x-amz-meta-title", "myFunction");
request.setMetadata(metadata);
s3Client.putObject(request);
System.out.println("File uploaded.");
return "Success";
} catch(AmazonServiceException e) {
System.err.println(e.getErrorMessage());
System.exit(1);
e.printStackTrace();
} catch(SdkClientException e) {
e.printStackTrace();
}
return "Hello from Lambda!"; }
Though I have a file with the correct path. Am using aws lambda function in order to upload a file in s3.
Any help?

Illegal query expression: No hash key condition is found in the query in AWS Query

I have table in AWS mobile hub and I am using the following model for it
public class UserstopcoreDO {
private String _userId;
private String _usertoplevel;
private String _usertopscore;
private String _username;
#DynamoDBHashKey(attributeName = "userId")
#DynamoDBAttribute(attributeName = "userId")
public String getUserId() {
return _userId;
}
public void setUserId(final String _userId) {
this._userId = _userId;
}
#DynamoDBAttribute(attributeName = "usertoplevel")
public String getUsertoplevel() {
return _usertoplevel;
}
#DynamoDBAttribute(attributeName = "username")
public String getUsername() {
return _username;
}
public void setUsername(final String _username) {
this._username = _username;
}
public void setUsertoplevel(final String _usertoplevel) {
this._usertoplevel = _usertoplevel;
}
#DynamoDBIndexHashKey(attributeName = "usertopscore", globalSecondaryIndexName = "usertopscore")
public String getUsertopscore() {
return _usertopscore;
}
public void setUsertopscore(final String _usertopscore) {
this._usertopscore = _usertopscore;
}
}
In the table, I have 1500+ records and now I want to fetch Top 10 records from it so for that I write the below query
final DynamoDBQueryExpression<UserstopcoreDO> queryExpression = new DynamoDBQueryExpression<>();
queryExpression.withLimit(10);
queryExpression.setScanIndexForward(false);
final PaginatedQueryList<UserstopcoreDO> results = mapper.query(UserstopcoreDO.class, queryExpression);
Iterator<UserstopcoreDO> resultsIterator = results.iterator();
if (resultsIterator.hasNext()) {
final UserstopcoreDO item = resultsIterator.next();
try {
Log.d("Item :",item.getUsertopscore());
} catch (final AmazonClientException ex) {
Log.e(LOG_TAG, "Failed deleting item : " + ex.getMessage(), ex);
}
}
But when I run the code it gives me an error
Caused by: java.lang.IllegalArgumentException: Illegal query expression: No hash key condition is found in the query
but in my condition, I did not need any condition because I want to fetch top 10 records instead of one specific record. So how to handle that condition ?
If you want to "query" DynamoDB without specifying all HashKeys, use a Scan instead, i.e. DynamoDBScanExpression. You probably also want to change your "usertopscore" to be a RangeKey instead of a HashKey.
From https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/dynamodbv2/datamodeling/DynamoDBQueryExpression.html every DynamoDBQueryExpression requires all the Hash Keys be set.
Also see boto dynamodb2: Can I query a table using range key only?
Please set the hash key in the query expression. Below is the example of query expression for main table and GSI (need to set the index name).
Querying the main table:-
Set the hash key value of the table.
UserstopcoreDO hashKeyObject = new UserstopcoreDO();
hashKeyObject.setUserId("1");
DynamoDBQueryExpression<UserstopcoreDO> queryExpressionForMainTable = new DynamoDBQueryExpression<UserstopcoreDO>()
.withHashKeyValues(hashKeyObject);
Querying the Index:-
Set the index name and hash key value of the index.
UserstopcoreDO hashIndexKeyObject = new UserstopcoreDO();
hashIndexKeyObject.setUsertoplevel("100");
DynamoDBQueryExpression<UserstopcoreDO> queryExpressionForGsi = new DynamoDBQueryExpression<UserstopcoreDO>()
.withHashKeyValues(hashIndexKeyObject).withIndexName("usertopscore");
GSI attributes in mapper:-
#DynamoDBIndexHashKey(attributeName = "usertoplevel", globalSecondaryIndexName = "usertopscore")
public String getUsertoplevel() {
return _usertoplevel;
}
#DynamoDBIndexRangeKey(attributeName = "usertopscore", globalSecondaryIndexName = "usertopscore")
public String getUsertopscore() {
return _usertopscore;
}