How to setTimeout for JsonStreamWrite api - google-cloud-platform

I am having a Java Spring boot application which uses JsonStreamWrite of Google BigQuery Storage Write API to write data to BigQuery.
I wanted to timeout the stream write if it takes more than a minute to insert into BigQuery.
Here is my sample code
public void createWriteStream(String table, JsonArr arr) throws IOException, Descriptors.DescriptorValidationException, InterruptedException {
BigQueryWriteClient bqClient = BigQueryWriteClient.create();
WriteStream stream = WriteStream.newBuilder().setType(WriteStream.Type.COMMITTED).build();
TableName tableName = TableName.of("ProjectId", "DataSet", table);
CreateWriteStreamRequest createWriteStreamRequest =
CreateWriteStreamRequest.newBuilder()
.setParent(tableName.toString())
.setWriteStream(stream)
.build();
WriteStream writeStream = bqClient.createWriteStream(createWriteStreamRequest);
JsonStreamWriter jsonStreamWriter = JsonStreamWriter
.newBuilder(writeStream.getName(), writeStream.getTableSchema())
.build();
jsonStreamWriter.append(jsonArr);
}
Does bigquery provide any such configuration to timeout the insert ?

Try this code:
public void createWriteStream(String table, JsonArr arr) throws IOException, Descriptors.DescriptorValidationException, InterruptedException {
BigQueryWriteSettings.Builder bigQueryWriteSettingsBuilder = BigQueryWriteSettings.newBuilder();
bigQueryWriteSettingsBuilder
.createWriteStreamSettings()
.setRetrySettings(
bigQueryWriteSettingsBuilder
.createWriteStreamSettings()
.getRetrySettings()
.toBuilder()
.setTotalTimeout(Duration.ofMinute(1))
.build());
BigQueryWriteSettings bigQueryWriteSettings = bigQueryWriteSettingsBuilder.build();
BigQueryWriteClient bqClient = BigQueryWriteClient.create(bigQueryWriteSettingsBuilder);
WriteStream stream = WriteStream.newBuilder().setType(WriteStream.Type.COMMITTED).build();
TableName tableName = TableName.of("ProjectId", "DataSet", table);
CreateWriteStreamRequest createWriteStreamRequest =
CreateWriteStreamRequest.newBuilder()
.setParent(tableName.toString())
.setWriteStream(stream)
.build();
WriteStream writeStream = bqClient.createWriteStream(createWriteStreamRequest);
JsonStreamWriter jsonStreamWriter = JsonStreamWriter
.newBuilder(writeStream.getName(), writeStream.getTableSchema())
.build();
jsonStreamWriter.append(jsonArr);
}
I referred to this documentation on how I implemented the BigQueryWriteSettings.

Related

DynamoDB client with auto refresh credentials

I'm creating DynamoDB client using IAM credentials obtained via STS assume role.
#Provides
public DynamoDbEnhancedClient DdbClientProvider() {
final AWSSecurityTokenServiceClientBuilder stsClientBuilder = AWSSecurityTokenServiceClientBuilder.standard()
.withClientConfiguration(new ClientConfiguration());
final AssumeRoleRequest assumeRoleRequest = new AssumeRoleRequest().withRoleSessionName("some.session.name");
assumeRoleRequest.setRoleArn("arnRole");
final AssumeRoleResult assumeRoleResult = stsClientBuilder.build().assumeRole(assumeRoleRequest);
final Credentials creds = assumeRoleResult.getCredentials();
final AwsSessionCredentials sessionCredentials = AwsSessionCredentials.create(creds.getAccessKeyId()
, creds.getSecretAccessKey(), creds.getSessionToken());
final AwsCredentialsProviderChain credsProvider = AwsCredentialsProviderChain.builder()
.credentialsProviders(StaticCredentialsProvider.create(sessionCredentials))
.build();
final DynamoDbClient ddbClient = DynamoDbClient.builder().region(Region.US_EAST_1)
.credentialsProvider(credsProvider).build();
final DynamoDbEnhancedClient ddbEnhancedclient =
DynamoDbEnhancedClient.builder().dynamoDbClient(ddbClient).build();
return ddbEnhancedClient;
}
The main lambda handler looks like below:
public void LambdaMainHandler {
DynamoDbEnhancedClient ddbClient;
#Inject
public LambdaMainHandler(final DynamoDbEnhancedClient client) {
this.ddbClient = client;
}
public LambdaResponse processRequest(final LambdaRequest request) {
QueryResponse queryResponse = client.query("...")
return LambdaResponse.builder().setContent(queryResponse.getByteBuffer()).build();
}
}
I'm using the DDB client in LambdaMain constructor.
Since this is running in Lambda behind APIGateway, how do I make sure the credentials are refreshed when they expire while executing LambdaMain handler?

How can I read result of web api call from Dynamics 365?

I try to retrieve a record from Dynamics 365 Sales. I created an app registration in Azure and I can get tokens based on this app.
Also, I can call the HTTP client. But I couldn't figure out how to read the result of the HTTP call.
Microsoft published only WhoAmIRequest sample, but I couldn't find a sample of other entities.
Here is my sample code. I try to read body object.
try
{
string serviceUrl = "https://****.crm4.dynamics.com/";
string clientId = "******";
string clientSecret = "*******";
string tenantId = "*******";
A***.Library.Utility.MSCRM mscrm = new Library.Utility.MSCRM(serviceUrl, clientId, clientSecret, tenantId);
var token = await mscrm.GetTokenAsync();
Console.WriteLine(token);
using (HttpClient client = new HttpClient())
{
client.BaseAddress = new Uri(serviceUrl);
client.Timeout = new TimeSpan(0, 2, 0); //2 minutes
client.DefaultRequestHeaders.Add("OData-MaxVersion", "4.0");
client.DefaultRequestHeaders.Add("OData-Version", "4.0");
client.DefaultRequestHeaders.Accept.Add(
new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, "/api/data/v9.0/accounts");
// Set the access token
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", token);
HttpResponseMessage response = client.SendAsync(request).Result;
if (response.IsSuccessStatusCode)
{
// Get the response content and parse it.
var responseStr = response.Content.ReadAsStringAsync();
JObject body = JObject.Parse(response.Content.ReadAsStringAsync().Result);
}
}
}
catch(Exception e)
{
Console.WriteLine(e.Message);
}
Here is the result of body object.
You can use either of these syntax to read values. Read more
JObject body = JObject.Parse(response.Content.ReadAsStringAsync().Result);
// Can use either indexer or GetValue method (or a mix of two)
body.GetValue("obs_detailerconfigid");
body["obs_detailerconfigid"];

how to send attached via email using SES and lambda in Java

I am trying to send the files stored in s3 via email using AWS SES and lambda in java. Mail has send successfully but when i am trying to open the file getting error "Excel cannot open the file 'filename.xlsx' because file format or file extension is not valid. Verify that the file has not been corrupted and that the file extension matches the format of the file."
Code Snippet in java :
public class EmailNotification {
static LambdaLogger logger = null;
private static String SENDER=null;
private static String RECIPIENT=null;
private static String SUBJECT=null;
private static String BODY_TEXT=null;
private static String filekey=null;
public void verifyEmailNotification(Context context) throws Exception {
LambdaLogger logger = null;
try {
logger = context.getLogger();
String bucket_name = "bucket_name";
String key_name = "path/";
String file_name ="file_name.xlsx";
filekey = key_name + file_name;
AmazonS3 s3client = GetAWSClients.getS3();
boolean isFileExists=Utility.checkIfFileExists(bucket_name, filekey);
logger.log("isFileExists " + isFileExists);
if (isFileExists)
filekey = key_name + file_name;
else
logger.log("file not available");
InputStream stream = s3client.getObject(bucket_name, filekey).getObjectContent();
XSSFWorkbook workbook = new XSSFWorkbook(stream);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
workbook.write(bos);
ByteArrayInputStream contentsAsStream = new ByteArrayInputStream(bos.toByteArray());
ObjectMetadata md = new ObjectMetadata();
md.setContentType("application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
md.setSSEAlgorithm(ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION);
md.setContentLength(bos.toByteArray().length);
SENDER = "sender email";
RECIPIENT = "receiver email";
SUBJECT = "subject";
BODY_TEXT = "Body text";
Session session = Session.getDefaultInstance(new Properties());
MimeMessage message = new MimeMessage(session);
message.setSubject(SUBJECT);
message.setFrom(new InternetAddress(SENDER));
message.setRecipients(Message.RecipientType.TO, InternetAddress.parse(RECIPIENT));
MimeMultipart msg_body = new MimeMultipart("alternative");
MimeBodyPart textPart = new MimeBodyPart();
textPart.setContent(BODY_TEXT, "text/plain; charset=UTF-8");
msg_body.addBodyPart(textPart);
MimeBodyPart att = new MimeBodyPart();
DataSource fds = new ByteArrayDataSource(bos.toByteArray(), "application/octet-stream");
att.setDataHandler(new DataHandler(fds));
att.setFileName(TemplateFile);
message.setContent(msg_body);
msg_body.addBodyPart(att);
try {
System.out.println("Attempting to send an email through Amazon SES "
+"using the AWS SDK for Java...");
AmazonSimpleEmailService client = GetAWSClients.getSES();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
message.writeTo(outputStream);
RawMessage rawMessage = new RawMessage(ByteBuffer.wrap(outputStream.toByteArray()));
SendRawEmailRequest rawEmailRequest = new SendRawEmailRequest(rawMessage);
client.sendRawEmail(rawEmailRequest);
System.out.println("Email sent!");
workbook.close();
} catch (Exception ex) {
System.out.println("Email Failed");
System.err.println("Error message: " + ex.getMessage());
ex.printStackTrace();
}
}
catch(Exception ex) {
System.out.println("Email Failed");
System.err.println("Error message: " + ex.getMessage());
ex.printStackTrace();
}
}
}
Used POI and workbooks concept and resolved the issue.
public void verifyEmailNotification(Context context) throws IOException, MessagingException {
String bucketName = "bucketName";
String key = "prefix";
String fileName = "fileName.xlsx";
String keyName = key + fileName;
S3Object fullObject = null;
try {
SENDER = "sender email";
RECIPIENT = "receiver email";
SUBJECT = "Subject msg";
BODY_TEXT = "Hi";
logger = context.getLogger();
AmazonS3Client s3Client = (AmazonS3Client) AmazonS3ClientBuilder.defaultClient();
InputStream stream = s3Client.getObject(new GetObjectRequest(bucketName, keyName)).getObjectContent();
// Preparing input-stream as spreed sheet.
XSSFWorkbook workbook = new XSSFWorkbook(stream);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
workbook.write(bos);
logger.log("Read success\n");
Session session = Session.getDefaultInstance(new Properties());
MimeMessage message = new MimeMessage(session);
message.setSubject(SUBJECT);
message.setFrom(new InternetAddress(SENDER));
message.setRecipients(Message.RecipientType.TO, InternetAddress.parse(RECIPIENT));
MimeMultipart msg_body = new MimeMultipart("alternative");
MimeBodyPart textPart = new MimeBodyPart();
textPart.setContent(BODY_TEXT, "text/plain; charset=UTF-8");
msg_body.addBodyPart(textPart);
// Preparing the attachment from the spread sheet
MimeBodyPart att = new MimeBodyPart();
DataSource fds = new ByteArrayDataSource(bos.toByteArray(), "application/octet-stream");
att.setDataHandler(new DataHandler(fds));
att.setFileName(fileName);
msg_body.addBodyPart(att);
message.setContent(msg_body);
logger.log("attachment prepared\n");
AmazonSimpleEmailServiceClientBuilder.standard().withRegion(Regions.US_EAST_1).build();
AmazonSimpleEmailService client = GetAWSClients.getSES();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
message.writeTo(outputStream);
RawMessage rawMessage = new RawMessage(ByteBuffer.wrap(outputStream.toByteArray()));
SendRawEmailRequest rawEmailRequest = new SendRawEmailRequest(rawMessage);
client.sendRawEmail(rawEmailRequest);
System.out.println("Email sent!");
} catch (AmazonServiceException e) {
// The call was transmitted successfully, but Amazon S3 couldn't process
// it, so it returned an error response.
e.printStackTrace();
} catch (SdkClientException e) {
// Amazon S3 couldn't be contacted for a response, or the client
// couldn't parse the response from Amazon S3.
e.printStackTrace();
}
}

How to show progress bar while uploading files to amazon s3?

I am uploading files to Amazon S3 bucket and that is working perfectly fine. Now I want to show the progress while uploading them, I have research everywhere but none of them is working. Below is the code for uploading file to Amazon S3 bucket-
On Spring boot controller-
private static final String UPLOAD_FILE_URL = "/uploadImages.htm";
#RequestMapping(value = UPLOAD_FILE_URL, method = RequestMethod.POST)
public ModelAndView uploadImagesRequest(#RequestParam("file") MultipartFile[] files, #RequestParam("filename") String title, HttpServletRequest request) {
ModelAndView model = new ModelAndView("index");
if(awsUploadImageInterface.uploadImage(filess, title, request)) {
model.addObject("successMsg", "Banner Image Is Added Successfully");
}
return "index";
}
Upload Image Functionality-
#Override
public boolean uploadImage(MultipartFile[] file, String title, HttpServletRequest request) {
boolean result = false;
S3BucketUtility s3client = new S3BucketUtility();
InputStream Is;
String key;
Properties prop = new Properties();
InputStream propstream = getClass().getClassLoader().getResourceAsStream("s3.properties");
try {
prop.load(propstream);
} catch (IOException e) {
System.out.println("Properties File Exception in AWS Connection Class: " + e);
}
try {
for (int j = 0; j < file.length; j++) {
if (file[j].getSize() != 0) {
Is = file[j].getInputStream();
String fileext = FilenameUtils.getExtension(file[j].getOriginalFilename());
AWSCredentials credentials = new BasicAWSCredentials(prop.getProperty("acceskey"), prop.getProperty("scretkey"));
String BucketName = prop.getProperty("bucketName");
key = title.concat(".").concat(fileext);
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(Long.valueOf(Is.available()));
metadata.setContentType("image".concat("/").concat(fileext));
s3client.uploadfile(credentials, BucketName, key, Is, metadata);
}
}
}catch (AmazonClientException e) {
return result;
} catch (IOException ex) {
ex.printStackTrace();
}
return result;
}
Hope anyone can help me how to show the progress as I am unable to do. Thanks in advance.

Exception when trying to connect to AWS Athena using JAVA API

I'm trying to execute query in AWS Athena using Java API:
public class AthenaClientFactory
{
String accessKey = "access";
String secretKey = "secret";
BasicAWSCredentials awsCredentials = new
BasicAWSCredentials(accessKey, secretKey);
private final AmazonAthenaClientBuilder builder = AmazonAthenaClientBuilder.standard()
.withRegion(Regions.US_WEST_1)
.withCredentials(new AWSStaticCredentialsProvider(awsCredentials))
.withClientConfiguration(new ClientConfiguration().withClientExecutionTimeout(10));
public AmazonAthena createClient()
{
return builder.build();
}
}
private static String submitAthenaQuery(AmazonAthena client) {
QueryExecutionContext queryExecutionContext = new QueryExecutionContext().withDatabase("my_db");
ResultConfiguration resultConfiguration = new ResultConfiguration().withOutputLocation("my_bucket");
StartQueryExecutionRequest startQueryExecutionRequest = new StartQueryExecutionRequest()
.withQueryString("select * from my_db limit 3;")
.withQueryExecutionContext(queryExecutionContext)
.withResultConfiguration(resultConfiguration);
StartQueryExecutionResult startQueryExecutionResult = client.startQueryExecution(startQueryExecutionRequest);
return startQueryExecutionResult.getQueryExecutionId();
}
public void run() throws InterruptedException {
AthenaClientFactory factory = new AthenaClientFactory();
AmazonAthena client = factory.createClient();
String queryExecutionId = submitAthenaQuery(client);
}
But I get an exception from startQueryExecutionResult.
The exception is:
Client execution did not complete before the specified timeout
configuration.
Has anyone encountered something similar?
The problem was in withClientExecutionTimeout(10).
Increasing this number to 5000 solved the issue