I try to use prepared select for get data from mysql,beacuse I think this faster than regular select.
this is select syntax:
char *sql = "select id,d1,d2,d3,d4,d5 from pricelist where d1 > ? limit 1000000";
that id,d2,d3 type unsigned int and other __int64
I wirte my code for prepared like below:
stmt = mysql_stmt_init(conn);
mysql_stmt_prepare(stmt, sql, strlen(sql));
// Select
param[0].buffer_type = MYSQL_TYPE_LONG;
param[0].buffer = (void *) &myId;
param[0].is_unsigned = 1;
param[0].is_null = 0;
param[0].length = 0;
// Result
result[0].buffer_type = MYSQL_TYPE_LONG;
result[0].buffer = (void *) &id;
result[0].is_unsigned = 1;
result[0].is_null = &is_null[0];
result[0].length = 0;
result[1].buffer_type = MYSQL_TYPE_LONGLONG;
result[1].buffer = (void *) &d1;
result[1].is_unsigned = 1;
result[1].is_null = &is_null[0];
result[1].length = 0;
result[2].buffer_type = MYSQL_TYPE_LONG;
result[2].buffer = (void *) &d2;
result[2].is_unsigned = 1;
result[2].is_null = &is_null[0];
result[2].length = 0;
result[3].buffer_type = MYSQL_TYPE_LONG;
result[3].buffer = (void *) &d3;
result[3].is_unsigned = 1;
result[3].is_null = &is_null[0];
result[3].length = 0;
result[4].buffer_type = MYSQL_TYPE_LONGLONG;
result[4].buffer = (void *) &d4;
result[4].is_unsigned = 1;
result[4].is_null = &is_null[0];
result[4].length = 0;
result[5].buffer_type = MYSQL_TYPE_LONGLONG;
result[5].buffer = (void *) &d5;
result[5].is_unsigned = 1;
result[5].is_null = &is_null[0];
result[5].length = 0;
mysql_stmt_bind_param(stmt, param);
mysql_stmt_bind_result(stmt, result);
mysql_stmt_execute(stmt);
mysql_stmt_store_result(stmt);
while(mysql_stmt_fetch (stmt) == 0){
}
and my code for reqular select is like below:
mysql_query(conn,"select id ,d1,d2,d3,d4,d5 from pricebook where us > 12 limit 1000000")
result = mysql_use_result(conn);
while (mysql_fetch_row(result)){
}
I run this two functions from remote pc and check time period for each one,duration for both of then is same equal to 6 sec
and when I check pcap file I see that vol that sent for prepared is same with reqular query even that in prepared comperes data.
$ capinfos prepared.pcap regular.pcap
File name: prepared.pcap
File type: Wireshark - pcapng
File encapsulation: Ethernet
Packet size limit: file hdr: (not set)
Number of packets: 40 k
File size: 53 MB
Data size: 52 MB
Capture duration: 6 seconds
Start time: Thu Aug 22 09:41:54 2013
End time: Thu Aug 22 09:42:00 2013
Data byte rate: 8820 kBps
Data bit rate: 70 Mbps
Average packet size: 1278.63 bytes
Average packet rate: 6898 packets/sec
SHA1: 959e589b090e3354d275f122a6fe6fbcac2351df
RIPEMD160: 7db6a437535d78023579cf3426c4d88d8ff3ddc3
MD5: 888729dc4c09baf736df22ef34bffeda
Strict time order: True
File name: regular.pcap
File type: Wireshark - pcapng
File encapsulation: Ethernet
Packet size limit: file hdr: (not set)
Number of packets: 38 k
File size: 50 MB
Data size: 49 MB
Capture duration: 6 seconds
Start time: Thu Aug 22 09:41:05 2013
End time: Thu Aug 22 09:41:11 2013
Data byte rate: 7740 kBps
Data bit rate: 61 Mbps
Average packet size: 1268.65 bytes
Average packet rate: 6101 packets/sec
SHA1: badf2040d826e6b0cca089211ee559a7c8a29181
RIPEMD160: 68f3bb5d4fcfd640f2da9764ff8e9891745d4800
MD5: 4ab73a02889472dfe04ed7901976a48c
Strict time order: True
if this ok that duration is same or I don't use prepared select as well as?
how I can improve it?
thanks.
The database server executes prepared statements and regular statements with the same speed. The performance difference comes when you execute the same query with different parameters: a prepared statement is parsed and prepared for execution once and then can be executed cheaply with different parameters, while a regular statement has to be parsed every time you want to execute it.
Related
We are programmatically creating PDF using our in house lib (C++) by adding all the required objects so that PDF readers can render them properly.
Currently we are enhancing the lib to support digital signatures in PDF. Our users will use USB token or Windows certificates to sign the PDF.
On studying raw PDF file with digital signature, we were able to make sense of all the objects except for the contents of Sig type object.
18 0 obj
<<
/Type /Sig
/Filter /Adobe.PPKLite --> signature handler for authenticating the fields contents
/SubFilter /adbe.pkcs7.sha1 --> submethod of the handler
/Contents <....> --> signature token
/ByteRange [ 0 101241 105931 7981
] --> byte range for digest calc
/M (D:20210505094125+05'30') --> timestamp
/Reason ()
/Location ()
/ContactInfo ()
>>
endobj
We have referred
https://www.adobe.com/devnet-docs/acrobatetk/tools/DigSigDC/Acrobat_DigitalSignatures_in_PDF.pdf
to understand what all constitutes the signature token.
We need direction on how to programmatically create the signature token for PDF using windows APIs. Currently we are not looking at 3rd party lib solutions.
Thanks in advance.
Update
We tried the following:
Updated our in-house PDF lib to support incremental updates so that digital signing related objects can be added. We added something like this apart from the obj# 18 mentioned above:
16 0 obj --> new Acroform obj
<<
/Fields [ 17 0 R ]
/SigFlags 3
>>
endobj
2 0 obj --> Updating root to add AcroForm
<<
/Type /Catalog
/Pages 3 0 R
/AcroForm 16 0 R
>>
endobj
17 0 obj --> new obj for signature field
<<
/T (SignatureField1)
/Type /Annot
/Subtype /Widget
/FT /Sig
/F 4
/Rect [ 270 159 503 201 ] --> field position. this will have image of sign
/P 5 0 R
/V 18 0 R
/AP <<
/N 19 0 R
>>
>>
endobj
5 0 obj --> updating existing page obj with Annots
<<
/Type /Page
/Parent 3 0 R
/MediaBox [ 0 0 595 841 ]
/Resources 4 0 R
/Contents 6 0 R
/Annots [ 17 0 R ]
>>
endobj
18 0 obj
<<
/Type /Sig
/Filter /Adobe.PPKLite
/SubFilter /adbe.pkcs7.sha1 --> we tried with adbe.pkcs7.detached as well
/Contents <> --> updated contents using windows APIs
/ByteRange [ 0 100381 102645 7322
] --> updated ByteRange with right offsets and lengths
/M (D:20210610125837+05'30') --> sign verified time
/Reason ()
/Location ()
/ContactInfo ()
>>
endobj
19 0 obj --> new obj
<<
/Length 7
/BBox [ 0 0 233 42 ]
/Type /XObject
/Subtype /Form
/Resources <<
/XObject <<
/FRM 20 0 R
>>
>>
>>
stream
/FRM Do
endstream
endobj
20 0 obj --> new obj for image manipulation
<<
/Length 29
/Type /XObject
/Subtype /Form
/Resources <<
/XObject <<
/Im1 21 0 R
>>
>>
/BBox [ 0 0 233 42 ]
>>
stream
q 233 0 0 42 0 0 cm /Im1 Do Q
endstream
endobj
21 0 obj --> image obj which contains sign info. Generated by us
<<
/Length 6166
/Type /XObject
/Subtype /Image
/Width 372
/Height 82
/ColorSpace /DeviceRGB
/BitsPerComponent 8
/Filter /DCTDecode
>>
stream
---------------------------------> image stream
endstream
endobj
xref --> updated xref
0 1
0000000000 65535 f
2 1
0000099954 00000 n
5 1
0000100172 00000 n
16 6
0000099901 00000 n
0000100020 00000 n
0000100297 00000 n
0000102944 00000 n
0000103096 00000 n
0000103271 00000 n
trailer --> updated trailer
<<
/Root 2 0 R
/Info 1 0 R
/Size 22
/ID [ <982AAACB948CE1AD9FDD976D177BF316> <982AAACB948CE1AD9FDD976D177BF316> ]
--> ID generated via windows API
/Prev 99491
>>
startxref
109605
%%EOF
For contents data, we used the below API:
bool SignMessageBySubjectName (BytePtr pMessage, ULong pMessageSize, StrPtr pSubjectName, CRYPT_DATA_BLOB * pSignBlob)
{
HCERTSTORE store_handle = NULL;
PCCERT_CONTEXT cert_context = NULL;
BYTE * signed_blob = NULL;
ULong signed_blob_size;
ULong message_size;
CRYPT_SIGN_MESSAGE_PARA signature_params;
BYTE * message;
pSignBlob->cbData = 0;
pSignBlob->pbData = NULL;
message = (BYTE *) pMessage;
message_size = (pMessageSize + 1) * sizeof(Char); //Size in bytes
const BYTE * message_array[] = {message};
DWORD message_array_size[1];
message_array_size[0] = message_size;
store_handle = CertOpenStore(CERT_STORE_PROV_SYSTEM, 0, NULL,
CERT_SYSTEM_STORE_CURRENT_USER, L"MY");
cert_context = CertFindCertificateInStore( store_handle, PKCS_7_ASN_ENCODING | X509_ASN_ENCODING, 0,
CERT_FIND_SUBJECT_STR, pSubjectName, NULL);
signature_params.cbSize = sizeof(CRYPT_SIGN_MESSAGE_PARA);
signature_params.dwMsgEncodingType = PKCS_7_ASN_ENCODING | X509_ASN_ENCODING;
signature_params.pSigningCert = cert_context;
signature_params.HashAlgorithm.pszObjId = szOID_RSA_SHA1RSA;
signature_params.HashAlgorithm.Parameters.cbData = NULL;
signature_params.cMsgCert = 1;
signature_params.rgpMsgCert = &cert_context;
signature_params.cAuthAttr = 0;
signature_params.dwInnerContentType = 0;
signature_params.cMsgCrl = 0;
signature_params.cUnauthAttr = 0;
signature_params.dwFlags = 0;
signature_params.pvHashAuxInfo = NULL;
signature_params.rgAuthAttr = NULL;
//Get size of signed message
CryptSignMessage(&signature_params, TRUE, 1, message_array, message_array_size,NULL, &signed_blob_size);
signed_blob = (BYTE *) Malloc(signed_blob_size);
CryptSignMessage(&signature_params, TRUE, 1, message_array, message_array_size, signed_blob, &signed_blob_size);
pSignBlob->cbData = signed_blob_size;
pSignBlob->pbData = signed_blob;
CertFreeCertificateContext(cert_context);
CertCloseStore(store_handle, CERT_CLOSE_STORE_FORCE_FLAG);
return true;
}
While using CryptSignMessage() with detached parameter as TRUE, we get a around 850 length sign token which we convert to hex and add in the contents part. It'll approximately be around 1700 chars.
In case of the image used in the Field newly added, we generated our own image and added it as a PDF obj.
For the ID in trailer part, we generated the same using API from Bcrypt.lib (BCryptGenRandom()), converted its output to hex and updated the ID part.
Listing out the steps we did:
We generated 2 buffers. Both buffers are identical with respect to all the PDF objects required, the ID generated from BCryptGenRandom() and ByteRange array updated with actual values. buffer1 has contents data as 0s for a definite length acting as a placeholder. buffer2 has empty contents data (/Contents <>)
buffer2 will be passed onto CryptSignMessage() to generate the sign token. This will be converted to hex.
The hex sign token will be added to contents part of buffer1 replacing the 0s based on its length.
buffer1 will be written to a PDF file.
When we did all these, and opened the PDF in readers, we got errors like
Signature is invalid
Document has been corrupted or altered since the signature was applied.
Error from a PDF Reader:
Detailed Error:
But with these errors too, the reader was able to identify the user, certificate, hash algorithm and signature algorithm used.
We think we need to somehow add timestamp data as part of the sign token so as to avoid this error. Or something else we would have missed.
PFA sample PDF here:https://drive.google.com/file/d/1Udog4AmGoq2ls3Tu3Wq5s2xU9LxaI3fH/view?usp=sharing
Kindly help us solve this issue. Thanks in advance.
We used a different set of APIs to make this work.
Pasting the code here:
bool SignatureHandler::SignMessageTest (BytePtr pMessage, ULong pMessageSize, StrPtr pSubjectName, CRYPT_DATA_BLOB * pSignBlob, LPSTR pOid, DWORD pFlag, DWORD pType)
{
HCERTSTORE store_handle = NULL;
PCCERT_CONTEXT cert_context = NULL;
BYTE * signed_blob = NULL;
ULong signed_blob_size = 0;
CRYPT_SIGN_MESSAGE_PARA signature_params;
BYTE * message;
BOOL rc;
pSignBlob->cbData = 0;
pSignBlob->pbData = NULL;
store_handle = CertOpenStore (CERT_STORE_PROV_SYSTEM, 0, NULL, CERT_SYSTEM_STORE_CURRENT_USER, L"MY");
cert_context = CertFindCertificateInStore (store_handle, (PKCS_7_ASN_ENCODING | X509_ASN_ENCODING), 0, CERT_FIND_SUBJECT_STR, pSubjectName, NULL);
HCRYPTPROV_OR_NCRYPT_KEY_HANDLE a = 0;
DWORD ks = 0;
BOOL bfr = false;
HCRYPTPROV_OR_NCRYPT_KEY_HANDLE PrivateKeys;
CERT_BLOB CertsIncluded;
CMSG_SIGNER_ENCODE_INFO Signers;
HCRYPTMSG hMsg;
rc = CryptAcquireCertificatePrivateKey (cert_context, 0, 0, &a, &ks, &bfr);
CMSG_SIGNER_ENCODE_INFO SignerEncodeInfo = {0};
SignerEncodeInfo.cbSize = sizeof (CMSG_SIGNER_ENCODE_INFO);
if (a)
SignerEncodeInfo.hCryptProv = a;
if (bfr)
PrivateKeys = a;
CERT_BLOB SignerCertBlob;
SignerCertBlob.cbData = cert_context->cbCertEncoded;
SignerCertBlob.pbData = cert_context->pbCertEncoded;
CertsIncluded = SignerCertBlob;
SignerEncodeInfo.cbSize = sizeof (CMSG_SIGNER_ENCODE_INFO);
SignerEncodeInfo.pCertInfo = cert_context->pCertInfo;
SignerEncodeInfo.dwKeySpec = ks;
SignerEncodeInfo.HashAlgorithm.pszObjId = pOid;
SignerEncodeInfo.HashAlgorithm.Parameters.cbData = NULL;
SignerEncodeInfo.pvHashAuxInfo = NULL;
Signers = SignerEncodeInfo;
CMSG_SIGNED_ENCODE_INFO SignedMsgEncodeInfo = {0};
SignedMsgEncodeInfo.cbSize = sizeof (CMSG_SIGNED_ENCODE_INFO);
SignedMsgEncodeInfo.cSigners = 1;
SignedMsgEncodeInfo.rgSigners = &Signers;
SignedMsgEncodeInfo.cCertEncoded = 1;
SignedMsgEncodeInfo.rgCertEncoded = &CertsIncluded;
SignedMsgEncodeInfo.rgCrlEncoded = NULL;
signed_blob_size = 0;
signed_blob_size = CryptMsgCalculateEncodedLength ((PKCS_7_ASN_ENCODING | X509_ASN_ENCODING), pFlag, pType, &SignedMsgEncodeInfo, 0, pMessageSize);
if (signed_blob_size) {
signed_blob_size *= 2;
hMsg = CryptMsgOpenToEncode (CERTIFICATE_ENCODING_TYPE,
pFlag,
pType,
&SignedMsgEncodeInfo,
0,
NULL);
if (hMsg) {
signed_blob = (BYTE *)malloc (signed_blob_size);
BOOL CU = CryptMsgUpdate (hMsg, (BYTE *)pMessage, (DWORD)pMessageSize, true);
if (CU) {
if (CryptMsgGetParam (
hMsg, // Handle to the message
CMSG_CONTENT_PARAM, // Parameter type
0, // Index
signed_blob, // Pointer to the BLOB
&signed_blob_size)) // Size of the BLOB
{
signed_blob = (BYTE *)realloc (signed_blob, signed_blob_size);
if (hMsg) {
CryptMsgClose (hMsg);
hMsg = 0;
}
}
}
if (hMsg)
CryptMsgClose (hMsg);
hMsg = 0;
}
}
CryptReleaseContext (a, 0);
pSignBlob->cbData = signed_blob_size;
pSignBlob->pbData = signed_blob;
CertFreeCertificateContext (cert_context);
CertCloseStore (store_handle, CERT_CLOSE_STORE_FORCE_FLAG);
return true;
}
The oid, flag and type we used are szOID_RSA_SHA1RSA, CMSG_DETACHED_FLAG and CMSG_SIGNED respectively.
On converting pSignBlob->pbData to hex and adding it to /Contents, the PDF and signature became valid when opened in PDF readers.
Ok, the signature container is embedded correctly.
But there are issues with the signature container itself:
Both in the SignedData.digestAlgorithms collection and in the SignerInfo.digestAlgorithm value you have used the OID of SHA1withRSA, but that is a full signature algorithm, not the mere digest algorithm SHA1 expected there.
Then the SHA1 hash of the signed bytes is BB78A402F7A537A34D6892B83881266501A691A8 but the hash you signed is 90E28B8A0D8E48691DAFE2BA10A4761FFFDCCD3D. This might be because you hash buffer2 and
buffer2 has empty contents data (/Contents <>)
The hex string delimiters '<' and '>' also belong to the contents value and, therefore, must also be removed in buffer2.
Furthermore, your signature is very weak:
It uses SHA1 as hash algorithm. SHA1 meanwhile has been recognized as too weak a hash algorithm for document signatures.
It doesn't use signed attributes, neither the ESS signing certificate nor the algorithm identifier protection attribute. Many validation policies require such special attributes.
I am facing a performance issue in Akka remoting. I have 2 actors Actor1 and Actor2. The message sending between the actor is synchronous ask request from Actor1 to Actor2 and the response back from Actor2 to Actor1. Below is the sample code snippets and config of my Actor:
Actor1.java:
object Actor1 extends App {
val conf = ConfigFactory.load()
val system = ActorSystem("testSystem1", conf.getConfig("remote1"))
val actor = system.actorOf(Props[Actor1].withDispatcher("my-dispatcher"), "actor1")
implicit val timeOut: Timeout = Timeout(10 seconds)
class Actor1 extends Actor {
var value = 0
var actorRef: ActorRef = null
override def preStart(): Unit = {
println(self.path)
}
override def receive: Receive = {
case "register" =>
actorRef = sender()
println("Registering the actor")
val time = System.currentTimeMillis()
(1 to 300000).foreach(value => {
if (value % 10000 == 0) {
println("message count -- " + value + " --- time taken - " + (System.currentTimeMillis() - time))
}
Await.result(actorRef ? value, 10 seconds)
})
val totalTime = System.currentTimeMillis() - time
println("Total Time - " + totalTime)
}
}
}
Actor2.java:
object Actor2 extends App {
val conf = ConfigFactory.load()
val system = ActorSystem("testSystem1", conf.getConfig("remote2"))
val actor = system.actorOf(Props[Actor2].withDispatcher("my-dispatcher"), "actor2")
implicit val timeOut: Timeout = Timeout(10 seconds)
actor ! "send"
class Actor2 extends Actor {
var value = 0
var actorSelection: ActorSelection = context.actorSelection("akka://testSystem1#127.0.0.1:6061/user/actor1")
override def receive: Receive = {
case "send" =>
actorSelection ! "register"
case int: Int => {
sender() ! 1
}
}
}
}
application.conf:
remote1 {
my-dispatcher {
executor = "thread-pool-executor"
type = PinnedDispatcher
}
akka {
actor {
provider = remote
}
remote {
artery {
transport = tcp # See Selecting a transport below
canonical.hostname = "127.0.0.1"
canonical.port = 6061
}
}
}
}
remote2 {
my-dispatcher {
executor = "thread-pool-executor"
type = PinnedDispatcher
}
akka {
actor {
provider = remote
}
remote {
artery {
transport = tcp # See Selecting a transport below
canonical.hostname = "127.0.0.1"
canonical.port = 6062
}
}
}
}
Output:
message count -- 10000 --- time taken - 5871
message count -- 20000 --- time taken - 9043
message count -- 30000 --- time taken - 12198
message count -- 40000 --- time taken - 15363
message count -- 50000 --- time taken - 18649
message count -- 60000 --- time taken - 22074
message count -- 70000 --- time taken - 25487
message count -- 80000 --- time taken - 28820
message count -- 90000 --- time taken - 32118
message count -- 100000 --- time taken - 35634
message count -- 110000 --- time taken - 39146
message count -- 120000 --- time taken - 42539
message count -- 130000 --- time taken - 45997
message count -- 140000 --- time taken - 50013
message count -- 150000 --- time taken - 53466
message count -- 160000 --- time taken - 57117
message count -- 170000 --- time taken - 61246
message count -- 180000 --- time taken - 65051
message count -- 190000 --- time taken - 68809
message count -- 200000 --- time taken - 72908
message count -- 210000 --- time taken - 77091
message count -- 220000 --- time taken - 80855
message count -- 230000 --- time taken - 84679
message count -- 240000 --- time taken - 89089
message count -- 250000 --- time taken - 93132
message count -- 260000 --- time taken - 97360
message count -- 270000 --- time taken - 101442
message count -- 280000 --- time taken - 105656
message count -- 290000 --- time taken - 109665
message count -- 300000 --- time taken - 113706
Total Time - 113707
Is there any wrong I am doing here?. Any observation or suggestion to improve the performance?
The main issue I see with the code is Await.result(). That is a blocking operation, and will most likely affect performance.
I suggest collecting the results in a fixed array / list, use an integer as an array, and consider it complete when the expected number of responses have been received.
I am currently stuck on below issue:
I have two tables that I have to work with, one contains financial information for vessels and the other contains arrival and departure time for vessels. I get my data combining multiple excel sheets from different folders:
financialTable
voyageTimeTable
I have to calculate the result for above voyage, and apportion the result over June, July and August for both estimated and updated.
Time in June : 4 hours (20/06/2020 20:00 - 23:59) + 10 days (21/06/2020 00:00 - 30/06/2020 23:59) = 10.1666
Time in July : 31 full days
Time in August: 1 day + 14 hours (02/08/2020 00:00 - 14:00) = 1.5833
Total voyage duration = 10.1666 + 31 + 1.5833 = 42.7499
The result for the "updated" financialItem would be the following:
Result June : 100*(10.1666/42.7499) = 23.7816
Result July : 100*(31/42.7499) = 72.5148
Result August : 100*(1.5833/42.7499) = 3.7036
sum = 100
and then for "estimated" it would be twice of everything above.
This is the format I ideally would like to get:
prorataResultTable
I have to do this for multiple vessels, with multiple timespans and several voyage numbers.
Eagerly awaiting responses, if any. Many thanks in advance.
Brds,
Not sure if you're still looking for an answer, but code below gives me your expected output:
let
financialTable = Table.FromRows({{"A", 1, "profit/loss", 200, 100}}, type table [vesselName = text, vesselNumber = Int64.Type, financialItem = text, estimated = number, updated = number]),
voyageTimeTable = Table.FromRows({{"A", 1, #datetime(2020, 6, 20, 20, 0, 0), #datetime(2020, 8, 2, 14, 0, 0)}}, type table [vesselName = text, vesselNumber = Int64.Type, voyageStartDatetime = datetime, voyageEndDatetime = datetime]),
joined =
let
joined = Table.NestedJoin(financialTable, {"vesselName", "vesselNumber"}, voyageTimeTable, {"vesselName", "vesselNumber"}, "$toExpand", JoinKind.LeftOuter),
expanded = Table.ExpandTableColumn(joined, "$toExpand", {"voyageStartDatetime", "voyageEndDatetime"})
in expanded,
toExpand = Table.AddColumn(joined, "$toExpand", (currentRow as record) =>
let
voyageInclusiveStart = DateTime.From(currentRow[voyageStartDatetime]),
voyageExclusiveEnd = DateTime.From(currentRow[voyageEndDatetime]),
voyageDurationInDays = Duration.TotalDays(voyageExclusiveEnd - voyageInclusiveStart),
createRecordForPeriod = (someInclusiveStart as datetime) => [
inclusiveStart = someInclusiveStart,
exclusiveEnd = List.Min({
DateTime.From(Date.EndOfMonth(DateTime.Date(someInclusiveStart)) + #duration(1, 0, 0, 0)),
voyageExclusiveEnd
}),
durationInDays = Duration.TotalDays(exclusiveEnd - inclusiveStart),
prorataDuration = durationInDays / voyageDurationInDays,
estimated = prorataDuration * currentRow[estimated],
updated = prorataDuration * currentRow[updated],
month = Date.MonthName(DateTime.Date(inclusiveStart)),
year = Date.Year(inclusiveStart)
],
monthlyRecords = List.Generate(
() => createRecordForPeriod(voyageInclusiveStart),
each [inclusiveStart] < voyageExclusiveEnd,
each createRecordForPeriod([exclusiveEnd])
),
toTable = Table.FromRecords(monthlyRecords)
in toTable
),
expanded =
let
dropped = Table.RemoveColumns(toExpand, {"estimated", "updated", "voyageStartDatetime", "voyageEndDatetime"}),
expanded = Table.ExpandTableColumn(dropped, "$toExpand", {"month", "year", "estimated", "updated"})
in expanded
in
expanded
The code tries to:
join financialTable and voyageTimeTable, so that for each vesselName and vesselNumber combination, we know: estimated, updated, voyageStartDatetime and voyageEndDatetime.
generate a list of months for the period between voyageStartDatetime and voyageEndDatetime (which get expanded into new table rows)
for each month (in the list), do all the arithmetic you mention in your question
get rid of some columns (like the old estimated and updated columns)
I recommend testing it with different vesselNames and vesselNumbers from your dataset, just to see if the output is always correct (I think it should be).
You should be able to manually inspect the cells in the $toExpand column (of the toExpand step/expression) to see the nested rows before they get expanded.
UPDATE 2
I turned on tracing and ran my sample query. Here is the trace. I do see the statement Strlen Or Ind = 0x7fff9c84ee88 -> 255. The indicator variable is defined as SQLLEN indicator; Is this not initialized properly?
[ODBC][22407][1379343424.503572][__handles.c][450]
Exit:[SQL_SUCCESS]
Environment = 0x14f8160
[ODBC][22407][1379343424.503627][SQLSetEnvAttr.c][182]
Entry:
Environment = 0x14f8160
Attribute = SQL_ATTR_ODBC_VERSION
Value = 0x3
StrLen = 0
[ODBC][22407][1379343424.503654][SQLSetEnvAttr.c][349]
Exit:[SQL_SUCCESS]
[ODBC][22407][1379343424.503678][SQLAllocHandle.c][364]
Entry:
Handle Type = 2
Input Handle = 0x14f8160
[ODBC][22407][1379343424.503707][SQLAllocHandle.c][482]
Exit:[SQL_SUCCESS]
Output Handle = 0x14f8a90
[ODBC][22407][1379343424.503745][SQLDriverConnect.c][688]
Entry:
Connection = 0x14f8a90
Window Hdl = (nil)
Str In = [DSN=MyDB;UID=MyUID;][length = 15 (SQL_NTS)]
Str Out = 0x7fff9c84cc80
Str Out Max = 2048
Str Out Ptr = (nil)
Completion = 1
UNICODE Using encoding ASCII 'ISO8859-1' and UNICODE 'UCS-2LE'
[ODBC][22407][1379343424.523244][SQLDriverConnect.c][1497]
Exit:[SQL_SUCCESS]
Connection Out [[DSN=MyDB;UID=MyUID;][length = 15 (SQL_NTS)]]
[ODBC][22407][1379343424.523297][SQLAllocHandle.c][529]
Entry:
Handle Type = 3
Input Handle = 0x14f8a90
[ODBC][22407][1379343424.523343][SQLAllocHandle.c][1064]
Exit:[SQL_SUCCESS]
Output Handle = 0x1526b40
[ODBC][22407][1379343424.523377][SQLExecDirect.c][236]
Entry:
Statement = 0x1526b40
SQL = [SELECT '123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890...][length = 309 (SQL_NTS)]
[ODBC][22407][1379343424.523948][SQLExecDirect.c][499]
Exit:[SQL_SUCCESS]
[ODBC][22407][1379343424.523982][SQLNumResultCols.c][152]
Entry:
Statement = 0x1526b40
Column Count = 0x7fff9c84eeae
[ODBC][22407][1379343424.524005][SQLNumResultCols.c][244]
Exit:[SQL_SUCCESS]
Count = 0x7fff9c84eeae -> 1
[ODBC][22407][1379343424.524030][SQLFetch.c][158]
Entry:
Statement = 0x1526b40
[ODBC][22407][1379343424.524056][SQLFetch.c][340]
Exit:[SQL_SUCCESS]
[ODBC][22407][1379343424.524084][SQLGetData.c][233]
Entry:
Statement = 0x1526b40
Column Number = 1
Target Type = 1 SQL_CHAR
Buffer Length = 5000
Target Value = 0x7fff9c84da90
StrLen Or Ind = 0x7fff9c84ee88
[ODBC][22407][1379343424.524115][SQLGetData.c][497]
Exit:[SQL_SUCCESS]
Buffer = [12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678]
Strlen Or Ind = 0x7fff9c84ee88 -> 255
[ODBC][22407][1379343424.524142][SQLColAttribute.c][286]
Entry:
Statement = 0x1526b40
Column Number = 1
Field Identifier = SQL_DESC_NAME
Character Attr = 0x7fff9c84ee20
Buffer Length = 100
String Length = 0x7fff9c84ee86
Numeric Attribute = (nil)
[ODBC][22407][1379343424.524167][SQLColAttribute.c][657]
Exit:[SQL_SUCCESS]
[ODBC][22407][1379343424.524229][SQLFetch.c][158]
Entry:
Statement = 0x1526b40
[ODBC][22407][1379343424.524257][SQLFetch.c][340]
Exit:[SQL_NO_DATA]
[ODBC][22407][1379343424.524321][SQLDisconnect.c][204]
Entry:
Connection = 0x14f8a90
[ODBC][22407][1379343424.524375][SQLDisconnect.c][341]
Exit:[SQL_SUCCESS]
[ODBC][22407][1379343424.524415][SQLFreeHandle.c][279]
Entry:
Handle Type = 2
Input Handle = 0x14f8a90
[ODBC][22407][1379343424.524438][SQLFreeHandle.c][330]
Exit:[SQL_SUCCESS]
[ODBC][22407][1379343424.524463][SQLFreeHandle.c][212]
Entry:
Handle Type = 1
Input Handle = 0x14f8160
UPDATE
I investigated my C++ program further. I now see that the query result is being truncated in the call to
data_ret = SQLGetData(stmt, i, SQL_C_CHAR, buf, sizeof(buf), &indicator);
buf is of sufficent length and data_ret is zero (success). Am I misusing SQLGetData, or does it have some behavior I am unaware of?
I have seen others with a similar problem, but I haven't quite been able to sort it out. If I query for a long string it gets truncated to 255 characters. I am querying a SQL Server DB from a bash script on a Linux machine using unixODBC and FreeTDS.
I set up the driver by putting
[FreeTDS]
Description = v0.91 with protocol v7.2
Driver = /usr/lib64/libtdsodbc.so.0
in a template and running
odbcinst -i -d -f tds.driver.template
I then put
[MyDB]
Driver = FreeTDS
Description = Database Description
Trace = No
Server = <serverIP>
Port = 1433
Database = <myDB>
UID = <myUID>
in a template and run
odbcinst -i -s -f tds.datasource.template
I tried this answer, but I must be doing something wrong. Any suggestions are appreciated.
You don't say what you are using to issue the query from the shell but if it is isql, then it hides very big columns and truncates smaller columns - probably at 255.
I am trying to figure out on making a time from the values i choose (dynamically)
I tried giving a date of 31st march 03:10pm and trying to go back a month with 03:10Pm which is 28th Feb 03:10Pm. But i am getting a value as 04:10Pm.
I set the isdst = -1 to do this bit; but still it fails.
Any ideas.
Tm->tm_mon = ReportMonth;
Tm->tm_mday = ReportEndDay;
Tm->tm_hour = TimeOfDay/60;
Tm->tm_min = TimeOfDay%60;
Tm->tm_sec = 59;
Tm->tm_isdst = -1;