I am trying to format the time into hh::mm::ss then put it in a wide-string-stream. The code is as follows.
std::chrono::steady_clock::time_point end = std::chrono::steady_clock::now();
std::chrono::steady_clock::duration time_elapsed = end - start;
std::chrono::hh_mm_ss formatted {std::chrono::duration_cast<std::chrono::milliseconds> (time_elapsed)};
start is in the constructor of the class. using the << stream operators do not work and i do not see any way to convert this type to a string.
My question is how can i convert formatted to a string (c style, wstring, or normal string)?
The following works for me in MSVC.
As of 2021-08-12 I had to use the /std:c++latest switch with Visual Studio version 16.11.2 in order for this solution to work.
const auto start{ std::chrono::steady_clock::now( ) };
std::this_thread::sleep_for( std::chrono::milliseconds{ 1000 } );
const auto end{ std::chrono::steady_clock::now( ) };
const auto elapsed{ end - start };
std::chrono::hh_mm_ss formatted{
std::chrono::duration_cast<std::chrono::milliseconds>( elapsed ) };
std::cout << formatted << '\n';
// Or
std::stringstream ss{ };
ss << formatted;
std::cout << ss.str( ) << '\n';
Related
I want to create a log file with the name mentioning the date and time and I am using Visual Studio 2013(V120). currently I am implemented a code like this but get into issues at run time. Could anyone has a method for this
std::string Logger::GetTimeStampWithMilliSeconds() const
{
const auto now = std::chrono::system_clock::now();
const auto nowMs = std::chrono::duration_cast<std::chrono::milliseconds>(now.time_since_epoch()) % 1000;
const auto nowAsTimeT = std::chrono::system_clock::to_time_t(now);
struct tm tm {};
localtime_s(&tm, &nowAsTimeT);
std::stringstream nowSs;
nowSs << std::put_time(&tm, "%Y-%m-%d %T") << '.' << std::setfill('0') << std::setw(3) << nowMs.count();
return nowSs.str();
The issue is replicate below.
I have been looking around to get what I want but I couldn't find anything hence my question (hopefully not a duplicate!)
I am looking to get a microsecond resolution epoch time (to be converted to a Date string) of the clock perhaps using chrono.
Following is what works for me for seconds resolution:
auto secondsEpochTime = std::chrono::duration_cast<std::chrono::seconds>(std::chrono::system_clock::now().time_since_epoch()).count();
std::cout << "Date string = " << ctime(&secondsEpochTime);
However when I change seconds to microseconds, ctime doesn't seem to reflect the correct date.
auto microSecondsEpochTime = std::chrono::duration_cast<std::chrono::microseconds>(std::chrono::system_clock::now().time_since_epoch()).count();
std::cout << "Date string = " << ctime(µSecondsEpochTime); // incorrect Date
Unfortunately std::chrono is not complete to provide a full answer to your question. You will have to use parts of the C library until C++23 at least otherwise you might end up with a race-prone implementation.
The idea is to get the timestamp and convert it to an integer as microseconds since epoch (1970-01-01).
Then use localtime_r to get the local time broken down in year/month/day/hour/minute/seconds and print it to string.
Finally append the milliseconds as an int padded to 3 digits and return the entire result as an std::string.
constexpr static int64_t ONEMICROSECOND = 1000000;
static std::string nowstr() {
auto now = std::chrono::system_clock::now();
auto onems = std::chrono::microseconds(1);
int64_t epochus = now.time_since_epoch()/onems;
time_t epoch = epochus/ONEMICROSECOND;
struct tm tms{};
localtime_r( &epoch, &tms );
char buf[128];
size_t nb = strftime( buf, sizeof(buf), "%Y-%m-%d %H:%M:%S", &tms );
nb += ::sprintf( &buf[nb], ".%06d", int(epochus%ONEMICROSECOND) );
return std::string( buf, nb );
}
If you run this as-is it will likely return the timestamp in GMT. You will heave to set your timezone programatically if not set in the environment (as it happens with compiler explorer/Godbolt.
int main() {
setenv("TZ", "/usr/share/zoneinfo/America/New_York", 1);
std::cout << nowstr() << std::endl;
}
Results in
Program stdout
2022-10-01 22:51:03.988759
Compiler explorer link: https://godbolt.org/z/h88zhrr73
UPDATE: if you prefer to use boost::format (std::format is still incomplete on most compilers unfortunately) then you can do
static std::string nowstr() {
auto now = std::chrono::system_clock::now();
auto onems = std::chrono::microseconds(1);
int64_t epochus = now.time_since_epoch()/onems;
time_t epoch = epochus/ONEMICROSECOND;
struct tm tms{};
localtime_r( &epoch, &tms );
std::ostringstream ss;
ss << boost::format( "%04d-%02d-%02d %02d:%02d:%02d.%06d" )
% (tms.tm_year+1900) % (tms.tm_mon+1) % tms.tm_mday
% tms.tm_hour % tms.tm_min % tms.tm_sec
% (epochus%ONEMICROSECOND);
return ss.str();
}
You will have to use parts of the C library until C++23 at least
Umm... If your platform supports the full C++20 spec (at least with regards to format and chrono):
#include <chrono>
#include <format>
#include <iostream>
int
main()
{
auto tp = std::chrono::system_clock::now();
std::chrono::zoned_time zt{std::chrono::current_zone(),
std::chrono::time_point_cast<std::chrono::microseconds>(tp)};
std::cout << "Date string = " << std::format("{:%a %b %e %T %Y}", zt) << '\n';
}
Sample output:
Date string = Sat Oct 1 23:32:24.843844 2022
I'm writing a class as follows:
struct TimeIt {
using TimePoint = std::chrono::time_point<std::chrono::high_resolution_clock>;
TimeIt(const std::string& functName) :
t1{std::chrono::high_resolution_clock::now()},
functName{functName} {}
~TimeIt() {
TimePoint t2 = std::chrono::high_resolution_clock::now();
std::cout << "Exiting from " << functName << "...\n Elapsed: ";
std::cout << std::chrono::duration_cast<std::chrono::milliseconds>(t2 - t1).count() << " ms" << "\n";
}
TimePoint t1;
std::string functName;
};
The whole point of it is measure the time that takes for one function to complete, calling this at the start of it. However, the only value I'm getting is 0ms. This is obviously wrong, because it takes up to a minute for some of the functions, but I can't see why it's wrong.
I did the same, but at the start and end of the function, creating the TimePoint (with auto) and doing a duration_cast. Any clue what I'm missing here?
Edit:
I'm going to try to make it reproducible. A little bit of context: I'm working with big matrixes (12000 dimensions) and doing a lot of input output operations.
template <typename InputType>
InputMat<InputType>
readInp(const std::string& filepath = "data.inp", const size_t& reserveSize = 15000) {
TimeIt("readInp");
std::ifstream F(filepath);
assert(F.is_open());
InputMat<InputType> res;
res.reserve(reserveSize);
std::string line;
while (F >> line) {
InputType lineBitset{line};
res.push_back(lineBitset);
}
return res;
}
This function reads a matrix, and calling TimeIt here gives really different results compared when I call it in the wrapper function:
void test1() {
//Testing for 0-1 values
auto t1 = std::chrono::high_resolution_clock::now();
auto inpMat = readInp<std::bitset<32>>();
auto t2 = std::chrono::high_resolution_clock::now();
std::cout << std::chrono::duration_cast<std::chrono::milliseconds>(t2 - t1).count() << "\n";
//More code...
}
This outputs:
Exiting from readInp...
Elapsed: 0 milliseconds
4
and data.inp
NOW you have made the problem clear! By writing this:
TimeIt("Reading");
you are creating a temporary object, which is immediately deleted. You need to give this object a name so it lives until the end of the block:
TimeIt timer("Reading");
For example, I have two string objects:string str_1, str_2. I want to concatenate to them. I can use two methods:
method 1:
std::stringstream ss;
//std::string str_1("hello");
//std::string str_2("world");
ss << "hello"<< "world";
const std::string dst_str = std::move(ss.str());
method 2:
std::string str_1("hello");
std::string str_2("world");
const std::string dst_str = str_1 + str_2;
Because the string's buffer is read only, when you change the string object, its buffer will destroy and create a new one to store new content. So method 1 is better than method 2? Is my understanding correct?
stringstreams are complex objects compared to simple strings. Everythime you use method 1, a stringstream must be constructed, and later destructed. If you do this millions of time, the overhead will be far from neglectible.
The apparently simple ss << str_1 << str_2 is in fact equivalent to std::operator<<(sst::operator<<(ss, str_1), str_2); which is not optimized for in memory concatenation, but common to all the streams.
I've done a small benchmark :
in debug mode, method 2 is almost twice as fast as method1.
In optimized build (verifying in the assembler file that nothing was optimized away), it's more then 27 times faster.
Thans for everyone. Maybe I was a little lazy. Now I gave the code test.
test 1: 158.751ms, so long, my god!
int main()
{
chrono::high_resolution_clock::time_point begin_time = chrono::high_resolution_clock::now();
for(int i=0;i<100000;i++)
{
//string str_1("hello ");
//string str_2("world");
stringstream ss;
ss << "hello " << "world";
const string dst_str = ss.str();
}
chrono:: high_resolution_clock::time_point stop_time = chrono::high_resolution_clock::now();
chrono::duration<double> slapsed = duration_cast<duration<double>>(stop_time - begin_time);
cout << "time takes " << slapsed.count() * 1000 << "ms" << endl;
return 0;
}
test 2: 31.1946ms, fastest!
int main()
{
chrono::high_resolution_clock::time_point begin_time = chrono::high_resolution_clock::now();
for(int i=0;i<100000;i++)
{
string str_1("hello ");
string str_2("world");
const string dst_str = str_1 + str_2;
}
chrono:: high_resolution_clock::time_point stop_time = chrono::high_resolution_clock::now();
chrono::duration<double> slapsed = duration_cast<duration<double>>(stop_time - begin_time);
cout << "time takes " << slapsed.count() * 1000 << "ms" << endl;
return 0;
}
test 3: use boost::filesystem::path 35.1769ms, also a nice choice
int main()
{
chrono::high_resolution_clock::time_point begin_time = chrono::high_resolution_clock::now();
for(int i=0;i<100000;i++)
{
string str_1("hello ");
string str_2("world");
boost::filesystem::path dst_path(str_1);
dst_path += str_2;
const string dst = dst_path.string();
}
chrono::high_resolution_clock::time_point stop_time = chrono::high_resolution_clock::now();
chrono::duration<double> slapsed = duration_cast<duration<double>>(stop_time - begin_time);
cout << "time takes " << slapsed.count() * 1000 << "ms" << endl;
return 0;
}
Method 2 is better as your situation here. Since it has better readability and time cost less for declining the stringstream obj creation.
But, I strongly recommend to use format for better readability & maintenance, instead of "+" or other string concatenation.
I am using boost::posix_time::ptime to measure my simulation run-time and for something else.
assuimg
boost::posix_time::ptime start, stop;
boost::posix_time::time_duration diff;
start = boost::posix_time::microsec_clock::local_time();
sleep(5);
stop = boost::posix_time::microsec_clock::local_time();
diff = stop - stop;
now
std::cout << to_simple_string( diff ) << std::endl;
return the time in hh:mm:ss.ssssss format and i would like to have the time as well in ss.sssssss.
for doing this, i tried
boost::posix_time::time_duration::sec_type x = diff.total_seconds();
but that gave me the answer in format of ss and seconds() returns Returns normalized number of seconds (0..60).
My question how could i get my simulation time in seconds of the format ss.ssssss?
EDIT
i was able to do:
std::cout << diff.total_seconds() << "." << diff.fractional_seconds() << std::endl;
is there something elegant that could plot ss.sssssss?
total_seconds() returns a long value which is not normalized to 0..60s.
So just do this:
namespace bpt = boost::posix_time;
int main(int , char** )
{
bpt::ptime start, stop;
start = bpt::microsec_clock::local_time();
sleep(62);
stop = bpt::microsec_clock::local_time();
bpt::time_duration dur = stop - start;
long milliseconds = dur.total_milliseconds();
std::cout << milliseconds << std::endl; // 62000
// format output with boost::format
boost::format output("%.2f");
output % (milliseconds/1000.0);
std::cout << output << std::endl; // 62.00
}
// whatever time you have (here 1second)
boost::posix_time::ptime pt = boost::posix_time::from_time_t( 1 );
// subtract 0 == cast to duration
boost::posix_time::time_duration dur = pt - boost::posix_time::from_time_t(0);
// result in ms
uint64_t ms = dur.total_milliseconds();
// result in usec
uint64_t us = dur.total_microseconds();
// result in sec
uint64_t s = dur.total_seconds();
std::cout << "s = " << s << ", ms = " << ms << ", us = " << us << std::endl;
s = 1, ms = 1000, us = 1000000
The most straight-forward way I see is something like this output, the rest of the time computations along the lines of nabulke's post:
#include <iomanip>
double dseconds = dur.total_milliseconds() / 1000. ;
std::cout << std::setiosflags(std::ios::fixed) << std::setprecision(3);
std::cout << dseconds << std::endl;
You want to express time in terms of a floating point number, so it's probably best to actually use one and apply the standard stream formatting manipulators.