Passing std::string_view to API execting const std::string& - c++

I am using Socket.IO library to create a client. While sending a message I have to pass my message (data) as sio::message::list(msg)
void Socket_IO::send_message(std::string_view msg) //Gives Error
//void Socket_IO::send_message(const std::string &msg) //Works
{
this->client.socket()->emit(Socket_IO::general_message, sio::message::list(msg), [&](sio::message::list const& msg) {
});
}
class sio::message::list has a constructor
list(const string& text)
{
m_vector.push_back(string_message::create(text));
}
but does not have a std::string_view constructor
Error :
'<function-style-cast>': cannot convert from 'std::string_view' to 'sio::message::list'
I wish to know is their any way I can pass std::string_view for API expecting const std::string&
I can't use c_str() function as the string might contain binary data which may contain null character (0x00).
I was thinking of creating a string object sio::message::list(std::string str(msg)) but wish to inquire will it defeat the purpose of using std::string_view.

You could go with:
this->client.socket()->emit(Socket_IO::general_message, sio::message::list(std::string(msg)), ...
this makes the job done. It will initialize temporary string object and pass it to list constructor.

Related

gmock save argument string

I hope there is an easier way to do this... I need to capture the string which is passed as an argument to a mock.
The mock
class web_api_mock : public iweb_api
{
public:
MOCK_METHOD(
(bool),
http_post,
(const etl_normal_string &, const char*),
(override));
};
I want to capture the char * passed to the mock as second argument. I need to construct some json structure from it, and I want to check if a certain element has a certain value.
I had to spend a lot of time to get it to work, eventually copying the trick from here. This brilliant mind figured out you can rely on the gmock's Invoke.
The EXPECT_CALL
http_post_args args;
EXPECT_CALL(_web_api_mock, http_post(etl_string_equals(url), _))
.WillOnce(
DoAll(
Invoke(&args, &http_post_args::capture),
Return(true)));
Here I am invoking all arguments of the mock to a struct which I defined as follows
struct http_post_args
{
void capture(etl_normal_string url, const char * p)
{
payload = std::string(p);
}
std::string payload;
};
And finally, I get my hands on the char * and do whatever I want afterwards.
It seems awfully complicated to save an argument when it's of the type char *.
My first attempt was the obvious mistake I guess many before (and after) me made: using the SaveArgPointee which will copy only the first element of the string and gives me with a string where the first character is correct, but the remaining string is filled with random mem.
My second attempt was to define an ACTION_P. This "almost" worked. In the callstack I could see the string I am interested in until the very last stackframe, where the args simply seem not to be passed to the actual implementation of my custom ACTION_P.
ACTION_P2(capture_string, url, payload)
{
/* if I break in the debugger, and go 1 stackframe up,
I can see that gmock holds my string in varargs as second element
But I couldn't find a way to access it here*/
}
I also tried the ACTION_TEMPLATE but I am not c++ enough to understand what they are trying to explain me on gmock cookbook.
So my final question: is the above working trick with http_post_args struct really "the only way" to capture a const char * being passed as an argument to a mock?
If it SHOULD be possible using ACTION_P or ACTION_TEMPLATE would somebody be so kind to provide an actual working example with a const char *?
You could simply use a lambda, like so (live example):
TEST(SomeTest, Foo)
{
std::string payload;
web_api_mock m;
EXPECT_CALL(m, http_post(Eq("url"), _))
.WillOnce([&](const std::string &, const char* p){
payload = p;
return true;
});
m.http_post("url", "foo string");
EXPECT_THAT(payload, Eq("foo string"));
}
No additional http_post_args or actions etc required.
Of course, you could also change the payload to a const char* if you want to "capture" the raw char pointer. But be careful with the lifetime of the pointed to characters, in this case.
You hinted that your real code will need to parse the payload string as json and check for a certain element. It might lead to more readable tests when you create a dedicated matcher that does this. To show a rough draft (live example):
MATCHER_P(ContainsJsonElement, expectedElement, "")
{
const char * payload = arg;
// Parse payload as json, check for element, etc.
const bool foundElement = std::string(payload) == expectedElement;
return foundElement;
}
TEST(SomeTest, Foo)
{
web_api_mock m;
EXPECT_CALL(m, http_post(Eq("url"), ContainsJsonElement("foo string")));
m.http_post("url", "foo string");
}

How do I convert a std::string to System.String in C++ with Il2CppInspector?

I am using Il2CppInspector to generate scaffolding for a Unity game. I am able to convert System.String (app::String in Il2CppInspector) to std::string using the functions provided below.
How would I reverse this process; how do I convert a std::string to System.String?
helpers.cpp
// Helper function to convert Il2CppString to std::string
std::string il2cppi_to_string(Il2CppString* str) {
std::u16string u16(reinterpret_cast<const char16_t*>(str->chars));
return std::wstring_convert<std::codecvt_utf8_utf16<char16_t>, char16_t>{}.to_bytes(u16);
}
// Helper function to convert System.String to std::string
std::string il2cppi_to_string(app::String* str) {
return il2cppi_to_string(reinterpret_cast<Il2CppString*>(str));
}
In short, I am looking for a function that takes in a std::string and returns an app::String
// Helper function to convert std::string to System.String
app::String string_to_il2cppi(std::string str) {
// Conversion code here
}
The accepted answer is actually wrong, there is no size parameter and copying stops at the first null byte (0x00) according to the MSDN documentation.
The following code fixes these problems and works correctly:
app::String* string_to_il2cppi(const std::string& string)
{
const auto encoding = (*app::Encoding__TypeInfo)->static_fields->utf8Encoding;
const auto managed_string = app::String_CreateStringFromEncoding((uint8_t*)&string.at(0), string.size(), encoding, nullptr);
return managed_string;
}
A quote from djkaty:
To create a string, you cannot use System.String‘s constructors –
these are redirected to icalls that throw exceptions. Instead, you
should use the internal Mono function String.CreateString. This
function has many overloads accepting various types of pointer and
array; an easy one to use accepts a uint16_t* to a Unicode string and
can be called as follows [...]
Export Il2CppInspector with all namespaces, which will give you access to Marshal_PtrToStringAnsi.
app::String* string_to_il2cppi(std::string str) {
return app::Marshal_PtrToStringAnsi((void*)&str, NULL);
}
Limitation: do not attempt to convert a string with null terminators inside of them example:
std::string test = "Hello\0world";
Use BullyWiiPlaza's solution if this is an issue for you.

Cannot use std::string variable in RapidJSON function call

I am all new to C++ and am running into an issue. I am using rapidJSON to create JSON documents.
void setKeyValue() {
Value obj(kObjectType);
Value key("key");
Value val(42);;
obj.AddMember(key, val, d.GetAllocator());
}
Works as expected. But when I try to replace the call to key to make it use a passed in param, like so:
void setKeyValue(string myKey) {
Value obj(kObjectType);
Value key(myKey);
Value val(42);;
obj.AddMember(key, val, d.GetAllocator());
}
The myKey in Value key(myKey) get a red curly underling in Visual Studio saying the following:
What is causing this and how can I solve it?
You don't get support for std::string by default. rapidJSON requires you to specify you want std::string support.
#define RAPIDJSON_HAS_STDSTRING 1
Only then is this constructor you're using valid:
GenericValue (const std::basic_string< Ch > &s, Allocator &allocator)
JSON library you are using seems that doesn't work with string objects from standard library, but it works with const char*.
So you must convert string object to char* with the method c_str():
void setKeyValue(string myKey) {
Value obj(kObjectType);
Value key((char*)myKey.c_str());
Value val(42);;
obj.AddMember(key, val, d.GetAllocator());
}

Using std::error_code with non-integer values

I'm writing a library and want to return error codes whenever an error is returned by a remote system. The problem is that these are identified by strings, eg, "0A01" and also contain a message, and error code requires an integer as value.
What is the best way to implement an error code, with all the functionality that std::error_code provides but that uses strings as the value? How do I add an external error string to the std::error_code or std::error_category?
As mentioned in the comments, you must know the error codes, which could be received from the remote server.
The std::string which you receive from a remote server contains 2 parts as you said,
The problem is that these are identified by strings, eg, "0A01" and also contain a message, and error code requires an integer as value.
As you haven't shared the format of the error message, I am not adding the code for spiting it, split your string into 2 parts,
Error Code
Error Message
Now you can convert Error Code of type std::string to int by using std::stoi(error_code), So lets say
int error_code_int = std::stoi(string_to_hexadecimal(error_code));
And for std::error_category which serves as base class for our custom error messages, do this,
std::string message_received = "This is the message which received from remote server.";
struct OurCustomErrCategory : std::error_category
{
const char* name() const noexcept override;
std::string message(int ev) const override;
};
const char* OurCustomErrCategory::name() const noexcept
{
return "Error Category Name";
}
std::string OurCustomErrCategory::message(int error_code_int) const
{
switch (error_code_int)
{
case 1:
return message_received;
default:
return "(unrecognized error)";
}
}
const OurCustomErrCategory ourCustomErrCategoryObject;
std::error_code make_error_code(int e)
{
return {e, ourCustomErrCategoryObject};
}
int main()
{
int error_code_int = std::stoi(string_to_hexadecimal(error_code)); // error_code = 0A01
ourCustomErrCategoryObject.message(error_code_int);
std::error_code ec(error_code_int , ourCustomErrCategoryObject);
assert(ec);
std::cout << ec << std::endl;
std::cout << ec.message() << std::endl;
}
The output for above working example is
Error Category Name : 0A01
This is the message which received from remote server.
You can use function string_to_hexadecimal() from this post.
I hope that now you can modify the above code according to your needs.
Edit 1:
As you said that:
This assumes the dynamic message is a global value. How do I pass it
to an std::error_category object?
You can see that both std::error_code::assign and constructor std::error_code::error_code are taking parameters of int for error code number and error_category. So It is obvious that std::error_code can't take the dynamic message.
But wait, I said std::error_code are taking error_category as an argument in constructor, so is there any way, we can assign the dynamic message there ?
std::error_category states that:
std::error_category serves as the base class for specific error
category types.
So it means that the struct we derived from std::error_category at the following line
struct OurCustomErrCategory : std::error_category
can have a data member and we can assign it via member function, so our struct will become like that,
struct OurCustomErrCategory : std::error_category
{
std::string message_received;
OurCustomErrCategory(std::string m) : message_received(m) {}
const char* name() const noexcept override;
std::string message(int ev) const override;
};
and you can assign it like that wherever you want,
const OurCustomErrCategory ourCustomErrCategoryObject("This is the message which received from remote server.");

rapidjson producing inconsistent results when converting to string

I currently have a JSON serializer class that I'm using. It's for some experimental code that I'm working on. This code is using cpprestsdk. The serialization is setup to use either rapidjson or cpprestsdk's json.
So for example, the virtual functions for the class look like:
virtual void toJson(rapidjson::Document& json) const =0;
virtual void toJson(web::json::value& json) const =0;
virtual void fromJson(const rapidjson::Value& json) =0;
virtual void fromJson(const web::json::value& json) =0;
I can convert from JSON no problem. I'm currently doing the conversion of a class object to JSON, and then exporting it as a string to a file. I have found that with rapidjson, I get variable results.
On some exports, I see a snippet like this:
"base": {
"name\u0000refere": "base",
On other runs, I see a snippet like this:
"base": {
"name": "base",
This is for successive runs, with no changes to the code.
The fields are actually globally defined const char * like so:
const char *kSymbolKeyName = "name";
const char *kSymbolKeyReferenceName = "referenceName";
The code to generate the JSON object that has the issue looks like:
void Object::toJson(rapidjson::Document& json) const {
using namespace rapidjson;
json.SetObject(); // Reset and clear any existing
auto& allocator = json.GetAllocator();
json.AddMember(StringRef(kObjectKeyName), Value(name.c_str(), allocator), allocator);
json.AddMember(StringRef(kObjectKeyPrioritizeTable), Value(prioritizeTable), allocator);
json.AddMember(StringRef(kObjectKeyPrioritizeGreaterOn), Value(prioritizeGreaterOn), allocator);
}
Note that kObjectKeyNameis defined as const char *kObjectKeyName = "name";.
And the caller to this class's toJson would look like:
using namespace rapidjson;
json.SetObject(); // Reset and clear any existing
auto& allocator = json.GetAllocator();
for (const auto& it : tables) {
Document iJson;
it.second->toJson(iJson);
json.AddMember(Value(it.first.c_str(), allocator), iJson, allocator);
}
Part of the problem may stem from the way I am using rapidjson::Documents and allocators. I believe toJsoncall will end up with it's own allocator once I make the SetObjectcall.
My plan is to revamp the code to use Value instead of Document in toJson and then pass the allocator in as an argument. I ideally didn't want to do that mainly because I was being lazy and wanted the signature to the same so it was easy to flip between rapidjson or cppsrestsdk's son.
Oh yeah, the code to output the file as a string is the following
std::ofstream out("output.json");
rapidjson::Document outDoc;
dataSet.toJson(outDoc);
rapidjson::StringBuffer buffer;
buffer.Clear();
rapidjson::PrettyWriter<rapidjson::StringBuffer> writer(buffer);
outDoc.Accept(writer);
out << buffer.GetString();
out.close();
There is no doubt something that I am doing odd/dumb as just recently started using rapidjson. I'm just trying to narrow down my issues and better understand the error of my ways.
It appears if modifying the process to pass in an allocator works.
I modified my toJson function to be
rapidjson::Value toJson(rapidjson::Document::AllocatorType& allocator);
In my usage, it means all generated Values, when needed, will use the base Document's allocator.