Can I read a zip archive using wxwidgets? - c++

I want to read a xml file or text file in zip archive without extracting it from the archive. Can I do it directly without extracting it from the zip archive?

Yes you can, wxZipInputStream should be what you are looking for.

wxZipInputStream zip(in);
while (entry.reset(zip.GetNextEntry()), entry.get() != NULL) {
wxString name = entry->GetName();
name = strPageName.BeforeLast('\\') + wxFileName::GetPathSeparator() + name;
zip.OpenEntry(*entry.get());
wxFileOutputStream file(name);
if (!file) {
wxLogError(_T("Can not create file '") + name + _T("'."));
break;
}
zip.Read(file);
I tried using wxZipInputStream. Yes, I can read files after extracting from the archive. I like to know whether I can read those files without extracting from the zip archive.

wxFileSystem::AddHandler(new wxZipFSHandler);
wxFileSystem fs;
wxFSFile *zip = fs.OpenFile( "d:\\test.zip#zip:test.txt");
if(zip!=NULL)
{
wxInputStream *in = zip->GetStream();
if ( in != NULL )
{
wxFileOutputStream out( "d:\\testout.txt" );
out.Write(*in);
out.Close();
}
delete zip;
}
Yes, we can read the zip file directly from the archive. The above is the sample code.

Related

Add files to the existing archive in XZip, C++, Windows

I am creating a windows service that periodically copies the contents of a folder to an archive. I've searched for archiving libraries and found suggestions to use XZip.
Now I can create a new archive using code like this:
HZIP arch = CreateZip((void*)Archive.c_str(), 0, ZIP_FILENAME);
if (arch == nullptr)
return addLogMessage("Unable to open archive.");
for (const auto& file : std::filesystem::directory_iterator(Directory)) {
const std::wstring filePath = file.path().wstring();
const std::wstring nameInArchive = file.path().filename().wstring();
if (ZipAdd(arch, (TCHAR*)nameInArchive.c_str(), (TCHAR*)filePath.c_str(), 0, ZIP_FILENAME) != ZR_OK) {
std::string what = "Error: Adding '" + std::string(filePath.begin(), filePath.end()) + "' to the archive.";
addLogMessage(what.c_str());
}
}
CloseZip(arch);
I also want to add files to an existing archive. When I run the code below, I get a ZR_ZMODE error from the ZipAdd function, which means tried to mix create / open zip archive:
if (std::filesystem::exists(Archive)) {
HZIP arch = OpenZip((void*)Archive.c_str(), 0, ZIP_FILENAME);
if (arch == nullptr)
return addLogMessage("Unable to re-open existing archive.");
for (const auto& file : std::filesystem::directory_iterator(Directory)) {
const std::wstring filePath = file.path().wstring();
const std::wstring nameInArchive = file.path().filename().wstring();
if (ZipAdd(arch, (TCHAR*)nameInArchive.c_str(), (TCHAR*)filePath.c_str(), 0, ZIP_FILENAME) != ZR_OK) {
std::string what = "Error: Adding '" + std::string(filePath.begin(), filePath.end()) + "' to the archive.";
addLogMessage(what.c_str());
}
}
CloseZip(arch);
}
I assume that in XZip library, I'm not able to add files to the existing archive. Is it true?
EDIT: I've solved the problem with different library, libzip. Now I can add new files to the old archive, and everything works fine. I've posted source code and compiling/linking instruction here.
I am still interested, if there is an option for adding files to the existing archive in XZip library, so I will not close the question for now.

How to modify the filename of the S3 object uploaded using the Kafka Connect S3 Connector?

I've been using the S3 connector for a couple of weeks now, and I want to change the way the connector names each file. I am using the HourlyBasedPartition, so the path to each file is already enough for me to find each file, and I want the filenames to be something generic for all the files, like just 'Data.json.gzip' (with the respective path from the partitioner).
For example, I want to go from this:
<prefix>/<topic>/<HourlyBasedPartition>/<topic>+<kafkaPartition>+<startOffset>.<format>
To this:
<prefix>/<topic>/<HourlyBasedPartition>/Data.<format>
The objective of this is to only make one call to S3 to download the files later, instead of having to look for the filename first and then download it.
Searching through the files from the folder called 'kafka-connect-s3', I found this file:
https://github.com/confluentinc/kafka-connect-storage-cloud/blob/master/kafka-connect-s3/src/main/java/io/confluent/connect/s3/TopicPartitionWriter.java which at the end has some of the following functions:
private RecordWriter getWriter(SinkRecord record, String encodedPartition)
throws ConnectException {
if (writers.containsKey(encodedPartition)) {
return writers.get(encodedPartition);
}
String commitFilename = getCommitFilename(encodedPartition);
log.debug(
"Creating new writer encodedPartition='{}' filename='{}'",
encodedPartition,
commitFilename
);
RecordWriter writer = writerProvider.getRecordWriter(connectorConfig, commitFilename);
writers.put(encodedPartition, writer);
return writer;
}
private String getCommitFilename(String encodedPartition) {
String commitFile;
if (commitFiles.containsKey(encodedPartition)) {
commitFile = commitFiles.get(encodedPartition);
} else {
long startOffset = startOffsets.get(encodedPartition);
String prefix = getDirectoryPrefix(encodedPartition);
commitFile = fileKeyToCommit(prefix, startOffset);
commitFiles.put(encodedPartition, commitFile);
}
return commitFile;
}
private String fileKey(String topicsPrefix, String keyPrefix, String name) {
String suffix = keyPrefix + dirDelim + name;
return StringUtils.isNotBlank(topicsPrefix)
? topicsPrefix + dirDelim + suffix
: suffix;
}
private String fileKeyToCommit(String dirPrefix, long startOffset) {
String name = tp.topic()
+ fileDelim
+ tp.partition()
+ fileDelim
+ String.format(zeroPadOffsetFormat, startOffset)
+ extension;
return fileKey(topicsDir, dirPrefix, name);
}
I don't know if this can be customised to what I want to do but seems to be somehow near/related to my intentions. Hope it helps.
(Submitted an issue to Github as well: https://github.com/confluentinc/kafka-connect-storage-cloud/issues/369)

Append files to an existing zip file with Poco::Zip

After successfully compress the folder, here is my situation :
If append = true and overWrite = false I have to check whether if the target zip file exists or not if existed I will check the existed zip file which files it doesn't contain and append new file from the source folder to it.
My question is:
How can I open the zip file and put it to the compress object? or which others library in Poco should I use to open zip stream? I'm trying to use std::ifstream but Poco::zip::Compress doesn't seem to receive an std::ifstream
I surely have to modify the Poco source code itself to match with my requirement. Thanks in advance.
void ZipFile(string source, string target, List extensions, bool append, bool overWrite)
{
Poco::File tempFile(source);
if (tempFile.exists())
{
if (Poco::File(target).exists() && append && !overWrite) {
fs::path targetPath = fs::path(target);
std::ifstream targetFileStream(targetPath.string(), std::ios::binary);
std::ofstream outStream(target, ios::binary);
CompressEx compress(outStream, false, false);
if (tempFile.isDirectory())
{
Poco::Path sourceDir(source);
sourceDir.makeDirectory();
compress.addRecursive(sourceDir, Poco::Zip::ZipCommon::CompressionMethod::CM_AUTO,
Poco::Zip::ZipCommon::CL_NORMAL, false);
}
else if (tempFile.isFile())
{
Poco::Path path(tempFile.path());
compress.addFile(path, path.getFileName(), Poco::Zip::ZipCommon::CompressionMethod::CM_AUTO,
Poco::Zip::ZipCommon::CL_NORMAL);
}
compress.close(); // MUST be done to finalize the Zip file
outStream.close();
}
}
No need to modify the Poco source code. Poco allows you to get the contents of an archive and add files to it.
First, open the target archive to check which files are already in there:
Poco::ZipArchive archive(targetFileStream);
Then collect all files you want to add, that are not in the archive, yet:
std::vector<fs::path> files;
if (fs::is_directory(source)) {
for(auto &entry : fs::recursive_directory_iterator())
// if entry is file and not in zip
if (fs::is_regular_file(entry)
&& archive.findHeader(fs::relative(entry.path, source)) == archive.headerEnd()) {
files.push_back(entry.path);
}
} else if (fs::is_regular_file(entry)
&& archive.findHeader(source) == archive.headerEnd()) {
files.push_back(source);
}
Finally, add the files to your zip
Poco::Zip::ZipManipulator manipulator(target, false);
for(auto &file : files)
manipulator.addFile(fs::relative(file, source), file,
Poco::Zip::ZipCommon::CompressionMethod::CM_AUTO,
Poco::Zip::ZipCommon::CL_NORMAL);
I had no opportunity to test this. So try it out and see what needs to be done to make it work.

Poco::Zip set Extension List

After successfully compress the zip file, I want to add extensions list in order to only zip any files that have extensions within the Extensions List.
set <string> extensionsSet;
std::ofstream fos(target, ios::binary);
Poco::Zip::Compress c(fos, true);
extensionsSet.insert("txt");
c.setStoreExtensions(extensionsSet);//set extensions List
set <string> a = c.getStoreExtensions();//a contains 1 string which is txt
Poco::File aFile(source);
if (aFile.exists())
{
if (aFile.isDirectory())
{
Poco::Path sourceDir(source);
sourceDir.makeDirectory();
c.addRecursive(sourceDir, Poco::Zip::ZipCommon::CompressionMethod::CM_DEFLATE,
Poco::Zip::ZipCommon::CL_NORMAL, false);
}
else if (aFile.isFile())
{
Poco::Path p(aFile.path());
c.addFile(p, p.getFileName(), Poco::Zip::ZipCommon::CompressionMethod::CM_AUTO,
Poco::Zip::ZipCommon::CL_NORMAL);
}
}
else {
_log.EndMethod();
throw new FileNotFoundException("File Not Found");
}
c.close(); // MUST be done to finalize the Zip file
fos.close();
Can anyone help me to find out what's wrong within my code, because I don't see any difference when I set the extensions List? I want to zip a file(e.g test.txt), following my code should "test.txt" be included in the zip file?

Write to existing json file

I am using this code to add to my existing JSON file. However It completely overrides my JSON file and just puts one JSON object in it when I would just like to add another item to the list of items in my JSON file. How would I fix this?
Json::Value root;
root[h]["userM"] = m;
root[h]["userT"] = t;
root[h]["userF"] = f;
root[h]["userH"] = h;
root[h]["userD"] = d;
Json::StreamWriterBuilder builder;
std::unique_ptr<Json::StreamWriter> writer(builder.newStreamWriter());
std::ofstream outputFileStream("messages.json");
writer-> write(root, &outputFileStream);
My recommendation is
Load the file into a Json::Value
Add or change whatever fields you want
Overwrite the original file with the updated Json::Value
Doing this is going to be the least error-prone method, and it'll work quickly unless you have a very large Json file.
How to read in the entire file
This is pretty simple! We make the root, then just use the >> operator to read in the file.
Json::Value readFile(std::istream& file) {
Json::Value root;
Json::Reader reader;
bool parsingSuccessful = reader.parse( file, root );
if(not parsingSuccessful) {
// Handle error case
}
return root;
}
See this documentation here for more information