Trying to use URLRequestInfo::SetStreamToFile and URLResponseInfo::GetBodyAsFileRef - c++

I am working on an NaCl plugin for Chrome, and trying to download a URL resource file locally, into the temporary cache of Chrome, but without success.
Here is how I proceed:
// I indicate that I want the resource to be downloaded to a file
m_URLRequestInfo.SetStreamToFile( true );
// I open the request:
m_URLLoader.Open( m_URLRequestInfo, m_CCFactory.NewCallback( &MyClass::OnOpen ) );
...
// My callback (OnOpen) is eventually called.
// I then check to make sure the status code is 200 using this call:
m_URLLoader.GetResponseInfo().GetStatusCode()
// Then I ask to download the whole file:
m_URLLoader.FinishStreamingToFile( m_CCFactory.NewOptionalCallback( &MyClass::OnFileDownloaded ) );
...
// My other callback (OnFileDownloaded) gets eventually called,
// and again the status code is 200.
// Then I query the FileRef using this call:
pp::FileRef l_FileRef = m_URLLoader.GetResponseInfo().GetBodyAsFileRef();
The returned pp::FileRef seems to be fine, but pp::FileRef::GetFileSystemType() returns PP_FILESYSTEMTYPE_EXTERNAL, and then the call to pp::FileRef::GetPath() fails (it returns an UNDEFINED pp::Var).
So from this point, I am lost. I don't know what else I should do to get a valid pp::FileRef that points to a local file in the browser's cache. My final goal is to open this local file (an image file in my case) using a standard system file IO like fopen().
Thanks for any light !

Is there a reason you can't use the nacl_io library instead? With it you can write something like this:
#include <stdio.h>
#include <sys/mount.h>
// Mount an HTTP filesystem that reads files from "http://example.com/path/...".
mount("http://example.com/path/", "/mnt/http", "httpfs", 0, "");
// Performs a URL Request of "http://example.com/path/my_image.png".
FILE* file = fopen("/mnt/http/my_image.png", "r");
...
Take a look a the nacl_io demo in the SDK. It is located at $NACL_SDK_ROOT/examples/demo/nacl_io.

After reading more thoroughly the documentation and running more tests, I finally figured out what I was doing wrong.
When we call pp::URLLoader::FinishStreamingToFile, then the file is downloaded in Browser's cache, but it cannot be opened/read using regular stdio services like fopen, fread etc. We need to use the pp::FileIO class services to open the obtained pp::FileRef and read the content of the file.
So here is what I did to successfully load and read a file that was downloaded for me by the Browser. Basically, I continued to use the C++ PPAPI services.
(1) upon callback from m_URLLoader->FinishStreamingToFile, we then call m_FileIO->Open to open the downloaded file using the obtained FileRef;
(2) upon callback from m_FileIO->Open, we then call m_FileIO->Query to obtain the size of the downloaded file (and some other file attributes);
(3) upon callback from pp::FileIO::Query, we then check the file attribute type (e.g. not a folder), allocate a memory buffer large enough to hold the whole file content, and start to call pp::FileIO::Read to obtain the file's content;
(4) upon callback from pp::FileIO::Read, if the obtained nResult argument is 0, then we reached the EOF and we finished reading the file content into our memory buffer; if the obtained nResult > 0, then it indicates the number of successfully read bytes, and we call m_FileIO->Read again to continue reading bytes, and storing them at a different offset location into our memory buffer; if the obtained nResult < 0 then an error occurred and we must terminate the reading process.
Many steps, and many callbacks to manage, but in the end this works smoothly.

Related

Could DropBox interfere with DeleteFile()/rename()

I had the following code which got executed every two
minutes all day long:
int sucessfully_deleted = DeleteFile(dest_filename);
if (!sucessfully_deleted)
{
// this never happens
}
rename(source_filename,dest_filename);
Once every several hours the rename() would fail with errno=13 (EACCES). The files involved were all sitting on a DropBox directory and I had a hunch that DropBox could be the cause. I figured that it might just be possible that the DeleteFile() function may return with a non-zero successfully_deleted but actually DropBox could still be busy doing some stuff in relation to the deletion that prevented rename() from succeeding. What I did next was to change rename() to my_rename() which would attempt a rename() and upon any failure would Sleep() for one second and try a second time. Sure enough that has worked perfectly ever since. What's more, I get a diagnostic message displaying first-attempt-failures every several hours. It has never failed on the second attempt.
So you could say that the problem is entirely solved... but I would like to understand what might be going on so as to better defend myself against any related DropBox issues in the future...
Really I would like to have a new super_delete() function which does not return until the file is properly deleted and finished with in all respects.
under windows request to delete file really never delete file just. it mark it FCB (File Control Block) with special flag (FCB_STATE_DELETE_ON_CLOSE). real deletion will be only when the last file handle will be closed.
The DeleteFile function marks a file for deletion on close. Therefore,
the file deletion does not occur until the last handle to the file is
closed. Subsequent calls to CreateFile to open the file fail with
ERROR_ACCESS_DENIED.
also if exist section ( memory-mapped file ) open on file - file even can not be marked for delete. api call fail with STATUS_CANNOT_DELETE. so in general impossible always delete file.
in case exist another open handles for file (but not section !) begin from windows 10 rs1 exist new functional for delete - FileDispositionInformationEx with FILE_DISPOSITION_POSIX_SEMANTICS. in this case:
Normally a file marked for deletion is not actually deleted until all
open handles for the file have been closed and the link count for the
file is zero. When marking a file for deletion using
FILE_DISPOSITION_POSIX_SEMANTICS, the link gets removed from the visible namespace as soon as the POSIX delete handle has been closed,
but the file’s data streams remain accessible by other existing
handles until the last handle has been closed.
ULONG DeletePosix(PCWSTR lpFileName)
{
HANDLE hFile = CreateFileW(lpFileName, DELETE, FILE_SHARE_VALID_FLAGS, 0, OPEN_EXISTING,
FILE_FLAG_BACKUP_SEMANTICS|FILE_FLAG_OPEN_REPARSE_POINT, 0);
if (hFile == INVALID_HANDLE_VALUE)
{
return GetLastError();
}
static FILE_DISPOSITION_INFO_EX fdi = { FILE_DISPOSITION_DELETE| FILE_DISPOSITION_POSIX_SEMANTICS };
ULONG dwError = SetFileInformationByHandle(hFile, FileDispositionInfoEx, &fdi, sizeof(fdi))
? NOERROR : GetLastError();
// win10 rs1: file removed from parent folder here
CloseHandle(hFile);
return dwError;
}
Update
Sorry i didn't get the question correctly the first time. I thought DeleteFile returned error 13.
Now I understand that DeleteFile succeeds but rename fails immediatlely after.
It could be just a sync issue with the filesystem. After calling DeleteFile the file will be deleted when the OS commits the changes to the filesystem. That may not appen immediately.
If you need to perform multiple operations to the same path, you should have a look at transactions https://learn.microsoft.com/it-it/windows/desktop/api/winbase/nf-winbase-deletefiletransacteda.
-- OLD ANSWER --
That is correct. If the another application handles to that file, DeleteFile will fail.
Citing MSDN docs https://learn.microsoft.com/en-us/windows/desktop/api/winbase/nf-winbase-deletefile :
The DeleteFile function fails if an application attempts to delete a file that has other handles open for normal I/O or as a memory-mapped file (FILE_SHARE_DELETE must have been specified when other handles were opened).
This applies to dropbox, the antivirus, or in general, any other application that may open those files.
Dropbox may open the file to compute its hash (to look for changes) at any moment. Same goes with the antivirus.

URLDownloadToCacheFile Problems

--My first question here, please give me a hint If I do something wrong!
I'm using the URLDownloadToCacheFile function in different places of a software project I work for.
In the main UI is use this function to update a INI file from the Internet. Here I download the ini file directly with a URL. Works well.
In a DLL I use the same function to download a little binary file. This file has exact 308 bytes. It is an encrypted textfile with the ending ".db".
It's this second call that fails.
But this function do not fail on all computers, it just fail on a handfull of computers.
On my development computer it does not fail.
On some customer computers it does fail.
But just the one in the DLL. The call in the main GUI does not fail. Any idea? Or more information needed?
HRESULT hr = URLDownloadToCacheFile(
NULL, //ActiveX component calling this function
dbUrl, //Url to download
strFileName.GetBuffer(MAX_PATH), //pointer to a string containing the name of the downloade file
URLOSTRM_GETNEWESTVERSION, //size of filename buffer above
0, //reserved; must be zero
NULL); //optional IBindStatusCallback
if(SUCCEEDED(hr))
{
}
The problem occurs because you are trying to run the function on the same thread as the DLL creation. In this case use a different thread inside the dll to run the download function.

C++ WinINet InternetReadFile function refresh

I am trying to get the content of a file using WinHTTP in C++. The file is a XML File and is generated by a executable on a server.
The code for init, connect and even read a file on the specified server address is working.
// Connect to internet.
m_hInternet = InternetOpen(L"HTTPRIP",INTERNET_OPEN_TYPE_PRECONFIG,NULL,NULL,0);
// Check if worked.
if( !m_hInternet )
return;
// Connect to selected URL.
m_hUrl = InternetOpenUrlA(m_hInternet, strUrl.c_str(), NULL, 0, INTERNET_FLAG_PRAGMA_NOCACHE | INTERNET_FLAG_RESYNCHRONIZE, 0);
// Check if worked.
if( !m_hUrl )
return;
if( InternetReadFile(m_hUrl, buf, BUFFER_SIZE, &bytesread) && bytesread != 0 )
{
// Put into std::string.
strData = std::string(buf,buf+bytesread);
}
Now I want to update the file (same address). The server update the file at 50Hz and I want my code to be able to ReadFile only if it has been updated by the server. Can InternetReadFile do that kind of thing? Maybe with a FLAG but I didn't find a thing on MSDN.
Thanks for your help.
There is no way in the HTTP protocol for you directly do that, hence there is no such function in WinHTTP. The easiest solution might be to download the file and see if it's changed, if the file is relatively small, or if the file is large, let the server which writes the file, also write a timestamp, checksum or counter increment file next to it.
Then your code would download the checksum file, see if it's changed, and in that case download the original file.
Or another solution would be to put a timestamp or similar data in the beginning of the XML file, and stop downloading the file if the timestamp (or checksum) is not updated. (This comes with its own drawbacks of course, you may have to write your own parser.)
If HTTP server has a page with info (e.g. timestamp) on this file (no matters that a file is generated; the page may be generated too), you may examine this page.
As you know that server updates the file with (nearly) constant speed, your app may just use the timer.
P.S. I doubt if there's really a sense in reading some file 50 times every second.

Receiving a Sharing Violation Opening a File Code 32

I have been trying the following piece of code that does not work. What I am trying to do is to start executing my exe (one that I created a simple dialog based application using VC6.0) then from inside this application modify its own contents stored on the hard drive.
So there is a running copy of the exe and from this running copy it will open the disk copy into a buffer. Once loaded into a buffer then begin a search for a string. Once the string is found it will be replaced with another string which may not be the same size as the original.
Right now I am having an issue of not being able to open the file on disk for reading/writing. GetLastError returns the following error "ERROR_SHARING_VIOLATION The process cannot access the file because it is being used by another process.".
So what I did I renamed the file on disk to another name (essential same name except for the extension). Same error again about sharing violation. I am not sure why I am getting this sharing violation error code of 32. Any suggestions would be appreciated. I'll ask my second part of the question in another thread.
FILE * pFile;
pFile = fopen ("Test.exe","rb");
if (pFile != NULL)
{
// do something like search for a string
}
else
{
// fopen failed.
int value = GetLastError(); // returns 32
exit(1);
}
Read the Windows part of the File Locking wikipedia entry: you can't modify files that are currently executing.
You can rename and copy them, but you can't change them. So what you are trying to do is simply not possible. (Renaming the file doesn't unlock it at all, it's still the same file after the rename, so still not modifiable.)
You could copy your executable, modify that copy, then run that though.

FTPClient in MFC :GetFile(Download) issue

I am using CFtpConnection class for creating my FTPClient Library using MFC.
I am using GetFile to download file from Server.
MY requirement is like if i am downloading 100 MB video from server when 50-60 MB video is downloaded and in between if i play that while it should play upto that particular location what it has downloaded uptil that time .
Is that way i can do it any additional parameters i need to pass or something like that?
My FTPlibrary download method is as follows:
CFtpConnection* m_pConnect;
bool CFTPClient::Download(LPCTSTR pstrRemoteFile, LPCTSTR pstrLocalFile,
DWORD dwFlags)
{
m_pConnect->GetFile(pstrRemoteFile,pstrLocalFile,dwFlags);
return true;
}
And while calling in my application i am doing like this :
CFTPClient m_objftpclient ;
m_objftpclient.Download("MVI_2884_1.avi","D:\\MVI_2884_1.avi",FTP_TRANSFER_TYPE_BINARY);
You can't do that easily or even do it at all. The GetFile method of CFtpConnection is blocking which means it will exit only when the file is downloaded. So even if you thread it, the only way you can monitor the download is to get the size of the file on disk.
If you're about to implement video streaming, you should go down a level and work at the socket level. If you really want to use CFtpConnection, you should use the method OpenFile which returns a CInternetFile which can be read by chunks allowing you to monitor the download and share the buffer in which the file is downloaded for playback.