I posted a question earlier regarding obtaining GPU clock speeds, but I guess the thread appeared as already answered since someone had replied to it.
I'd been advised by one of your members to try extracting GPU clock speeds using SetupDiEnumDeviceInfo.
However, I looked around at some examples, like this one: http://www.codeproject.com/KB/system/DevMgr.aspx
and nothing seemed to be displayed about the clock speed.
Could someone please elaborate on how to achieve this, if at all possible?
Thanks again
You will want to check out this msdn article:
http://msdn.microsoft.com/en-us/library/bb742655.aspx
Specifically, follow these steps:
Call SetupDiGetDeviceRegistryProperty to retrieve the size, in bytes, of the property value. Supply the following parameter values:
Set DeviceInfoSet to a handle to a device information set that contains the device instance for which to retrieve the requested property value.
Set DeviceInfoData to a pointer to an SP_DEVINFO_DATA structure that represents the device instance for which to retrieve the requested property value.
Set Property to an SPDRP_Xxx identifier. For a list of these identifiers and a description of the corresponding device properties, see the description of the Property parameter that is included with SetupDiSetDeviceRegistryProperty.
Set PropertyRegDataType to a pointer to a DWORD-typed variable.
Set PropertyBuffer to NULL.
Set PropertyBufferSize to zero.
Set RequiredSize to a pointer to a DWORD-typed variable that receives, the size, in bytes of the property value.
In response to the call to SetupDiSetDeviceRegistryProperty, SetupDiGetDeviceRegistryProperty sets *RequiredSize to the size, in bytes, of the buffer that is required to retrieve the property value, logs the error code ERROR_INSUFFICIENT_BUFFER, and returns FALSE. A subsequent call to GetLastError will return the most recently logged error code.
Call SetupDiGetDeviceRegistryProperty again and supply the same parameter values that were supplied in the first call, except for the following changes:
Set PropertyBuffer to a pointer to a BYTE-typed buffer that receives the requested property value.
Set PropertyBufferSize to the the size, in bytes, of the PropertyBuffer buffer. The first call to SetupDiGetDeviceRegistryProperty retrieved the required size of the PropertyBuffer buffer in *RequiredSize.
This link shows how to get to the point where you've got the required strucutres to call SetupDiGetDeviceRegistryProperty.
http://www.pinvoke.net/default.aspx/setupapi/SetupDiEnumDeviceInfo.html
Related
I want to use dialog_fselect for selecting file in c++ console application. I wonder how I get the result path of dialog_fselect?
For example when I run:
dialog_fselect("Path", "", getmaxy(main_window)-10, getmaxx(main_window)-10);
How I could get the selected path?
dialog_fselect copies the result to dialog_vars.input_result:
Certain widgets copy a result to this buffer. If the pointer is NULL,
or if the length is insufficient for the result, then the dialog
library allocates a buffer which is large enough, and sets DIALOG_VARS.input_length. Callers should check for this case if they have
supplied their own buffer.
(The capitialized DIALOG_VARS in the manual page refers to the type name rather than the actual variable of that type—see DATA STRUCTURES).
std::string filename;
In this code:osg::Image* image = osgDB::readImageFile(filename + ".dicom");
osg::Image type variable: image gets wrong returned values from read file. And by debugging to the line above, the watch window shows as follows:
The _fileName (std::string type) value indicated on the first and second lines are both "digest", but in the fourth line the value of _fileName turned out to be "iiiiii\x*6" with capacity equals to 0.
According to my understanding, the _fileName of the fourth line in the watch window should indicate the same member variable of osg::Image as the _fileName on the first and second lines. Thus, I think all the _fileName in the debug watch window should have the same value. But, I am not sure why there are such differences.
MSVC++ implementation of std::string uses different storage strategies for short strings and for long ones. Short strings (16 bytes or less) are stored in a buffer embedded directly inside the std::string object (you will see it as _Bx._Buf in Raw View). Long strings are stored in an independently-allocated block of memory located elsewhere (pointed by _Bx._Ptr).
If you violate the integrity of std::string object, you might easily end up in a situation as the one you describe. The objects thinks that the data should be stored in the external buffer, but in reality it is stored in the internal one and vice-versa. That might easily confuse the debugger as well.
I suggest you open the Raw View of your std::string object and check what it shows in _Bx._Buf and _Bx._Ptr. If the current _Myres value is smaller or equal to the internal buffer size, then the data is [supposed to be] stored in the internal buffer. Otherwise, it is stored in the external memory block. See if this invariant really holds. If it doesn't, then there's your problem. Then you'll just have to find at which point it got broken.
For some reason your filename argument isn't getting .dicom attached to it when it becomes _filename ("digest" should become "digest.dicom"). OSG decides which plugin to use for file loading by extension, so it will have no idea how to load the current one. And so the second reference to _filename doesn't get intialized by any plugin.
By the way, I don't think the dicom plugin is part of the standard OSG prebuilt package - you might have to gather the dependencies yourself and build the plugin.
HRESULT CreatePartitionEx(ULONGLONG ullOffset, ULONGLONG ullSize, ULONG ulAlign, [in] CREATE_PARTITION_PARAMETERS *para, IVdsAsync **ppAsync
)
When i am passing parameter ppAsync = NULL , perticular call is failing and returned INVALIED argument.
Please help me to solve this issue.
According to the documentation, concerning the last parameter - ppAsync:
The address of an IVdsAsync interface pointer, which VDS initializes
on return. Callers must release the interface. Use this pointer to
cancel, wait for, or query the status of the operation.
This means that you should provide an actual pointer as a last parameter when calling the function. Since you are providing NULL, probably that is causing the problem.
EDIT:
Use like this:
IVdsAsync *pAsync; // Declare a pointer
// Then use it like this (take a look at the last parameter)
CreatePartitionEx(
ullOffset,
ullSize,
ulAlign,
para,
&pAsync); // You pass it with a leading &, which gives you the address of the pointer
And that should do it.
Remember that you should release the pAsync after you are finished with it, as the documentation states.
I am doing the same thing with Createvolume() , buts its returning hResult= E_InvalidArg . The fourth parameter is stripe size .The Windows implementation requires the stripe size to be 65536 if the type is VDS_VT_STRIPE or VDS_VT_PaARITY. Other volume types are not striped and the stripe size is 0.
I can't seem to find a way of doing this, but it seems strange to me that a registry key wouldn't be given a time stamp at all when created. Does anyone know of a way? Target platform is XP 32 bit.
Thanks.
RegQueryInfoKey() states that it can retrieve the last modified time:
lpftLastWriteTime [out, optional]
A pointer to a FILETIME structure that receives the last write time.
This parameter can be NULL.
The function sets the members of the FILETIME structure to indicate
the last time that the key or any of its value entries is modified.
I have a process under the Run key in the registry. It is trying to access an environment variable that I have defined in a previous session. I'm using ExpandEnvironmentStrings to expand the variable within a path. The environment variable is a user profile variable. When I run my process on the command line it does not expand as well. If I call 'set' I can see the variable.
Some code...
CString strPath = "\\\\server\\%share%"
TCHAR cOutputPath[32000];
DWORD result = ExpandEnvironmentStrings((LPSTR)&strPath, (LPSTR)&cOutputPath, _tcslen(strPath) + 1);
if ( !result )
{
int lastError = GetLastError();
pLog->Log(_T( "Failed to expand environment strings. GetLastError=%d"),1, lastError);
}
When debugging Output path is exactly the same as Path. No error code is returned.
What is goin on?
One problem is that you are providing the wrong parameters to ExpandEnvironmentStrings and then using a cast to hide that fact (although you do need a cast to get the correct type out of a CString).
You are also using the wrong value for the last parameter. That should be the size of the output buffer, not the size of the input length (from the documentation the maximum number of characters that can be stored in the buffer pointed to by the lpDst parameter)
Putting that altogether, you want:
ExpandEnvironmentStrings((LPCTSTR)strPath,
cOutputPath,
sizeof(cOuputPath) / sizeof(*cOutputPath));
I don't see any error checking code in your snippet, you don't assert the return value. If there's a problem, you'd never discover it. Also, you are using ANSI strings, beware of the weirdo requirement for the nSize argument (1 extra).
What about buffersize ? Is it initialized - to the right value ?
The documentation states that If the destination buffer is too small to hold the expanded string, the return value is the required buffer size, in characters.