C++ issues with Windows UAC - c++

I have a software written in C++ installed in some 1000 PC which is having some difficulties with windows UAC. I'm trying to make things work properly but I would need some help to understand the problem and find correct solution.
Situation is as follows: I need to write some data in some text / xml files, so I started (in XP) to write them in the executable folder. Not recommended, I know. When Vista kicked in, all files started being saved to the VirtualStore folder, which was fine for me, so I left things untouched. I had some issues back then with a couple of users (see problem 1) but I fixed them by hand and that was it. Now with Windows 8 I'm facing different problems (problem 2) and I want to fix them properly once and for all.
First problem: with Windows Vista it happened that some users eventually found their software "empty", as it was when just installed. All their work on it was gone. It turned out that suddendly Vista was looking for their files in C:\Program Files\{MyApp} instead of looking in the VirtualStore. Copying the files from VirtualStore to Program folders solved the problem, never understood why
Second problem: now with Windows 8 some users [a minority of them] experience a different, strange fact; my app does not seem to be able to create files in the VirtualStore, but it can edit existing files. So if I create the files manually, everything works. If not the app does not work: files are not modified neither in the program folder nor in the virtual store
Now I want to fix up things. My plan is to move all the files that need modifications in CSIDL_LOCAL_APPDATA and to have the software save stuff there. Only executables will be in the program folder. For backward compatibility, though, it seems I cannot use SHGetKnownFolderPath which seem to be Vista+ specific. So I would use SHGetFolderPath, which is deprecated, I know, but should work in XP and act as a wrapper of SHGetKnownFolderPath in Vista+, which is good for me.
My questions:
any ideas about my problems 1 & 2? I'd like to understand them in order to be sure I defeated them complitely.
is my plan UAC compliant? As far as I understand it is, but...
any way to assure XP compatibility but for my workaround? I do not feel comfortable using deprecated functions, but I definitely do not want to have two versions (XP and Vista+) to deal with!
Thank you very much for any help you can provide.
Luca

These are my ideas to problems 1 & 2:
1. If you are writing the files your application is working with, your files will be written in a sub-directory of C:\Program Files. In order to write files at this location (C:\Program Files), your application must be run as Administrator. For compatibility reasons Windows Vista will write data to VirtualStore, if you try to write data to directories where you need Administrator rights to write data to.
2. Windows 8 does not use VirtualStore anymore. Read access does not require Administrator rights, however.
Yes, your plan is UAC compliant. Your program and the files your program uses must be saved in two different directories.
I would use the ShGetFolderPath function in order to get the path of the AppData directory. Additionally this function is compatible with Windows XP. You might check the operating system version and use the appropriate function/interface for this version. GetVersion and GetVersionEx have been changed in Windows 8. This is why I recommend the use of the Version Helper functions for version checking purposes on Windows 8.

Related

error in _bitset.h file header file: array must have at leas one element

I have programmed an application in C++Builder 6, compiled in Windows 95, the application works perfectly.
The error appears when I compile the application on Windows 10. The following error occurs in the header file _bitset.h:
In
template <size_t _Nw>
On the line
_WordT _M_w[_Nw];
array must have at leas one element
Any ideas?
Thank you all for answering...
I finally solved the problem...
Although the version of the class that I had installed on the new pc seemed the same as on the old pc the "boost_1_31_0" the _biset.h files were different, it was enough to replace the ones on the new pc with the old ones and EUREKA.
Thank you all for your time and excuse my English.
Without MCVE source code or BCB6 installed on Win10 we can only guess sohere are few hints instead of direct answer...
First OS related hints:
There where quite a few changes in OS since BCB6 times. The most likely reason for problems are wrong absolute paths on 64bit windows simply copy compiler and IDE stuff from:
[Program Files (x86)]
into:
[Program Files]
that usually works for most of the older IDEs and SW build before 64bit Windows (like GC/GCC + Eclipse).
On top of this Win10 changed process scheduling to the point many older SW does not work properly or at all and even compatibility modes are useless in Win10. My experience with direct successor of BCB6 (BDS2006 Turbo C++ Explorer) is that to run properly you have to:
run IDE as administrator
You can set this in BCB6 icon properties (compatibility).
set IDE process affinity to single CPU
You can set this in Task manager on BCB6 process. Without this the IDE will freeze for few seconds (up to 45sec) every few minutes (or seconds).
Beware if your app is multithreaded You have to set its affinity back to all CPUs somewhere in your App init code. This is done like this:
Cache size estimation on your system?
Just look for SetProcessAffinityMask usage in the last code there.
install font fix for user folder
I do not know if BCB6 needs this but BDS2006 will not work properly without it as after some Win7 update MS changed the policy of user folder and having fonts there is no longer allowed without fix.
If nothing works try to use Win7 there usually works everything on first try without any problems. Old developing tools tend to not work at all or properly on Win10 and newer versions are usually much worse than old some to the point of to be unusable especially for MCU and USB stuff. So its always a good idea to have a backup Win7 PC for development.
Now code related hints:
Different OS mean different compiler #define directives which means some parts of code might be different then on Win95 see:
C++Builder Compiler Version
so some (most likely inbuild) header files you use might be wanting to see some OS version and have numbers that are not handled in code properly causing some parts of code are not compiled. The remedy is simply to look for those #define in code and either change the version numbers or add new entries ...
Also it looks like BDS2006 compiler bug fix can remedy some weird bugs on BCB6 too so see:
bds 2006 C hidden memory manager conflicts (class new / delete[] vs. AnsiString)
Too many initializers error for a simple array in bcc32

visual studio removes backslash from path, generated by cmake [duplicate]

We just did a move from storing all files locally to a network drive. Problem is that is where my VS projects are also stored now. (No versioning system yet, working on that.) I know I heard of problems with doing this in the past, but never heard of a work-around. Is there a work around?
So my VS is installed locally. The files are on a network drive. How can I get this to work?
EDIT: I know what SHOULD be done, but is there a band-aid I can put on right now to fix this and maintain the network drive?
EDIT 2: I am sure I am not understanding something, but Bob King has the right idea. I'll work with the lead web developer when he gets back into the office to figure out a temporary solution until we get some sort of version control setup. Thanks for the ideas.
While we do use Source Control, we do also run all our projects from Network Drives (not shared directories, private directories on network drives). The network drives are backed up nightly, and also use Volume Shadow Copy, so if you need to revert to something before it made it's way to SC, then you can.
To get projects to run correctly with the right permission, follow these steps.
Basically, you've just got to map the shared directory to a drive, and then grant permission, based on that Url, to all code. Say you map to "N:\", then use "N:\*" as your Url pattern. It isn't obvious you need to wildcard, but you do.
The question is rather generic so I'll give an answer to one issue I was facing.
I run Visual Studio 2010 using a Parallels virtual machine on my Mac while keeping all my projects on the mac side via a network share. Visual Studio however wouldn't load the projects assembly files from there. Trying to set the rights using "caspol" alone didn't help in my case.
What finally worked for me to allow Visual Studio to load assemblies from a network share was to edit the file
"C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe.config" (assuming a default installation).
in the xml "<runtime>" section you have to add
<loadFromRemoteSources enabled="true"/>
You may have to change the permissions on that file to allow write access. Save the file. Restart Visual Studio.
In the interests of actually answering the question, I copied this comment from jcarle.com:
Trusting Network Shares with Visual Studio 2010 / .NET Framework v4.0
January 20, 2011, 4:10 pm
If you are like me and you store all your code on a server, you will have likely learned about trusting a network share using CasPol.exe. However, when moving from Visual Studio 2008 (.NET Framework 2.0/3.0/3.5) over to Visual Studio 2010 (.NET Framework 4.0), you may find yourself scratching your head.
If you are used to using the Visual Studio Command Prompt to quickly get to CasPol, you may find that some of your projects will not seem to respect your new FullTrust settings. The reason is that, unless you are carefully paying attention, the Visual Studio Command Prompt defaults to adding the .NET Framework 4.0 folder to its path. If your project is still running under .NET Framework 2.0/3.0/3.5, it will require setting CasPol for those versions as well. Just a note, I have also personally had more success with using 1 as a code group instead of 1.2.
To trust a network share for all versions of the .NET Framework, simply call CasPol for each version using the full path as below:
C:\Windows\Microsoft.NET\Framework\v2.0.50727\CasPol -m -ag 1 -url file://YourSharePath* FullTrust
C:\Windows\Microsoft.NET\Framework\v4.0.30319\CasPol -m -ag 1 -url file://YourSharePath* FullTrust
I would not recommend doing that if you have (or even if you don't have) multiple people who are working on the projects. You're just asking for trouble.
If you're the only one working on it, on the other hand, you'll avoid much of the trouble. Performance is going to out the window, though. As far as how to get it to work, you just open the solution file from VS. You'll likely run into security issues, but can correct that using CASPOL. As I said, though, performance is going to be terrible. Again, not recommended at all.
Do yourself and your team a favor and install SVN or some other form of source control and put the code in there ASAP.
EDIT: I'll partially retract my comments. Bob King explains below the reason they run VS projects from a network drive and it makes sense. I would say unless you're doing it for a specific reason like Bob, stay away from it. Otherwise, get your ducks in a row before setting up such a development environment.
So I was having a similar issue. Visual Studio wouldn't recognize a network location I had mapped for a drive letter for anything. The funny thing is, it worked for a day. I set up my project and began working on it and had no issues. Then, I shut down and the next day nothing works. I couldn't read/write files in code, output my executables or anything. My project is local but my output was intended to be thrown up on the network.
Anyways, the problem is probably about the administrator context but one way to fix it which I found while digging around online is to get Visual Studio to browse to the drive in question some how. There are plenty of ways to do this but VS will magically be able to recognize mapped drive letters. My solution is to go the the Debug Output Location in the Project Properties, click browse and go to my previously made output location on my network drive and Voila!!!
I wanted to put this up because I spent half a day trying to figure this out and figured it might save someone else some time. Thanks much and good luck!!!
Erik
I understand this is an older thread, but this was the best thread I found when looking to solve a similar issue I had visual studio 2013 on a virtual box (using Win 8.1) and the code on the host machine (Win 7). Although I could open the solution, I could not compile. All of the other answers on this relate to older software, so I am adding this answer to update this frequently found question with the solution that worked for me.
Here's what I did; Made a registry entry to be able to use a UNC path as the current directory.
WARNING: Using Registry Editor incorrectly can cause serious, system-wide problems that may require you to reinstall Windows NT to correct them. Microsoft cannot guarantee that any problems resulting from the use of Registry Editor can be solved. Use this tool at your own risk.
Under the registry path:
HKEY_CURRENT_USER
\Software
\Microsoft
\Command Processor
add the value DisableUNCCheck REG_DWORD and set the value to 0 x 1 (Hex).
WARNING: If you enable this feature and start a Console that has a current directory of an UNC name, start applications from that Console, and then close the Console, it could cause problems in the applications started from that Console.
Found this information at link: http://support.microsoft.com/kb/156276
How about we rephrase this into a question that everyone can answer? I have the exact same problem as the initial poster.
I have a copy of VB 2008 (recently upgraded from VB6). If I store my solutions on the backed up network drive, then it won't run a single thing ever. It gives "partially trusted caller" errors for accessing a module, even when "allowpartiallytrustedcallers" is set in the assembly. If I store the files on my (not backed up) C:, then it will run wonderfully, until I put it on the share drive for everyone to use, and I'm back to my same problem.
This isn't a big request. I just want to be able to put a solution and executable on the share drive and run it without an absurd amount of nonsense about security. I shouldn't have to cram all my work into form files.
-Edit: I found the problem with why it was ignoring the AllowPartialllyTrustedCallers command. I'm trying to reference ADODB, which doesn't allow partially trusted. So, no network executable can access a database? What does Microsoft have against intranets anyway?
I was facing the same issue just recently so this answer is more for the sake of keeping track of my own knowledge. Anyway, should soumeone find it useful, below is the issue and the solution.
Issue:
NET 4.0 projects, SVN repo, checkout folders are on local drives, referenced assemblies are build by build server and available on a network drive. Visual studio on W7 is is able to add the reference but unable to build projects.
Solution:
Since NET 4.0 does not automatically provide a sandbox anymore for network assemblies, you have to make those full-trusted via machine.config update. http://msdn.microsoft.com/en-us/library/dd409252.aspx
I had a similar problem with opening Visual Studio projects on a network drive, and I fixed it by creating a symbolic link on my local C:\ drive that points to the UNC directory
e.g.
mklink /D "C:\Users\Self\Documents" "\\domain.net\users\self\My Documents"
then you can just open the project using the C:\Users\Self\Documents\ path, instead of the UNC path
(You have to be careful, because Visual Studio will automatically redirect you to the '\\domain.net..' path if you double click the symlink when you're browsing for the project. I had to copy paste the 'C:\Users\' path to get it to open with the drive letter path)
Don't do it. If you have source control (versioning), you do not want your files on a network drive. It totally bypasses all you want to achieve by using source control, because once your files are on a network drive, anyone can modify them .... even while you're currently building your project. Ka-boooom!
PS: this sounds like a typical case of over-engineering to me.
Are you having any specific problems?
If you allow more than one person to open the solution, your first problem will be that the .NCB file (Intellisense) will be locked exclusively and only one user will be able to browse the class tree. And of course you have the potential for one user's changes to overwrite the other user's changes.
You should be warned that some feature in Visual Studio will refuse to work with network drive.
For example, mdf file of SQL Express user instance must be located in local drive.
For another example, if you use UNC path, you have to make sure they are short enought.
i found this helpful while trying use vc11 with parallels which run on mac:
http://social.msdn.microsoft.com/Forums/en-US/toolsforwinapps/thread/2ffdcb01-c511-4961-834b-afd5f2fbb8e1, and specifically:
1) You can switch from local debugging to remote debugging and set the machine name as 'localhost'. This will do a remote deployment on your local machine (thus not using the project's directory). You don't need to install the Remote Debugger tools, nor start msvsmon for this to work on localhost.
In case this helps anyone else, I had to do the steps outlined here to add the network share location to Windows intranet zone. In particular, I was having trouble with Visual Studio hanging on load when opening a solution on a network share (i.e. using VMware Fusion and opening a solution from my Mac's hard drive). I also had problems with PostSharp running in this scenario.
If i understand you correctly, your Visual Studio project files are stored on the network drive and you are running them from there. This is what I do and don't have any problems. You will need to make sure that you have set the security policy. You can use Caspol to do this, or via the control panel-admin tools menu.
"How can I get this to work?"
You have a couple choices:
Choice A:
1. Move all files back to your local hard drive
2. Implement some type of backup software on your machine
3. Test said backup solution
4. keep on coding
Choice B:
1. Get a copy of one of the FREE source control products and implement it.
2. Make sure it's being backed up
3. Test it
Choice C:
Use one of the many ONLINE source control repositories available. Google, SourceForge, CodePlex, something.
Well, my question would be why you are asking this. Is it not working when you are storing it on a network drive? I haven't tried this myself, and one problem I could envision would be that .NET code running from a network drive (ie. from the bin\Debug directory, also located on the network drive) would be running in a sandbox mode, unless you mess around with CASPOL (or use 3.5 SP1 which I hear has removed that obstacle).
If you have specific problems, ask about them. Never ask "Why is doing X not working?".
You're not saying if you're just one person or multiple persons accessing the same remote drive, but I'm assuming you're just one for each network directory. Is this correct? If not, no, there is no band-aid. Get version control, move the files back to a local disk.

Application only runs if you run as administrator?

Edit: This problem only occurs on windows 7 and vista from what I've heard.
I have a very simple app developed with an external graphics library. If I install this app into a program files directory and run it, it will crash immediately but it works fine normally, with exactly the same files. I have realised it is because you need to run the application as administrator for it to work.
I appreciate if this is a problem directly related to the graphics engine I am using, but I don't really think so (but I'm clueless). Can anyone help me?
Edit for more detail:
The application executable and files that are needed to run it are installed into the default program directory - for me, C:\Program Files (x86). If you try and run with without clicking run as administrator, it will simple freeze and say "App has stopped working. Windows is checking for a solution to the problem..." My question is basically, how can I make it so run as administrator isn't necessary?
When a program cannot perform an operation, it (the operation) should fail gracefully. My guess is your application is attempting to do something that it cannot do as a normal user and then fails to check for a return code, and then subsequently crashes. You need to identify what it is your program is doing that it should not be able to do as a normal user. For example (off the top of my head):
Write a file to Program Files (x86)
Write to HKLM
(Without more details) The problem is most likely related to the fact that your program tries to write into the directory and then excepts the file creation/modification to actually have an effect. UAC prevents applications from writing the Program Files directories without administrator privilages. The solution is to redesign your application to not rely on such behavior or store the files in question in one of the intended locations (AppData, etc. folders).
If you right-click on the EXE and go to Properties -> Compatibility there are some options that might help. You could try running the app in compatibility mode for a previous Windows version or if that doesn't work at least mark the EXE to run as administrator by default.

Get program files directory from Windows in Borland C++ 6

I had to create an executable (using Borland C++ Builder 6) in place of a batch file for Windows 7, since permissions didn't allow ordinary users (non-admins) to run the necessary batch. We've got a number of different Windows 7 machines, some 64 bit and some 32, etc. The problem I'm running into is that the "Program Files" directory is hard coded in to the program, but it's not always the RIGHT program files directory, which leads to some errors on some machines.
I'm familiar the method for getting the program files dir from the registry, but I'm afraid this won't work on all machines because of permissions settings not allowing programs to access the registry. I've been searching high and low for a function like GetWindowsDirectory, but to no avail. Does ANYONE have any suggestions?
EDIT:
I've programmed this on a Windows XP Machine to simply be placed on Win7 (No way to change or avoid the XP/7 thing, crappy as that may be). It's a simple utility that needs no installation; it's just placed in a file. It just needs to go out and find the program files directory to perform some tasks.
This is first of all a deployment problem. You will have to copy/install your program to c:\program files (x86) on a 64-bit machine. You can simply use c:\program files in your code, Windows redirects it to the (x86) directory.
There is otherwise no easy cure for trying to bypass UAC. You'll have to embed a manifest in the executable to ask for admin privileges. The user gets the UAC prompt to let her know that you are going to be hacking the private parts. How to do this with such an old tool isn't obvious to me, you'll probably have to embed it in the .rc file. Or use a .manifest file.
How to get Program Files folder path (not Program Files (x86)) from 32bit WOW process?
Use SHGetFolderPath with CSIDL_PROGRAM_FILES.
There's a newer version called SHGetKnownFolderPath if you're always on Windows Vista or later, but you might need to update your Platform SDK. If you're still using Borland C++ 6, I suspect your Platform SDK might be older. In that case, you should be able to use SHGetFolderPath.
after installing the software, go to :
C:\Program Files\Borland\CBuilder6\Bin
Right click on bcb.exe file, choose
Properties -> Compatibility
Select the option - Run this program mode Windows XP(Service Pack 3) and Privilege Level
then, select the option Run as administrator, and then click Apply.
This works for my problem.
On windows 7 x64, just create a junction point in "c:\Program Files" pointing to the actual folder where the installation is in "c:\Program Files(x86)". This should be done by the same user who installs the software. That should not only take care of your problem but also third party packages that would not otherwise work on Win 7 x64.
If you don't know what a junction point is, just read the help for mklink.

Checking if DWM/Aero is enabled, and having that code live in the same binary for 2000/XP/Vista/7

I know the title makes little sense, mostly because it's hard to explain in just one line. So here's the situation:
I have a program who's binary is targeted at Windows 2000 and newer. Now, I went ahead and added some code to check if the user is running under Vista/7, and if so then check if Aero/DWM is enabled. Based on this I'll disable some stuff that isn't relevant to that particular platform, and enable some other features. My main problem is that in order to call DwmIsCompositionEnabled from Visual C++ 2008 I have to add the dwmapi.lib file and compile against it. Running the binary in anything other than Vista or 7 gives the "Unable to locate component. The application failed to start because dwmapi.dll was not found" error. This, of course, is expected to happen since DWM is new and not available for older platforms.
My question is then: will it be possible for me to somehow manage to pull this off? One binary for all OS versions AND include that DWM check code? This program was written under Visual Studio 2008, Visual C++ using MFC.
Turns out I can just tell the linker to delayload the dwmapi.dll.
I'd like to thank ewanm89 because something he said sort of resonated and led me down the path to finding the actual answer.
The normal solution is to use LoadLibrary() and GetProcAddress(). Both can be done after your program started. But still +1 for the DelayLoad solution, which does the same for you behind the scenes.