I am working on a project or application built in Windows using C++. I would like to seek help on any idea or approach or existing libraries that implements geolocation/geocoding, since I want to limit may C++ Windows application to run on certain region or country only.
Any suggestions, comments will be a great help. Thanks.
It won't be possible to prevent an application from running locally in certain regions. A user can always disconnect from the internet and then you'll have no idea where they're located.
What you could do is to have some of your app logic run in a server, then make requests to your server from the local C++ app. Then you can geolocate based on the IP address of the request, often a standard feature in cloud platforms.
If you do want to explore getting someone's location, you can look at Apple's Core Location or Microsoft's Geolocation namespace.
Related
I have an existing Windows desktop application written in C++ that needs to add support for SNMP so that a few pieces of status information are available on some SNMP OIDs. I found the net-snmp project and have been trying to understand how this can best fit into the existing program.
Questions:
Do I need to run snmpd, or can I just integrate the agent code into my application? I would prefer that starting my application does everything necessary rather than worry about deploying and running multiple processes, but the documentation doesn't speak much about doing this. The net-snmp agent daemon tutorial has an option for running the sample code as the full-agent rather than sub-agent, but I'm not sure about any limitations of doing this.
What would the PROs/CONs be for running a full agent in my application vs using snmpd and putting a subagent in my application? Is there a 3rd option I should also consider?
If I can integrate the full agent into the existing program, how do I pass it a configuration file via the API? Can I avoid the config file all together by passing these parameters in via function call instead?
I was wondering about the best way to deliver private web service instances to lots of users, so the user would always be able to connect to their own offline version of a service, just like running a web service from visual studios while debugging. I was struggling with setting this up in VS2013 even with the many online tutorials, but I am not sure if its not working because it was never supposed to work this way.
I have provided this in-depth explanation of my issue as i am not sure i am going about this in the right way and would appreciate feedback:
Background:
I have a web service to interface with an engine. This deals with the front-end and builds a set of commands for how to make a CAD model. These commands are for controlling the 3rd party CAD software's API. Therefore the engine can be seen to have two main functions -
Build the CAD's API instructions, which can be saved for later
Execution, where it catches the instance of the CAD software
running on the same computer and it builds the model.
The second part is restricted for the general public. Only our in-house users should be able to use it. However, they want to have an otherwise identical front-end and user experience.
The problem is, if they connect to the same engine as the public, which exists on our main server, then the engine will be looking for an instance of the CAD package on the same machine as itself - i.e the server, as stressed in the emboldened point above. What should happen is the engine finds the CAD instance running on the machine that the controlling UI is based on and it uses that for its target. I have spoke to the CAD API support and they say they do not know how to do that.
And so we get to my solution of providing an offline stand alone of my web service on each of the employees computers. This means the front-end will check at the start of the session if a localhost connection is available. If not it will use the main address, which takes it to my server. Otherwise it uses the local engine which will look perform the default behavior of looking for a CAD package on the same machine as itself. Because its locally installed that is now the right machine and it will find the CAD instance of the user successfully.
Final points:
The engine cannot be accessed by the UI directly as i am using
Unity3D for the front-end and there is .Net compatibility issues.
I need a completely self contained version of the software in the
future anyway, so eventually i have to deal with having the engine
accessed locally
I ended up using IISExpress. I got the user to install this and then get them to call a batch file installer i made which sets up the config file and moves my web project to the correct directory.
Say we have an InnoSetup installer script, a native C++ QT files loading application and a .Net client application that we load/update each time and which is the main application. We load files via http post\get requests.
So how many different certificates would our application need to prove for antivirus protectors that we are indeed a real not virus application and which part of our application would require which certificates?
One certificate. But each .EXE and .DLL must have its own signature. This probably means you have to sign your build output before it's included inside the installer.
The signature states that you (your company) are the author of that particular file, and the certificate proves the signature is not forged.
You will only need to buy one code signing certificate. You should sign both the executable and the installer. Take a look at this as a good description of code signing. After working through many issues myself, I've concluded that the advice in that guide is right. Either a normal code signing cert or a kernel mode cert will be fine. I don't believe an EV certificate will give you value. Unless you are providing a driver or a component that is part of the security or kernel infrastructure, the advice on that guide will be sufficient. If you are signing a driver you will also need the /integritycheck option. The Microsoft kernel mode code signing walkthrough is a good read to explain how to do code signing. Some of the steps there are more than you need if you are not providing a driver, but they will always be sufficient. Where the walkthrough differs from the first link I provided trust the first link.
I'm trying to retrieve the path of the profiles directory across various versions of Windows. In older versions that might be [drive]:\Documents and Settings, on newer ones it's [drive]:\Users. There are several ways to do this locally without a problem, however I need to find the path on a remote machine that I've connected to.
Remote Registry is enabled.
I have an impersonation token and can successfully gather information from the remote host via the Net API, etc.
I have access to the administrative share, and therefore all the files on the remote drive.
Here's a list of other important caveats.
It's a C++ project.
It doesn't seem to be possible for GetProfilesDirectory or GetUserProfileDirectory to operate in a remote context. If I'm wrong about this please let me know but in all my experiments the function has returned something from the local machine.
I can't use WMI, we tried many times to integrate WMI functionality into our project and it just didn't work.
I would prefer to do as little "screen scraping" as possible. If you have an idea that doesn't involve reading from some text file and parsing the result, I'd love to hear it. But I'd appreciate any useful answer really.
The profile directory information in the registry don't seem to be useful because it contains environment variables, and like GetProfilesDirectory, the environment variable expansion function does not seem designed for work with remote hosts. This means that the solution wouldn't work if I was making the call from a newer Windows machine to an older Windows machine or vice versa.
The solution should be general enough to work between hosts that might be running any version of Windows from Windows Server 2003 to Windows 7.
Thanks in advance for whatever ideas you might have. Ideally I'd just like to be able to force GetProfilesDirectory to operate on the remote host so if you know how to do that I'll love you forever.
If you have access to the remote registry, you should be able to look at the key where the profile directory and the profile list is kept:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList
My open source project it is C++ dynamic linking library. Most of bugs - crash.
I want create public symbol server to simplify debugging with memory dump.
See also: Setting up a Symbol Server
I assume you're using Microsoft tools? If so, all you should need to do is expose your 'symstore' directory with a web server then configure debuggers to access that store:
srv*symbol-cache-location*http://your.web.server.com/symboldir
The "Debugging Tools for Windows" docs (debugger.chm) has details for configuring IIS - I'm sure any other HTTP server will work just as well if you don't need authentication, which I imagine would be the case for an open source project. As far as I know, symsrv.dll just makes normal HTTP GET requests for symbol files when it's trying to get them from an HTTP server.
You'll also need to build the symbol store using the 'symstore' utility. Hopefully that can be integrated into your build or packaging process so it happens automatically. Again, debugger.chm has good docs on the tool.
This will not be a real answer, but you might want to take a moment to vote for C++ support in NuGet in work item Support Managed C++ Project Types or have a look at the discussion about C++ Project support. When that gets in, SymbolSource support will follow shortly (currently it only supports hosting symbols for .NET assemblies).