I would like the BHO instances of my IE extension to be able to share common data. I just need them to share a couple of variables, so I am trying to find an easy solution for the problem.
The alternatives I can think of, from easier to more complex are:
1) Writing/reading data to/from the file system or to the registry, see MSDN article and Codeproject article. Question: is this information accessible from BHO instances running in different threads?
2) Developing a Windows Service or a background application that communicates with all BHO instances, see MSDN article. Problem: I have NO IDEA how to make this, or where to start with. I am worried about the user having to install to many things.
3) Providing IPC mechanisms so that the different BHO instances can communicate directly to each other. Like using the IGlobalInterfaceTable, see ookii article. Problem: Yes, you can store pointers in this IGlobalInterfaceTable and get cookies to access them back, but how can you share one cookie obtained in BHO Instance 1 with BHO Instance 2, so that the second instance can access the data inserted in the IGlobalInterfaceTable by the first one? Aren't we having here the same data sharing problem again?
Well, as you see, after a whole week looking for a solution I simply don't know how to start dealing with this problem. Any help would be greatly appreciated.
Often, Memory Mapped Files are used for this purpose. It's a non-trivial amount of work, however, as you must ensure that they are ACL'd properly to allow cross-process access (each tab may be in a different process) and work across multiple integrity levels.
1 sort of, except the place a normal web site can write is isolated from a trusted web site can access.
2 Writing a service is probably the easiest way, given the abundant amount of documentation on how to write a windows service (you even get an ATL project wizard if you use Visual C++), and your broker code can survive a tab processes crash or even a user log off.
3 Indeed you have the same sharing problem again, COM messages are blocked by UIPI unless you can change the message filter, but the messages used by COM are not documented. I would use something like named pipe/memory file mapping.
You need to host the communication broker code somewhere and only create it once. You can write something like how computers in a workgroup elect the master browser (kind of chatty), or have a broker process to do the communication work (e.g in a windows service).
Related
Currently we have an application (a diagram editor), that have the ability to save and load (serialize) its state in a xml file.
Now we want this application to behave like Microsoft OneNote application. Where multiple users have the ability to access the same file.
Later we may also need to enhance with other things like, (1)what is changed and who changed it, (2)option to resolve conflicts if any.
I came to know about sync framework to resolve this. so far, i have not tried it.
All i want is,
Virtually single file should be edited by multiple instances of
same application.
We need a dll (sync framework) that does following
It takes complete responsibility of file handling.
Using this dll, each instance of the application will notify their own changes.
Each instance of the application should have the ability to detect the changes that is recently made (when, who, what are the changes).
My question:
Will sync framework be suitable for this requirement?
If so, is there a demo application that represents this?
No, Sync Fx cannot handle this. There's a file sync provider, but that's not smart enough to determine what has changed in terms of the actual contents of the file.
I've been looking into centralising my computer game saves to make it easier to backup and restore as well as putting them up on the cloud via dropbox but there in so may places that it makes it quite difficult. I noticed the Windows 7 and Vista now support Symbolic links so I've been playing around with that but I was wonder the follow:
Is it possible (code example or a point in the right direction) for an application (vb.net or C++) to spoof a file or folder?
E.g. Application A (a game like Diablo III or Civilization V) attempts to read or right from file A (the game save), application B (the save repository) detects this read/write request and pipes the request through itself preforming the request on file B (the actual game save in another location). Application A is in no way altered and treats the file normally.
Note: I realise there are many simple ways of preforming the same task in essence such as monitoring the use of Application A or periodically checking file A and copying it if it has been altered since the last check etc but all these methods have draw backs and less interested in making it work than if it is possible.
It is entirely possible to do this through a file system filter driver. For information about these, take a look here:
http://msdn.microsoft.com/en-us/windows/hardware/gg462968
Filter drivers can hook into CreateFile operations and redirect the create to a different place if you want, but they are much harder to write as compared to normal applications. They run in kernel mode and must obey the limitations of drivers.
You can "fake" special folders, like control panel does, but I don't think you can create anything accessible/writeable (in an easy way). I might be wrong though. I had the same idea once too (as a compatibility step for some company stuff), but couldn't find anything supporting an easy way to do it. It seems like it might be easier to be done on Unix systems (but that's obviously no option here). Also, I wouldn't expect any nice or easy solutions for .net.
Only approach I could think about right now, would be highjacking the according API calls (e.g. FileOpen) to reroute/manipipulate them (similar to what root kits do), but I wouldn't say that's a good idea, considering it might be detected as possible malware or cheats by things like punkbuster or antivirus solutions.
Yes or no depending on (using your terms) the level of abstraction that Application A is using.
If Application A is performing a CreateFile wto start access and passing a fixed filesystem path then Application B would need to emulate a file system and do so in the kernel.
On the other hand if Application A were to user HTTP with RESTful URLs then the HTTP server could answer all requests from files or by dynamically creating the content.
So the question can only be answered in specific by knowing the details of Application A.
How should a Windows 8 Metro application connect to a central database?
I've read about local storage, but I haven't read anything about connecting to a central database.
Obviously, this architectural design decision needs to support the disconnected scenario.
WCF web services seem to make sense.
But even if they do make sense, should we really create separate methods for all read/write operations?
Or are OData WCF services the way to go?
It seems like tablet software architecture should be able to borrow a lot from smartphone software architecture (but I am new to both).
Has Microsoft made any recommendations in its app samples?
It appears that others are asking similar questions on the Microsoft Developer Forums.
Here is what I've found:
According to Tim Heuer:
...You cannot directly have a SQL db embedded in your app or use
something like ADO.NET. This is more of an async/services
infrastructure. So if your data was exposed via services, then of
course you could connect that way. There are some other light-weight
methods you could use for local storage as well using things like the
Windows.Storage namespace (which is similar to Isolated Storage in
.NET).
Morten Nielsen agrees:
You can use HttpClient to download pretty much anything from the web.
Why don't you configure your WCF service to return data as JSON, and
use the DataContractJsonSerializer to deserialize the results?
Also, Tim Heuer cautions:
...Please note that while awesome, the SQLWinRT project on codeplex is a
wrapper to communicate with the classic SQLite engine...which uses
APIs that would not pass store validation currently.
Generic Object Storage Helper for WinRT and WinRTFile Based Database seem to have some promise.
But Daniel Stolt raises some good points:
It's awesome that there is good support for building OData clients and
other REST clients - but this only addresses the online scenario. The
"structured" part of Windows.Storage is a very limited model,
essentially limited to name/value pairs, insufficient for all but the
most basic scenarios. Yes there is local file storage, which is great
of course. But forcing every app developer out there to build her own
DBMS on top of local file storage will simply not cut it, especially
with all of System.Data having been removed from the profile. If local
file storage was sufficient for most device apps, then things like
SQLCE would have no purpose today already. And SQLCE clearly has a
purpose, and has played a very important role for occasionally
connected device apps for a very long time. There is also a tremendous
need for synchronization with a server-side database such as SQL
Azure, mostly to be able to roam data between devices. Yes there is
the roaming storage model in WinRT, but it shares the same limitations
of local storage mentioned above, and on top of this is very limited
in capacity (currently 30KB if memory serves). It is simply
insufficient for all but the simplest roaming data needs. Again,
forcing every app developer to design and implement her own
synchronization solution is very bad. You can do much better to enable
developers.
Many people are disappointed that the System.Data namespace is not supported in WinRT.
Richard Bethell said:
I don't even have words for this. This is astonishing. Leave aside for
the moment they want to force you to abstract to middleware for
database connectivity - I don't agree, but I can quasi understand a
rationale for that. I can even see pathways for developing like that.
But no System.Data.... at all? Do you even understand what you've done
to us?
What System.Data can do, outside of just having providers for Sql,
OleDb and other custom providers like Oracle, is provide a rich
abstraction of XML datasets that allow you to very quickly build a
data oriented Service Oriented Architecture.
For instance, I can easily create a web service using SOAP or WCF that
returns DataSets or DataTables, and then consume those objects easily
and directly. Being able to do this allows very rapid construction of
n-tier architectures, even without direct data connections available.
Without System.Data, and the power of DataViews, DataTables, etc. this
gets a lot harder. Sure you can custom create structs, put data in
there, and serve up structs, and use Linq to do whatever sorting,
filtering, etc. you want to do.... but it ends up being twice the
work, and makes code reuse a lot harder. And it means using our
existing service oriented architecture is impossible (without a big
overhaul.)
The withdrawal of System.Data is as big a thing for developers to deal
with as the loss of the Printer object in VB6 to vb.net 1.0 was. What
is harder to understand in this case is why it is necessary -
re-enabling it in the Metro profile can't possibly be a technical
difficulty of the product, can it?
It is valuable enough that I would seriously consider including Mono's
System.Data classes as part of any app I create (which would obviously
have to be open source.)
I think that this is another of those "it depends" questions...
The first and most obvious issue is that it very much depends on the context in which the application is running as to whether, to take the first case "Obviously...support...disconnected" is actually true - if the app is an internal corporate app then quite possibly not in that case no db == not work.
Secondly you could look (hmm, rash... one assumes you could look, this could be a bad assumption) at database synchronisation between a local SQL database and the remote db and so on and so forth.
Taking a step back... yes - you're absolutely right, look at it as being the same as phone or silverlight (although I don't know if there is yet RIA support) - but the thing is at this point its very hard to be prescriptive because given a general purpose platform one can therefore write applications to suit all sorts of purposes.
Not a hugely helpful answer really - but a start.
Having read #Jim G's answer it seems that I should probably withdrawn mine?
I am considering building an app in C++ that will be parsing text from the web and create some statistical results. These results I want to be fed in an external app in real time. The external app (to whose code I have no access, but can ask for a - paid - custom made addition) will then need some code to read and use these results.
I am wondering what would be the best way to interconnect the two apps, in terms of speed and ease of implementation. I am considering :
disk I/O (slow)
a Windows service
a DLL
a web service
a web page
Perhaps I am missing a better solution ? Thank you.
Update : there is an additional need to know how long a data request may take at worst.
Sockets?
Shared Memory?
RAM Disk?
TCP/IP?
Windows Messages?
Command line arguments (of the other application)?
What methods does the other application have to support receiving data?
A windows service would be sensible but would still need to communicate with the other app, this is called IPC, approaches on windows are described here, Named Pipes are simple & flexible, File Mapping is powerfull.
An alternative would be to stick a database in the middle?
There are a number of IPC mechanisms to choose from from (sockets, shared memory, pipes, ...). I guess the "best choice" will depend to a large extend on how the other application is structured aka. how much your custom extension will cost you.
I don't know much of your environment but it might be worthwhile to have a look at boost.interprocess:
http://www.boost.org/doc/libs/1_43_0/doc/html/interprocess.html
I am currently working on a C++/COM project using ArcEngine(From ESRI). Aside from the fact that there is little to no support in terms of documentation (SDK is there.) Anyways, i am wondering if anyone here has had any experience in making the initialization process of ArcEngine faster. Right now it takes 30-35 seconds just to initialize the engine. Now we are going to be running several of these applications. Does anyone have any experience, with this?
Its a very werid and odd task, but ESRI's developer forums are no help. and i couldnt find anything on google.
Any ideas?
It's been almost a decade since I last played with ESRI stuff, so I can't help you with anything specific to ArcEngine.
Maybe you can pool instances? In the best case scenario you would be able to reuse ArcEngine instances, and could return an instance back to pool after you're done with it.
If that's not possible, you could at least try to have a number of instances ready to roll, although whether that is possible and/or useful depends a lot on the specifics of your app.
Is it really COM? In that case, the ArcEngine will be exposing a set of COM interfaces. COM interfaces are not magic, and not uniquely bound to one program. In fact, COM has explicit support for proxying. This is e.g. used by DCOM; you get a local proxy for the remote server.
In this case, it should be possible to write a custom COM proxy that fakes the initialization stuff but forwards everything else. Towards your client, the proxies COM interface is identical except faster. Towards ArcEngine, your proxy can wait quite long between calls.
Something that I have found useful with getting ESRI products to start faster (not necessarily ArcEngine, but this probably applies) is to specify the port number (generally 27004) in the registry where the license server is defined.
HKEY_LOCAL_MACHINE\SOFTWARE\ESRI\License\LICENSE_SERVER
HKEY_LOCAL_MACHINE\SOFTWARE\ESRI\ArcInfo\Workstation\8.0\LICENSE_SERVER
When you set this in installation or through the desktop administrator, it is generally something like: #yourserver.name
Change this to 27004#yourserver.name
Again this may not solve your issue, but if you're not doing it, it's worth a try. I've found it to speed things up in our environment, both using a license manager on a network and with a hardware dongle on the local machine.
Well from my understanding ArcEngine initialization, initializes a special COM environment.
You don't ever get any sort of real handle over the initialized environment. Can you somehow store a COM Enviroment and pass it to other programs. My current idea is:
Windows Service Running in Background with initialized ArcEngine. Program somehow queries the service, the service returns the COM Enviroment. Is this even possible?
I had a lot of grief with ESRI forums providing very little help. It feels like Arc* developers are largely on their own.
Using ArcEngine + .Net the initialization time for an application has been trivial (maybe 1 second?) in our environment -- are you using a slow remote server or is this JUST the engine with no network or maps being loaded?
Whenever I've had to deal with large data sets, ESRI has a pig though.
Good to see some discussion on SO of ESRI products! Not a lot here yet...
Exactly what line is taking 45 seconds? If I had to do some psychic debugging, I would guess that you are running into a problem with your license server.
Check that first.