I'm trying to create a web api application that could support 3-4 similar web sites as a backend. Sometimes I'll need to deploy changes and fixes to my backend that are not as critical to all sites, so I want to expose the API in Urls dependent on my version (.../v1.0.0.3/StuffControler/DoSomething). For that I need the ability to load several versions of my assemblies at once. I can write my own implementation of that using reflection, but I'm using Ioc container that loads my implementation - my question is : could it also pick the assemblies by a given parameter (version number)
Related
I recently started to work on a friend's project in Django and we want to provide a REST API so other projects can consume our data. I'm starting to learn django-rest-framework (and django-rest-swagger for documentation). Is it possible to create the API as a separate service? this way we dockerize it and serve the api in one container, and keep the application on its original container preventing that if many requests to the API were made, it will not interfere on the application (by bringing it down for example). If it is not possible what is the best way for implement the API on the project?
Yes, this is possible. See here. Structurally, you will probably want to write the project by keeping all presentation-related apps separate from the API-related apps.
I am interested in the pros/cons of refactoring a large application (industrial monitoring software) into a bunch of Libraries / NuGet Packages, rather than as stand-alone Services. The perception is that they're almost identical, i.e. a service can be built as either a Library hosted within the application, or it can be built as a Web Service and hosed externally to the application. The only difference is integration (code level vs. network SOAP or REST traffic). I'm not sure it's that straightforward, looking for pros/cons of each.
The perception is that they're almost identical...
If you squint your eyes hard enough I think they might look identical. But only up to a point.
You implement some functionality in a web service, you can implement it in a library too. The web service
provides an API to call, the library provides an API to call too. You invoke the web service, you can invoke
the library too.
But there is stuff you can't do by replacing a web service with
a library.
Your libraries are in .dll files? How is a PHP application going to use them? Or a Java application? SOAP/REST are client agnostic. Your web service can use any technology stack, your client can use any technology stack.
With libraries you are stuck to the same technology you used to write the library in.
Web services are individually deployable components. Fix bugs for the web service, deploy, and your 10 clients get the fix on the next call. Now fix some bugs for the library then update all 10 clients to use
the new versions of the libraries.
How about security? A web service can have a database for example. You don't get access to the database directly, you do so
with the API the web service provides you. You have a library that uses a database? Guess where the connection
string, username and password are found.
How about scalability? If you have a web service you can scale it horizontally and your clients immediately benefit
from the improvement. If the library is in the client how are you going to scale it? By scaling the clients?
You can find similar responses elsewhere (e.g. Web Service vs. Shared Library), but the point to take home is that it depends on what you are building and for whom. One is not a general replacement for the other, and vice-versa.
If you are building a single monolithic application that makes use of all of these components then maybe it does not matter for you if the components are microservices or hosted libs. With libs it might actually be easier to build. But if your plan is to actually provide services to other clients, then hosted libs won't work all the way.
I have a hypothetical web application which is split up into a microservice architecture like (as an example):
Clients A-C are web applications that serve HTML. Services 1-3 are the backend that handle CRUD and serve JSON. There are other clients (not pictured) that do not access Frontend Service - namely, native clients such as Android and iOS. I'm trying to figure out the best way to serve common frontend content (such as header/footer/css) across all web clients. The best way I can think of doing this is to create a Frontend Service that each web client can access to pull this common information. That way changing the common front end will be reflected in each application immediately without need to update versions, recompile or redeploy.
My question is what is the best way of doing this? I'm using Dropwizard to serve both the web clients and the services. The web clients serve Dropwizard Views (with freemarker templates) via Jetty. Is there a way to compose Dropwizard Views so that I can request a Header and a Footer view from Frontend Service and wrap these around each view returned from the Clients? Or am I going about this completely wrong? I know that Freemarker supports template inheritance but as far as I can tell this means the header/footer would have to live in each client or be pulled in from a common JAR (which would require updating version numbers and recompiling).
If you want to have content synchronized between all the microservices, in your case the header and footer, I'd suggest Zookeeper, it's designed for distributed orchestration and has more of a push model - i.e. you'd update the header in Zookeeper and all of your services would receive that update almost instantly.
I suggest the Curator library as it's much easier to work with than Zookeeper directly, the cache example might be a useful starting point.
You can also use Hazelcast as distributed Map/Cache. It is really easy to use (see code examples), but if you want some enterprise features you have to pay a lot.
I have got a web application which is speparated in a GUI (JSF 2.0, Orchestra, Spring) and service (Spring, JPA, Hibernate,...) project. Due to network issues between the web server and the database server, I neet to split the application completely, between the layers and deploy them on two different tomcats, for the service part close to the database server. I have generated allready a webservice and a webservice-client with the Eclipse WTP CXF Plugin.
My Problem is: For the client it generates a copy of the domain model classes, so I can't use them directly in my gui project and would need to introduce an conversion layer, between the web service client and the gui layer. Wich is cumbersome and error prone.
Is there a possibility to generate the web service client (out of the existing web service module and the wsdl) using the shared domain model (model classes are in an separate project, wich both - service and gui - projects depend on)?
desperatly looking for a solution, as the deployment deadline is close...
To generate a copy of the domain model classes (DTOs) is a good practice when you have two physical layers : Your Hibernate POJOs need to be deproxyfied before being sent to an other physical layer. Maybe you could use Dozer to do it, to avoid to spend too much time doing it.
Maybe you should use RMI instead of Web Services if you need performances.
If you're absolutely determined to use your domain objects in the presentation layer, you should look about Gilead (formerly known as Hibernate4GWT).
Pure DTOs, DTOs with Dozer, and Gilead use are described in details here :
http://code.google.com/intl/fr/webtoolkit/articles/using_gwt_with_hibernate.html
I am currently working on a SharePoint project that needs to use the Lists SharePoint web service (Lists.asmx). Therefore, we need to add a service reference to it in Visual Studio. However, we all develop and test on different virtual machines (with different VM names, URLs, etc.). The QA, Test and Production environments all have different names and URLs as well.
Adding a service reference adds a bunch of references to the URL that was specified when the reference was created (in the app.config. .wsdl, .disco, etc.). This is obviously a problem for us as code that works on one machine won't work anywhere else (which breaks the build and continuous integration) We also have to delete and add the service reference every time we work with code that was checked-in by someone else.
This must be a fairly common problem for people developing Web services so I wondered if there was a way around it. I know you can't really create a "dynamic" web reference, but perhaps the impacts of the URL change could be minimized somehow?
Thanks!
By default, the web-service uses the location where it was initially created. The WebService proxy has a URL property which can be set.
This example shows setting it dynamically: http://www.codeproject.com/KB/XML/wsdldynamicurl.aspx
EDIT:
You're also not limited to using the Add Web Reference feature in Visual Studio. You can use the wsdl.exe tool that ships with the .NET Framework SDK to generate the code file.