Plugin grails to consume web service with property id - web-services

Good morning everyone.
I am developing a plugin for my project which will consuimir a web service in the system and my question is how to have a general standard of access where each user has an ID registered in the database, giving you access to the API that will be consumed.
Example DB:
User 1 - Name, API_KEY ...
User 2 - Name, API_KEY ...
My question is what is the best way to get these values​​, leaving my unbound plugin for my whole system, and can even be reused by others. At first a bean that receives configuration via constructor parameters:
class ExpediaConfiguration implements Configuration {
private final String cid;
private final String apiKey;
private final String minorRev;
private final String customerSessionId;
private final String customerIpAddress
public ExpediaConfiguration(String cid, String apiKey, String minorRev, String customerSessionId, String customerIpAddress) {
//do some
}
#Override
public void configure() {
// TODO Auto-generated method stub
}
}
Maybe he receive the Properties were more interesting?
But I'm not sure this is the ideal implementation, perhaps the Spring help me more but do not know how.
I appreciate the answers.

Related

aem-mocks property test a servlet

Trying to write some proper AEM integration tests using the aem-mocks framework. The goal is to try and test a servlet by calling its path,
E.g. an AEM servlet
#SlingServlet(
paths = {"/bin/utils/emailSignUp"},
methods = {"POST"},
selectors = {"form"}
)
public class EmailSignUpFormServlet extends SlingAllMethodsServlet {
#Reference
SubmissionAgent submissionAgent;
#Reference
XSSFilter xssFilter;
public EmailSignUpFormServlet(){
}
public EmailSignUpFormServlet(SubmissionAgent submissionAgent, XSSFilter xssFilter) {
this.submissionAgent = submissionAgent;
this.xssFilter = xssFilter;
}
#Override
public void doPost(SlingHttpServletRequest request, SlingHttpServletResponse response) throws IOException {
String email = request.getParameter("email");
submissionAgent.saveForm(xssFilter.filter(email));
}
}
Here is the corresponding test to try and do the integration testing. Notice how I've called the servlet's 'doPost' method, instead of 'POST'ing via some API.
public class EmailSignUpFormServletTest {
#Rule
public final AemContext context = new AemContext();
#Mock
SubmissionAgent submissionAgent;
#Mock
XSSFilter xssFilter;
private EmailSignUpFormServlet emailSignUpFormServlet;
#Before
public void setup(){
MockitoAnnotations.initMocks(this);
Map<String,String> report = new HashMap<>();
report.put("statusCode","302");
when(submissionAgent.saveForm(any(String.class)).thenReturn(report);
}
#Test
public void emailSignUpFormDoesNotRequireRecaptchaChallenge() throws IOException {
// Setup test email value
context.request().setQueryString("email=test.only#mail.com");
//===================================================================
/*
* WHAT I END UP DOING:
*/
// instantiate a new class of the servlet
emailSignUpFormServlet = new EmailSignUpFormServlet(submissionAgent, xssFilter);
// call the post method (Simulate the POST call)
emailSignUpFormServlet.doPost(context.request(),context.response());
/*
* WHAT I WOULD LIKE TO DO:
*/
// send request using some API that allows me to do post to the framework
// Example:
// context.request().POST("/bin/utils/emailSignUp") <--- doesn't exist!
//===================================================================
// assert response is internally redirected, hence expected status is a 302
assertEquals(302,context.response().getStatus());
}
}
I've done a lot of research on how this could be done (here) and (here), and these links show a lot about how you can set various parameters for context.request() object. However, they just don't show how to finally execute the 'post' call.
What you are trying to do is mix a UT with IT so this won't be easy at least with the aem-mocks framework. Let me explain why.
Assuming that you are able to call your required code
/*
* WHAT I WOULD LIKE TO DO:
*/
// send request using some API that allows me to do post to the framework
// Example:
// context.request().POST("/bin/utils/emailSignUp") <--- doesn't exist!
//===================================================================
Your test will end up executing all the logic in SlingAllMethodsServlet class and its parent classes. I am assuming that this is not what you want to test as these classes are not part of your logic and they already have other UT/IT (under respective Apache projects) to cater for testing requirements.
Also, looking at your code, bulk of your core logic resides in following snipper
String email = request.getParameter("email");
submissionAgent.saveForm(xssFilter.filter(email));
Your UT criteria is already met by the following line of your code:
emailSignUpFormServlet.doPost(context.request(),context.response());
as it covers most of that logic.
Now, if you are looking for proper IT for posting the parameters and parsing them all the way down to doPost method then aem-mocks is not the framework for that because it does not provide it in a simple way.
You can, in theory, mock all the layers from resource resolver, resource provider and sling servlet executors to pass the parameters all the way to your core logic. This can work but it won't benefit your cause because:
Most of the code is already tested via other UT
Too many internal mocking dependencies might make the tests flaky or version dependant.
If you really want to do pure IT, then it will be easier to host the servlet in an instance and access it via HttpClient. This will ensure that all the layers are hit. A lot of tests are done this way but it feels a bit heavy handed for the functionality you want to test and there are better ways of doing it.
Also the reason why context.request().POST doesn't exist is because context.request() for is a mocked state for the sake of testing. You want to actually bind and mock Http.Post operations which needs some way to resolve to your servlet and that is not supported by the framework.
Hope this helps.

Unit Testing a Domain Model Containing Lists Populated From The Database

I am currently in the middle of writing unit tests for my domain models.
To lay a bit of context I have a Role Group class which has a list of Roles, it also has a list of Users that currently have this Role Group assigned to them.
A Role Group is assigned to a User so all the methods to do this are in the User domain model. This means the Users property on the Role Group side is basically pulled in from the database.
Both Role Group and User are Aggregate Roots and they can both exist on their own.
Unit Testing a Domain Model Containing Lists Populated From The Database
What I am struggling with is I can not test the CanArchive method below because I have no way off adding in a User to the property. Apart from the bad was of using the Add method which I don't want to use as it break the whole idea of Domain Models controlling their own Data.
So I am not sure if my Domain Models are wrong or if this logic should be placed in a Service as it is an interaction between two Aggregate Roots.
The Role Group Class:
public bool Archived { get; private set; }
public int Id { get; private set; }
public string Name { get; private set; }
public virtual IList<Role> Roles { get; private set; }
public virtual IList<User> Users { get; private set; }
Updating Archived Method:
private void UpdateArchived(bool archived)
{
if (archived && !CanArchive())
{
throw new InvalidOperationException("Role Group can not be archvied.");
}
Archived = archived;
}
Method to check if Role Group can be Archived
private bool CanArchive()
{
if (Users.Count > 0)
{
return false;
}
return true;
}
Method that sets the User's Role Group in the User class
This is called when a user is created or update in the user interface.
private void UpdateRoleGroup(RoleGroup roleGroup)
{
if (roleGroup == null)
{
throw new ArgumentNullException("roleGroup", "Role Group can not be null.");
}
RoleGroup = roleGroup;
}
A few thoughts :
Unit testing a domain object should not rely upon persistence layer stuff. As soon as you do that, you have an integration test.
Integrating changes between two aggregates through the database is theoretically not a good idea in DDD. Changes caused by User.UpdateRoleGroup() should either stay in the User aggregate, or trigger public domain methods on other aggregates to mutate them (in an eventually consistent way). If those methods are public, they should be accessible from tests as well.
With Entity Framework, none of that matters really since it is not good at modelling read-only collections and you'll most likely have a mutable Users list. I don't see calling roleGroup.Users.Add(...) to set up your data in a test as a big problem, even though you should not do it in production code. Maybe making the list internal with internalsVisibleTo your test project - as #NikolaiDante suggests - and wrapping it into a public readonly collection would make that a bit less dangerous.
What I am struggling with is I can not test the CanArchive method below because I have no way off adding in a User to the property.
How does the framework that loads the RoleGroup instances from the database populate the users? This is the question you must ask yourself to find the solution for your unit tests. Just do it the same way.
I don't know what language you use. In Java for example you can use the reflection api to set private fields. There are also a lot of test frameworks that provide convenience methods for this job, e.g. Deencapsulation or Whitebox.

Error when deserializing Java client request to WCF service

I have a WCF service with yet only one method:
[OperationContract]
void SaveDocument (InwardDocument doc);
[DataContract]
public class InwardDocument{
[DataMember]
public Citizen Citizen {get;set;}
....
}
[DataContract]
public class Citizen{
[DataMember]
public string LastName {get;set;}
....
}
I've tested the service with both WCF test client and a separate .NET console application. In both cases the service works as expected. But when a java client tries to consume it, a deserialization problem occurs. I've put some markers inside the SaveDocument method to see what causes the problem:
public void SaveDocument(InwardDocument doc){
if(doc==null)
throw new ArgumentnullException("InwardDocument");
if(doc.Citizen==null)
throw new ArgumentnullException("InwardDocument.Citizen");//This exception is thrown when consumed by java client
}
As you can see the first exception is skipped which means doc argument itself is not null but for some reason, the Citizen property is null. The guy who generates the request in java client confirms that the InwardDocument.Citizen property is not null by debugging the code. In fact we've had a problem generating the proxy class in that java client which I describe in this SO thread. So I'm assuming it has something to do with that same problem.Maybe I need to add some more attributes to my classes and their members to care of any such problems that might occur in other platforms? Any suggestions are appreciated.
Have you tried to add Know Type attribute in your InwardDocument class. See link here.
[DataContract]
[KnownType(typeof(Citizen))]
public class InwardDocument{
[DataMember]
public Citizen Citizen {get;set;}
....
}
The problem was caused by incorrect creation of the corresponding JAXBelement instances. The solution to the problem is in this SO thread answer

Http Post with WebAPI Causing "The HttpOperationHandlerFactory is unable to determine the input parameter..."

I have a method that is receiving more than one parameter. The method signature with attributes look like the following:
[WebInvoke(Method = "POST", ResponseFormat = WebMessageFormat.Json, RequestFormat = WebMessageFormat.Json)]
public int AddUser(string firstName, string lastName, string emailaddress) { // actions here }
However, when I use this method, I get the following exception:
The HttpOperationHandlerFactory is unable to determine the input
parameter that should be associated with the request message content
for service operation 'Initiate'. If the operation does not expect
content in the request message use the HTTP GET method with the
operation. Otherwise, ensure that one input parameter either has it's
IsContentParameter property set to 'True' or is a type that is
assignable to one of the following: HttpContent, ObjectContent`1,
So, I've created a custom object (such as below) to be passed in.
[DataContract]
public class UserToAdd {
[DataMember] public string firstName { get; set; }
[DataMember] public string lastName { get; set; }
[DataMember] public string emailAddress { get; set; }
}
Using this new signature:
[WebInvoke(Method = "POST", ResponseFormat = WebMessageFormat.Json, RequestFormat = WebMessageFormat.Json)]
public int AddUser(UserToAdd user) { // actions here }
When I do that, I get a 404. It seems I can't win. Any suggestions?
If you want to create routes declaratively you can. I had to do this as I had inherited a bunch of non-restful URIs that had to be supported for backwards compatibility reasons. I created my own attribute to describe URIs, constraints and HTTP methods. This is effectively a replacement for WebInvoke/WebGet. I reflect over my service methods on start-up to discover routes and call MapHttpRoute() as appropriate. Each of my routes specifies the controller and action explicitly. This approach is good for RPC style APIs, but it required quite a lot of heavy lifting. The nice part is that it keeps the definition of routes with the methods - this is not something web api gives you explicitly.
So whilst RPC style is possible, it's not idiomatic. Web API is strongly biased out of the box towards RESTful APIs with convention driven mapping of routes to services and methods - in this way very generic routes are registered and conventions do the rest. That largely avoids the issues of defining routes away from their corresponding actions. If you can I would convert to the RESTful/convention approach of Web API, as you are otherwise fighting the framework a little.

Exposing existing API as a Web service

I am currently working on a task to expose an API as a Web service. The idea here is to package the existing business logic in JAR files into a WAR file and expose the WAR file as a Web service that would return a free-form XML string. When we expose an existing API as a Web service, is it sufficient that we make available an XSD & WSDL file of the returned XML string data? Is that the convention or the standard practice?
It depends on whether or not you are using SOAP or REST. SOAP is more restrictive; as a result, it's more expected that you'll have a WSDL file to generate the classes that interface with the API.
On the other hand, if you are using REST, just exposing a RESTful URI would be considered enough to meet the constraint of a RESTful web service having a uniform interface.
REST tends to be gaining more ground over SOAP since it is a permissive architectural style. I would prefer this method, and I would recommend this method if you're new to developing web services.
Depending on what language you are using, I'm assuming Java, you can use Restlets or Spring 3.0's REST framework to help you build a RESTful web service. These tools really make that job a lot easier and help you conform to the 6 Constraints of a RESTful Web Service and meet the 4 Key Goals.
UPDATE:
Assuming you already have existing, object-oriented code, and assuming you want to expose that code as a REST API, using Spring 3.0 MVC, create a Controller subclass that will wrap around your existing package:
Example GET:
Resource: Javadocs for Jackson's ObjectMapper POJO/JSON Marshaller
// this is the wrapper around your existing Java packages.
#Controller
public class UserController {
protected static final DATA_TYPE = "json";
// In REST, GET method is used to retrieve data with no side effects,
// meaning that no changes are made to the data on the server.
#RequestMapping(value="/users/{username}", method=RequestMethod.GET)
public void getUserData(#PathVariable("username") String userName, Model model) {
// this is your existing class
UserDataService userDataService = new UserDataService();
// assume you have a class User, and getUserDetails gives you that POJO object.
User user = userDataService.getUserDetails(username);
// marshal the User object to JSON, using Jackson, and write as output in response
ObjectMapper mapper = new ObjectMapper();
mapper.writeValue(response.getWriter(), user);
}
}
// assume you have an existing POJO class called User
class User implements Serializable {
String username;
String age;
String birthday;
String mood;
String getMood() { return this.mood; }
String getBirthday() { return this.birthday; }
String getAge() { return this.age; }
String getUsername() { return this.username; }
String setMood(String mood) { this.mood = mood; }
String setBirthday(String birthday) { this.birthday = birthday; }
String setAge(String age) { this.age = age; }
String setUsername(String username) { this.username = username; }
}
Request:
http://api.example.com:8080/users/jmort253/
Response:
{
"username":"jmort253",
"mood":"good",
"age":"not too old and not too young",
"birthday","Jan 1, 1900"
}
XML instead of JSON:
The main difference between returning XML and returning JSON is in the marshaller used. Using javax.xml.bind.annotations, you can place annotations on the POJO class so the marshaller can convert it to XML, freeing you up from the details of having to manually code XML by hand:
Using javax.xml.bind.annotations to convert Java Objects to XML and XSD. This resource also explains how to generate the XML Schema, if you deem that as a requirement to your REST Web service.
#XmlRootElement
class User implements Serializable {
String username;
String age;
String birthday;
String mood;
String getMood() { return this.mood; }
String getBirthday() { return this.birthday; }
String getAge() { return this.age; }
String getUsername() { return this.username; }
String setMood(String mood) { this.mood = mood; }
String setBirthday(String birthday) { this.birthday = birthday; }
String setAge(String age) { this.age = age; }
String setUsername(String username) { this.username = username; }
}
Instead of using the Jackson API's ObjectMapper class to marshal the POJO class to JSON, use the javax.xml.bind.annotations package in place of ObjectMapper:
JAXBContext context = JAXBContext.newInstance(User.class);
Marshaller marshaller = context.createMarshaller();
// pretty print XML
marshaller.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, true);
marshaller.marshal(user, System.out);
Aside from the other resources, this article has a few examples that use JAXB to deserialize an ArrayList of POJO objects to XML.
My final suggestion when working on the REST Web service wrappers is to set your logging levels to "ALL" or "DEBUG". I find that this helps me more easily determine the root cause of any problems I face when setting up a Web service. The libraries themselves will output helpful debug messages to help you resolve configuration issues, such as missing dependencies, missing annotations, and other issues that you'll likely encounter when dealing with the conversion to XML/JSON process or in setting up Spring 3.0.
Once you're uniform interfaces are setup and you can make GET requests and receive responses, you can then set the logging levels back to the previous INFO or WARN levels.
First of all, I would hesitate before exposing an existing API as a web service on a one-for-one basis. Was the existing API written to be accessed over a network? If not, then it was probably not designed with networking constraints in mind.
It may include method calls which will involve larger numbers of small operations - the kind that cost nothing when used within a single process. Over a network, each call has an associated latency that is much larger than the overhead of caling a method within the same process.
Instead, I would design a service to meet the functional requirements of the API. The service would likely be designed to have a smaller number of operations which perform more work per operation, thereby minimizing the overhead associated with network traffic. The service would likely be implemented by calling on the API (assuming it is written to handle multi-threaded environments like a service).
In terms of WSDL, the toolkit you are using may very well construct a WSDL for you. I know that WCF in .NET does that, and I have done the same using IBM Rational Web Developer, so I know that the Java world can do the same.
Otherwise, it's not actually that hard to hand-write a WSDL and corresponding schema. In either case, they do need to be provided so that your clients can consume the service.
There's nothing wrong with using REST for this if your API can be cleanly expressed as a set of operations on resources. In this case, yes, provide the schema to make it easier for your clients to process the XML. I would beware of forcing your API to fit the REST model if it is not cleanly expressible as operations on resources.