Suppose I have a UML model conforming to the org.eclipse.uml2.uml metamodel. Suppose that this model contains a Class, a Property, and an ownedAttribute relationship between them.
At the Ecore level, the Class and the Property are EObjects, while the ownedAttribute is an EReference.
My task is as follows: given an EObject, retrieve all of its EReferences. I can accomplish this using the following code snippet:
for (EReference eRef : myEObject.eClass().getEAllReferences()) {
if (eObject.eIsSet(eRef)) {
// found a relevant EReference
}
}
Going back to the UML example above, this code snippet would identify all of the following EReferences: ownedElement, ownedMember, member, feature, attribute, ownedAttribute, role.
My problem: out of the identified EReferences, I would like to keep only ownedAttribute, since this relationship subsets all of the others according to the UML standard. However, the Ecore metamodel does not specify any kind of hierarchy between EReferences. What approach could I use to filter out the more general EReferences that I am not interested in?
You can filter out 'derived' references (org.eclipse.emf.ecore.EStructuralFeature.isDerived() == false).
Related
I need to create an like/dislike system which can be used on any entity. I'm going to create a Like entity with an 'Id', 'Entity' (can be anything), 'author' (ManyToOne with User class) and 'like' (boolean).
I just want to know if there is a good way to do it ?
I can't use the table inheritance (mappedsuperclass) because this entity will be part of a bundle (SocialBundle) which can be used on several project (It will be a vendor).
I have no code to show you because i'm still in the analysis part.
Thanks !
Create an interface for that entity and later you can map this interface to any entity using addResolveTargetEntity method. See this.
I have two questions related to coder issues I am facing with my Dataflow pipeline.
How do I go about setting a coder for my custom data types? The class consists of just three items - two doubles and another parameterized property. I tried annotating the type with SerializableCoder but I still end up with the error "com.google.cloud.dataflow.sdk.coders.CannotProvideCoderException: Cannot provide coder based on value with class interface java.util.Set: No CoderFactory has been registered for the class." The Set actually contains the parameterized custom data-type - so I am assuming that the custom datatype is the problem. I could not find enough documentation/examples on the right way to do this. Please point me to the right place if its available.
Even without the custom datatype, whenever I try switching to a parameterized version of Transform functions, it results in coder errors. Specifically, inside a complex transform which is parameterized, a ParDo works with parameterized types but when I apply a Combine.PerKey on the resulting PCollection after the ParDo, it results in the CoderNotFoundException.
Any help regarding these two items would be helpful as I am kind of stuck on this for sometime now.
It looks like you have been bitten by two issues. Thanks for bringing them to our attention! Fortunately, there are easy workarounds for both while we improve things.
The first issue is that the default coder registry does not have an entry for mapping Set.class to SetCoder. We have filed GitHub issue #56 to track its resolution. In the meantime, you can use the following code to perform the needed registration:
pipeline.getCoderRegistry().registerCoder(Set.class, SetCoder.class);
The second issue is that parameterized types currently require advanced treatment in the coder registry, so the #DefaultCoder will not be honored. We have filed Github issue #57 to track this. The best way to ensure that SerializableCoder is used everywhere for CustomType is to register a CoderFactory for your type that will return a SerializableCoder. Supposing your type is something like this:
public class CustomType<T extends Serializable> implements Serializable {
T field;
}
Then the following code registers a CoderFactory that produces appropriate SerializableCoder instances:
pipeline.getCoderRegistry().registerCoder(CustomType.class, new CoderFactory() {
#Override
public Coder<?> create(List<? extends Coder<?>>) {
// No matter what the T is, return SerializableCoder
return SerializableCoder.of(CustomType.class);
}
#Override
public List<Object> getInstanceComponents(Object value) {
// Return the T inside your CustomType<T> to enable coder inference for Create
return Collections.singletonList(((CustomType<Object>) value).field);
}
});
Now, whenever you use CustomType in your pipeline, the coder registry will produce a SerializableCoder.
Note that SerializableCoder is not deterministic (the bytes of encoded objects are not necessarily equal for objects that are equals()) so values encoded using this coder cannot be used as keys in a GroupByKey operation.
I'm using POCO to auto generate my entities from DAL project to Entities project. I currently have no need in creating view classes manually.
However I have one problem - When I try to return a poco object that has navigation properties from a [WebMethod] I get the following error:
Cannot serialize member Entities.City.Customers of type System.Collections.Generic.ICollection1[[Entities.Customer, Entities, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null]] because it is an interface.
I tried writing context.ContextOptions.LazyLoadingEnabled = false; and
context.ContextOptions.ProxyCreationEnabled = false; to no avail.
if I add [System.Xml.Serialization.XmlIgnore] before the properties, I get no error, but then I lose those properties?
The message is clear: serialization fails because your Entities.City.Customers member is declared as an interface (ICollection).
The interface does not say anything about the implementing type, it only defines the contract that the implementation should follow. As such, the serializer does not know how to represent the implementation in a serialized format.
You might think that it's not that hard to reflect the type and serialize based on the information you get from introspection, but the problem will be when you try to deserialize from this representation. The same representation could possibly correspond to all implementation types, in which case what should the serializer choose as the concrete type?
There are a few steps to work around this limitation, as you can find in this post: XML serialization of interface property. In your particular case, the simplest way would be to make the Entities.City.Customers member of a concrete type like List<Customer> instead of ICollection<Customer>.
I have seen the [DebuggerNonUserCode] and [ExcludeFromCodeCoverage] attributes in resources and other SO questions about exlcuding code from coverage statistics, and wanted to know if it was possible to automatically add this attribute to the classes in the code generated by the Entity Framework using .NET 4.0.
Also would it need to be class level or could it be on the diagram.Designer.cs level, needing one attribute for all code generated by that diagram?
Since partial classes (which Entity Framework creates) merge attributes, extended functionality in other partial classes are also excluded if the attribute is class level in the template, it will have to be applied at the method level.
The best way that I've found to do this is using T4 (as recommended in #Craig Stuntz's answer) to:
include: using System.Diagnostics.CodeAnalysis; at the top of the file
Then apply [ExcludeFromCodeCoverage] to getters, setters and Factory methods by searching for:
#>get
#>set
Template_FactoryMethodComment
and placing them in the appropriate place.
This was made a lot easier using Tangible's T4 editor Extension for VS.
This is my first attempt and it seems to work, "your milage may vary", so complete a test run to make sure everything's working as necessary.
I'm refactoring existing class to support Products import from CSV file.
Requirements:
1) during import products are identified by special product number.
2) product has category attribute. If such category exist - use its id, if not - create it first.
3) if product with some number already exists in DB - do an update of other fields in another case - create a new product.
Goal:
Make a real unit-test(no DB interaction) verifiing that categories creation/reuse works fine.
Correct me if I'm wrong:
1) I need to inject a list of existing product categories.
2) Loop through parsed products from CSV file and see whether category can be found in injected categories list or not.
3) What to return? Repository of aggregates(Should aggregate root be a product or product category?)
The problem is that we don't know what ID will new categories will obtain from DB.
Please give me some direction of how this problem can solved?
I'm new to the Repository pattern(and persistence-ignorant domain concept) and I'm using Mock testing in my daily coding.
As far as I can tell, you need to ask about the previous existence of both products and categories. This hints at two different Repositories: a ProductRepository and a CategoryRepository.
Injecting a list of existing categories is one possible approach, but you would also need to inject a list of existing products.
Another alternative would be to inject both Repositories and simply ask them whether the product or category already exists. If you need other functionality provided by these Repositories, this may be a better option, since you already have the required dependencies.
You might also want to consider doing both to keep closer to the Single Responsibility Principle. One collaborator could be a service that constructs a new Product instance based on the parsed data and the existing categories and products. Another would be responsible for retrieving the existing data. Yet another class would implement the CSV parser.
All types would implement interfaces, so that you might have collaborators such as these:
public class CsvParser : IParser
public class DataRetriever : IDataRetriever
public class ProductCreator : IProductCreator
Your overall class would then be one that takes those three interfaces as dependencies and orchestrates their interaction.
This will allow you to unit test each in isolation using mocks for each dependency.