Google Cloud Data flow : CloudBigtableScanConfiguration.withScan(), how to pass dynamic filter values? - google-cloud-platform

SourceLocation is prefix for my Bigtable, which is fetched from application.properties. Is there a way to fetch it dynamically while running the data flow template?
My Pipeline:
pipeline.apply("ReadTable", Read.from(CloudBigtableIO.read(configSetUp(options))))
CloudBigtableScanConfiguration
private static CloudBigtableScanConfiguration configSetUp(LocationSetupOptions options) {
ValueProvider<Integer> pageFilter = options.getPageFilter();
Scan scan = new Scan(Bytes.toBytes(options.getSourceLocation().get()));
FilterList filterList = new FilterList();
PrefixFilter prefixFilter = new PrefixFilter(Bytes.toBytes(options.getSourceLocation().get()));
filterList.addFilter(new PageFilter(Long.valueOf(pageFilter.get())));
filterList.addFilter(prefixFilter);
scan.setFilter(filterList);
return new CloudBigtableScanConfiguration.Builder()
.withProjectId(options.getProjectId())
.withInstanceId(options.getInstanceId())
.withTableId(options.getTableId())
.withScan(scan)
.build();}

There are two clients for Bigtable CloudBigtableIO and BigtableIO. The CloudBigtableIO parameters are not updated to be modified by templates via a ValueProvider but BigtableIO is compatible with ValueProviders.
In your particular case if you are looking for ValueProvider to be used along with template then I would recommend that you move to using BigtableIO. A sample can be found here AvroToBigtable.
UPDATE
The #Default.InstanceFactory can be used to specify a user-provided factory method to generate default values for a parameter. With this, you could read the default value from a resource file inside of your DefaultValueFactory implementation.
As an example, you can check how WindowedWordCount defines DefaultToCurrentSystemTime to annotate the minTimestampMillis parameter:

Related

DynamoDb scan not working properly

I am trying to fetch items from the dynamoDb. However when I fetch single item using partiton key everything works fine but when I try to fetch all the items in dynamo db using scan I encouter an error. Following is the code what I am trying to do:
public List<PageCmsDomain> getAllPages() {
DynamoDBUtil dynamoDBUtil = new DynamoDBUtil();
AmazonDynamoDB dynamoDBClient = dynamoDBUtil.getDynamoDBClient();
DynamoDBMapper mapper = new DynamoDBMapper(dynamoDBClient);
List<PageCmsDomain> scanResult = mapper.scan(PageCmsDomain.class, new
DynamoDBScanExpression());
return scanResult;
Following error comes when I execute this code:
com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: PageCmsDomain[restrictions]; could not unconvert attribute
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperTableModel.unconvert(DynamoDBMapperTableModel.java:271)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.privateMarshallIntoObject(DynamoDBMapper.java:456)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.marshallIntoObjects(DynamoDBMapper.java:484)
at com.amazonaws.services.dynamodbv2.datamodeling.PaginatedScanList.<init>(PaginatedScanList.java:64)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.scan(DynamoDBMapper.java:1458)
at com.amazonaws.services.dynamodbv2.datamodeling.AbstractDynamoDBMapper.scan(AbstractDynamoDBMapper.java:216)
at com.astro.ott.dynamodb.cmsapi.impl.PageCmsDaoImpl.getAllPages(PageCmsDaoImpl.java:74)
at com.astro.ott.cmsapi.impl.PageCmsGetServiceImpl.getPages(PageCmsGetServiceImpl.java:41)
at com.astro.ott.cms_api.cms_api.PageCmsTestCase.get(PageCmsTestCase.java:81)
Caused by: com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted
at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$Rules$NotSupported.get(StandardModelFactories.java:660)
at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$Rules$NotSupported.get(StandardModelFactories.java:650)
at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$AbstractRule.unconvert(StandardModelFactories.java:714)
at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$AbstractRule.unconvert(StandardModelFactories.java:691)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter$DelegateConverter.unconvert(DynamoDBTypeConverter.java:109)
at com.amazonaws.services.dynamodbv2.datamodeling.StandardTypeConverters$Vector$ToMap.unconvert(StandardTypeConverters.java:433)
at com.amazonaws.services.dynamodbv2.datamodeling.StandardTypeConverters$Vector$ToMap$1.unconvert(StandardTypeConverters.java:417)
at com.amazonaws.services.dynamodbv2.datamodeling.StandardTypeConverters$Vector$ToMap$1.unconvert(StandardTypeConverters.java:410)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter$DelegateConverter.unconvert(DynamoDBTypeConverter.java:109)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter$NullSafeConverter.unconvert(DynamoDBTypeConverter.java:128)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter$ExtendedConverter.unconvert(DynamoDBTypeConverter.java:88)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperFieldModel.unconvert(DynamoDBMapperFieldModel.java:146)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperFieldModel.unconvertAndSet(DynamoDBMapperFieldModel.java:164)
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperTableModel.unconvert(DynamoDBMapperTableModel.java:267)
... 32
And I am using same PageCmsDomain while fetching single item also.
Specified code will work only if restrictions field is null or empty map. Otherwise it will throw exception as DynamoDB does not allow to save or fetch objects that have field with Object type (even as a generic type, in this case - map's value type Object), instead it should have specific type (e.g. String, Integer, Map<String, Integer>).
If restrictions map's value type can't be changed to specific one,
you can fix that by adding DynamoDBTypeConvertedJson annotation to store serialized value of restrictions field:
#DynamoDBAttribute(attributeName = "restrictions")
#DynamoDBTypeConvertedJson
private Map<String, Object> restrictions;
Another option is to specify your own custom converter class and annotate restrictions field with
#DynamoDBTypeConverted(converter = CustomConverter.class)

How to query based on one-to-many relationships in Spring Data Neo4j

I have defined simple one-to-many relationship as:
#Relationship(type = "BELONG")
private Set<Category> categories;
I want to query all the objects based on the exact set of Category. i.e. implement something like:
Page<SomeObject> findAllByCategories(Set<Category> categories, PageRequest page);
What is the best way to do it in Spring Data Neo4j?
You have a couple of options on how you can do this.
Your calling code can simply call the following on your SomeObjectRepository:
Iterable<Long> ids = categories.stream().map(SomeObject::getId).collect(Collectors.toSet());
Iterable<SomeObject> someObjects = someObjectRepository.findAll(ids, 1);
Note this method does not support paging though.
Another way is to inject a Session into your calling code and either write a custom Cypher query to retrieve the objects you want or use the load all by instance method:
#Autowired
private Session session;
...
// Option 1: write some cypher
Map<String, Object> params = new HashMap<>():
params.put("ids", ids) // use a stream like above to collect the ids.
Iterable<SomeObject> someObjects = session.query(SomeObject.class, "MATCH (n:SomeObject) ...", params);
//Option 2: use the load by instances method which also allows pagination:
Collection<SomeObject> someObjects = session.loadAll(categories, new Pagination(0, 25));
My recommendation would be option 2 as it pretty much does exactly what you want.

How to add a list property to an entity using gcloud Java client?

I can set a property to a new entity:
Entity.Builder builder = Entity.builder(actKey);
builder.set("name", someName);
I can see a method to add a list as a property:
List<Value<String>> aliases = new ArrayList<>();
builder.set("aliases", aliases);
I cannot find, however, how to create this Value<String>. There is a DatastoreHelper.makeValue() method in DatastoreV1, but it creates a different Value object.
Looking at the source code for gcloud, the answer is this:
Builder aliases = ListValue.builder();
while (someIterator.hasNext()) {
aliases.addValue(StringValue.builder("some string").build());
}
builder.set("aliases", aliases.build());

Writing a custom condition in WSO2 CEP with left and right argument

I want to extend the Wso2 CEP product in our needs and try to write a custom condition as indicated in this official wso2 cep link.
I am able to write an extension class that extends "org.wso2.siddhi.core.executor.conditon.AbstractGenericConditionExecutor" and implement its abstract method as indicated below:
#SiddhiExtension(namespace = "myext", function = "startswithA")
public class StringUtils extends
org.wso2.siddhi.core.executor.conditon.AbstractGenericConditionExecutor {
static Log log = LogFactory.getLog(StringUtils.class);
#Override
public boolean execute(AtomicEvent atomicEvent) {
log.error("Entered the execute method");
log.error("Atomic event to string: " + atomicEvent.toString());
return true;
}
}
when i use this extensioned method as:
from allEventsStream[myext:startswithA(name)]
insert into selectedEventsStream *;
In this situation, i want that startswithA method returns true if the name field has 'A' at the begining of it. However when i run this query in CEP the whole event drops into my execute function i.e. there is no sign to show that i send "name" field is sent to startswithA method as argument.
How can i understand which field of the stream is sent to my extended method as argument?
Also i want to write conditions like
from allEventsStream[myext:startswith('A', name)]
insert into selectedEventsStream *;
How can i achive this?
In 'AbstractGenericConditionExecutor' there's another method that gives you the set of expression executors that are included in the parameters when executor instantiates:
public void setExpressionExecutors(List<ExpressionExecutor> expressionExecutors)
You don't necessarily have to override this method and store the list, it is already stored there in the 'AbastractGenericConditionExecutor' as a list named expressionExecutors. You can pass the event to these executors to retrieve the relevant values from the event in order.
For an example, if you include a variable (like 'name') in the query (as a parameter at index 0), you'll get a 'VariableExpressionExecutor' in the list at index 0 that will fetch you the value of the variable from the event. Similarly for a constant like 'A', you'll get a different executor that will give you the value 'A' when called.
To add to Rajeev's answer, if you want to filter all the names that starts with 'A', you can override the execute method of your custom Siddhi extension similar to the following segment.
#Override
public boolean execute(AtomicEvent atomicEvent) {
if(!this.expressionExecutors.isEmpty()) {
String name = (String)this.expressionExecutors.get(0).execute(atomicEvent);
if(name.startsWith("A")) {
return true;
}
}
return false;
}
When writing the query, it would be similar to
from allEventStream[myext:startsWithA(name)]
insert into filteredStream *;
You can extend this behaviour to achieve an extension that supports
from allEventsStream[myext:startswith('A', name)]
type queries as well.
HTH,
Lasantha

Adding SignedDataObjects (and consequently add proofOfApproval property) to an enveloped signature

I'm creating an Enveloped signature with xades4j following this statements:
Element elemToSign = doc.getDocumentElement();
XadesSigner signer = new XadesTSigningProfile(...).newSigner();
new Enveloped(signer).sign(elemToSign);
But I need to put in the signature also other properties like ProofOfApprova etc...
I see that in xades4j examples the proofOfApprovalProperties are addedto enveloped signature using different statements of signature, for example:
AllDataObjsCommitmentTypeProperty globalCommitment = AllDataObjsCommitmentTypeProperty.proofOfApproval();
CommitmentTypeProperty commitment = CommitmentTypeProperty.proofOfCreation();
DataObjectDesc obj1 = new DataObjectReference('#' + elemToSign.getAttribute("Id"))
.withTransform(new EnvelopedSignatureTransform())
.withDataObjectFormat(new DataObjectFormatProperty("text/xml", "MyEncoding")
.withDescription("Isto é uma descrição do elemento raiz")
.withDocumentationUri("http://doc1.txt")
.withDocumentationUri("http://doc2.txt"))
.withIdentifier("http://elem.root"))
.withCommitmentType(commitment)
.withDataObjectTimeStamp(dataObjsTimeStamp)
SignedDataObjects dataObjs = new SignedDataObjects(obj1)
.withCommitmentType(globalCommitment);
signer.sign(dataObjs, elemToSign);
I see here that another procedure of signature is used, more specificately the statement in which I create a DataObjectreference saying that I use "Id" attibute fo root tag is unusable for me because in input I can have any kind of xml document and I cannot know what kind of attribute (if present) I can use foe define root tag.
Briefly, can I have some examp'le code where I create an Enveloped signature and put a proofOfApproval property using "new Enveloped(signer).sign(elemToSign);", or anyway whitout knowing the xml source structure?
Thanks
M.
The proofOfApproval property has to be applied to data objects being signed, hence the need to use the SignedDataObjects class.
The Enveloped class is just a helper for straightforward scenarios. If I understood correctly you want to sign the whole XML document. The XML-Signatures spec defines that an empty URI on a reference (URI="") means exactly that. If you check the code on the Enveloped class you'll see that it adds a DataObjectReference with an empty uri.
To sum up, you'll need something like:
DataObjectDesc obj1 = new DataObjectReference("")
.withTransform(new EnvelopedSignatureTransform())
.withCommitmentType(CommitmentTypeProperty.proofOfApproval());
signer.sign(new SignedDataObjects(obj1), elemToSign);