PostgREST Transactions - postgrest

I am using PostgREST to expose DB entities to a Springboot app which consumes those.
I have two entities inside my DB which are Person and City.
I would like to save the Person entity and the City at the same time, if any of those two fails I would like the other one to not persist on the PostgREST.
I would like to achieve Transactional behaviour but on PostgREST. Is there any chance to achieve this natively from the tool or without programmatically delete the just created record on exception?

You could create a PL/pgSQL procedure that receives data from City and Person, and insert on both tables, and calls a ROLLBACK if anything fails.
Here are the docs with some examples:
https://www.postgresql.org/docs/11/plpgsql-transactions.html
Postgrest will expose the procedure in the /rpc/{function_name} endpoint.
Here are the docs:
https://postgrest.org/en/stable/api.html#stored-procedures

Related

WSO2 DAS SiddhiQL : Dynamic event table / persist event stream

I would like to know if WSO2 Data Analytics Server allows to define a dynamic events tables or dynamic streams.
For example, imagine one event represent a car, and in this event, an attribute is the 'brand' of the car (Ford, Mercedes, Audi ...).
And I would like to add a column each time there is a new different brand. So my table would look like this :
And thus, if I receive an event with the brand 'Toyota', it would add a column to my table which would look like this:
Considering that I don't know in advance the number of different brands I will receive, I need this to be dynamic.
Dynamically changing the schema of an event table is not possible.
This is because the schema of the event table is defined when the Siddhi execution plan is deployed. Once it is deployed, the schema cannot be changed.
On the other hand,
it looks like what you need here is not an event table.
Perhaps, what you need to do is to update the schema of a table (an RDBMS table) when a certain event happens (for example, when a car event arrives with a new brand). Do you use this updated table in your Siddhi execution plan? If you do not use it, then you do not need an event table.
Please correct me if I have misunderstood your requirement.
If your requirement is to update the schema of a table when a certain event happens, then you might need to write a custom event publisher to do that. If so, please refer the document on the same: Building Custom Event Publishers.

Create a doctrine repository class without an entity

I'm currently in a situation where I need to create a Repository class which would contain multiple financial statistic queries. The queries are not exactly tied up with one Entity but rather with multiple Entities and will select specific data from the database, based on various some conditions.
Having said that, I'm looking for a way to create a Repository class (i.e. StatisticsRepository) which is not associated with an Entity at all, so I could store the queries there. Simply creating that repository doesn't seem to be working. I'm guessing I probably need to create a service of some kind that loads this repo class? Is this correct, and if so is there an example I'm missing in the Symfony/Doctrine docs?
You can just create a class like StatisticsService/StatisticsFinder (naming convention is for you).
That service should have an entity manager injected, so define it in your config.
Create a query builder inside that service, then simply get and return results.

Cross-service references in DB

I am building service oriented system, with multiple services and application.
Current I am not sure how to handle DB references between resources from multiple services and databases.
For example, I have a users service, where I can define all users and their roles.
Next I have, products service, where I can define my products, their prices and other information.
I also have invoicing service, which is used to create invoices. This service will use information from previous two services. It will link products and users to invoice. Now I am not sure what is the best approach for this?
Do I just save product ID and user ID that it got from other two services, without any referential integrity?
If I do this, then I will have problem when generating reports, because at time of generation I will need to send a lot of requests to products service, to get names and prices of product in invoice. Same for users.
Do I create some table products in my invoicing application, and store name and price of product at the moment of invoice creation?
If I go with this approach, then in case that price or name of product changes, I will have inconsistent data across my applications?
Is there some well-known pattern for this kind of problem, that is what is the best solution.
Cross-service references in DB is a common challenge for Data integrity between multiple web services, And specially when we are talking about Real time access.
There is two approaches for your case :
1- Databases Replication across your servers
I suppose that you have each application hosted on a separate server, So i can name your servers as Users_server, Products_server and Invoices_Server.
In your example, your Invoice web service need to grab data from Users & Products Servers, in this case you can create a Replication of your Users Database and Products Database on your Invoices_server.
This way you can run your Join queries on the same server and get data from multiple databases.
Query example :
SELECT *
FROM UsersDB.User u
JOIN InvoicesDB.Invoice i ON u.Id = i.ClientId
2- Main Database Replication
1st step you have to replicate all your databases into one main server we can call it Base_server, which basically contain all your databases from all your services.
Then you can build an internal web service for your application to provide needed data in just "One Call", this answer your question about generating reports.
In other words, you will make one call to the mane Base service instead of making 2 or 3 calls to your separate services.
Note: As a Backend developer we use this organization as a best practice while building a large bundle based application, we create a base bundle and then create service_bundle which rely on the base bundle.
If your services are already live, we may need more details about the technology and databases type you using in order to give you a more accurate solution.
Just because you are using SOA doesn't mean you abandon database integrity. Continue to use referential integrity where your database design requires it.
At the service level, you can have each service be responsible for returning identity information for the entities which it owns. This identity information may or may not be the actual primary key from the database, but it will be used by the clients of the service as though it were the actual primary key.
When a client wants to create an invoice, it will call the User service and receive a User entity, which will contain a User Identifier. It will call the Product service and receive a set of products, each with a product identifier. It will then call the Invoice service to create an invoice, passing the user identifier and the product identifiers. This will likely return an invoice identifier.
You can (probably should) enforce the integrity making the productId and userId foreign keys in your invoice table. Then your DB makes sure the referenced entities exist. Reports should join tables, not query services for each item. I assume a central DB shared across the system.

Inserting records in CRM 2011 using SSIS

I am working on a SSIS (2012) package that collects data from our till system to staging area and from staging area to CRM 2011 (on-premise | Roll up 11).
In CRM we have contact entity and order entity. These two entity are related via a guid called contactid(PK in contact) and customerid(FK in order).
when i insert new order in to CRM how do I ensure that the guid is created to associate that order to either a new contact or already existing contact?
I'm assuming since you're using SSIS your doing straight SQL inserts? If so, this is not supported. Ideally you'd be using the SDK, and in that case, you can set the GUID manually before actually creating the record, although the Contact Id still has to exist when creating the Order.
So you'll want to grab all of your existing Contacts up front, then determine for each order, if the contact exists or not. If it does, just set the customerId when you create the order and you're all set. If it doesn't, you'll need to create the Contact (potentially assigning it an Id), and then setting the customerId when you create the order.
I would echo what Daryl has said in that SQL inserts are not supported and generally a bad idea. However there is a solution, a company called Kingsway Soft make a SSIS component that allows you to read and write into CRM using the web services. The best part of it is that it is free if you don't want to run it using SQL agent. Even if you do want to schedule it the cost is very small for such an excellent product.
You can download it from here
http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-crm

How to monitor database updates from application?

I work with SQL Server database with ODBC, C++. I want to detect modifications in some tables of the database: another application inserts or updates rows and I have to detect all these modifications. It does not have to be the immediate trigger, it is acceptable to use polling to periodically check database tables for modifications.
Below is the way I think this can be done, and need your opinions whether this is the standard/right way of doing this, or any better approaches exist.
What I've thought of is this: I add triggers in SQL Server, which, on any modification, will insert the identifiers of modified/added rows into special table, which I will check periodically from my application. Suppose there are 3 tables: Customers, Products, Services. i will make three additional tables: Change_Customers, Change_Products, Change_Services, and will insert the identifiers of modified rows of the respective tables. Then I will read these Change_* tables from my application periodically and delete processed records.
Now if you agree that above solution is right, I have another question: Is it better to have separate Change_* tables for each of my tables I wish to monitor, or is it better to have one fat Changes table which will contain the changes from all tables.
Query Notifications is the technology designed to do exactly what you're describing. You can leverage Query Notifications from managed clients via the well known SqlDependency class, but there are native Ole DB and ODBC ways too. See Working with Query Notifications, the paragraphs about SSPROP_QP_NOTIFICATION_MSGTEXT (OleDB) and SQL_SOPT_SS_QUERYNOTIFICATION_MSGTEXT (ODBC). See The Mysterious Notification for an explanation how Query Notifications work.
This is the only polling-free solution that work with any kind of updates. Triggers and polling for changes has severe scalability and performance issues. Change Data Capture and Change Tracking are really covering a different topic (synchronizing datasets for occasionally connected devices, eg. Sync Framework).
Change Data Capture(CDC)--http://msdn.microsoft.com/en-us/library/cc645937.aspx
First you will need to enable CDC in database
::
USE db_name
GO
EXEC sys.sp_cdc_enable_db
GO
Enable CDC on table then
:: sys.sp_cdc_enable_table
Then you can query changes
If your version of Sql Server is 2005 - you may use Notification Services
If your Sql Server is 2008+ - there is most preferrable way to use triggers and log changes to log tables and periodically poll these tables from application to see the changes