good day guys,
I need your opinion on this problem. although am using Django for my project but am sure this problem is not tie to django alone. So, I am working on these services booking system. In my database I have 3 tables listed below:
User_Table with field
• Id
• Username
• Fullname
Services_Table with field
• Id
• name
• Price
Transaction_Table with field
• Id
• User_id
• Services_id (many to many relationship)
When this services get booked, I send it to the transaction table using the user_id and services_id as foreign key for User Table and Services Table meaning it’s the id values that are saved.
When a client want to view his or her transaction history, I provide it by running the query:
price = transaction.service.price
service_name = transaction.service.name
total_cost = sum of all services selected
as not to present the user with id values for price and service_name.
now here is my problem, in future, if the admin decide to change the name and price of a service and the client goes back to view his old transaction log, the new value get populated cus I referenced them by ids which is not what I want, I want the client to see the old value as a receipt would be even when I updated the services table.
What do you suggest I do in this case?
You should record every transaction made and record the price and amount it totalled up to at the moment the txn was made. Transaction model should have fields to record every detail about the transaction.
This means:
You would have a txn_service table, where all services in a transaction are saved and linked to the transaction table.
Related
I am working in Dynamo DB for the first time . My assignment is Ticket Management System where it has 3 entities Department , User and Ticket. The relationship between each entity is.
I have identified the following access patterns
Fetch a Department.
Fetch all users in Department
Fetch a given user in Department
Fetch all Tickets belongs to the Department
Fetch all Tickets assigned to the User
for which i defined the following data model . I am thinking of creating GSI with Tickets as PK and User as SK to do 4 & 5
On a higher level I need to perform 2 updates . I can update the User to which the ticket is assigned and I can update the ticket status as inprogress, resolved . And in the table I have Ticket details as JSON object as below.
I need help from from the experienced people whether my understanding and approach is efficient.
I think you're on the right track. I'd design it as a table with two Global Secondary indexes. The base table looks like this:
The first Global Secondary Index like this (GSI1):
The second Global Secondary Index like this (GSI2):
Now for the why:
This design allows you to easily update the following things:
A user's department
A ticket's status if you know the ticket Id
A ticket's user if you know the ticket Id
A ticket's department if you know the ticket Id
You can get a bunch of information from this model:
Fetch a Department.
Query the base table with the department name or list all departments
Fetch all users in Department
Query GSI 1 with the Department Name and filter the sort Key using begins_with = USER#
Fetch a given user in Department
Sound like you know the UserId, so do a GetItem on the base table. If that's not the case, do the query mentioned in "Fetch all users in Department".
Fetch all Tickets belongs to the Department
Query GSI 1 with the department name as the PK and filter the SK using begins_with = Ticket#
Fetch all Tickets assigned to the User
Query GSI 2 with the user id as the PK and filter the SK using begins_with = Ticket#
I have a column named company ID that contains various comapnies with differnet employes in them, I need a DAX Query which will give data A/Q to the Company id, suppose if there are 3 employes inside a company and each of them have a companyid 1 then I they should able to see the reports of each other but they cannot be able to see the reports of comanyid 2 and 3, how this can be achived?
I know I can do this by creating different Roles for each companyid, but how can this be achived if I want this to be built in one particular role???
What you are asking is to implement Dynamic Row Level Security.
Model:
User Table: Table that contains user detail along with the field on which we will apply security(here email field).
Company Table: Table containing company data .
User Company Bridge: Bridge table that contains permission details, for example user x is member of company y and z.
Company Data Table: Measures or transaction information of company that is to be filtered.
Defining RLS(Row Level Security):
In Modelling -> Manage Roles, create a new role on Email of User Table by this DAX query which returns the email id of logged in user.
[Email] = userprincipalname()
Finalizing:
Go to PowerBI Service -> Dataset -> Security and add users to the roles created.
To test the implementation:
Go to Modelling tab of pbix file.
Click on View As Roles.
Check other User checkbox and put an email ID and also check Profile
checkbox.
Now you can see data filtered.
In this manner it becomes easy to maintain roles and security by just modifying the bridge table that stores all permission details.
I am using a Django backend with postgresql.
Let's say I have a database with a table called Employees with about 20,000 records.
I need to allow multiple users to edit and verify the Area Code field for every record in Employees.
I'd prefer to allow a user to view the records, say, 30 at a time (to reduce burnout).
How can I select 30 records at a time from Employees to send to the front end UI for editing, without letting multiple users edit the same records, or re-selecting a record that has already been verified?
I don't need comments on the content of the database (these are example table and field names).
One way to do this would be to add 2 more fields to your table, say for example assigned_to and verified. You can update assigned_to, which can be a foreign key to the verifying user, when you allow the user to view that Employee. This will create a record preventing the Employee from being chosen twice. assigned_to can also double as a record of who verified this Employee for future reference.
verified could be simply a Boolean field which keeps track if the Employee has already been verified and can be updated when the user confirms the verification
The actual selects can be done like this:
employees = Employee.objects.filter(assigned_to=None, verified=False)[:30]
Then
for emp in employees:
emp.assigned_to = user
emp.save()
Note: This can still potentially cause a race condition if 2 users make this request at exactly the same time. To avoid this, another possibility could be to partition the employee tables into groups for each user with no overlap. This would ensure that no 2 users would ever have the same employees
I create a ticket for a custmer from LDAP datasource,customer_id and customer_user_id column in ticket table show like below:
customer_id customer_user_id
zhangjq#**.com zhangjq
I think this is the foreign key which OTRS use to associate this ticket with an customer in LDAP,But,When after I update the value of customer_id and customer_user_id column,the customer info also displayed correctly in ticket view page:
Customer Infomation
Firstname: Junqian
Lastname:Zhang
login:zhangjq
Email:zhangjq#**.com
Comment:##2012.09.03
All of these information is read from LDAP.
So,how does otrs save relation between ticket and ldap customer in DB table? or OTRS has other way to manage the relation between ticket and ladp customer?
When setting up your LDAP-Connection for CustomerUser, you can define a CustomerKey. This is the key, OTRS is using to uniquely identify a customer.
When after I update the value of customer_id and customer_user_id column,the customer info also displayed correctly in ticket view page
What are exactly doing? If you are modifying in the SQL-Database, this won't have any impact on the data displayed within OTRS, as this data is cached.
If you are doing modifications in your LDAP, this won't change anything either of the customer-information displayed as this data only gets updated when you (re-)assing a customer to a ticket.
I have an application that allows for "contacts" to be made completely customized. My method of doing that is letting the administrator setup all of the fields allowed for the contact. My database is as follows:
Contacts
id
active
lastactive
created_on
Fields
id
label
FieldValues
id
fieldid
contactid
response
So the contact table only tells whether they are active and their identifier; the fields tables only holds the label of the field and identifier, and the fieldvalues table is what actually holds the data for contacts (name, address, etc.)
So this setup has worked just fine for me up until now. The client would like to be able to pull a cumulative report, but say state of all the contacts in a certain city. Effectively the data would have to look like the following
California (from fields table)
Costa Mesa - (from fields table) 5 - (counted in fieldvalues table)
Newport 2
Connecticut
Wallingford 2
Clinton 2
Berlin 5
The state field might be id 6 and the city field might be id 4. I don't know if I have just been looking at this code way to long to figure it out or what,
The SQL to create those three tables can be found at https://s3.amazonaws.com/davejlong/Contact.sql
You've got an Entity Attribute Value (EAV) model. Use the field and fieldvalue tables for searching only - the WHERE caluse. Then make life easier by keeping the full entity's data in a CLOB off the main table (e.g. Contacts.data) in a serialized format (WDDX is good for this). Read the data column out, deserialize, and work with on the server side. This is much easier than the myriad of joins you'd need to do otherwise to reproduce the fully hydrated entity from an EAV setup.