Laravel 5 checkout form insert foreign key into transaction table - foreign-keys

Right now i'm study a flow of ecommerce site using laravel 5.0 and crinsane laravel package .
I have setup 2 tables
Which is transactions and orders table
The relations is orders has many transactions (1 transaction 1 type of item ) , and transactions belong to orders .
So , in transactions there is foreign key order_id which references to order tables id .
In routes I set route::post('checkout','OrderController#checkoutpost');
public function checkoutpost()
{
// Get input from checkout forms
$input = Request::all();
// Insert forms data into Order table
Order::create($input);
// Retrieve the session data and inserting into Transaction table
$formid = str_random();
$cart_content = Cart::content();
foreach ($cart_content as $cart) {
$transaction = new Transaction();
$products = Product::find($cart->id);
$transaction->product_id = $cart->id;
$transaction->form_id = $formid;
$transaction->qty = $cart->qty;
$transaction->total_price = $cart->price * $cart->qty;
// Here is the problem , how to assign this transaction>order_id into our "id" that just inserted earlier ..
$transaction->order_id = $orders;
$transaction->save();
Cart::destroy();
return redirect('product/checkout');
}
}
The problem is how to assign order_id with the id of data that we just insert earlier?
Any feedback were really appreciated, thank you

Firstly, when creating the Order you need to assign the return value:
// An instance of Order is returned, so the id is accessible.
$order = Order::create($input);
Then you can use:
// Remember to make 'id' a fillable field on the Order model if you want to do it this way.
$transaction->order_id = $order->id;

Have you try this AvoRed an Laravel E commerce its almost fully featured e commerce for Laravel if you like it give it a try and let me know the feedback if you have any.
AvoRed An Laravel E commerce

Related

save() not updating the table in database

I have to update 2 tables in database every time user visit a page.
this is working:
order_order = OrderTable.no_objects.get(order_number=order_id)
order_order.status = 'updated status'
order_order.save()
This is not working: (related with first table through foreign key)
order_item = ItemsTable.objects.filter(order_id__order_number=order_id)
for i in range(order_item.count()):
order_item[i].qty = i + 5
order_item[i].qty_nextMonth = i + 30
order_item[i].save()
Can anyone tell what's wrong in 2nd part of code. It's not updating the database.
Each time you write order_item[i], you make a separate fetch, and you return an ItemsTable item from the database. That means that if you set the .qty = ... attribute of that object, it is simply ignored, since the next order_item[i] will trigger a new fetch. Your order_item[i].save() updates the record in the database, but with the values that have been retrieved just from a fetch from the database.
It is better to simply iterate over the queryset, and thus maintain a reference to the same ItemsTable object:
order_items = ItemsTable.objects.filter(order_id__order_number=order_id)
for i, order_item in enumerate(order_items):
order_item.qty = i + 5
order_item.qty_nextMonth = i + 30
order_item.save()
This is more efficient as well, since enumerating over a queryset forces evaluation, and hence you fetch all objects at once.
As of django-2.2, you can use .bulk_update(..) [Django-doc], to update in bulk an iterable of objects.

how to display different price for same product in opencart?

Want to display different prices for a same product. Multiple sellers will be selling the same product with their respective prices is what I want to display if someone views a product....
If I understand you, It's not a coding case.
1 - go to Admin/Sales/Customers/Customers Group, and create as many groups as you want.
2 - go to Admin/Catalog/Products and edit an exiting product or create a new product. in Special tab you can assign different prices for each group created on step 1.
My first thought is to make them separate products, however you might want to display a product on a single page with a list of sellers to choose from, in which case...
The different sellers are product options!
The way I have set this up is by adding fields for them in the SQL products table such as price_a, price_b, price_c and then adding another field to the customers table called price_category with the relevant prefix (A,B,C). Then I wrote a function under getProduct (catalog/model/catalog/product.php) to cater for this.
The reason I took this route is because my files are uploaded automatically to the table and links to another program which generates invoices and sends the result back to the website automatically.
My Function is as follows:
if ($query->rows) {
foreach ($query1->rows as $row) {
$price_category = strtolower($row['price_category']);
$debtor_class = $row['debtor_class'];
$price_percentage = $row['price_percentage'];
}
} else {
$price = ($query->row['discount'] ? $query->row['discount'] : $query->row['price']);
$special = $query->row['special'];
}
$product_special_query = $this->db->query("SELECT price, to_qty, bonus_qty FROM product_special WHERE debtor_class = '".$debtor_class."' AND product_id = '".(int)$product_id."' AND customer_group_id = '".(int)$customer_group_id. "'");

django annotate count filter

I am trying to count daily records for some model, but I would like the count was made only for records with some fk field = xy so I get list with days where there was a new record created but some may return 0.
class SomeModel(models.Model):
place = models.ForeignKey(Place)
note = models.TextField()
time_added = models.DateTimeField()
Say There's a Place with name="NewYork"
data = SomeModel.objects.extra({'created': "date(time_added)"}).values('created').annotate(placed_in_ny_count=Count('id'))
This works, but shows all records.. all places.
Tried with filtering, but it does not return days, where there was no record with place.name="NewYork". That's not what I need.
It looks as though you want to know, for each day on which any object was added, how many of the objects created on that day have a place whose name is New York. (Let me know if I've misunderstood.) In SQL that needs an outer join:
SELECT m.id, date(m.time_added) AS created, count(p.id) AS count
FROM myapp_somemodel AS m
LEFT OUTER JOIN myapp_place AS p
ON m.place_id = p.id
AND p.name = 'New York'
GROUP BY created
So you can always express this in Django using a raw SQL query:
for o in SomeModel.objects.raw('SELECT ...'): # query as above
print 'On {0}, {1} objects were added in New York'.format(o.created, o.count)
Notes:
I haven't tried to work out if this is expressible in Django's query language; it may be, but as the developers say, the database API is "a shortcut but not necessarily an end-all-be-all.")
The m.id is superfluous in the SQL query, but Django requires that "the primary key ... must always be included in a raw query".
You probably don't want to write the literal 'New York' into your query, so pass a parameter instead: raw('SELECT ... AND p.name = %s ...', [placename]).

Django AutoField not returning new primary_key

We've got a small problem with a Django project we're working on and our postgresql database.
The project we're working on is a site/db conversion from a PHP site to a django site. So we used inspect db to generate the models from the current PHP backend.
It gave us this and we added the primary_key and unique equals True:
class Company(models.Model):
companyid = models.IntegerField(primary_key=True,unique=True)
...
...
That didn't seem to be working when we finally got to saving a new Company entry. It would return a not-null constraint error, so we migrated to an AutoField like below:
class Company(models.Model):
companyid = models.AutoField(primary_key=True)
...
...
This saves the Company entry fine but the problem is when we do
result = form.save()
We can't do
result.pk or result.companyid
to get the newly given Primary Key in the database (yet we can see that it has been given a proper companyid in the database.
We are at a loss for what is happening. Any ideas or answers would be greatly appreciated, thanks!
I just ran into the same thing, but during a django upgrade of a project with a lot of history. What a pain...
Anyway, the problem seems to result from the way django's postgresql backend gets the primary key for a newly created object: it uses pg_get_serial_sequence to resolve the sequence for a table's primary key. In my case, the id column wasn't created with a serial type, but rather with an integer, which means that my sequence isn't properly connected to the table.column.
The following is based on a table with the create statement, you'll have to adjust your table names, columns and sequence names according to your situation:
CREATE TABLE "mike_test" (
"id" integer NOT NULL PRIMARY KEY,
"somefield" varchar(30) NOT NULL UNIQUE
);
The solution if you're using postgresql 8.3 or later is pretty easy:
ALTER SEQUENCE mike_test_id_seq OWNED BY mike_test.id;
If you're using 8.1 though, things are a little muckier. I recreated my column with the following (simplest) case:
ALTER TABLE mike_test ADD COLUMN temp_id serial NOT NULL;
UPDATE mike_test SET temp_id = id;
ALTER TABLE mike_test DROP COLUMN id;
ALTER TABLE mike_test ADD COLUMN id serial NOT NULL PRIMARY KEY;
UPDATE mike_test SET id = temp_id;
ALTER TABLE mike_test DROP COLUMN temp_id;
SELECT setval('mike_test_id_seq', (SELECT MAX(id) FROM mike_test));
If your column is involved in any other constraints, you'll have even more fun with it.

Automate the generation of natural keys

I'm studying a way to serialize part of the data in database A and deserialize it in database B (a sort of save/restore between different installations) and I've had a look to Django natural keys to avoid problems due to duplicated IDs.
The only issue is that I should add a custom manager and a new method to all my models. Is there a way to make Django automatically generate natural keys by looking at unique=True or unique_togheter fields?
Please note this answer has nothing to do with Django, but hopefully give you another alternative to think about.
You didn't mention your database, however, in SQL Server there is a BINARY_CHECKSUM() keyword you can use to give you a unique value for the data held in the row. Think of it as a hash against all the fields in the row.
This checksum method can be used to update a database from another by checking if local row checksum <> remote row checksum.
This SQL below will update a local database from a remote database. It won't insert new rows, for that you use insert ... where id > #MaxLocalID
SELECT delivery_item_id, BINARY_CHECKSUM(*) AS bc
INTO #DI
FROM [REMOTE.NETWORK.LOCAL].YourDatabase.dbo.delivery_item di
SELECT delivery_item_id, BINARY_CHECKSUM(*) AS bc
INTO #DI_local
FROM delivery_item di
-- Get rid of items that already match
DELETE FROM #DI_local
WHERE delivery_item_id IN (SELECT l.delivery_item_id
FROM #DI x, #DI_local l
WHERE l.delivery_item_id = x.delivery_item_id
AND l.bc = x.bc)
DROP TABLE #DI
UPDATE DI
SET engineer_id = X.engineer_id,
... -- Set other fields here
FROM delivery_item DI,
[REMOTE.NETWORK.LOCAL].YourDatabase.dbo.delivery_item x,
#DI_local L
WHERE x.delivery_item_id = L.delivery_item_id
AND DI.delivery_item_id = L.delivery_item_id
DROP TABLE #DI_local
For the above to work, you will need a linked server between your local database and the remote database:
-- Create linked server if you don't have one already
IF NOT EXISTS ( SELECT srv.name
FROM sys.servers srv
WHERE srv.server_id != 0
AND srv.name = N'REMOTE.NETWORK.LOCAL' )
BEGIN
EXEC master.dbo.sp_addlinkedserver #server = N'REMOTE.NETWORK.LOCAL',
#srvproduct = N'SQL Server'
EXEC master.dbo.sp_addlinkedsrvlogin
#rmtsrvname = N'REMOTE.NETWORK.LOCAL',
#useself = N'False', #locallogin = NULL,
#rmtuser = N'your user name',
#rmtpassword = 'your password'
END
GO
In that case you should use a GUID as your key. The database can automatically generate these for you. Google uniqueidentifier. We have 50+ warehouses all inserting data remotely and send their data up to our primary database using SQL Server replication. They all use a GUID as the primary key as this is guaranteed to be unique. It works very well.
my solution has nothing to do with natural keys but uses picke/unpickle.
It's not the most efficient way, but it's simple and easy to adapt to your code. I don't know if it works with a complex db structure, but if this is not your case give it a try!
when connected to db A:
import pickle
records_a = your_model.objects.filter(...)
f = open("pickled.records_a.txt", 'wb')
pickle.dump(records_a, f)
f.close()
then move the file and when connected to db B run:
import pickle
records_a = pickle.load(open('pickled.records_a.txt'))
for r in records_a:
r.id = None
r.save()
Hope this helps
make a custom base model by extending models.Model class, and write your generic manager inside it, and acustom .save() method then edit your models to extend the custome base model. this will have no side effect on your db tables structure nor old saved data, except when you update some old rows. and if you had old data try to make a fake update to all your recoreds.