I have to write code which will automatically add/delete/edit products on that opencart website, so my question is the following :
How to add new products to the database automatically with php?there are some methods in opencart, so I can use them, or I have to write the code from scratch
P.S: an example will be great (suppose we have the product information in an array)
Currently I am doing something similar to this, my setup is as follows:
Script: this is where the items are added, deleted or updated. I write directly into the database tables (product, product_description, product_to_store, category_id, product_to_category) by use of MySQL statements.
Since my sync happens concurrently, I have a CRON job triggering the script.
Logging: to keep track of whats happening I added a database table called log where I keep a record of what files are being processed, at what time, errors etc.
My conclusion is that you should create a model and controller which will handle this request to automatically add/delete/edit products and then whenever you need to run this you can just navigate to that script. Another clever thing to do would be to secure that script on the back-end so that only you are able to navigate to it.
An example of a MySQL insert statement for products as follows:
$insert_product = "INSERT INTO " . DB_PREFIX . "product ( sku, quantity, stock_status_id, status, date_added, price, taxpercentage, supplier, department, webshop, schedule)
VALUES ('".$sku."','" .$onhand. "','7','1',NOW(),'".$price. "','" .$tax_percentage. "','" .$supplier. "','" .$department. "','" . $webshop . "','" .$schedule. "')";
Related
I want to implement a task similar to the one here Getting stock information in opencart 2.3.0.2, but my problem is that when deleting a product, the information about the warehouse will disappear. I want to enter the data into the table order_product, but I still don’t understand how. Help me please.
I can't figure out how to write information about the rest of the goods to the database. I don't understand how to implement it.
Hmm... You want to save product`s stock information in order_product table?
So first you must to add custom field in oc_order_product
ALTER TABLE `oc_order_product` ADD `stock_status` INT(11) NOT NULL AFTER `reward`;
Then in catalog/controller/checkout/confirm.php in $order_data['products'][] = array( You must add key - stock_status
'stock' => $this->model_catalog_product->getProduct($product['product_id'])['stock_status'],
So we modified our controller in customer side, and now our data receive the stock_status.
After receive products data from customer side, we must save in oc_order_table
Open path - catalog/model/checkout/order.php and edit addOrder function... After reward = '" . (int)$product['reward'] . "' in products loop add after comma:
stock_status = '" . $this->db->escape(product['stock_status']) . "'
I write an example, how to save data from customer side about product stock data.
I have an API which reads from two main tables Table A and Table B.
Table A has a column which acts as foreign key to Table B entries.
Now inside api flow, I have a method which runs below logic.
Raw SQL -> Joining table A with some other tables and fetching entries which has an active status in Table A.
From result of previous query we take the values from Table A column and fetch related rows from Table B using Django Models.
It is like
query = "Select * from A where status = 1" #Very simplified query just for example
cursor = db.connection.cursor()
cursor.execute(query)
results = cursor.fetchAll()
list_of_values = get_values_for_table_B(results)
b_records = list(B.objects.filter(values__in=list_of_values))
Now there is a background process which will enter or update new data in Table A and Table B. That process is doing everything using models and utilizing
with transaction.atomic():
do_update_entries()
However, the update is not just updating old row. It is like deleting old row and deleting related rows in Table B and then new rows are added to both tables.
Now the problem is if I run api and background job separately then everything is good, but when both are ran simultaneously then for many api calls the second query of Table B fails to get any data because the transaction executed in below manner:
Table A RAW Transaction executes and read old data
Background Job runs in a single txn and delete old data and enter new data. Having different foreign key values that relates it to Table B.
Table B Models read query executes which refers to values already deleted by previous txn, hence no records
So, for reading everything in a single txn I have tried below options
with transaction.atomic():
# Raw SQL for Table A
# Models query for Table B
This didn't worked and I am still getting same issue.
I tried another way around
transaction.set_autocommit(False)
Raw SQl for Table A
Models query for Table B
transaction.commit()
transaction.set_autocommit(True)
But this didn't work either. How can I read both queries in a single transaction so background job updates should not affect this read process.
I am trying to build a page where it returns table for the logged in user. And then you can use the filter to look at other users records .
User = USERPRINCIPALNAME()
I am having problems filtering the table for the logged in user . Without using row level security with data model changes like below . Is there a way for the Power BI table to return data just for the logged in user ? No SSAS involved.
https://medium.com/#barrasa8/dynamic-data-masking-in-powerbi-based-on-rls-927eb6a34e5d**strong text**
The data model is a FACT table linked to a USER dimension. In the User Dimension , there is an email address which is what the USERPRINCIPALNAME() resolves to.
I thought about a DAX summary table with summarise and may try that later . Then 2 buttons on the page , one to show current logged in user and the other button just gives you details about all other users data and work with all the filters on the page .
So basic want is
Logged In User : X
Table - Col 1 , 2 ,3 .... ( Filtered for User x only by default )
Then I would like a way for the logged in user then to see others user data easily.
Unfortunately there is no way to use USERNAME() or USERPRINCIPALNAME() in DAX measures.
What you're left with is using row level security but that would mean it'll not be possible to show the data of other users.
The best alternative I can think of is to load the data twice. Then set Row Level Security on one table, display that one as "Your data", don't use RLS on the second table and display that as "Other people's data". Put them side by side for easy comparison.
I just wrote a blog post about a similar challenge on how to make sure you can still filter both tables: https://www.linkedin.com/pulse/calculating-totals-row-level-security-using-powerbi-van-der-pasch
I'm pretty novice at programming (recently learned functions), and have found myself re-writing the same "insert into mysql table" function (below) from script to script... mainly to just modify these two section - (name,insert_ts) &&& VALUES (%s, %s)
Is there a good way to re-write the below to accept ANY number of values , based on length of a tuple that contains values as well as inserting the column headers based on 'labels' list? VALUES (%s, %s) and this part (name,insert_ts)
list_of_tuples = [] #list of records to be inserted.
#take a list of dictionaries - and create a list of tuples in proper format/order
for dict1 in output:
one_list = []
one_list.extend((dict1['name'],dict1['insert_ts']))
list_of_tuples.append(tuple(one_list))
labels = ['name', 'insert_ts']
#db_write accepts table name as str, labels as str, and output as list of tuples
def db_write(table,labels,output):
local_cursor.executemany(""" INSERT INTO my_table
(name,insert_ts) #this is pulled from 'labels'
VALUES (%s, %s) #number of %s comes from len(labels)
"""
, list_of_tuples)
local_db.commit()
local_db.close()
#print 'done posting!'
Or, is there a better way to accomplish what I'm trying to do, using mysqldb?
Thank you all in advance!
After a bit of experience (3 months, heh), wanted to update everyone on the solution that seems to work pretty well!
Instead of using mysqldb, I spent some time learning how to use SQL Alchemy python package, and would recommend everyone do the same!
SQL Alchemy allows you to:
1) Define a table within python code (used Excel to come up with column names, etc).
2) Most important! You can pass on a dictionary to SQL Alchemy, and as long as dictionary's key names match the table's key names, everything will magically get posted to your SQL table. If you have 60 columns in your sql table, but your dict has only two keys - BAM, SQL Alchemy will take care of everything and post just the two values, and leave the other values in MySQL as blanks. MAGIC!
I am building an app in Symfony2, using Doctrine2 with mysql. I would like to use a fulltext search. I can't find much on how to implement this - right now I'm stuck on how to set the table engine to myisam.
It seems that it's not possible to set the table type using annotations. Also, if I did it manually by running an "ALTER TABLE" query, I'm not sure if Doctrine2 will continue to work properly - does it depend on the InnoDB foreign keys?
Is there a better place to ask these questions?
INTRODUCTION
Doctrine2 uses InnoDB which supports Foreign Keys used in Doctrine associations. But as MyISAM does not support this yet, you can not use MyISAM to manage Doctrine Entities.
On the other side, MySQL v5.6, currently in development, will bring the support of InnoDB FTS and so will enable the Full-Text search in InnoDB tables.
SOLUTIONS
So there are two solutions :
Using the MySQL v5.6 at your own risks and hacking a bit Doctrine to implement a MATCH AGAINST method : link in french... (I could translate if needed but there still are bugs and I would not recommend this solution)
As described by quickshifti, creating a MyISAM table with fulltext index just to perform the search on. As Doctrine2 allows native SQL requests and as you can map this request to an entity (details here).
EXAMPLE FOR THE 2nd SOLUTION
Consider the following tables :
table 'user' : InnoDB [id, name, email]
table 'search_user : MyISAM [user_id, name -> FULLTEXT]
Then you just have to write a search request with a JOIN and mapping (in a repository) :
<?php
public function searchUser($string) {
// 1. Mapping
$rsm = new ResultSetMapping();
$rsm->addEntityResult('Acme\DefaultBundle\Entity\User', 'u');
$rsm->addFieldResult('u', 'id', 'id');
$rsm->addFieldResult('u', 'name', 'name');
$rsm->addFieldResult('u', 'email', 'email');
// 2. Native SQL
$sql = 'SELECT u.id, u.name FROM search_user AS s JOIN user AS u ON s.user_id = u.id WHERE MATCH(s.name) AGAINST($string IN BOOLEAN MODE)> 0;
// 3. Run the query
$query = $this->_em->createNativeQuery($sql, $rsm);
// 4. Get the results as Entities !
$results = $query->getResult();
return $results;
}
?>
But the FULLTEXT index needs to stay up-to-date. Instead of using a cron task, you can add triggers (INSERT, UPDATE and DELETE) like this :
CREATE TRIGGER trigger_insert_search_user
AFTER INSERT ON user
FOR EACH ROW
INSERT INTO search_user SET user_id=NEW.id, name=NEW.name;
CREATE TRIGGER trigger_update_search_user
AFTER UPDATE ON user
FOR EACH ROW
UPDATE search_user SET name=name WHERE user_id=OLD.id;
CREATE TRIGGER trigger_delete_search_user
AFTER DELETE ON user
FOR EACH ROW
DELETE FROM search_user WHERE user_id=OLD.id;
So that your search_user table will always get the last changes.
Of course, this is just an example, I wanted to keep it simple, and I know this query could be done with a LIKE.
Doctrine ditched the fulltext Searchable feature from v1 on the move to Doctrine2. You will likely have to roll your own support for a fulltext search in Doctrine2.
I'm considering using migrations to generate the tables themselves, running the search queries w/ the native SQL query option to get sets of ids that refer to tables managed by Doctrine, then using said sets of ids to hydrate records normally through Doctrine.
Will probly cron something periodic to update the fulltext tables.