Remove (delete, replace or unlink) old image file while updating - delete-file

I have implemented the (bulletproof) uploader in my script. But when I try to update my records by changing the DB table row for example:
I want to update and (replace) the old game image with a new game image (I am using a rename method)! I find that the new image uploaded and renamed successfully, but the old image still exists in the thumbs folder.
Here is my code:
$newthumb=false;
if($continue){
$bulletProof = new ImageUploader\BulletProof;
try{
if($_FILES && $_FILES['avaimage']['name']){
$name3=strtolower($game_name);
$name3=str_replace(' ', '-',$name3);
$thumbnail=$name3.'-'.generateRandomString();
$pic1=$bulletProof->uploadDir("../thumbs")->shrink(array("height"=>200, "width"=>200))->upload($_FILES['avaimage'],$thumbnail);
if($pic1){
$thumb=$pic1;
}
}
}catch(\ImageUploader\ImageUploaderException $e){
$error=$e->getMessage();
$continue=false;
}
}
I used to delete the old files while updating to prevent exhausted disk space and keep my site clean and easy to manage, but this way if I changed the game image 10 times, I'll get 10 diff images and so on, This is a bad thing.
This is what I get:
I can explain as follow:
1- User edit profile
2- User choose a photo for the profile
3- User click save to update the profile.
Each time the user uploads a new (or even the same profile photo) it uploaded (and renamed) to be as if a new one, and leave the old (prev photo) intact.
Screenshot#1
So, a new photo uploaded without removing the old one, If I have 1 million users, and every user uploaded 10 photos (and only uses 1) then I'll get 9 million unused orphaned files.
Even if I refreshed the page, a new photo uploaded (because of browser history).
Check this#: I click refresh 4 times (before leaving the page)
And I get this:
.
Can you imagine how orphaned files are there?
Any help will be appriciated.
I am using PHP PDO for updating the records.
Thanks

Okay! thank you everyone for watching this. I have come up with a solution, as follow:
if($_FILES && $_FILES['avaimage']['name']){
//check current photo
$stmt = $db->prepare("SELECT avatar FROM users WHERE id=:uid LIMIT 1");
$stmt->bindValue(':uid', $uid, PDO::PARAM_INT);
$stmt->execute();
$result = $stmt->fetchAll(PDO::FETCH_ASSOC);
$icheck=$result[0];
if ($icheck['avatar']) {
#unlink("images/avatars/".$icheck['avatar']);
}else{
$error='Please check photo';
$continue=false;
}
}
Now, Only one photo can be uploaded and nothing more can be added if the user refreshed the page, only one image will be there.
Thank you.

Related

Power BI parameter for image

I wanted to know if there's a way to have a parameter that holds an image.
The problem: I have 10 PBIX, each one containing over 20 screens. every report has the clients logo. it is an image let's say the client changed his logo, that means I'll need to change the logo 200 times!! (10 PBIX X 20 screens).
Wanted solution: is there a way to hold a parameter that will contain the logo, therefore changing the logo only 10 times (for each PBIX) that will be a HUGE improvement for my time and productivity. sorry for not including pictures this is to keep my client anonymous
Thank you!
Here is how you can achieve your requirement as stated below-
Step-1 create a custom function in Power Query as stated below. The function is taken from Here and you can check this blog for more details if you are interested.
let
BinaryToImage = (BinaryContent as binary) as text =>
let
Base64 = "data:image/jpeg;base64, " & Binary.ToText(BinaryContent, BinaryEncoding.Base64)
in
Base64
in
BinaryToImage
Here below is the image for the custom function-
Step-2 create report wise folder in your local directory and keep your image there. I am considering one folder here, but you can do your own once you get the workaround.
Step-3 Create a data connection for Folder as shown below and point to the folder you kept your logo/image-
And now you will have a table as below with one row as there are one image only. I named the table as PBI_Images.
Here basically Content column holding the Binary data for the image and we have to convert it to Base64 using the Function we created earlier.
Step-4 Now lets Invoke the function in our table. Select the table and do what shown in the below image-
Now you have a new column with data like below-
Step-5 Get back to report by clicking "Close & Apply" button.
Step-6 Download the following App from AppStore-
Step-7 Just add the new downloaded visual to your report and put the newly created column to the "Image URL" field as below-
Step-8 Now add the Image visual to all your report pages and tag the Image URL as stated in step-7.
You are all done now. Just Change the Image in your source file keeping the same name and after that refresh your table (PBI_Images my case) and you will see Image changes everywhere in the report.
Finally, if everything works expected, you can first try will all logos from one single folder. If do not work, go for folder per customer as mentioned earlier.

File browse Item uploading to BLOB column

One of the tables in my DB has a BLOB column that stores images. So now I am setting up the page for this table. I have a bunch of IGs and such to process most of the data, but I set up a modal page to process the image.
The modal page gets the ID (which is the PK) into an item, and then it reads the image currently in the table into a 'Display Image' item. And I have a 'File browse...' item to upload new images.
Except I cannot get it to save.
I initially started with the display image item just having Setting Based on : BLOB column returned by SQL statement, as I couldn't get the source to work with the SQL query(Error Expected CHAR, source is BLOB), I managed to resolve this by putting automatic row processing on the page and then having the source be a column.
So now it displays well, with no errors.
But the save does nothing. I have tried saving by having the File browse reference the column and using automatic row processing, and there is just nothing. No errors pop up, but it just does nothing.
I have tried saving to APEX_APPLICATION_TEMP_FILES and then having a PLSQL DA or a PLSQL process to
SELECT blob_content
FROM APEX_APPLICATION_TEMP_FILES
WHERE name = :FILE_BROWSER_ITEM
And insert this into the table, but it just pops up a 'No data found' error.
I have gone through every bit of intel my google-fu has found, but I have failed to find a solution.
So I would appreciate any insight any of you might have.
Since noone answered, I stepped away from it for a bit and tried again at a later date. And now I made it work finaly.
I set up automatic row fetch and automatic row processing but disabled both of them, for some reason automatic row processing must be there so that you can have the source for the display image and file browse be the column.
Then I set the browse file to load into apex_application_temp_files.
and set up a process to be executed at page submit(but after the automatic row processing even though its disabled and shouldnt matter). The process executing the following code:
BEGIN
UPDATE MY_TABLE
SET MY_IMAGE = (SELECT blob_content
FROM apex_application_temp_files
WHERE name = :FILE_BROWSER_ITEM)
WHERE id = :ID;
END;
And I execute the page submit through a button with the action page submit and Database action being SQL UPDATE action.
I am guessing a fair bit of the things I did and have set up dont even matter, but I dont dare remove them for fear of breaking shit. What I have described here finaly works for me, and if you stumble upon this then you can try and I hope it works for you too, and you can try removing some of the disabled stuff and see if it still works.

Parse large file and paginate / load its parts with scrolling

I'm looking for suggestions and the most Django way of loading large variable content (say massive 10,000 lines list) part by part to user page to display only some lines before user asks for more.
This is a detailed scenario (I hope it makes sense to you. It is just a simple example to help dealing with large template variables and pagination):
User goes to website.com/searchfiles which is hosted on my Django backend and returned as a template
searchfiles.html template contains one form with Select drop-down menu to let choose a file that already exists on server (say there are 20 massive log files). Below the drop-down menu there is a text box that allows user to enter a regular expression string. So only two items in the form.
P.S. Each file is usually pretty big e.g. 20-30MB
When user selects the file and enters regular expression in the text box and clicks on "Submit", HTTP POST is made
Django backend receives POST, reads the filename + regexp string and executes function dosearch(FILE, pattern)
dosearch function does something like this:
dosearch(FILE, pattern):
result = []
fh = open(FILE, 'r')
for line in fh:
if re.match(pattern, line):
result.append(line)
return result
Now, result is a list that, depending on pattern, can be pretty large (e.g. 10-20MB). Processing of the file is completed now and I want to present user with "result" variable. After HTTP post, user is redirected to website.com/parsed.
As you can imagine, my goal in step 7 is to return this variable to the user after HTTP POST. But because "result" variable can be huge, I don't want to just dump, say, 10,000 lines of output directly to the page. What I want to achieve is the way that maybe first 200 lines are displayed, and as user is scrolling down, additional 200 lines are loaded once user reaches the bottom of the page.
To keep it simple, ignore the scroll part. User can be also presented with [NEXT] button to click and load additional 200 entries and so on.
What is the most Django way to achieve this? Do I need to save results variable to database and use Ajax?
Also assume that multiple users are going to use the very same page/website so I need to be able to distinguish between two users searching two different files at the same time.
When user navigates out, "result" variable that was generated should be destroyed from the memory.
I can think of two possibilities:
A. using a Model
class ResultLine(models.Model):
user = models.ForeignKey(User)
sequence_number = models.IntegerField()
line = models.CharField(max_length=1000)
created_at = models.DateTimeField(auto_now_add=True)
After parsing the file you would store each result line as a instance of this model, using sequence_number to specify the order of the lines.
In your result view you could use pagination or generic ListView to show the first lines, or use AJAX to fetch more lines.
You would need to add a delete button to clear the users data from this model, or run periodical jobs (maybe using crontab and a custom management command) to delete old result lines.
B. using session data
Another possibility would be to store the result in the users session.
request.session['result_list'] = dosearch(FILE, pattern)
Depending on the session engine there could be size restrictions; this post states that the database-backed sessions are only limited by the database engine (which means you could store many MB or even GB of data in the session).
Also, your server needs sufficient RAM to hold the whole result list of multiple users.
And later in your result view you just read from the session instead from a model.
Performance-wise there are differences: both approaches store the data in the database (with database-backed sessions), but option A allows you to do partial reads in your result view, while option B always reads the whole result list into memory on each request (because the whole session dict is stored in encoded format).

Exclusive dashboard is not showing reports in Sitecore when register local search using DMS

I am working on register local search terms in Sitecore DMS. I took help from the following blog. First of all I registered a Search page Event with named "Search".
protected void RegisterSearchPageEvent(string searchQuery)
{
if (!Tracker.IsActive || Tracker.Visitor == null || Tracker.Visitor.CurrentVisit == null)
return;
var page = Tracker.Visitor.CurrentVisit.CurrentPage;
if (Tracker.Visitor.CurrentVisit.PreviousPage != null)
page = Tracker.Visitor.CurrentVisit.PreviousPage;
page.Register(new PageEventData("Search")
{
Data = searchQuery,
DataKey = searchQuery.ToLowerInvariant(),
Text = searchQuery
});
}
I also defined the page event “Search” in Sitecore. Now to display the Report in Executive Dashboard I went under the "Site Search" but it does not display anything.
I configured the .config file located here:
\sitecore\shell\Applications\Reports\Dashboard\Configuration.config
Here there is a setting called “MinimumVisitsFilter“. I set it from 50 to 5 and also entered the search keywords - more than 50 times. The main point is here is that the above code is inserting the keyword into the Analytics database. Is there any SQL Query problem for Executive Dashboard?
Even with the MinimumVisitsFilter set to 5, you still need to generate 5 unique visits to start seeing any data. On your local dev environment you could probably set this one as low as 1 or even 0 - but I would not recommend you did this on the live environment.
Also make sure all the basics are in place; Analytics is active (Sitecore.Analytics.config), the database is set up and so on.
I followed the same post when registering local search, and the procedure Brian describes here does work.
The above problem is due to browser cache. Sitecore DMS Search event stores the single value for one word if we don't close the browser or need to search from the different browser to store the value. If this kind of problem occurs, then search for different keywords by closing the browser and then clearing the cache. This works for me.

Query to set a value for all items in Amazon SimpleDB

I am trying to to set a value for all items in a domain that do not already have a certain value and have an additional flag set.
Basically for all my items,
SET ValueA to 100 if ValueB is 0
But I am confused about how to achieve this. So far ive been setting the value for individual items by just using a PutRequest like this:
ArrayList<ReplaceableAttribute> newAttributes = new ArrayList<ReplaceableAttribute>();
newAttributes.add(new ReplaceableAttribute("ValueA",Integer.toString(100), true));
PutAttributesRequest newRequest = new PutAttributesRequest();
newRequest.setDomainName(usersDomain);
newRequest.setItemName(userID);
newRequest.setAttributes(newAttributes);
sdb.putAttributes(newRequest);
This works for an individual item and requires me to first get the item name (userID). Does this means that I have to "list" all of my items and do this 1 by 1?
I suppose that since I have around 19000+ items I would also have to use the token to get the next set after the 2000 limit right?
Isn't there a more efficient way? This might not be so heavy right now but I expect to eventually have over 100k items.
PD: I am using the AWS Java SDK for Eclipse.
If you are talking about how you can do it grammatically by writing your own code then Yes. First you have to know all item name i.e in your case UserID and then you need to set a value one by one. You can use BatchPUTAttribute in this case. Using Batch PUT you can update 25 items in one request. You can do 5 to 20 BatchPutAttribute requests in parallel threads. Know more to tune the performance.
If you need to do it somehow in tricky way then you can use SDBExplorer. Please Remember it will set 100 for all items because SDBExplorer does not support conditional PUT. If you would like to set it anyway then Follow these steps-
Download SDBExplorer zip version form download page.
Extract it and run the executable.
Download 30 days trial license.
Once license has been downloaded main UI will open.
Provide valid Access Key and Secret keys and click on "GO" button.
You will see list of domains in Left side tree.
Right click on the domain in which you would like to set value for all item.
Choose "Export to CSV" option.
Export the content of domain into CSV. http://www.sdbexplorer.com/documentation/simpledb--how-to-export-domain-in-csv-using-sdbexplorer.html
Go to path where your domain has exported.
Open CSV file.
Your first column is item name.
Delete all columns other then item Name and column "ValueA".
Set 100 for all item name under "ValueA" column.
Save the CSV.
Go to the SDBExplorer main UI.
Select the same domain.
Click on "Import" option from tool bar.
A panel will open.
Now Import the data into the Domain. http://www.sdbexplorer.com/documentation/simpledb--how-to-upload-csv-file-data-and-specifying-column-as-amazon-simple-db-item-name.html
Once import is done, explore the domain and you will find the value 100 set to all items for column ValueA.
Please try the steps first on any dummy domain.
What exactly I am trying to suggest you?
To know all item name in your domain, I am suggesting you to export all content of your domain into CSV file at local file system. Once you get all item name in CSV, keep only one column "ValueA". Set "100" for all the items in CSV file and upload/import the content back into domain.
Discloser: I am one of the developer of SDBExplorer.