How to run lynx Automate Browsing Actions in background - lynx

I am trying to extract some contents from one of my website for my report works.
I am unable to continue session in lynx.
So I am using an automate browser action script to login and print the page to a file and I am using that file to get my report.
But whenever, I run the script, it shows the browser actions in foreground. I dont want this and I would like to mute. i.e I dont want to see the browser action.
I am using the following method to run automate script.
lynx -cmd_log=/tmp/newscript http://example.com
Script:
#!/bin/bash
lynx -cmd_script=/tmp/newscript http://example.com

I Found out the answer for this question.
We can modify the line to the given below line to make it silent.
lynx -cmd_script=/tmp/newscript http://example.com > /dev/null
Thank you all.

Related

Why I always getting logs Auth::user() on null in my blade template?

I use {{ Auth::user()->name }} on all of my blade templates. It successfully shown the user's name though, but why in my logs it written Auth::user()->name is Trying to get property 'name' of non-object. These logs just make things go crazy.
What am I doing wrong? Thanks for your help!
Your code might fail on below code since you don't have a check here if you are logged in or not!
{{ auth()->user()->name }}
First clear out the log file empty it. Assuming its storage/logs/laravel.log
In terminal use following command to look for the live changes in your log file.
{project_root} > tail -f storage/logs/laravel.log
if above doesn't work use following:
{project_root} > sudo tail -f storage/logs/laravel.log
Now, you can keep taps on the log file if anything is logged in it.
Open your browser and terminal/console side by side and go through your App and look for the trigger point when the Exception is logged and shown on the terminal and start debugging from there.
Did you login your application. when you login then only you can get info from auth.
you don't worry about login sign up process in laravel.
just run php artisan make:auth. laravel scaffolding working for you.
this is simple template. you can customize what ever you want.

Access Webpage With Credentials and Cookies From Command Line

I am trying to access a proprietary website which provides access to a large database. The database is quite large (many billions of entries). Each entry in the database is a link to a webpage that is essentially a flat file containing the information that I need.
I have about 2000 entries from the database and their corresponding webpages in the database. I have two related issues that I am trying to resolve:
How to get wget (or any other similar program) to read cookie data. I downloaded my cookies from google chrome (using: https://chrome.google.com/webstore/detail/cookiestxt/njabckikapfpffapmjgojcnbfjonfjfg?hl=en) but for some reason the html downloaded by wget still cannot be rendered as a webpage. Similarly, I have not been able to get Google Chrome from the command line to read cookies. These cookies are needed to access the database, since they contain my credentials.
In my context, it would be OK if the webpage was downloaded as a PDF, but I cannot seem to figure out how to download a webpage as a pdf using wget or similar tools. I tried using automate-save-page-as (https://github.com/abiyani/automate-save-page-as) but I continuously get an error of the browser not being in my PATH.
I solved both of these issues:
Problem 1: I switched away from wget, curl and python's requests to simply using the selenium webdriver in python. Using selenium, I did not have to deal with issues such as passing cookies,headers, post and get, since it actually opens a browser. This also has a plus that as I was writing the script to use selenium, I could inspect the page and see what it was doing as it was doing it.
Problem 2: Selenium has a method called page_source, which downloaded the html of the webpage. When I tested it, it rendered the html correctly.

include $config->get in a external PHP function

Im building a script for opencart where it will send emails. Its not an extension, its a separate panel.
I have added an option in opencart where they can enable or disable this feature. The problem is that im unable to get the setting from the opencart into my script.
for an example i have the below code in my php script
if ($config->get('sendemails_status')) {
$store_id = $config->get('config_store_id');
}
When i run that code i get this error
Call to a member function get() on a non-object
Can someone let me know how can i get the above code working in my PHP script?
The error is due to permissions set on the cache folder. Setting that to 755 and the files to 555 with set you home and dry.
OR
in system/library/session.php replace start session with:
session_save_path(realpath(dirname($_SERVER['DOCUMENT_ROOT']) . '/../tmp'));
OR
You did not set that variables , please check in yo database for the settings.

Export a SiteCore Page

I am looking to export an entire sitecore page. Ideally, upon exporting the links would move from relative to absolutely, but that's not necessary. I want to download the html, css, images, everything from one of my pages. Any help would be greatly appreciated.
Assuming that you want to download one page only, you should be able to just use Save option in your browser. Most of the modern browsers supports all of your requirements. Sitecore web page is like any other html page.
If you want to download the whole site, you can try to use http://www.httrack.com/, wget or Firefox plugin as described here.
If none of those are enough, try to search for save entire site with images and css in your favourite search engine - there are plenty of other possibilites.
If I only have to locally save a page hosted in our Sitecore (really doesn't matter what's hosting it) I use wget.
This seems to work quite well:
wget.exe -E -H -k -K -p --no-check-certificate <your url>
You can change the parameters to recursively spider the whole site, if you do so make sure to check with your security department to not set of any alarms.

Saiku-Clicking on New query not displaying anything

I downloaded saiku (Saiku Server 2.4 (Including Foodmart DB)) from the following link -http://analytical-labs.com/downloads.php.
Following the installation notes, I have downloaded latest Apache software, JDK. As per instructions if I now go to localhost 8080 I see log on page. Upon logging with username and password as admin and admin I am into the Saiku main page.
When I click on 'New Query' button nothing is happening. I am not able to see any Foodmart Databse or Dimensions or anything.
Can someone help me where I have gone wrong?
I have got it working. Problem was after running start-saiku.bat file, the command prompt file was still running with 'Press enter to continue..'. Upon clicking this, the cmd prompt ended and now am able to browse through Saiku from localhost.