Unload source file with GDB - gdb

(gdb) source script.py loaded script file to GDB
How to unload that script? How to unload all loaded script or view all script that loaded ?

How to unload that script? How to unload all loaded script or view all script that loaded ?
Think of (gdb) prompt as a shell. You can't "unload" a script anymore than you can "untype" a command you type into the shell:
$ FOO="bar"
$ source script.sh
Neither of above commands can be "unloaded".
If you need to reset the state of shell or gdb, start a new instance of it.

The script is "sourced", not "loaded". The script executed and exited. Hence you can't unload it. It may have left something after itself (pretty-printers, commands, breakpoints, changes in configuration etc). You can't unload them all as a group, you have to find them and undo one-by-one.

Related

Startup script NOT running in instance

I have a instance where I have some Flask web app. In order the app to start when the VM is booted I have included a startup script:
#!/bin/sh
cd documentai_webapp
cd docai_webapp_instance_gcp
sudo python3 server.py
However, this is not at all executed, anyone can help me?thanks!
PD: When I execute this script manually within the VM it works perfectly fine
As context it is necessary contemplate:
For Linux startup scripts, you can use bash or non-bash file. To use a non-bash file, designate the interpreter by adding a #! to the top of the file. For example, to use a Python 3 startup script, add #! /usr/bin/python3 to the top of the file.
If you specify a startup script by using one of the procedures in this document, Compute Engine does the following:
Copies the startup script to the VM
Sets run permissions on the startup script
Runs the startup script as the root user when the VM boots (missing step from #Andoni)
For information about the various tasks related to startup scripts and when to perform each one, see the Overview.

How to make GCP start script to start multiple processes?

So, I'm using Google Cloud Platform and set below startup script
#! /bin/bash
cd /home/user
sudo ./process1
sudo ./process2
I worried about this script because process1 blocks shell and prevent to run sudo ./process2. And it really was. process1 was started successfully but process2 was not started.
I checked that script has no problem with starting process1 and process2. Execute ./process2 via SSH worked but after I close the SSH shell and process2 was stopped too.
How can I start both process in booting time(or even after)?
I tried testing your startup script in my environment,it seems the script works well.
1.You can please try checking process1 and process2 scripts.
2.If you want your process to run in the background even after the SSH session is closed, you can use “&” { your_command & }at the end of your command.
To run a command in the background, add the ampersand symbol (&) at the end of the command:
your_command &
then the script execution continues and isn't blocked. Or use linux internal means to auto run processes on boot.

Informatica powercenter shell not working

I tried to run command task to changes clear contents of a file using .sh commands and printf, but neither one isn't working. where as I'm able to run mkdir,copy and rm are working.
Just wondering should I have to change any configuration settings for clearing and write contents of a file.

Running SAS program via Crontab PuTTY

I am new to the world of PuTTY and hoping this is an easy ask. I have 16 programs in SAS that I need to automatically kick off once a month using crontab via PuTTY environment. I have it set up to email me but it just tells me the file doesn't exist. What am I missing in my script?
CRONTAB:
SHELL=/bin/bash
* 9 15 * * /prod/file/sas-data2/....../SasProgram.sas
Please help!
You usually need to add the SAS executable to the command. Assuming it is in the path then just
sas /prod/file/sas-data2/....../SasProgram.sas
should work.
If it is not in the path, then explicitly prefix sas with the path.
I find it is much easier to maintain if the CRONTAB entry points to a shell script that runs the commands. Then if the list of SAS programs to run changes you can just edit the script file and not have to mess with CRONTAB again.
Also jobs run with CRONTAB do not normally run your normal startup file (.profile if using sh variant shells like bash) so it is useful to source those in the script so that your normal environment variables and search paths exist.
CRONTAB:
* 9 15 * * /mydirectory/nightly_job.ksh
Script file
#!/bin/bash
# Set environment
. /mydirectory/crontab.profile
#
cd /prod/file/sas-data2/....../
sas SasProgram.sas

'Execute Command' keyword does not complete the execution on remote machine

I am trying to run a command (Jar file execution) on a remote machine using the 'Execute Command' keyword of SSH library. But the control returns even before the command execution is completed. Is there a way to wait until the command is executed?
Below are the keywords written:
Run The Job
[Arguments] ${machine_ip} ${username} ${password} ${file_location} ${KE_ID}
Open Connection ${machine_ip} timeout=60
Login ${username} ${password}
${run_jar_file}= Set Variable java -jar -Dspring.profiles.active=dev ${file_location} Ids=${KE_ID}
${output}= Execute Command ${run_jar_file}
Log ${output}
Sleep 30
Close Connection
Use Read and Write, instead of using "Execute Command" so that you can specify timeout for command execution.
refer: http://robotframework.org/SSHLibrary/latest/SSHLibrary.html#Write
You are explicitly asking for the command to be run in the background (by virtue of adding & as the last character in the command to be run), so the ssh library has no way of knowing when the program you're running exits. If you want to wait for it to finish, don't run it in the background.
In other words, remove the trailing & from the command you are running.
If anyone is still strugling with this one, i have discovered solution
Open Connection ${SSH_HOST} timeout=10s
Login login pass
Write your_command
Set Client Configuration prompt=$
${output}= Read Until Prompt
Should End With ${output} ~ $