is there any way in perforce to check the diffs between the diffs of two files? - compare

I regularly need to check if the changes of a changelist are the same as the changes of a reference changelist, I would like to know if there is any command or script to facilitate this

The first idea that comes to my mind is to write a script that runs the p4 diff command (or p4 diff2, if you are comparing two different changelists on the server) for the two different changelists you want to compare and puts each of them in a separate text file. Then run p4merge and pass the files in as arguments.
p4 diff first_path/...#345 > diff1.txt
p4 diff second_path/...#347 > diff2.txt
p4merge diff1.txt diff2.txt
Hopefully that at least gets you started on some ideas!

Related

Is there a way to apply SAS EG processes to new files?

I'm taking over a project from a coworker that involves several extensive SAS process flows. I have all the files with all the same names and a copy of the process flows they used. Since the file paths in their processes are direct references to their computer, normally I would just re-import the files with the same output names and run the process from there. In a few cases I would have to recreate a query builder as I'm using a few .sas7bdat files from another project.
However, there are quite a few files involved and I may end up having to pass this to another coworker in a few months, and since I can't get a good look at exactly what the import task is doing I'm concerned I may have some of the variables imported incorrectly. Is there an easy way to just change the file path the import or other task refers to?
Given the updates in comments, there's two possibilities I see.
If the paths you're changing are, or can be, relative to the location of the EGP, then you can right click on the Project->Properties->File References and check "Use paths relative to the project...", which means instead of storing a file in c:\my EGP folder\my code folder\code.sas it would store it as my code folder\code.sas. So then if the whole project moves to another computer (or just any other folder) then it automatically has the right path. This is mostly useful for code or similar things.
Otherwise, you're going to have to convert things to SAS code modules. There you can use macro variables to define the locations of things.

Batch File to compare 1 local and 1 network folder and copy local to another local if they match

I have a network folder that contains sub-folders and files on a network drive. I want to automate copying the files and folders to my 4 local computers. Due to bandwidth issues I have a scheduled task that pulls the update over at night to a single computer. I would like a batch file for the other 3 local computers that can verify when the 2 folders on separate devices (1 local and 1 remote) are in sync then copy the local files to itself.
I have looked through Robocopy, and several of the other compare commands and I see they give me a report of the differences, but what I am looking for is something conditional to continue batch processing. I would execute it from a scheduled task, but basically it would perform like:
IF \remotepc\folder EQU \localpc1\folder" robocopy "\localpc1\folder" "c:\tasks\updater" /MIR
ELSE GOTO EOF
Thanks in advance. Any help is appreciated.
A Beyond Compare script can compare two folders, then copy matching files to a third folder.
Script:
load c:\folder1 \\server1\share\folder1
expand all
select left.exact.files
copyto left path:base c:\output
To run the script, use the command line:
"c:\program files\beyond compare 4\bcompare.exe" "#c:\script.txt"
The # character makes Beyond Compare run a file as a script instead of loading it for GUI comparison.
Beyond Compare scripting resources:
Beyond Compare Help - Scripting
Beyond Compare Help - Scripting Reference
Beyond Compare Forums - Scripting

How to only run certain lines of a long Stata do file code? (e.g. lines 30-3200)

I have a very, very long do file which runs different sets of commands every 800 lines or so. (Total about 8000 lines).
It is very cumbersome to select, for example, lines 30-3200 every time to get the code to run. Is it possible to write a command which runs only a segment of a particular code? And possibly multiple segments? e.g. run line 30-3200 and then 4800-5400.
Thanks!
I typically take one of two approaches to this:
For a big project, split the analysis into separate files (e.g. a clean.do file to prepare data, a stats.do file for summary stats, an analysis.do file for regressions, ...). Then, create something like a build.do file that uses the include command to run the other files:
// build.do - Run the full analysis
include clean.do
include stats.do
include analysis.do
You can then re-run selected parts of the pipeline by typing include file.do from the command prompt.
Splitting the analysis into separate files like this is also a good idea if you're using source control and collaborating with others.
You could have one file that takes options for running parts of the code:
args run_a run_b
if `run_a' == 1 {
// run the part A code ...
}
if `run_b' == 1 {
// run the part B code ...
}
You then specify the parts of the code to run by passing arguments to the script. For example, do file.do 1 0 would run part A only. This method might become hard to manage with big files; imagine trying to remember what 10 different arguments do.
Personally, I prefer method 1. Keep the do files short and give them logical names as a way to organize the code. Method 1 might also make it easier to find and re-use code in other projects.

Batch file date check from multiple folders

I'm looking (if possible) for a check and copy batch script that I can run remotely to check multiple directories and copy the newest modified date.
To clarify: On the remote machine I'm looking at a potential five folders (that may or may not be there). I need the script to check the last modified date of two sub-folders (Desktop and Internet Favorites) of a user's potential 5 profiles, then pick the most recent modified date and copy the folders to another location
So pathway looks like:
"\\%asset%\c$\documents and settings\%username%\Desktop"
"\\%asset%\c$\documents and settings\%username%\Favorites"
To check the date and compare it with (potentially)
"\\%asset%\c$\documents and settings\%username%.temp\Desktop"
"\\%asset%\c$\documents and settings\%username%.temp\Favorites"
Or
"\\%asset%\c$\documents and settings\%username%.temp001\Desktop"
"\\%asset%\c$\documents and settings\%username%.temp001\Favorites"
Once it has found the sub-folders with the most recent modified date to copy (only the most recent) to:
"\\%asset%\c$\documents and settings\Backup"
I know I can get the check done on one location, but I don't know how to ask batch to run multiple checks and then to pick the most recent.
Is that actually possible or am I trying this in the wrong language? I've gotten every thing but the check written out and that's where I'm getting stuck...
Any help would be appreciated!
As I said in my comment, I think Powershell would be better suited to the task. But I did think of one approach that may work in batch without having to resort to textual date comparisons (which are difficult).
You may be able to use robocopy with its /copy:t option which copies timestamps. Imagine copying all three directories to one location, then doing a dir /b /od to list out this temporary directory sorted by date to find the most recent one, and copy that to your target.
I don't have time to test this theory out or give you real code, but hopefully it gives you an approach to try. Or convinces you to take a look at Powershell. :-)

How do you effectively compare 15000 files multiple times?

I am comparing two almost identical folders which include hidden .svn folders which should be ignored and I want to continually quickly compare the folders as some files are patched to compared the difference without checking the unchanged matching files again.
edit:
Because there are so many options I'm interested in a solution that clearly exploits the knowledge from the previous compare because any other solution is not really feasable when doing repeated comparisons.
If you are willing to spend a bit of money, Beyond Compare is a pretty powerful diffing tool that can do folder based diffing.
Beyond Compare
I personally use WinMerge and find it very useful. It has filters that exclude svn file. Under linux i prefer Meld.
One option would be to use rsync. Something like:
rsync -n -r -v -C dir_a dir_b
The -n option does a dry-run so no files will be modified. -r does a recursive comparison. Optionally turn on verbose mode with -v. (You could use -i to itemize the changes instead of -v.) To ignore commonly ignored files such as .svn/ use -C.
This should be faster than a simple diff as I read the rsync manpage:
Rsync finds files that need to be transferred using a "quick check"
algorithm (by default) that looks for files that have changed in size
or in last-modified time. Any changes in the other preserved
attributes (as requested by options) are made on the destination file
directly when the quick check indicates that the file's data does not
need to be updated.
Since the "quick check" algorithm does not look at file contents directly, it might be fooled. In that case, the -c option, which performs a checksum instead, may be needed. It is likely to be faster than an ordinary diff.
In addition, if you plan on syncing the directories at some point, this is a good tool for that job as well.
Not foolproof, but you could just compare the timestamps.
Use total commander ! All the cool developers use it :)
If you are on linux or some variant, you should be able to do:
prompt$ diff -r dir1 dir2 --exclude=.svn
The -r forces recursive lookups. There are a bunch of switches to ignore stuff like whitespace etc.