Debug script code that is being executed with Roslyn CSharpScript - roslyn

I created this test console application to run some C# code with the Roslyn scripting engine (in the Microsoft.CodeAnalysis.CSharp.Scripting nuget package).
string code = "int test = 123;\r\nConsole.WriteLine(\"hello, world!\");";
var options = ScriptOptions.Default.WithImports("System");
var script = CSharpScript.Create(code, options);
await script.RunAsync();
This works, but now I would also like the option of somehow debugging into the script that is being executed. Is there a way to do that?

Figured out a way to do it by writing the code to a temporary file, and adding debugging information pointing to that file. Then I can step into the RunAsync call, and visual studio will load the temporary file, show an execution pointer, and will let me inspect variables.
using Microsoft.CodeAnalysis.CSharp.Scripting;
using Microsoft.CodeAnalysis.Scripting;
using System;
using System.IO;
using System.Text;
namespace RoslynScriptingTest
{
class Program
{
static async Task Main(string[] args)
{
string code = "int test = 123;\r\nConsole.WriteLine(\"hello, world!\");";
string tmpFile = Path.GetTempFileName();
var encoding = Encoding.UTF8;
File.WriteAllText(tmpFile, code, encoding);
try
{
var options = ScriptOptions.Default
.WithImports("System")
.WithEmitDebugInformation(true)
.WithFilePath(tmpFile)
.WithFileEncoding(encoding);
var script = CSharpScript.Create(code, options);
await script.RunAsync();
}
finally
{
File.Delete(tmpFile);
}
Console.ReadKey();
}
}
}
The debugging only seems to work when "just my code" is enabled in the visual studio debugger settings.
In my actual use case, I'm actually loading the code from an XML file, so it would be better if I could point to that original file and map the line numbers somehow. But this is already a good start.

Related

How can i attach a debugger to google v8?

In my application (Windows 10 VC2017) i enabled the possibility to write and execute scripts using google v8 and v8pp.
v8pp calls a script like this:
v8::Local<v8::Value> context::run_script(std::string const& source, std::string const& filename)
{
v8::EscapableHandleScope scope(isolate_);
v8::Local<v8::Context> context = isolate_->GetCurrentContext();
v8::ScriptOrigin origin(to_v8(isolate_, filename));
v8::Local<v8::Script> script;
bool const is_valid = v8::Script::Compile(context,
to_v8(isolate_, source), &origin).ToLocal(&script);
v8::Local<v8::Value> result;
if (!script.IsEmpty())
{
auto res1 = script->Run(context); //
if(! res1.IsEmpty())
result = res1.ToLocalChecked();
}
return scope.Escape(result);
}
How can i attach a debugger (chrome debug) to my code?
I found googles description at https://v8.dev/docs/inspector -
But this leaves some things blank and consists mostly of js code?
And i found the implementation for v8toolkit at https://github.com/xaxxon/v8toolkit/blob/master/src/debugger.cpp. But this seems to run not for windows.
What is a easy way to attach chrome debug to js code? The code is typically not a file but rather is stored in a data base and then stored in a std::string.
I finally got done a windows version of v8inspector that works well with my stand alone windows application with integrated v8.
I made a own fork including descriptions where to find/build the required 3rd party libraries (or where to find prebuilds). I also did a number of changes/additions:
https://github.com/StefanWoe/v8inspector
In the meanwhile this has also been merged into the parent project:
https://github.com/hsharsha/v8inspector
EDIT:
In the meanwhile ive been pointed to another implementation built with boost::beast and no other dependencies. Much simpler and more robust etc.:
https://github.com/ahmadov/v8_inspector_example

HXCPP Profiler won't create log file

I am using Haxe for a game and compiling for the C++ target using HXCPP. I am trying to get the built-in profiler to work (cpp.vm.Profiler), but I cannot get it to create a dump file. My code is as simple as that :
if(Input.check(Key.P))
cpp.vm.Profiler.start("profiler.txt");
if(Input.check(Key.M))
cpp.vm.Profiler.stop();
I use HaxePunk for the input, and I assert that the profiler calls are indeed being executed (I made sure using a couple trace calls). I use defines HXCPP_STACK_TRACE and HXCPP_PROFILER for the compilation.
Am I doing anything wrong, or missing anything ?
EDIT : here is some code that when compiled using haxe -D HXCPP_PROFILER -D HXCPP_STACK_TRACE -main Main -cpp test, doesn't actually create any noticeable "profiler.txt" file :
class Main
{
static public function main()
{
var bleh = haxe.Timer.stamp();
cpp.vm.Profiler.start("profiler.txt");
while(haxe.Timer.stamp() - bleh < 5.)
{
// Do something I guess
Math.cos(haxe.Timer.stamp());
}
cpp.vm.Profiler.stop();
}
}
Relevant bug report to hxcpp: #580.
Apparently this was fixed on 17 May 2017 in this commit. The fix should be in the next hxcpp version after 3.4.64.

Can I programmatically collapse/expand all preprocessor blocks of a certain name in Visual Studio 2012?

My current project has a lot of debug preprocessor blocks scattered throughout the code. These are intentionally named differently to the system _DEBUG and NDEBUG macros, so I have a lot of this:
// Some code here
#ifdef PROJNAME_DEBUG
//unit tests, assumption testing, etc.
#endif
// code continues
These blocks sometimes get rather large, and their presence can sometimes inhibit code readability. In Visual Studio 2012 I can easily collapse these, but it would be nice to automatically have all of them collapsed, allowing me to expand them if I want to see what's in there. However, as I also have a bunch of header guards I don't want to collapse all preprocessor blocks, only the #ifdef PROJNAME_DEBUG ones.
Can I do this?
This is the most easiest scenario you can achive it, I think.
You should create an Add-In first in C#. (in VS 2013 they become deprecated :( )
In the OnConnection method you should add your command:
public void OnConnection( object application, ext_ConnectMode connectMode, object addInInst, ref Array custom )
{
_applicationObject = (DTE2)application;
if (connectMode == ext_ConnectMode.ext_cm_AfterStartup || connectMode == ext_ConnectMode.ext_cm_Startup)
{
Commands2 commands = (Commands2)_applicationObject.Commands;
try
{
//Add a command to the Commands collection:
Command command = commands.AddNamedCommand2(_addInInstance, "MyAddinMenuBar", "MyAddinMenuBar", "Executes the command for MyAddinMenuBar", true, 59, ref contextGUIDS, (int)vsCommandStatus.vsCommandStatusSupported + (int)vsCommandStatus.vsCommandStatusEnabled, (int)vsCommandStyle.vsCommandStylePictAndText, vsCommandControlType.vsCommandControlTypeButton);
}
catch (System.ArgumentException)
{
//If we are here, bla, bla... (Auto generated)
}
}
}
Note: you can find how parameters are act at the reference of AddNamedCommand2
The template created version would be also fine, but naturaly it worth to name your command properly.
After that you need to add your logic to Exec method:
public void Exec( string commandName, vsCommandExecOption executeOption, ref object varIn, ref object varOut, ref bool handled )
{
handled = false;
if (executeOption == vsCommandExecOption.vsCommandExecOptionDoDefault)
{
if (commandName == "MyAddinMenuBar.Connect.MyAddinMenuBar")
{
List<string> args = (varIn as string).Split(' ').ToList();
TextSelection ts;
ts = (TextSelection)_applicationObject.ActiveDocument.Selection;
EditPoint ep = (ts.ActivePoint).CreateEditPoint();
ep.StartOfDocument();
do
{
string actualLine = ep.GetLines(ep.Line, ep.Line + 1);
if (args.TrueForAll(filter => actualLine.Contains(filter)))
{
_applicationObject.ExecuteCommand("Edit.GoTo", ep.Line.ToString());
_applicationObject.ExecuteCommand("Edit.ToggleOutliningExpansion");
}
ep.LineDown();
} while (!ep.AtEndOfDocument);
handled = true;
return;
}
}
}
Note: Name you given to the command is checked in exec.
Than you can build.
Deployment of Add-In can happen through an [ProjectName].AddIn file in ..\Documents\Visaul Studio 20[XY]\AddIns\. (Created by the template, you should copy if you move the Add-In elsewhere)
You should place your Add-In assembly where the Assembly element of the mentioned file you set to point. To change version you should modify the text in Version element.
After you deployed and started Studio, you should activate the Add-In in the manager in Toolsmenu.
You need to expand all collapsable section in your code file (CTRL+M+L with C# IDE settigs).
This is required because I found only a way to invert the state of collapsion. If you find better command, you can change it.
Next you should activate Command Window to use the the created command.
Now only you need to type your commands name, like this:
MyAddinMenuBar.Connect.MyAddinMenuBar #ifdef PROJNAME_DEBUG
Hopefully magic will happen.
This solution is independent of language of code you edit so pretty multifunctional.

TFS: Query for builds containing a specific changeset

I have a number of build definitions that get executed based upon a single branch in TFS (eg Main).
I'd like to (somehow) query TFS to find all builds containing a specific changeset number that I supply, and return a list of string of the names of the builds that TFS contains. Any kind of app (VS extension, CLI app, winforms, whatever) will do.
Note: this isn't a 'plz give me the code' request; I'm willing to hoof it and do serious work on this. Any pointers to documentation on how to query the database or SDK, or an example of how to query builds; just some place to start looking would be extremely helpful. Thanks.
The following snippet will crawl all Build Definitions of all Team Project of a Collection, and will check each and every build for an Association to the input changeset number:
using System;
using System.Linq;
using Microsoft.TeamFoundation.Build.Client;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
namespace FindChangesetInBuild
{
class Program
{
static void Main(string[] args)
{
TfsTeamProjectCollection teamProjectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://tfs:8080/tfs/collectionName"));
var versionControl = teamProjectCollection.GetService<VersionControlServer>();
var buildService = (IBuildServer)teamProjectCollection.GetService(typeof(IBuildServer));
var teamProjects = versionControl.GetAllTeamProjects(true);
foreach (var teamProject in teamProjects)
{
var buildDefinitions = buildService.QueryBuildDefinitions(teamProject.Name);
foreach (var buildDefinition in buildDefinitions)
{
var builds = buildService.QueryBuilds(buildDefinition);
foreach (var buildDetail in builds)
{
var changesets = InformationNodeConverters.GetAssociatedChangesets(buildDetail);
if (changesets.Any(changesetSummary => changesetSummary.ChangesetId == Convert.ToInt32(args[0])))
{
Console.WriteLine("Changeset was build in "+buildDetail.BuildNumber);
}
}
}
}
}
}
}
Needless to say, this is a brute force attack.You can further refine the code if you narrow down the list of buildDefinition, make focus on specific teamProjects etc. In any case I can hardly imagine the above to be useful as-is!Apart from (obviously) MSDN, a great resource for TFS-SDK is Shai Raiten's blog.For Build-Speficic examples, check also here & here for some possibly interesting SO posts.
You can use this little DB Query in TFS 2010 and just substitute 90264 with your changeset id.
USE Tfs_Warehouse
go
SELECT BuildName
FROM DimBuild
INNER JOIN FactBuildChangeset
ON DimBuild.BuildSK = FactBuildChangeset.BuildSK
WHERE FactBuildChangeset.ChangesetSK = 90264

Storing an INI file in memory

I am writing an application that uses an ini file to store all status codes (errors, success codes, etc.) A very simple version is this:
[success]
000=Status code not found.
[error]
000=Error code not found.
001=Username/Password not found.
And my CF Component to work with that uses the following code:
component hint="Set of tools to interact with status codes."{
public function init(string codefile) any{
this.codefile = Arguments.codefile;
}
public function getCodeString(string type, string code) string{
var code = getProfileString(Variables.codefile, Arguments.type, Arguments.code);
return code;
}
}
What I assume happens when I call the getProfileString is that Railo opens the file, searches for the key and returns the value. So as my application grows and I have a lot more codes, I expect that this process will slow down. So is there a way that I can open the file in my init method and read it all into the variables scope, and call the getProfileString from there?
You can even parse your ini file in onApplicationStart and push the data into application scope like recommended for a XML file by #Sergii, if you want to stick with the .ini approach.
Do something like that:
var sections = getProfileSections(variables.codeFile);
var sectionEntries = [];
var indx = 0;
for (key in sections){
sectionEntries = listToArray(sections[key]);
application[key] = {};
for (indx=1; indx <= arraylen(sectionEntries); indx++){
application[key][sectionEntries[indx]] = getProfileString(variables.cfgFile,key,sectionEntries[indx]);
}
}
haven't tested this on Railo, but it should work on ColdFusion 9 at least
Because you are using Railo there's possibly easiest solution: put the file into the RAM filesystem.
So full path to the file will look like ram:///some/path/to/config.ini.
Obviously, you need to write the file into the RAM first, possibly on first request.
So slightly modified version of the component may look this way:
component hint="Set of tools to interact with status codes."{
public function init(string codefile, string.ramfile) any{
variables.codefile = arguments.codefile;
variables.ramfile = arguments.ramfile;
}
public function getCodeString(string type, string code) string{
if (NOT fileExists(variables.ramfile)) {
fileCopy(variables.codefile, variables.ramfile);
}
return getProfileString(variables.ramfile, arguments.type, arguments.code);
}
}
Please note that I've changed this.codefile to the variables.codefile in the init.
Any way, I'm also not sure ini file is the most handy and maintainable solution. You need to parse it each time any way, right? If you need a file config, use XML. Just parse it in onApplicationStart and push the data into the application scope.