make cppcheck skip the PACKAGE definition - c++

I'm using the GUI version of cppcheck 1.64 for static code analysis on C++-Builde-6 code. For DLL exports and imports, the definition of PACKAGE is necessary:
/// A dialog exported from a BPL (a VCL-specific kind of DLL)
class PACKAGE MySharedDialog {
public:
// lots of methods to-be checked
private:
// lots of methods to-be checked
// lots of members
};
Cppcheck stops when it encounters PACKAGE because it doesn't know what it means:
The code 'class PACKAGE TAppInfoDialog {' is not handled. You can use -I or --include to add handling of this code.
...and this of course means that the entire class isn't checked. If I could make cppcheck simply ignore the PACKAGE "keyword", it would do exactly the right thing, but how to do it? Including its original definition via include path, seems to be not an option: cppcheck then tells me a lot about headers of the VCL framework I cannot change...
The manual does not describe an option to do it, Google doesn't help, SO does not have an answer yet.
In the cppcheck issue tracker, I found the analogous problem #4707 (Microsoft 'abstract' and 'sealed' extension for class) – cppcheck. Here the lead developer suggests to create a file and (pre-?)include it to the cppcheck run, but I'm using the GUI version and there is no option to include a single file. So I tried to add a directors to the include section of my project options (an XML file), then I edited the corresponding line to a file specification, but that's clearly nonsense, because this section contains include paths.
What can I try next?

A solution is to add the definition of PACKAGE (being empty) to the project file:
<?xml version="1.0" encoding="UTF-8"?>
<project version="1">
<defines>
<define name="PACKAGE=" />
</defines>
</project>
This solution I finally found in this small but valuable project file description at the project repo cppcheck/gui/projectfile.txt at master · danmar/cppcheck · GitHub

Related

Problem with using common .editorconfig file (imported in csproj) in Visual Studio 2019 Preview 4

I want to streamling code analysis and the respetive rules accross multiple projects and teams.
We used to do that by having a NuGet package that imported analyzers to projects (Microsoft.CodeAnalysis.FxCopAnalyzers and StyleCop.Analyzers) and defined a ruleset to define how each rule was handled by VS (error, warning, etc.).
I have been trying to setup this using a common .editorconfig file instead of the ruleset. The problem is that settings like the following just seem to be ignored when the .editorconfig file is imported from a shared folder.
dotnet_diagnostic.CA1062.severity = error
For the purpose of testing this, I have a very simple scenario that illustrates the problem.
The .editorconfig file is as follows:
[*.cs]
dotnet_diagnostic.CA1062.severity = error
#dotnet_code_quality.null_check_validation_methods = NotNull
Now this file is imported in a csproj like this:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
</PropertyGroup>
<Import Project="..\..\_Shared\Build.props" />
<ItemGroup>
<PackageReference Include="Microsoft.CodeAnalysis.FxCopAnalyzers" Version="2.9.4">
<PrivateAssets>All</PrivateAssets>
</PackageReference>
</ItemGroup>
</Project>
Build.props is like this:
<Project>
<PropertyGroup>
<SkipDefaultEditorConfigAsAdditionalFile>true</SkipDefaultEditorConfigAsAdditionalFile>
</PropertyGroup>
<ItemGroup Condition="Exists('$(MSBuildThisFileDirectory)\.editorconfig')" >
<AdditionalFiles Include="$(MSBuildThisFileDirectory)\.editorconfig" />
</ItemGroup>
</Project>
The following code is supposed to trigger an error on CA1062:
public int Calculate(InputData input)
{
SmartGuard.NotNull(nameof(input), input);
if (this.Multiply)
{
return input.Value * 2;
}
else
{
return input.Value + 2;
}
}
But the result is a warning:
Now, if I change .editorconfig and uncomment the second line:
[*.cs]
dotnet_diagnostic.CA1062.severity = error
dotnet_code_quality.null_check_validation_methods = NotNull
The error goes way which means that null_check_validation_methods is being considered.
Why is that dotnet_diagnostic.CA1062.severity = error is being ignored?
This and other issues with the .editorconfig mechanics were reported in the following issues:
https://github.com/dotnet/roslyn/issues/38782
https://github.com/dotnet/roslyn/issues/43080
These have been solved and the original question is answered by implementing the recommendations referenced in those issues.
The problem you have comes from the fact, that the ".editorconfig" file mechanics (which is not defined by Visual Studio or Microsoft - it's pre-existing standard) is based on where the files are located in the folder structure. It has nothing to do with the mechanics of the Visual Studio projects.
See here on Microsoft's mention of this:
When you add an .editorconfig file to a folder in your file hierarchy, its settings apply to all applicable files at that level and below. You can also override EditorConfig settings for a particular project, codebase, or part of a codebase, such that it uses different conventions than other parts of the codebase. This can be useful when you incorporate code from somewhere else, and don’t want to change its conventions.
To override some or all of the EditorConfig settings, add an .editorconfig file at the level of the file hierarchy you want those overridden settings to apply. The new EditorConfig file settings apply to files at the same level and any subdirectories.
[ hierarchy image here ]
If you want to override some but not all of the settings, specify just those settings in the .editorconfig file. Only those properties that you explicitly list in the lower-level file are overridden. Other settings from higher-level .editorconfig files continue to apply. If you want to ensure that no settings from any higher-level .editorconfig files are applied to this part of the codebase, add the root=true property to the lower-level .editorconfig file:
# top-most EditorConfig file
root = true
EditorConfig files are read top to bottom. If there are multiple properties with the same name, the most recently found property with that name takes precedence.
Or here for the EditorConfig project.
Or here for the EditorConfig specification:
File Processing
When a filename is given to EditorConfig a search is performed in the directory of the given file and all parent directories for an EditorConfig file (named “.editorconfig” by default). Non-existing directories are treated as if they exist and are empty. All found EditorConfig files are searched for sections with section names matching the given filename. The search shall stop if an EditorConfig file is found with the root key set to true in the preamble or when reaching the root filesystem directory.
Files are read top to bottom and the most recent rules found take precedence. If multiple EditorConfig files have matching sections, the rules from the closer EditorConfig file are read last, so pairs in closer files take precedence.
I use Visual Studio version 16.11.2 and my experience is that the problem you describe appears, as a bug, when editing a project file in Visual Studio AFTER you have added a link to it to a it as a solution item. After such an action StyleCop does not any longer listen to the .editorconfig-file of the project.
To re-trigger StyleCop errors as build errors I then have to:
Remove the link to the .editorconfig for the project.
Add a copy of .editorconfig to the project.
Remove the copy of the .editorconfig to the project.
Re-add the link to the .editorconfig.
Quite akward yes, but the above DOES trigger the errors to appear as build errors again.
Moreover, in the version of Visual Studio above, I need to have the line below in the .csproj-file:
<PropertyGroup>
<EnforceCodeStyleInBuild>true</EnforceCodeStyleInBuild>
</PropertyGroup>
For previous versions of Visual Studio I had to have the lines below instead:
<PropertyGroup>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<WarningsAsErrors></WarningsAsErrors>
</PropertyGroup>

Set debug/run environment variable in Visual Studio 2017 C++ project?

I'm trying to set automatically run/debug environmental variables for my project in Visual Studio.
I mean, is there any CMake or C++ code line to do this not needing to do it manually?
Here are the instructions how to do it manually (what I want to avoid).
Here there is an still unsolved question about how to do it with Cmake (seems not to be possible).
I also tried with setenv() and putenv() in different ways but it didn't work, because the main function doesn't even run until that line of code, before an error message shows up: "Some.dll was not found" and the program stops.
If your dll is one you are intending to use, this answer details how to quickly ensure it is found at runtime (putting the DLL alongside the executable)
If by 'automatic' you mean in code, you can set environment variables in code using _putenv as described in this answer similar to what you seem to be describing.
ostringstream classSize;
classSize << "classSize=" << howManyInClass;
_putenv(classSize.str().c_str());
The solution I found is base on this answer.
Steps for the solution:
Create a UserTemplate.vcxproj.user file next to the CMakeLists.txt file, with the next content:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="15.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<LocalDebuggerEnvironment>PATH=..\Your\Path\to\Binaries;%PATH%".</LocalDebuggerEnvironment>
<DebuggerFlavor>WindowsLocalDebugger</DebuggerFlavor>
</PropertyGroup>
</Project>
Where ..\Your\Path\to\Binaries is the relative path to your binary
files (the two points at the beginning .. are optional, if you
want to go up in the relative directory path, you may want to use
them).
Add the next lines of code in the CMakeLists.txt file.
# Configure the template file
SET(USER_FILE main.vcxproj.user)
SET(OUTPUT_PATH ${CMAKE_CURRENT_BINARY_DIR}/${USER_FILE})
CONFIGURE_FILE(UserTemplate.vcxproj.user ${USER_FILE} #ONLY)
Where ProjectName is the name of the VS project where you want to define your PATH variable.

ResolveAssemblyReference cannot find dll and I cannot force it to look where it is

I have solution with n csharp projects and cpp project on top, this cpp provides interfaces and headers so those csharp ones can be used in other cpp solutions.
The build machine is configured to build csharp project with anyCPU architecture so it provides single assembly per build in Solution\bin\Release. For cpp the anyCpu is not available so I build project twice and store assemblies in Solution\bin\Release\x86 and x64 folders.
This is all to get it packaged in nuget as a single package with .targets file to ease consumption in other cpp projects.
Issue is that cpp project is looking for csharp asseblies using ResolveAssemblyReference and cannot find it, giving missleading message:
ResolveAssemblyReferences:
Primary reference "Implementation".
Could not find dependent files. Expected file "C:\Jenkins\Workspace\Solution\bin\Release\x86\Implementation.dll" does not exist.
Could not find dependent files. Expected file "C:\Jenkins\Workspace\Solution\bin\Release\x86\Implementation.dll" does not exist.
Resolved file path is "C:\Jenkins\Workspace\Solution\bin\Release\x86\Implementation.dll".
Reference found at search path location "".
I tried to alternate ResolveAssemblyReferences behaviour using command line properties, custom targets/properties, but without any luck. The parameters described in https://learn.microsoft.com/en-us/visualstudio/msbuild/resolveassemblyreference-task?view=vs-2017 seem to be computed during the build process and I cannot inject any value, which should be in this case something like $(OutDir)..
The one feasable solution seems to be copy c# dlls into each cpp folder, but I dont think it is the way to solve it properly.
Closes I got is by using /p:ReferencePath like below:
"C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\MsBuild\15.0\bin\MsBuild.exe" /p:BuildProjectReferences=false /p:Configuration=Release /p:DebugType=full /p:DebugSymbols=true /p:PlatformToolset=v120 /p:WindowsTargetPlatformVersion=8.1 /p:ForceImportBeforeCppTargets="C:\Jenkins\Workspace\Solution\Cpp.props" /p:OutDir="C:\Jenkins\Workspace\Solution\bin\Release\x86\" /p:Platform=Win32 /t:Build Interface\Interface.vcxproj /p:ReferencePath="C:\jenkins\workspace\Solution\bin\Release"
My custom Cpp.props does:
<Target Name="Output" BeforeTargets="ResolveAssemblyReferences">
<Message Text="AssemblySearchPaths: $(AssemblySearchPaths)" />
</Target>
and by adding /p:ReferencePath it got added to AssemblySearchPaths as second record, after {CandidateAssemblyFiles}; but it is still not finding those dlls

Guidelines for including TMB c++ code in an R package

I've recently discovered the wonders of TMB and I'm working on a package which would ideally include TMB c++ templates in it for rather computationally expensive models.
I'm assuming that there's a possibility of:
Automatically compiling the TMB source code on package install
but I can't find any clear guidelines in the TMB documentation regarding this. As of now, my alternative is to write functions that compile the TMB code upon the first call of a function which uses an uncompiled class... but I have a feeling there are nicer ways to do this.
Has anyone successfully included TMB functions within another package and could point me in the direction of relevant documentation or examples?
With a bit more searching i finally found my answer in this thread. I guess I missed it because the resolutions it details were moved to the wiki page titled development, where the content is specifically targeted for users wishing to contribute to the development of TMB, whereas I just want to distribute code which incorperates TMB.
To summarize, the thread suggests some changes which I adopted like this (myPkg should be the name of your package):
src/
Place your .cpp template in mypkg/src. This will then be automatically compiled by R when you build your package.
DESCRIPTION
Add these lines to your description file so R has all the tools necessary to compile the model template.
Depends: TMB, RcppEigen
LinkingTo: TMB, RcppEigen
R/roxygentags.r
Now we need to add our TMB template to the namespace file. We can do this easily through roxygen by making a dummy file like so:
#' Roxygen commands
#'
#' #useDynLib myPkg
#'
dummy <- function(){
return(NULL)
}
The dummy function is just an excuse to have the tag #useDynLib myPkg somewhere in my source code where I wont mess with it. This tag will populate your NAMESPACE with useDynLib(myPkg)... and as I understand, this loads the shared libraries upon loading the package for you.
Calling the function in your package:
Finally, when calling MakeADFun, set DLL="myPkg". With this setup, you can compile a single TMB model into your package. This is because the content compiled in your ./src/ folder will automatically be renamed according to your package name, thus you cannot create uniquely named models.
EDIT: Solution for distributing multiple DLLs
After some more searching (same thread as referenced above)... I realized that solution described in the official wiki (and detailed above) is only relevant for distributing a single dll (i.e. a single TMB model).
If you want to distribute multiple TMB models in a package, you'll have to use your own makefile. I've given a more detailed description in my blog, so I'll only briefly describe the steps here with regard to how they differ from the previous steps I described.
src/Makefile
You'll have to define your own Makefile (or Makefile.win for windows users) and drop it in your src/ directory. Here's an example that works for me:
all: template1.so template2.so
# Comment here preserves the prior tab
template1.so: template1.cpp
Rscript --vanilla -e "TMB::compile('template1.cpp','-O0 -g')"
template2.so: template2.cpp
Rscript --vanilla -e "TMB::compile('template2.cpp','-O0 -g')"
clean:
rm -rf *o
For windows, replace so, with dll, and use the relevant compiler flags (for debugging). See ?TMB::compile for info regarding compiler flags for debugging.
R/roxygentags.r
This is slightly different than above:
#' Roxygen commands
#'
#' This is a dummy function who's purpose is to hold the useDynLib roxygen tag.
#' This tag will populate the namespace with compiled c++ functions upon package install.
#'
#' #useDynLib template1
#' #useDynLib template2
#'
dummy <- function(){
return(NULL)
}
Using your models in the package
Finally, the above changes will compile multiple uniquely named TMB templates and load them into the namespace. To call these models in your package, here's an example:
obj <- MakeADFun(data = data,
parameters = params,
DLL="template1",
inner.control = list(maxit = 10000),
silent=F)
Tips...
I had issues when I tried compiling this on a windows machine... it turned out to be related to not properly cleaning the src folder and I had old linux compiled files stuck in there. If you have compilation issues, its worth manually cleaning out the residual files in your src/ directory from previous builds... or perhaps someone can give some good advice on writing a better make file!
If you want access to the CppAD library with the additional code from TMB (which is quite substantial!) then you can use the WITH_LIBTMB macro variable as I do in this header here. This will allow you to have multiple .cpp files which you can compile separately. Importantly, you only need to compile the code from the TMB header once using a file like this which #includes the TMB.hpp header without defining WITH_LIBTMB.
This reduces the compilation time substantively as you can compile each .cpp on its own without all the code which is declared in TMB.hpp. Moreover, you can also use the code with Rcpp if you undefine and define a few macros as I do in the link.
You can also have one file which can used by TMB::MakeADFun. It requires a bit of manual work but can be done whilst also using Rcpp by using Rcpp::compileAttributes and changing the created file called RcppExports.cpp to instead be named init.cpp and then include these additional lines in the CallEntries array and R_init_survTMB function:
CallEntries array.
R_init_survTMB function.
Note on using Rstudio
Rstudio calls Rcpp::compileAttributes (or something similar) each time you build. Hence, you cannot use this. One way around this is to create a custom build script similar to the one here. It essentially calls R CMD INSTALL after having removed the RcppExports.cpp file created by Rcpp::compileAttributes. I also like to run the tests by calling devtools::test() but you can remove this if you like.

SpecFlow unit test failed due to not able to find "TechTalk.SpecFlow" file

I have a VS2010 unit test project set to using SpecFlow 1.8.1 and mstest. In order to get the SpecFlow unit tests working, I've done the following:-
I added the references to the following files in my project:-
Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll
TechTalk.SpecFlow.dll
Note that the TechTalk.SpecFlow.dll has been added into my project and the reference points to that file.
I've set the "Copy Local" property of the TechTalk.SpecFlow.dll reference to True.
I've also added an App.Config that specifies "MsTest.2010" as the provider, and regenerated all code-behinds for the SpecFlow features.
Everything works in my VS2010, the tests run successfully in both the SpecFlow testrunner and the mstest test runner. BUT when I try to run the mstests in TFS 2008 (using a .vsmdi test list file), it failed with the following exception:-
Class Initialization method MyNamespace.MyTestFeature.FeatureSetup threw exception.
System.Configuration.ConfigurationErrorsException:
System.Configuration.ConfigurationErrorsException: An error occurred creating the
configuration section handler for specFlow: Could not load file or assembly
'TechTalk.SpecFlow' or one of its dependencies. The system cannot find the file
specified. (D:\Projects\TestProject\TestResults\administrator_MYPC 2012-06-27
18_30_05_Any CPU_Debug\Out\TestProject.DLL.config line 4) --->
System.IO.FileNotFoundException: Could not load file or assembly 'TechTalk.SpecFlow'
or one of its dependencies. The system cannot find the file specified.
Note that the TFS built the project fine and it runs other unit tests in the same project (normal mstests, not SpecFlow) without problems. It only failed for the SpecFlow test runs.
So what am I doing wrong?
Edit: The contents of my App.Config file looks like this:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<configSections>
<section
name="specFlow"
type="TechTalk.SpecFlow.Configuration.ConfigurationSectionHandler, TechTalk.SpecFlow"/>
</configSections>
<specFlow>
<unitTestProvider name="MsTest.2010" />
<runtime detectAmbiguousMatches="true"
stopAtFirstError="false"
missingOrPendingStepsOutcome="Inconclusive" />
<trace traceSuccessfulSteps="true"
traceTimings="false"
minTracedDuration="0:0:0.1" />
</specFlow>
</configuration>
Following the instruction on this site and this site:
the command Tools > Library Package Manager > Package Manager Console allows you to type in PM> Install-Package SpecFlow
when the prompts returns "installed successfully", the SpecFlow Assembly now appears in the references of your project. And the MSTest project now compiles succesfully (at least for me).
I got this error as well, in my case the problem was that I was using the \...\obj\Debug||Release\ folder as target and not the \...\bin\Debug||Release\ folder. Looking in these folders I saw that the TechTalk.dll assembly was missing from the former. Simply switching in my .bat file the problem was fixed.
Sometimes VS2013 is looking for SpecRun dlls not in project folder, but in C:\Users\**YOUR_USER**\AppData\Local\Temp\VisualStudioTestExplorerExtensions\SpecRun.Runner.1.3.0\tools. So you just need to put all necessary SpecFlow libraries therel
One hack I found to get it working is to add another class for EVERY single SpecFlow feature that I created in the project. The class looks like this:-
[DeploymentItem(#"TechTalk.SpecFlow.dll")]
partial class MyTestFeature { }
// The above class-name needs to come from the auto-generated code behind
// (.feature.cs) for each SpecFlow feature.
I consider this as a very nasty hack, but it does provide a clue as to why it didn't work. It would be good if anyone comes up with a more elegant solution.
I finally found the more proper fix for this issue. I just need to add a post-build event to remove the .config file from the build output. (The App.config file is used only to generate the code-behind during design time. It is not used at all during runtime, so it can be removed.)
The command for the post-build event looks like this:-
del /f /q "$(TargetDir)$(TargetFileName).config"
Correction: The .config file is used for generating inconclusive results, so a better post-build event command is as follows:-
if "$(IsDesktopBuild)"=="false" del /f /q "$(TargetDir)$(TargetFileName).config"