This question already has answers here:
How to ignore generated files from Go test coverage
(4 answers)
Closed 4 years ago.
I have folder structure somewhat like this.
├── executor
| ├── executor_test.go
| |── executor.go
| |--excutor_mock.go
| |--errors.go
| |--app.go
├── _includes
| ├── xyz.go
| └── abc.go
├──vendor
executor_test.go contains all the unit test cases for executor.go.
So when I run go test --cover ./... It shows me the coverage package wise which is good but I also want to exclude errors.go, excutor_mock.go and app.go in executor folder from showing up in go test --cover ./....So is there a way to exclude them?
You can add all the tests you want to run in my_test.go and run go test my_test.go.
you can use -run=<regex> flag. This possible when you have a common Prefix, Suffix or term in the names of the tests you want to run. I will update a better answer.
hey, if anyone still ends up here. I tried building a command line application. Its still rough around the edges. It doesn't import all the packages. But it does what you required here. It will exclude files. Based on contents of a .testignore file.
here is the link https://github.com/kurianCoding/ko
Related
I am trying to get the proper coverage amount of my code in the processor package using my test files that are in the test directory. I've tried numerous combinations of -cover and -coverpkg and cannot get it to work correctly. Can this be done while having the test files in separate package/folder?
.
├── internal
│ └── processor
└── test
I tracked down an answer and this is how I was able to accomplish my goal:
go test -cover -coverpkg "./internal/processor" "./test"
Based on cmd/go, Testing Flags, the flag usage is:
-coverpkg pattern1,pattern2,pattern3
Apply coverage analysis in each test to packages matching the patterns.
The default is for each test to analyze only the package being tested.
See 'go help packages' for a description of package patterns. Sets -cover.
I am using GoCD to build a project with a large number of modules and I have modeled it as a pipeline with two stages (stage 1 is to build the code, stage 2 is to run the tests). The directory structure after a successful build stage looks about like this:
myproject/
|-- myproject-module1
| |-- build <-- created by stage 1, required by stage 2
| `-- src
|-- myproject-module2
| |-- build <-- created by stage 1, required by stage 2
| `-- src
|-- myproject-module3
| |-- build <-- created by stage 1, required by stage 2
| `-- src
`-- ... many more modules ...
In stage 1 I have configured a Build Artifact with source */build and in stage 2 I'm trying to fetch all the build folders again with source * , with the intention that they would end up in the correct location next to the src folder inside each of the project modules.
Unfortunately, I have found no way to achieve this yet. GoCD seems to create a separate ZIP file of all the *\build folders and during the fetch, the file *.zip cannot be found (I assume that it really looks for a file with that exact name, instead of using wildcards). Of course I could hard-code all the module names and individually fetch myproject-module[1:n], but that's exactly what I want to avoid.
Does anyone have some advice on how this could be achieved?
In this discussion from 2014, it is claimed that wildcards cannot be used to fetch artifacts. Is that really still the case?!
I don't know if it's possible to do using built-in features of GoCD, but it should be definitely be possible using REST API.
Knowing current pipeline name, you can get all available stages of it and calculate previous one. Next, using a possibility to download an artifact directory as zip archive, you can get what you want.
So you can add this as a script for the second stage, which will get the zipped artifact and after that you can continue with testing.
To do that, I can recommend my implementation of GoCD API - yagocd. This python library would let you to program aforementioned logic in a natural way.
Some tips:
you can get current pipeline name from GO_PIPELINE_NAME (there are a lot environment variables at your service)
to find pipeline by name you can use PipelineManager: go.pipelines.get($GO_PIPELINE_NAME, $GO_PIPELINE_COUNTER)
having pipeline instance, you can iterate stages by pipeline_instance.stages object
having a stage, you can get it job and download directory by some path using go.artifacts.directory_wait method
If you would have questions about the implementation I can try to help you.
You can select artifacts by wildcard, you just have to give the artifact a destination (attribute dest in the XML config). You can do that multiple times inside the same job, just use a different dest each time:
<artifact src="myproject/myproject-module1/build/*" dest="module1/" />
<artifact src="myproject/myproject-module2/build/*" dest="module2/" />
<artifact src="myproject/myproject-module3/build/*" dest="module3/" />
The corresponding <fetchartifact ...> tags then need to use srcdir="module1" etc.
py.test gives me an "import file mismatch" error. I believe its because I have test modules with the same name under different packages (see below).
Someone posted an identical question here: py.test - test discovery failure when tests in different directories are called the same. Unfortunately, none of the responses work for me.
I've also read the manual. I've tried not including the __init__.py files in the test directories per the manual but that doesn't seem to make a difference.
The "app_class1" and "app_class2" directories each contain a test module named test_create.py.
Here is how my code is organized
/app
-setup.py
/app
-app.py
-__init__.py
/test
- __init__.py
/app_class1
- __init__.py
- test_create.py
/app_class2
- __init__.py
- test_create.py
Any help is greatly appreciated. Thank you.
P.S. I would have commented on the referenced conversation but my reputation isn't high enough to do so.
I've been using PyCharm to develop a submodule to drop into several other projects. I have a Tests directory containing my unit tests and I'd like to run them from PyCharm, but when I test any of my code that contains relative imports, I get:
"ValueError: attempted relative import beyond top-level package"
My structure is roughly:
A
____init____.py
...
B
____init____.py
...
Tests
____init____.py
...
Where I am testing a function in the B module that uses relative imports to import A:
from ..A import some_fn
This thread here pycharm and unittesting - structuring project references marking the test directory as such, but when I right click it, I only have the option to mark it as a source root which has no effect.
I also can't really change from relative to absolute imports because it will break my ability to use it as a submodule in other projects. Any advice on how to fix this would be much appreciated.
Update: I also came across this thread How to properly use relative or absolute imports in Python modules? and I'm not a huge fan of the solution (I'd prefer not to have mirror imports in a try/except block), but it does somewhat solve the problem. I would still appreciate a more elegant solution, but if not, that does actually fix the error.
The problem here is that A and B are different packages. You want them both to be subpackages of the the myproj package.
I think all you are missing is a __init__.py file in the parent directory. Allowing you to relatively import something in B from something in A
myproj/
├── A
│ └── __init__.py
├── B
│ └── __init__.py
└── __init__.py
I'm using webassets in my Flask application using Flask-Assets and I'm having trouble with the depends option when creating bundles.
In my case I'm bundling LESS files from the following directory structure:
/static
\_ /css
\_ /bootstrap
| \_ bootstrap.less // This file #imports variables.less and custom.less
| \_ variables.less
\_ custom.less
My bundle looks like this:
css = Bundle(
"css/bootstrap/bootstrap.less",
filters="less, cssmin",
output="dist/base.css",
depends="**/*.less"
)
With these settings, the LESS files are rebuilt whenever a change is made to either bootstrap.less or custom.less but NOT variables.less.
From what I understand, the expression used for the depends option is a glob instruction and using the one above should simply go through all directories recursively and pick up any LESS files. However, it never seems to pick up on any changes made to variables.less.
In my attempts to fix this, I've tried the following options for depends:
"*.less" - Doesn't pick up anything (as it's searching in the root of the project directory, I believe, where there are no LESS files anyway)
"**/*.less, myproject/static/css/bootstrap/variables.less" - Doesn't pick up on any changes in any file at all.
"**/*.less, myproject/static/css/bootstrap/variables.less" - Same as the one above.
"myproject/static/css/bootstrap/variables.less" - Strangely enough, this picks up on changes made to both variables.less AND any other LESS files (such as custom.less).
In essence, the last item is the "solution" for my problem but I have no idea why it works the way it does, so it doesn't sit well with me. Can anyone provide an explanation or a nudge in the right direction here?
Thanks!
The problem here is that recursive glob ** is not supported in glob module (yet; see issue 13968).
Here's how I have set this up in one of my projects (not sure if that would suit your needs):
less/
├── bootstrap/
│ ├── bootstrap.less
│ ├── variables.less
│ └── ...
└── style.less # #import "bootstrap/bootstrap.less";
Bundle configuration:
css = Bundle(
"less/style.less",
filters="less, cssmin",
output="css/all.css",
depends="less/bootstrap/*.less"
)
I fixed this by installing the glob2 module. My depends='**/*.scss' then started working as I expected, watching for changes in nested directories as well as the top level.