Does ES5 modules accelate webpack build time comparing with ES6 modules? - build

In convention, when we write a ES6 modules, we put source code in src folder, and compile it using babel-loader and webpack to lib or dist folder to set code to ES5, and set main entry to dist folder, then publish to npm.
On the one hand, user can use this module without using webpack, and the code can run. On the other hand, when using webpack, ES5 code may reduce babel-loader time because it's already ES5 code.
What I confused is the second point, when using webpack, does ES5 codes in node_module reduce babel-loader time so we can accelate webpack build performance ?
The question is almost about ES5 npm modules with webpack build performance, although it's a convention we already did, I just want to know about something about webpack build performance. Thanks!

Yes, generally public packages are distributed with sources that have already been transformed. The performance benefit, with regards to Webpack and babel-loader, is that you can consume these sources as-is without having to process them with babel-loader, so you'll commonly see:
{
test: '\.js$',
loader: 'babel',
exclude: ['node_modules']
}
So, I too am confused about this excerpt, specifically why one would want to parse ES5 code with Babel, since no transformation would eventually take place.
Either way, the sources are always parsed by Webpack and not having to parse, transform them beforehand with babel-loader should improve performances.

Related

"history" Module is not listed in package.json dependencies

In the Jest tests for my React app in WebStorm, the following line
const { createMemoryHistory } = require("history");
has the following warning:
Module is not listed in package.json dependencies
The tests run as expected, they pass and fail as expected. createMemoryHistory works. And when I hover on history WebStorm actually shows me the documentation for the library.
(strike this:) But history is a native JS library, like fs, is it not? How do I fix this pesky warning?
UPDATE: Okay, I understand that fs is not a native JS library, it's a core node.js module. I was wrong and thanks for setting me straight on that.
I see that my package-lock.json does include an entry for "node_modules/history". It looks like it's two indents deep, but the lockfile is too complex for me to really tell, or get breadcrumbs, or fold the branch to see where this line falls in the tree.
So I guess the real question is, Webstorm is saying that I don't have the dependency, but the lockfile implies that I do. Unless I'm misunderstanding further.
Again, how do I fix this pesky warning? (or what other fact am I missing? Remember, everything does actually work).
fs is a core Node.js module, i.e. its code is compiled into Node.js binary and doesn't have to be installed. history library is a usual NPM module that is not a part of Node.js core and has to be added with npm i history(see https://github.com/remix-run/history/blob/main/docs/installation.md). The IDE just tells you that you are importing a module that is not listed among dependencies in your package.json

How to determine the tree of files which are imported during a test case?

When I run a test in Go, is there any way for me to get the list of files that the code imports, directly or indirectly? For example, this could help me rule out changes from certain parts of the codebase when debugging a failing test.
Alternatively, with Git, can we find out what the lowest common ancestor git tree node is for the files exercised in a given test?
Context: I'm looking into automated flakiness detection for my test suite, and I want to be able to know the dependency tree for every test so that I can detect flaky tests better.
For example, if TestX fails for version x of the code, and later on some files in the same codebase which are not used at all by TestX are changed, and then TestX passes, I want to be able to detect that this is a flaky test, even though the overall codebase that the test suite ran on has changed.
You are probably looking for go list -test -deps [packages].
For an explanation of what the flags do, you can check Go command List packages or modules:
-deps:
The -deps flag causes list to iterate over not just the named packages but also all their dependencies. It visits them in a depth-first post-order traversal, so that a package is listed only after all its dependencies. [...]
-test:
The -test flag causes list to report not only the named packages but also their test binaries (for packages with tests), to convey to source code analysis tools exactly how test binaries are constructed. The reported import path for a test binary is the import path of the package followed by a ".test" suffix, as in "math/rand.test". [...]
Maybe I'll state the obvious, but remember that list works on packages, not single files, so the command above will include dependencies of the non-test sources (which should be what you want anyway).

julia: rerun unittests upon changes to files

Are there julia libraries that can run unittests automatically when I make changes to the code?
In Python there is the pytest.xdist library which can run unittests again when you make changes to the code. Does julia have a similar library?
A simple solution could be made using the standard library module FileWatching; specifically FileWatching.watch_file. Despite the name, it can be used with directories as well. When something happens to the directory (e.g., you save a new version of a file in it), it returns an object with a field, changed, which is true if the directory has changed. You could of course combine this with Glob to instead watch a set of source files.
You could have a separate Julia process running, with the project's environment active, and use something like:
julia> import Pkg; import FileWatching: watch_file
julia> while true
event = watch_file("src")
if event.changed
try
Pkg.pkg"test"
catch err
#warn("Error during testing:\n$err")
end
end
end
More sophisticated implementations are possible; with the above you would need to interrupt the loop with Ctrl-C to break out. But this does work for me and happily reruns tests whenever I save a file.
If you use a Github repository, there are ways to set up Travis or Appveyor to do this. This is the testing method used by many of the registered modules for Julia. You will need to write the unit test suite (with using Test) and place it in a /test subdirectory on the github repository. You can search for julia and those web services for details.
Use a standard GNU Makefile and call it from various places depending on your use-case
Your .juliarc if you want to check for tests on startup.
Cron if you want them checked regularly
Inside your module's init function to check every time a module is loaded.
Since GNU makefiles detect changes automatically, calls to make will be silently ignored in the absence of changes.

Unit testing with Webpack, Jasmine (-core), typescript

I have a project that is using webpack to bundle all code into a single file. The project is using Typescript and it is working fine at the moment.
I've gone to add unit testing and jasmine seems to be the way (one of the many ways) forward. Its actually jasmine-core that is included in the package.json - not sure how much of a difference that makes.
So running a very simple test such as
it('true is true', function(){ expect(true).toEqual(true); });
works fine.
But when I add tests that require the use of an import - eg
import MyService = require('./MyServices');
then when I run the tests it complains as it doesn't know what 'require' is.
Uncaught ReferenceError: require is not defined
Now I'm guessing this is because I need to package up the test module in a similar way that I package up the main project.
So what is the best way to do this?
Should I have multiple entry points in the webpack.config.js file - one for each *.spec.ts file?
Or is there a way to have say accept an unknown number of spec files
entry:[ *.spec.ts ] and have it output a js file for each one - *.spec.js
You can use karma/karma-webpack to run all the tests using webpack for resolving the imports. You can take a look at this repository for a simple configuration.
You can also specify an index.spec.ts as en entry point and make this file require all the spec files if you don't want to make one entry point for each spec.ts in your webpack's configuration file.

Ember Cli - Transpiling vendor ES6 dependency in ember-cli-build?

I'm writing an Ember.js application using Ember Cli, and I want to include a non-bower dependency - basically a dependency from my vendor folder.
The instructions on doing so is telling me to add the following line into my ember-cli-build.js file:
app.import('vendor/dependency-to-include.js');
That would work fine with a normal ES5 flavored dependency, but what if I want to add a dependency written in ES6?
Right now it just delivers it to the browser untouched, which produces an error like:
Uncaught SyntaxError: Unexpected reserved word
because my ES6 flavored dependency uses the following syntax:
import Util from './util
I'm guessing that I need to tell ember-cli-build to transpile this particular dependency before passing it on to the browser, but how do I go about doing that?
Thanks
For transpiling imported dependencies you need to run the imported file(s) through the broccoli addon broccoli-babel-transpiler. For a basic example, checkout this file: https://github.com/thefrontside/ember-impagination/blob/2fa38d26ef1b27a3db7df109faa872db243e5e4c/index.js. You can adapt this addon to an in-repo addon for your project.
See this link for the background discussion and #rwjblue and #cowboyd on the actual fix: https://github.com/ember-cli/ember-cli/issues/2949
Are you currently including Babel within your project? I would have thought that it checks your vendor directory the same as it does everything else and converts the ES6 code to ES5.
The other option would be to just convert the file to ES5 manually whenever you need to include a vendor file with ES6 syntax. Not necessarily ideal, but if it's a static file then it's something you'll need to do once and then forget about.