https://www.w3schools.com/nodejs/default.asp Test connectivity to configured repository: npm ping [--registry http...otherreg] "npm help x" doesn't work on Archlinux. Workarounds: man npm-x Or https://man.archlinux.org/listing/?lang=en&sorting=alphabetical and in top entry field enter: npm-x Regularly update npm itself: npm in -g npm@latest N.b. you get warning about available npm update only once. So check explicitly. GOTCHA: Many npm commands that connect to the npm repository silently fail with exit value 0 if there are connectivity/proxy/authentication/authorization failures. For this, use the (mostly hidden from docs) global-level switch '-verbose' for troubleshooting. Browser-less JS. It bundles v8. -c switch to "node" sometimes does nothing at all. Syntax check. -e switch is great and works precisely like Perl's -e switch. -p is a great addition-- it is "-e" plus displays the final value. Should set PATH=%PATH%;%APPDATA%\npm on Windows, to get installed bin scripts in search path. Be aware that APPDATA variable not available in sysdm.cpl "System variables" but is available under "User variables for x". TEMPLATES in ~/code/templates/nodejs/: nodejs.js CORE SCRIPT TEMPLATE!! Arg handling, quiet exceptions, strict OO classes and objects *Class*.js, userOfNodeClass.js Class defined in external file. I think that my OO examples and instructions are for modern ES versions not just NodeJS. Subclass constructors must call super function before can use 'class'. Has great Array.shift/unshift/push/pop() Destructuring variable initialization. To initialize new var(s) of name 'vName1', 'vName2': const|let|var { vName1, vName2 } = { vName2: val2, unus: vs, vName1: val1 }; Vals get undefined if corresponding key not in the dictionary. Simple js-file inclusion/import: require("vm").runInThisContext(require("fs").readFileSync("include.js","utf8")) This loads directly into the context of the calling file, without any scope limiting as would be done by import or require statements. BUT the included files don't have access to the calling context. Somehow this does work when executed inside of anonymous functions too. Package search: https://www.npmjs.com/package/package Package vs. module. Modules are things that can be loaded via require(x). Package is a file or directory described by a "package.json" (utf8-encoded). Text file read: N.b. encoding is specified as "utf8", no hyphen, space, or capitalization. console.log(require("fs").readFileSync("date.txt", "utf8")); Without the encoding specification this would return a Buffer instead of string. Text file write: require("fs").writeFileSync("data.txt", "content"); require("fs").appendFileSync("data.txt", "content"); Reading stdin and writing stdout. Simple read everything is: require("fs").readFileSync(0, "utf8") (Due to buffering concerns, this MUCH easier than trying to use process.stdin.) Simple write everything is: process.stdout.write(string); // Use default encod?? Simple LINE reading: Problem with OOTB readline module is, it is asynchronous. Lots of readline sync modules don't work with stdin piping and don't distinguish between "" and EOD. Only module 'syncprompt' works just like traditional "read", where prompt("") returns null upon EOD. file checks/tests EXISTENCE: fs.existsSync(path) GOTCHA: Can't use fs.accessSync, e.g. fs.accessSync(path, fs.contants.F_OK) before using fs.existsSync, because accessSync will throw if file doesn't exists. [WHERE??? accessSync/canRead/isReadable/canWrite/isWritable clumsy] fs.accessSync(path, fs.constants.W_OK) + R_OK, F_OK (exists), X_OK(unix-only) GOTCHA These do not return!!! Must wrap in a try block. Other: fs.[l]statSync(filePath).isDirectory|File() GOTCHA: Must do an existence test before //https://nodejs.org/api/fs.html#fs_class_fs_dirent (?? fs.Stats class) Use l variant 'lstatSync' to check on sym links themselves rather than target node. Can also get the owners/times/size attributes from this fs.Stats. toString implementation gotcha. Since console.log(obj) does object pretty-printing, it does not honor or use your toString(). To print your toString() value, force execution of it in some way, perhaps most directly like: console.log(String(obj)); PRETTY-PRINTING collections (incl. nesting) 'prettyprint' package has recursion bugs printing single level arrays just works; for any other structures, toString (incl. implicit toString and from `backticks`) just outputs "[object Object]" for objects and flattens array members. Therefore you need to log with any of the Dump methods below. In most cases display of a collection will output just "[object Object]". incl. String(col) and col.toString(). The console varargs substitution/printf functionality is actually from util.format, which can be used as a limited sprintf, with %x placeholders. GOTCHA: console.* colorizes by default, but util.format does not. Go get this, just use this instead: util.formatWithOptions({colors:true}, regular util params...); undefined vars throw. If displaying a %s, things are toStringed. GOTCHA: Unlike printf, %i for integers. %d is for either floats or ints. Like printf there is no boolean format. GOTCHA: %c is not for Character. It writes nothing! But only console.log* cases you get simple pretty-printing: [with color if output is directly console]. *actually log/debug/info/warn/error console.log(col); GOTCHA for following is that multiple params are separated by a space before then, so if you want a title on its own line via "param1\n", then line #2 will have a friggin space before it. So, for multiple-line output, better to pass a single line with \n's, for example: console.log[line1, line2, line3].join("\n")); console.log(other1, col, other2) https://www.npmjs.com/package/prettyprint This nests lodash (see "js.txt" about lodash). Until can use: import prettyprint from 'prettyprint'; do use: const prettyprint = require("prettyprint").default; BUT BEWARE! Due to prettyprint recursion, sometimes it will throw because of infinite recursion, and worse it will sometimes just hang. In those cases, need to run a custom non-recursion dumper like AdmcUtil.objDump. If you console.log(obj), it will pretty-print. Non-% varargs prettyprints objects just like %O. So if you don't need other strings to frame the data being output, use the shortcut: console.log(obj1, obj2) === console.log("%O %O", obj1, obj2); N.b. GOTCHA %s resolves much closer to a prettyprint than toString! General problem with Node.js .toString is, it displays just " If 1st param has %'s then it's printf-like without length specifiers. Use %O to pretty-print objects just like console.log(obj) but embedded with other text; or %o to specify pp options. (%O and %o work find with strings and scalars). GOTCHA: Generally do use %O because %o is usually too verbose; but if you are missing representation of some nested structures in there, %o can fix that. util.formatWithOptions({opts}, regFormat params...); gives EXCELLENT control, with: depth, colors, *Lengths, compact, sorted It looks like either console.x or util.format always truncate big items. Need to manually output big stuff (with process.stdout). 'string.ify' Ultra-succinct but seems to drop closing ] for arrays. https://github.com/xpl/string.ify Can use Symbols to customize renering. Can configure depth and max array length, floating point precision, both default and ad-hoc. print-object. Poor type indicators and recursion, but can write HTML. ypp. Very simple and good, but open/closer delims too verbose. Quotes keys :(. Logging .debug, .info, .log go to stdout;.warn, .error go to stderr. To alter where things go, just override the console.x implementation. E.g. QUIET DEBUG: console.debug = ()=>{}; QUIET (all but warn+err): console.debug = console.log = console.info = ()=>{}; DEBUG to stderr: console.debug = console.warn; console is implemented just like browser 'console' per https://www.w3schools.com/jsref/api_console.asp console.log/debug/info/warn/error(params) // params may be LIMITED printf-like list, [no %20s] // or a collection that will be pretty-printed // or just params each of to display delimited by space. console.dir(dictOrArray[,opts]) // Has depth option. console.table(arr[, props]) // Displays Array of dicts as ascii table. Consider ololog, which uses string.ify. Stack traces: console.trace([msg]) // msg may be LIMITED printf-like list. string.padEnd(n) and string.padStart(n) can be used to work around lack of printf %20s and similar. String padding works great: string.padEnd(n, ["padder"]); string.padStart(n, ["padder"]); Timer console.time([label]); and then: console.timeLog([label][,...data]); // also outputs to stdout console.timeEnd([label]); // also outputs to stdout Sleeping/polling/etc. https://nodejs.org/en/docs/guides/timers-in-node/ setTimeout and setInterval work just like JavaScript window.X. They take optional params after ms params that are PASSED THRU to your fn. This very effective: const Sleep = secs => new Promise(res => setTimeout(res, secs*1000)); (async()=>{ ... await Sleep(3); ... })(); "npm shrinkwrap" updates file "npm-shrinkwrap.json" specifying versions of all package dependencies. For some reason if you display dates directly like with 'console.log(aDate);', it displays with UTC time zone. V8 in browser displays these with local zone. But toString() and implicit conversions to string make a string with local zone. File-inclusion/nesting. Common JS (ES5) Module require is the simple file-sharing mechanism. For non-path requires, you will have to manually "npm i x˝ or "npm link x" for each of them, unless you make a custom script to either Automatically run the npm command when you get 'missing module' errors for your script x.js. Scan x.js for requires statements. Minimal requirement for a module is just a JS script that assigns something to (provided or re-created) module.exports. Once you have that, any other script can load those module.exports with a require statements with filepath of the exporting module file or directory. Path start with / or ./ or ../ (relative paths like "fileHere" or "sub/file" do not work). May exclude the assumed filename extension ".js" If a directory path (not file path), script file must be named "index.ms". For the module to be resolvable by name, need a package.json file in directory with name of module name, and the JSON associates this name with the script ("main"). Parent of that module dir needs to be in the Noode lookup path. just needs to be with it. require("x") does 2 things: 1. Executes the script/module that 'x' resolves to (see below). 2. Returns a reference to the script/module's 'module.exports' member Therefore: 1. To simply execute the script just ignore require's output: require("x"); 2: Where need to use modules' module.exports just once, do inline ... require("x").member... Caller (calling module) and required module share global vars, so can share by using undeclared variables. Variables declared in any module are local to the module (can export them with "module.exports" member assignments).. A require path that doesn't start with . or / looks in 1. module.paths from 2. $HOME/.node_modules According to docs. Does not work for me! 3. $HOME/.node_libraries (untested) 4. /lib/node_modules (incorrectly documented as /lib/node) /node_modules on Win. 5. NODE_PATH elements For installing, same except in this case is $PWD; doesn't honor NODE_PATH and prefers either existing directory or that the parent directory contains the package*json file(s) and/or node_modules subdir (utf8-encoded). If none found it will create $PWD/node_modules (and package*json files). 1. require "./..." is relative to . 2. $PWD from execution is NEVER checked (for runs not installs). 3. relative NODE_PATHs are relative to execution $PWD 4. module.paths is always /node_modules/ + every x/node_modules/ for every ancestor directory up to the root ("npm prefix"). Beware of crossing sym-links! 5. prefix is for -g installs and for #4 in resolution precedence list above. npm prefix -g QUIRK! On Unix, global installs add to /bin/ and /lib; but on Windows global installs (1) are not global but user-specific, (2) is %APPDATA%\npm, (3) bin scripts + node_modules go directly into this instead of into .../bin and .../lib. Either way, filename suffix is optional, and there can be directories relative to search start location, i.e. "../subdir/modX" or "subdir/modX". ("The basename can be either for "basename.js" or "basename/index.js"). N.b. on Windows, prefix is personal under AppData, not shared; whereas on ArchLinux is is /usr. Therefore on Linux need root and is really global! Module: Simply set module.exports to precisely what you want to share (anything). To return functionS, add shared items to the provided empty module.exports. (No effect at all if you replace the provided one with: module.exports={}; To return a single function or class or anything else, just replace the provided module.exports with your thing: module.exports = (x)=>... OR module.exports = class YourClass {... Caller usage: const utils = require("./AdmcUtil"); utils.aMemberFn(...); OR const localName = require("./AdmcUtil.member"); localName(...); Load shared things from the export map with: myExportMap = require("./fileBaseName"); // ".js" suffix optional and const { createTrader } = require("devalpha"); is a shortcut for: const createTrader = require("devalpha").createTrader; Published CommonJS node modules can automatically be ES6-imported, but local CommonJS module files can't be loaded. You must convert them to ES6. CommonJS require and ES6 imports are mutually exclusive. Can't use both in one script. For both you can't directly load all exported members into current namespace. You must explicitly specify which members to load or what new single variable or const to add. ES6/ES2015 import from node.js BIG DIFFS: With Common JS can export just a single item by simply: module.exports = x; To export just a single item with ES6 you must export as default: export default x; (can then load to arbitrary constname like: import locConst from "...js") Other than this you must always specify imported item keys (can rename them with "alias"). Must end path with full filename suffix. N.b.!! you can only use "export" before a declaration, like: export [let|const|class|function] nm... https://www.geeksforgeeks.org/how-to-use-an-es6-import-in-node-js/ This system use export (not exports!) and import and even the import can only be done from within an ES6 module. ES6 modules always run strict, so don't add "use strict". Module file must be named *.mjs, at least if it lives in a non-module-type package directory. Must either set 'type: module' in package.json OR node --input-type module -e 'your scriptlet' # does not work with -p Includes, at least for local files outside your dir require full filename with suffix. At least for local files outside your dir, you must use suffix .mjs unless you have a module file with type=module in that dir. This also allows you to call 'await'!!! Therefore, if you want to use await you should strongly consider using module type! Working example: https://masteringjs.io/tutorials/node/import Many ways to run an ES6 module from node. All listed in the g4g article above. Difficulty is that the node ES6 support module "esm" must load BEFORE your own script. The tactics detailed in article are: 1. EITHER (a) Add to package.json: "type": "module" OR name script *.jms (*.jsm can't be a sym link. Real file node must have this name.) + execute with: node --experimental-modules yourScript.js [Switch actually works though not documented in --help nor man page!] BUT see above re. -e and -p (eval execution modes). 2. Install 'esm' module [it is tiny!], then Pre-load esm module by invoking your script with: node -r esm yours.js 3. Install 'esm' module [it is tiny!], then Require wrapper script containing this magic: module.exports = require("esm")(module)("./yourScript.js") [regular require resolution rules] import statements: 'import...from' or 'import *' See https://www.w3schools.com/js/js_modules.asp ALL VARIANTS: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import import * as x from "mSpec". NEW. Quirks with non-default members, so avoid. "as x" is required so like CommonJS it's impossible to load all members into current namespace directly. default exports have no name (other than default), so you specify local dir name with no "as". To import from a specific file, as will need to do aggregating in a module file, need to import a relative path to the JS file-- not filename completion (like always with deno). require("mSpec"); ⇒ import "mSpec˝ // exec-only, no loading = require("mSpec") ⇒ import from "mSpec"; For native export modules, the import requires precisely what the module explicitly exports. Which is either (a) a single "default" item; or (b) exported mappings For b you can either have multiple 'export var's or 'export {var1, var2,,,} Module specification uses same resolution as require module specification. When importing from an ES5 module, i.e. the system generates an ES6 export: It adds to modules.exports map a new entry 'default' whose contents is a reference (sort of) to the pre-modified (original) module.exports. Imports other than "*" type always pull this 'default' or pieces of it. There is also special shortcut syntax to get both default + specified values from default: require dVar, { v1, v2} from "mSpec"; QUIRKS: If you delete map.x, x does not removed from both map and map.default. If you add map.y, map.y does not get added, map.default.y does! Avoid these quirks by not fetching non-default. Don't use '* as x'. npm --- Do frequent upgrades of npm: ONLY ON ArchLinux: pacman --noconfirm -S npm WHEN npm installed manually: [sudo] npm up -g npm Version of current package: npm pkg get version Gotcha: uninstalls give no indication if uninstalls fail, e.g. if you typo the package name. Help system doesn't work right on Archlinux. On Windows you would do "npm help X" for available man pages; On Archlinux you must do: man npm-X; or better https://man.archlinux.org/listing/?lang=en&sorting=alphabetical and in top 'Search manual page' entry, enter "npm-X". Config changes change ~/.npmrc by default: npm cnfig set prefix x/y Can use env vars in all config files like ${HOME}… GOTCHA: ENTRY problem when specіfying a "npm config set" value that's a ${VAR} of an absolute path. System can't see that and converts it to relative to your $PWD. In this case, edit the npmrc file manually. npmrc files allow comments for lines beginning with ; or #. npm i [-g] # installs all dependencies (dev+reg) listed in ./package.json into ./node_modules/ (or an ancestor, per "npm root") or to /lib/node_modules if -g. [env var and switches to exclude dev dependencies] npm i [-g] pkgname|tarfile|tarUrl # -g to install under /usr/, otherwise into some ~/node_modules. Nicely: tar file names don't matter (as long as *.tar/*.tar.gz/*.tgz); Nicely: If package with this name exists, you will replace it. If you point to a newer version, this is effectively upgrading. More specifically, g installs under prefix ("npm prefix -g") at /lib/node_modules/npm/node_modules AND always writes element to ./package.json dependencies list; and writes source and version details to ./package-lock.json. INSTALL LOCATION ALGO. Impossible to state this in passing. Starts at PWD looking for a directory that contains either ./package.json or ./node_modules; recursively up ("npm prefix"). It then installs to ./node_modules/ ("npm root") and updates (or creates) project's ./package.json and ./package-lock.json . If none found then it uses PWD to make same updates. npm rm [-g] pkgname # uninstall with its dependencies (equivalent to "npm un...") Resolution (not installation) is via module.paths, which you can print out easily with: node -p module.paths "npm start" looks for scripts.start in "package.json" (possibly elsewhere) and defaults to "server.js". See "Load shared things" above for how to pull them in. To update the dependency module versions pulled into my project, run npm update # Updates all dep modules which have available qualifying updates, where qualifying means obeying package.json version specifiers where ^ prefix means lastest-within-major-version (update minor+patch) and ~ prefix means latest-within-major+minor-version (update only patch) and NO Prefix means no auto­updates. GOTCHA: Unfortunately there is NO WAY to preserve the de-scolling messages about what actually gets updated. I tried all relevant configs and switches. npm outdated # Run anytime to see all available dependency upgrades Show dependencies: npm ls [-l|-a] I don't know why last line of output starts with ` instead of +. Meaning seems the same. FS STRUCTURE: Binaries are in node_modules/.bin/ To resolve a require("a/x") statement, a/x must be either: A file a/x.js script A directory a/x containing a/x/index.js script/run system is awesome! Add entries to package.json's "scripts" which map from invocation-name to shell command. npm set-script name 'com mand' npm exec [-npm-switches...] [-- -switches-to-cmd] [args passed to cmd] ONLY FOR PACKAGES. ??? # Executes with project environment set up as did a global install # Most importantly, bins are made available. npx [-p pkg] [-switches-to-cmd] [args passed to cmd] # auto-installs needed boilerplates ONLY FOR PACKAGES. ??? npm run # list all configured script commands. npm run [-- -switches-to-invoc] [args passed to invocation] # Environment is same as for 'npm exec' npm test # == npm run test Some script names are special triggers: test, preversion, versioņ, postversion. Probably others. Has something added to PATH IIRC, perhaps the .bin dir. You get all npm config and package.json values via env vars npm_config_* and npm_package_*. Check: npm config list set: npm config [--location=project] set key=val Upgrade flexibility. At least on Windows, if you do a "npm i -g npm", then the real global binary npm will somehow redirect to your %APPDATA%\npm npm. Essential/Critical built-in modules: (You can reference as anything but the require name must be...) http http server and client, but non-built-ins are better: axios client, http-server for OOTB file server, express for server framework http server gives poorly documented req + resp callback params, documented (inconsistently) at https://tutorialspoint.com/nodejs/nodejs_request_object.htm url (for parsing) fs File System readFile and rename (and perhaps all methods) supply callback function and are asynchronous. fs.renameSync(oldName, newName) works great, silently overwiting when necessary events class MyEmitterClass extends [my older note says: Generally need to get an EventEmitter: new events.EventEmitter();] Raise/Throw an event of my type: eventEmitter.emit("myType", arbitraryparams...); Register an event type handler: eventEmitter.on("myType", fn); WHERE fn can take arbtraryparams. If use traditional function (non-arrow) then ´this' will be set to the eventEmitter instance. util (templating engine line printf, and a couple other things) dns (resolution) assert path (path parsing) process https://nodejs.org/api/process.html WEB SERVERS OOTB http and https http-server Awesome http FILE server. Just run the default bin script "http-server" with param of content dir. lws. Uses koa as middleware engine. plugins implement 'middleware' function. local web server = lws + 14 middleware features express. Provides a whole framework not just web server hapi koa Essential/Critical optional modules: syncprompt Works great on Linux but won't install successfully on Windows. :( formidable (file uploading) nodemailer (email) node-chartist (generates Chartist.js SVG HTML code) process.nextTick(callbackFn, [passThru, parameters]); This causes it to execute after the current "operation" (all of "user's code") but before the next event phase. Generally use setImmediate instead because simpler and more portable. setTimeout works like regular JS setImmediate executes once current poll phase completes (elsewhere it says in next "iteration"??? but definitely AFTER current phase). (Callback will execute sooner than setTimeout(fn, 0) if in an I/O callback). THREADING 1 Event Loop + k Worker Pool threads (aka the threadpool) Try to block none of these. Often, only a few dns.*, fs.*, crypto.*, and Zlib functions use Worker Pool threads. Tactics: setImmediate() can be used to queue up partitions of work (all in event loop) Offload via Custom dedicated Worker Pool Offload via custom C++ addon ------------------------------------------------ EXAMPLES http Listener: From https://www.w3schools.com/nodejs/ref_url.asp or Node site. var http = require('http'); var url = require('url'); http.createServer(function (req, res) { res.writeHead(200, {'Content-Type': 'text/html'}); var q = url.parse(req.url, true).query; var txt = q.year + " " + q.month; res.end(txt); }).listen(8080); --------------------------------- HIGHLAND Stream.split doesn't just split on '\n' (or specified delimiter), but it joins all existing elements first. I.e. ["one", "two", "three\n"four"] -> ["onetwothree", "four"] Problem is, it expects only internal delimiters, so if input ends with "\n" you get a n empty element at the end. EXAMPLES let x = highland([1, 2, 3, 4]).through( (s) => { return s.map((i) => { return i*3; }); } ); highland(fs.createReadStream("/tmp/nums.txt")).split().map((i)=>{return {nm:parseInt(i),c:i.length};}).toArray(utils.consoleObjListDump); const csvStream = fs.createReadStream("myData.csv") feeds: { myData: csvStream.through(pipeline) Essential/CRITICAL COMMON STUFF Determine JS version: console.log(process.versions); (N.b. these are the only 2 global __ variables). __filename is absolute path of containing script NOT AVAILABLE IN ESM! __dirname is directory containing script NOT AVAILABLE IN ESM! GOTCHA! Not available in ES6, so use workaround from https://stackoverflow.com/a/50052194/400306 import { dirname } from 'path'; import { fileURLToPath } from 'url'; const __dirname = dirname(fileURLToPath(import.meta.url)); process.cwd() is $PWD require("os").hostname() // obviously require("os").homedir() // obviously require("os").tmpdir() // obviously process.env map (See my function sortDict in file "tech/js.txt"). Command-line arguments, args, params. May depend on invocation method, but at least for scripts with "#/usr/bin/env node" interpreter line invocation, you have process.argv[0] == shell, process.argv[1] == script, process.argv[2...] Great way to just get regular args: const args = process.argv.slice(2); Exit: process.exit(2); TO SIMPLY LIST DIR CONTENTS: console.log(require("fs").readdirSync(".")); External command execution via child_process module. Multiple methods. There's a timeout option for all. My favorite for synchronous is spawnSync, but doesn't allow interaction with IO, you set input before and get returnVal, stdout, stderr afterward. Huge difference between the different methods: execFileSync throws if command returns non-zero status Return value: ChildProcess for all non-Exec stdout for execFileSync, execSync plain object with optimally convenient members for spawnSync [default stdio not documented, but is empirically: ['pipe', pipe-auto->pObj.stdout, pipe-auto->pObj.stderr] NOT .output which is diff! Use this option if output will not be consumed: {stdio: "ignore"} env option's PATH entry can be used to set executable search PATH. Generally, setting option 'input' to a string overrides 'stdio'[0] to be auto->pipe. What makes this largely useless is, this is silently ignored if you set 'stdio'; consequently you can only use it if you want default stdio options for stdout and stderr. sync: exec* and execFile methods do not support non-0 return value, stdin. [default output buffer sizes 200kb can be increased; utf8 by default]. child_process.exec[File]Sync. You get stdout but stderr passes through. System handles this with default opt 'stdio: "pipe"' and has pipe handlers set up to return stdout from *Sync() call and "inherit" stderr. Can only defeat this by using spawn. GOTCHA!! This is contrary to docs! Default opt should be: stdio: "pipe" which would pass std+stderr to Sync's just by virtue of default 'stdio' option of "pipe". Can easily change to 'execSync' (i.e. no "File") executes a shell string. promisified child_process.exec[File]. You get stderr too. 'execSync' (i.e. no "File") executes a shell string. child_process.spawnSync. No limitations Programmatic child abortion: const ac = new AbortController(); ... = exec*|spawn*(cmd, args, {... signal: ac.signal...}; ... ac.abort(); spawn: spawn() returns a async child_process (still running) vs. spawnSync() returns a plain object. Why the fuck not always use this for sync??? It's awesome + simple! Because it doesn't support synchronous interaction with input or output. retVal.stdout, retVal.stderr are Buffers not streams! Use .toString("utf8") stdio defaults to "pipe", resulting in: async: ChildProcess: attrs { stdin stream.Writable has .read([size]), .end(), .readable/Ended/Length/Encoding() events close() upon pipe close end() upon all output consumed stdout stream.Readable stderr stream.Readable have .setDefaultEncoding(), .write() .writable/Ended/Length/Finished events: data→sometimes string, sometimes Buffer, can be other MAYBE dependent upon 'encoding' option? stdio Socket[] exitCode number signalCode killed boolean } events error(Error), close(code) ==cp.exitCode, exit(code) ==cp.exit ** event handlers like: obj.on(evName, fn(providedParam){}); sync: pObj.status is same as async .exitCode except null if a signal was thrown (in which case see .signal). async [New option 'options.detached' (only for spawn* funcs?) allows Node process to term while child continues.] child_process.exec[File]. Powerful 'exec' (i.e. no "File") executes a shell string. child_process.spawn. No limitations If use detach option, need to do something with output, such as 'stdio: "ignore"' or stdio: ["ignore", fs.openSync("./out.log"), fs...]; To get from output (of all but .spawn) to string, use bfrReturnVal.toString("utf8"); To spawn a program and let user see output as it is written. Even harder if the output is colorized of if you want to allow user input as they type it. For the simple case of displaying even colorized text as it is written, you can exec on Linux: mintty -e... or on Win: start /wait cmd /s /c "..." :: supports compound commands Third party modules difflib has critical bugs. n option doesn't work. Fix easily by adding n param to "difflib.js" getGoupedOpcodes calls. contextDiff just does not represent differences reliably. Bash getopt-like vargs processing yargs module: https://nodejs.org/en/knowledge/command-line/how-to-parse-command-line-arguments/ IMO this is more intuitive than the more strict getopt-like ports. See "~/code-templates/yargs.js". Somehow require("yargs").argv provides a plain object deep clone of yargs. But referencing this .argv is the action that actually causes argument parsing. Switch default values. No explicit default settings (ultimate default behavior): Not in yargs obj. If you give switch with no value then true If you specify a 'default:' then giving "-x" is same as giving nothing: In yargs obj with the default value where nothing or "-x". To manually display help, need to require("yargs").showHelp(), and process.exit yourself. Good to do for failed value checks with regexp. Use requireArgs to require paired value with switch. There is a glitch with .usage where it trims each line. Can only avoid this by having no space character at all on the line. Replace all spaces with either \t's and/or \u2009. Easiest workaround: .usage(....replace(/ /g, "\u2009")). count of any character in a string: "one\ntwo\nthree".split("").filter((c)=>{ return c==="\n"; }).length JDBC: Currently waiting for reply from 'jdbc' package maintainer. Getting no reply. This module is incredibly over-engineered. I have working script at "~/tmp/nodejdbc/idunno.js" but it blocks when the work is completed. Module nodejdbc looks great but seems to have been aborted years ago and won't install successfully now. node-any-jdbc also aborted and won't install successfully. IDEA: Build my own package which bundles a sql module JDK. HTTP for clients. axios is better! (http.createServer is obviously for server-side) SUCKS for http/https clients with PROXIES!! See "~/code-templates/webClientjs" for working example of GET requests. // .get and .request params are optional, other than one of url or options: http.ClientRequest http.get|request(url, {opts}, fn(http.IncomingMessage)) .get (a.o.t. request) automatically calls cr.end. If 'end' event not raіsed then you must call im.resume() to release memory. http.IncomingMessage (IM); IM.read() OR add 'data' handler OR IM.resume() http.IncomingMessage, an EventEmitter essential functions: im.setEncoding("string"); // input or output??? im.on("eventName", handlerFn); im.resume() Deallocate if 'end' event not raised by last 'data' callback or by last ѕtring im.read(). im.read() returns either a string or null. When last of date is read (not the post-last-data null), 'end' is raised. essential events: Though system will execute some combos, use just one of 'readable' or 'data'. readable: Raised when data received and for EOD. In callback should call im.read(). (See above about these im.read() calls raising 'end'). data: Handler called with param of input string. When raised with final data (after handler execution?), 'end' is raised. end: Called unless you error: deprecated http.ClientRequest,an EventEmitter. essential functions: cr.end(...) submits request. Not needed for http.get ClientRequests. cr.write("string") data to transmit essential events: error: Handler called with param of an Error JSON and Pretty-printing node-js module for color-pretty-printing without console.x functions. node -e '(async () => {console.info(await require("node-jq").run(".", {a:1, b:10}, {input: "json", color: true}));})();' To use external jq binary: At install & update time: export NODE_JQ_SKIP_INSTALL_BINARY=true At runtime: export JQ_PATH=/usr/bin/jq Base64 encoding: Buffer.from(string, "utf8").toString("base64") Base64 decoding: Buffer.from(encodedString, "base64").toString("utf8"); Short program/script name: process.argv[1].replace(/.*[/\\]/, "") ESLint Uses AST. Direct interacive usage: eslint --no-eslintrc --stdin --env node --parser-options '{ecmaVersion: 3|5|6|12|latest}' <<< 'x' Install with: npm i -D eslint # answer prompts about desired execution environment Install servicenow plugin with: npm -i -D eslint-plugin-servicenow To create an .eslintrc* file interactively, if you don't have one from some other source: npm init @eslint/config Beware of the init interaction. Some prompts are multi-sel and you must space-sel items. For node-centric packages without typescript or ES6 import, prefer "~/code-templates/eslintrc.json .eslintrc* file have overriding cascading similar to node_modules cascading. You can stop cascading by setting "root": true They have changed behavior over time, but I believe now ends at home dir, and if not below home dir, there is no implicit ~/.eslintrc*. Execute from project dir with node_modules/.bin/eslint [-f html -o file.html] [-c addl.json|addl.js] [--cache] paths.js OR node_modules/.bin/eslint [-f html -o f.html] [-c ad.json|ad.js] -ext .js [--cache] dir/1/ dir2 # the directories are searched recursively and node_modules is skipped --cache option allows for smart incremental checks using file ".eslintcache" (by default). Config file selection: --no-eslintrc -c addl.json Max of 2 config files: .eslintrc* and/or a -c. -c files requires one of the same supported suffixes as .eslintrc* files, to indicate language. Big difference with -c is: paths in the file are relative to $PWD rather than to config file location. Custom rules. Generally add them within a custom plugin/extension. But certain cases can be added without that, incl. function calls. All covered by https://medium.com/@_kbremner/make-eslint-do-the-work-adding-custom-rules-b83c2a1879a6 Creating custom rule extension without a standalone module doesn't work as explained at https://stevenpetryk.com/blog/custom-eslint-rules/ . For this, simply use --rulesdir switch. Rules/Environment Plugin In order for user to use ANYTHING from a plugin, user's config must include it in "plugins" list. There is awesome utility RulesTester. Just plugin with base name beginning with "eslint-plugin-" (and for scoped package it may be just @x/eslint-plugin"); and the main JS exports 'rules' and/or 'environments and/or 'configś' map. module.exports.rules: code may be the rules entry values right there. module.exports.environments: keys 'globals' and 'parserOptions' same as reg. configs except: globals.x value is not a readable/writable/off but boolean indicating writability. module.exports.configs: keys are extends names to use after "plugin:thisPlugin/". Beware these are only configs. To include colocated rules you must specify that. Disable comments are just like for codenarc. Except that file-level and block-level (enable/disable) must use /*...*/ not //. *-next-line-* alternate is ever better than CN. https://chaseadams.io/posts/eslint-disable-rule-by-comment-generator/ Comments above are for satisfying rules; but also have comment for satisfying globals: GOTCHA! This only applies to entire file. Location doesn't matter! /* global var1, var2:writable */ // slash-shash does not work, values in next line // slash comment only works if the eslint comment is the only // comment on that line. /* star comment only works if the eslint directive is the only thing in that /*comment*/. DEPRECATED global mapping values. Now use only: off, readonly, writable Every type of JavaScript comment is allowed in the .eslintrc.json files. //, /*, partial-line, line-spanning overrides element in config files is awesome. Can use this instead or with cascade config files. Each element of overrides is a {} that sets files blogs + any overrides for any root-level elements which make sense. If run with --stdin-filename, that's the string evaluated against the globs. Files patterns. if --stdin without --stdin-filename then options.filePath gets undefined. Match file relative to config dir, not against the reported file path. Must specify relative path **/ prefix means any OR NO dir prefix **/* matches everything and nothing Can not give options.filePath of "". There is no way to test for undefined path. files which match everything incl. undefined: "", "*", "**", "**/*", "*?*", "?*", "*?" "x" and "*x*" match x and *x* only in final segment. can test for number of specified relative segments like: */*/* for 3 segments Config env: env encompasses parser options and globals (node + es2022) or (browser + es6) TODO: Ask about es5 w/ browser at ESLint forum. "es*" values also set ecmaVersion.parser values. Config parserOptions: This іs very JS-engine-specific and doesn't cover globals. Defaults to ecmaVersion: 5 unless env sets a JS ver; sourceType: script; ecmaFeatures.*: false Only things I would change are: ecmaVersion: a supported integer! or "latest". Defaults to 5 unless have env any "es*" I will usually not set this since I should always set an env "es*" value. ecmaFeatures.impliedStrict: true <-- Only parserOption I will usually set A rule resolution quirk is that references to unavailable rules throw UNLESS it's an "off" setting, in which case the missing rule is ignored. GOTCHA: Specifying exceptions for no-unused-vars is clumsy. Must specify as regex AND there are subtly different option names for argument/param vars and non-argument/param vars. MESSAGES: The rule's 'message' should be a 1-line. Appears in results with code source location. The rule's docs/description can be longer and is for incluѕion in documentation. CLI: Syntax BNF is wrong; and if you specify no input files at all you get useless no-op behavior: GOTCHA: Always specify input files ѕomehow, at least with: . Rule implementations and tests use JavaScript AST. The online AST Explorer is an extremely useful utility. fs.rm vs. fs.unlink. unlink doesn't work on directories. Both return undefined. mocha plugin, unit test-runner Install with -D. If you just run "mocha", it executes tests test/*.js, test/*.mjs, test/*.cjs; from current $PWD. Can give as param any other directory or files, so: mocha sudir/tst.d assertion equals function usual param ordering: (actual, expected, message) joi data type and object schema validation: See "~/code-templates/nodejs/joi.js". GOTCHA: I don't understand the signficance of: joi.object().keys({...}) So far I detect no difference from simply: joі.object({...}) Examples below are for object validation, using require("join")object({...}). Specified keys, if in data, are always required to meet the specified type. Can also check scalars with schema like: joi.number().integer().validate(34.1, { presence: "required" }).error) OR joi.number().integer().required().validate(34).error To just check for keys without constraining values, use: joi.object().keys({keyA: keyConstraint}) MOST STRICT schema: add option presence: "required" Requires specified keys; prohibits unspecified MOST LIBERAL schema: tack function .unknown() onto the schema: specified keys are optional; unspecified keys are allowed I have used option 'presence: "required"' below, but if you need some required and some not then use schema tag-on function .required() instead. Either: // undefined if good: const errMsg = require("joi").object({schema}).unknown().validate(data, { presence: "required" }).error; OR const joi = require("joi"); joi.assert(data, joi.object({schema}).unknown(), msgPrefixOrError, { presence: "required" }); OTHER MODULES OF POSSIBLE INTEREST joi datatype and object schema validator PropTypes for property schema validation. See if can make a good system out of "diff" module, which is low-level. Doubtful that can get it to do context lines like UNIX diff. PDFKit. nodemailer restify prettier. Dev-time code formatter. cjsc (by dsheiko) cheerio (server-side jQuery) marked. markdown markdown-toc generates github-compatible markdown TOC requireindex. Returns a plain object of recursive tree of module.exports from all *.js files (excluding that file itself and _*.js) from specified dir branch. require("requireindex")("a/dir") would return a {} built from searching 'a/dir' branch for *.js files and for each such file sd/f.js it would add: sd: {f: module.exports-from-f} Typically makes sense to add this to the package.json "main" file and use '__dirname' as param. Linting/Code-checkers eslint and eslint-watch jscodesniffer (by dsheiko) jshint linter json-easy-strip Strips code comments from JSON (there is also newer 'strip-json-comments' Former has ability to load files; 2nd works only with strings and replaces with spaces by default strip-comments GOTCHA: Has a bug where /* inside of comments strips everything after xmldom + xpath. XML and XPath 7zip-min. 7Zip UNIT TESTING tst looksgood mocha MANY others env vars: process.env map Newer version has plain old fashioned: string.replaceAll(a, b); Great! Text editing on Windows wordpad sucks. It tries to be smart and add all sorts of binary stuff. notepad works only if EOLs are DOS \r\n np package tiny-cli-editor node_modules/.bin/editor works onlї if EOLs are UNIX \n (will STRIP OUT \r's!). Only supported commands are: +c Abort without saving +d Save file and exit Display is still totally fucked up with older windows CMD terminals where properties has no 'terminal' tab. MEMORY If get OOME errors, try the following. Units are MB. Default is 4096 for Node v. 20.x addnode switch: node --max_old_space_size=4096... OR export NODE_OPTIONS=--max_old_space_size=4096