Node.js – File System

The first experiment I tried was accessing the File System. This is not a complete tutorial, but a series of commands to try in your programs, so you will need some programming knowledge to make sense of them. Full tutorial to come later if there’s a demand for it.

Watching a file for changes

Create a file

touch myfile.txt

Create a node program – watchfile.js

const fs = require('fs');
fs.watch('myfile.txt', function() {
  console.log("File 'myfile.txt' just changed");
});
console.log("Watching 'myfile.txt' for changes");

Run the program with node

node watchfile.js

ARGV

ARGV is part of the global process object. Use this to gain access to 3 parameters:

  • 0 = Node source
  • 1 = Full path
  • 2 = Command Line Argument
inputArg = process.argv[2];
console.log(inputArg);

If you require a file argument remember to check for it in the code and throw an Error to halt execution.

if(!inputArg) {
  throw Error("You must enter a valid Argument!");
}

Child Process

You can execute programs using the child_process module, specifically the spawn function. As functions are first-class citizens in JavaScript they can be directly assigned to variables.

"use strict";
const spawn = require('child_process').spawn;
...
fs.watch(filename, function() {
  let ls = spawn('ls', ['-lh', filename]);
  ls.stdout.pipe(process.stdout);
});

“use strict” ensures some problematic JavaScript features are disabled. It must be the very top line of your file (although comments can technically go above it).

the LET command is similar to CONST but can be assigned more than once.

SPAWN takes the program name and an array of arguments to pass.

The returned SPAWN child_process object has STDIN, STDOUT, STDERR stream properties. This means we can read and write data. PIPE allows us to send it to our STDOUT which is the console. You could just store the result, but that’s not much fun at the moment!

EventEmiter

Lots of classes inherit from (by extending) EventEmiter including STREAMS and CHILD_PROCESS. We can add ‘on’ events to listen for specific actions. For example when a stream receives data we can act on it:

ls.stdout.on('data', function(newData) {
  output += newData;
}

Or when a child_process exits it emits the ‘close’ event:

ls.on('close', function() {
  let parts = output.split(/\s+/);
  console.dir([parts[0], parts[4], parts[8])]
});

Asynchronous File Read and Write (All-At-Once)

Essentially there are two methods for approaching file access. All-At-Once or by buffering in a STREAM. The all-at-once approach is relatively simple. It’s part of the FILESYSTEM module.

const fs = require('fs');
fs.readFile('target.txt', function (err, data) {
  if(err) {
    throw(err);
  }
  process.stdout.write(data.toString());
});

There are a couple of things to notice here. READFILE is a method which takes a filename and a callback function to which it sends 2 arguments, ERR and DATA. How do I know that though? Well, if you do a quick search for ‘node.js readfile’ one of the top results will be the documentation for ‘fs’: https://nodejs.org/api/fs.html, then find the READFILE method and the description tells you what arguments to expect for the callback you are writing.

ERR will be FALSE if nothing went wrong, so a quick check at the start of the callback function is a common pattern in node.js. Then you can deal with the BUFFER object as you need.

Writing a file is just as simple:

fs.writeFile('target.txt', 'Updated Message!', function(err) {
  if(err) throw err;
  console.log("File Saved!");
});

This shows a 1-line version of an IF statement if you want to save on brackets and also notice that the callback only receives one argument in the form of an ERR to let you know if the operation was successful.

Asynchronous File Read and Write (Streams)

It is also possible to create STREAMS to read and write data. In this case you can listen for EVENTS to occur including errors like so:

const
  fs = require('fs'),
  stream = fs.createReadStream(process.argv[2]);
stream.on('data', function(chunk) {
  process.stdout.write(chunk);
});
stream.on('error', function(err) {
  process.stdout.write("ERROR: " + err.message + "\n");
});

You can try this with a file that doesn’t exist to see the error handler exit normally instead of terminating.

It is also possible to read and write synchronously (readFileSync), but this will block the Event Loop until it has finished. Generally this is not a good idea unless you are reliant on the file during the initialization phase of your program. REQUIRE actually makes use of this to ensure module code is available before continuing with execution as a failure here would cause problems later.

That’s it. Next up I’m going to start looking at networking.