Developers' guide - nodify/nodify-persist GitHub Wiki

Hi. Welcome to the nodify-persist developers' guide. This page answers the question "what do I need to know to use this package?" But before we start, let's go over what you should know already:

  • JavaScript & Node.JS - If you're looking to this page to be a tutorial on JavaScript or Node.JS, you're in for a big disappointment.
  • SQL and/or Mongo - Remember, this package doesn't completely abstract the database away, it just makes it easier to work with. You still need to know the basics of your database of choice.

What we're going to talk about here:

  • Basic Use Pattern
  • Nodify-persist Instances
  • Using the DAO's CRUD+I accessors
  • What goes into a Database Descriptor?
  • What goes into a Collection Schema?
  • How do I use more than one database?
  • What is the Collection Prototype Helper?
  • How do I export Collection code to a web browser? (and why would I want to?)
  • Any way I can cache DB results?

If you're looking for info on how to install nodify-persist, that's over at the Installing Nodify-Persist page now.

Basic Use Pattern

Nodify-persist performs two primary tasks:

  1. It creates a regular Create, Read, Update & Delete (CRUD) API on top of multiple database drivers.
  2. It maps tables / collections to JavaScript object prototypes so database queries return "real" objects, not just bags of key/value pairs.

Developers using nodify-persist will need to know a few things before starting:

  1. What database are you using?
  2. What tables / collections are you defining?
  3. If a record in a table could be represented by an object, what would it look like?

Once you have that data down, you're ready to start. Information about your database is stored in a database descriptor object and passed to the init() function. The database descriptor also lists JavaScript object prototypes that represent each table / collection in the database. This example shows database descriptor referencing a MySQL database with two tables, each defined in their own source file:

var db_descriptor = {
  mysql: {
    host: "127.0.0.1",
    database: "example",
    user: "jlee",
    password: "supertaster"
  },
  collections: [
    "user.js",
    "session.js"
  ]
};

A complete list of items in database descriptors is provided in the "What goes into a Database Descriptor?" section below. To create a nodify-persist instance, require the package and use the JavaScript new keyword passing the database descriptor as a parameter.

var persist = require( 'nodify-persist' );
var instance = new persist( db_descriptor );

Next, call the init function, passing a callback that takes an error and a DAO target object. The error parameter will be null or undefined if the request completed without error. The target object is the data access object which defines create, read, update, delete and insert methods for each collection defined in the database descriptor.

var dao;
instance.init( function( err, target ) {
  if( err ) { throw err; }
  dao = target;

  // put application specific code here. for example:
  _load_sample_users();
} );

After this init() callback is invoked, the dao variable will have these 10 methods defined: userCreate(), userRead(), userUpdate(), userDelete(), userInsert(), sessionCreate(), sessionRead(), sessionUpdate(), sessionDelete() and sessionInsert(). It should be clear what the Create, Read, Update and Delete methods do; the Insert method is a convenience function for creating several records at one time.

Before we look at an example of using the userInsert() function, let's talk a little about how tables and collection "classes" are defined. During initialization, nodify-persist reads schema information from the collection object prototypes referenced in the "collections" array in the database descriptor. In the example above, we had two such object prototypes referenced: "user.js" and "session.js". Each of these module file names were read in with a standard require() call before processing.

If we looked inside the "user.js" module, we would see a typical JavaScript "class" with a constructor and likely other methods. We would also see a schema object which looks something like this:

{
  name: "user",
  table: {
    uid: "INT NOT NULL AUTO_INCREMENT",
    uname: "VARCHAR(80) NOT NULL DEFAULT ''"
  },
  key: 'uid'
}

This tells nodify-persist the MySQL table the module represents is named "user" and has two fields: "uid" and "uname". The "key" element in this object tells the system the "uid" field is the primary key. Putting it all together, it looks something like this:

function user( data ) {
  this.uid = data.uid;
  this.uname = data.uname;
}

user.prototype.toString = function () {
  return "User(" + this.uid + ") " + this.uname;
};

user.mysql = {
  name: "user",
  table: {
    uid: "INT NOT NULL AUTO_INCREMENT",
    uname: "VARCHAR(80) NOT NULL DEFAULT ''"
  },
  key: 'uid'
};

module.exports = user;

Because this example is using MySQL as the underlying database technology, the "schema object" in this example is named "user.mysql". If we were using sqlite, it would be "user.sqlite" and contain sqlite-specific information. It is possible to define a collection class with "schema objects" for multiple database back-ends.

Getting back to the application... after creating a new instance with a DB descriptor which references collection function definitions and their schemas, we're ready to use the DAO object. In the example above, we were just about to define the function _load_sample_users():

function _load_sample_users() {
  var sample_users = {
    { uid: 1, uname: "Anonymous User" },
    { uname: "John Lee" },
    { uname: "Thomas Edison" }
  };

  dao.userInsert( sample_users, function( err ) {
    if( err ) { throw err; }

    _read_example();
  } );
}

function _read_example () {
  dao.userRead( { uid: 1 }, function ( err, data ) {
    if( err ) { throw err; }
    if( data.length > 0 ) {
      console.log( data[0].toString() );
    }
    _close();
  }
}

function _close() {
  instance.close( function( err ) {
    if( err ) { throw err; }
    console.log( "DB Connection Closed" );
  } );
}

To review... the basic pattern is:

  1. require the nodify-persist package
  2. create a nodify-persist instance passing a database descriptor as a parameter
  3. make sure collection classes referenced in the database descriptor have a schema
  4. call the init() function on the nodify-persist instance
  5. use the DAO object passed as a parameter to the init callback
  6. eventually call the close() function on the instance object

Nodify-persist Instances

Nodify-persist instances have four primary public functions: the constructor, the init(), query() and close().

the constructor -- i.e. new( db_descriptor )

To create an persistence instance, create a database descriptor and pass it to the constructor:

var persist = require( 'nodify-persist' );
var instance = new persist( { mysql: { ... }, collections: [ ... ], ... } );

Details for database descriptors are in the "What goes in a Database Descriptor?" section below.

init( callback )

The constructor does not perform any action other than saving a copy of the database descriptor it is passed. To read the database descriptor, establish connections with databases and create the accessors, call the init() function.

The init() function takes a callback with two parameters: error and dao. If initialization completes without error, the error value should be null and the dao argument should be set to an object with the newly created create, read, update, delete and insert functions:

instance.init( function( err, dao ) {
  if( err ) { throw err; }
  dao.createWhatever( ... );
} );

query( query, [parameters], callback )

The query() function allows you to send a query directly to the database. The query argument is either a string (for SQL-backed instances) or an object (for mongo-backed instances.) The MySQL driver allows the query to be broken up into a query string with parameters and an array of parameters to fill in. For instance, the following two queries are essentially identical:

// Query 1
instance.query( "SELECT * FROM blargh WHERE id=? AND dingo=?", [144, 1], _callback );

// Query 2
instance.query( "SELECT * FROM blargh WHERE id=144 AND dingo=1", _callback );

The callback to the query function takes two arguments: error and data. The error argument is null if the query completed successfully. The data argument is an array of maps, each representing a result from the query.

close( callback )

The close() function tears down database connections and should be called before exiting. The callback function is passed an error if there were errors on close:

instance.close( function( err ) {
  console.log( "Ouch! There was an error closing the database: " + err.toString() );
} );

Using the DAO's CRUD+I accessors

One of the parameters passed to the init() callback is the DAO - Data Access Object. Application programmers use this object's functions to access information in the database. Five functions are created for each collection listed in the database descriptor: create, read, update, delete and insert.

Create( data, function( err, data ) )

Create accessors take a map whose keys represent database table column names and whose values represent the data to place in that field. After inserting the record into the database, it calls the callback function. If an error occured, err will be non-null. If the call is successful, it will return a collection object.

Some databases support auto-incrementing index values. Objects passed to the callback will include these id's in their definition. (i.e. - you don't have to read immediately after creating just to get the auto-assigned key)

For example, if we had a collection named "session", we could create a session record in the database with this call:

dao.sessionCreate( {uuid: "c28097e8-d121-403a-b9a5-9ea82f933194"}, function( err, data ) {
  if( err ) { throw err; }
  console.log( data.toString() );
} );

Read( data, function( err, data ) )

Read accessors take a map whose keys represent field names you wish to search for and whose values represent the field values you wish to search for. You can search for more than just key fields and read will return 0, 1 or many results in response. The second argument to the read callback is always an array of collection objects.

For example, if there was a collection named "user" with id, name fields, both of the following queries would be valid:

dao.userRead( {id: 12}, _post_read );
dao.userRead( {name: "James Fenimore Cooper"}, _post_read );

function _post_read( err, data ) {
  if( err ) { throw err; }
  if( 0 == data.length ) {
    console.log( 'Not Found' );
  } else {
    for( var i = 0, il = data.length; i < il; i++ ) {
      console.log( "Result " + i + ": " + data[i].toString() );
    }
  }
}

Update( data, function( err ) )

The update accessor takes a map with key/value pairs representing field names and field values. The query constructed uses ONLY a table's key to search. The remainder of the field values are used to update the database record.

Delete( data, function( err ) )

The delete accessor removes the row represented by the data argument from the database, calling the callback function upon completion.

Insert( data, function( err, data ) )

The insert accessor is a convenience function taking an array of values and calling the create accessor on each. The callback takes an error and an array of objects created based on the data provided. Note! Ordering is not preserved between input and callback arrays. In other words, the first element in the object array passed to the callback may not have been created from information in the first element of the data array passed to the insert accessor.

What goes into a Database Descriptor?

Database descriptors describe where a database is located and what kind of database it is (MySQL, SQLite, Mongo). Here is a list of elements in the descriptor:

mysql

sqlite

mongodb

providers

target

loglevel

logfunc

drop

cache

cfunc

What goes into a Collection Schema?

What is the Collection Prototype Helper?

How do I export Collection code to a web browser? (and why would I want to?)

Any way I can cache DB results?

⚠️ **GitHub.com Fallback** ⚠️