In this second of three posts on Node.js I will complete the implementation for the plug-in we designed in part one and build and test it on an IBM PureApplication System. In doing so I will cover the details of the process of creating PureApplication System plug-ins, as well as using them.
In part one of this series, we looked at an overview of the plug-in mechanism of IBM PureApplication Systems and how we could add support of the new OSS web application stack: Node.js.
As I noted, Node.js has some interesting architectural and implementation decisions which might make it more performant for some classes of workloads. I’ve also identified a minimal set of attributes (three) needed to create a Node.js cloud component so that we can support Node workloads in IBM PureApplication systems.
If you did not read or don’t recall the details from the previous post now might be a good time to go and peruse it. In this post I am going to complete the plug-in realizing this Node.js component and build and test it on an installation of IBM PureApplication System.
Creating the Node.js plug-in
Now that we have an idea of how we want to structure our Node.js pattern, as well as which plug-in should we create and what attributes it needs to have; let’s deep dive and create the plug-in.
As mentioned in the previous post, PureApplication System’s plug-ins have a strict form as far as directory structure and some of the files located herein.
At the root of each plug-in are two manifest directories with files: META-INF/MANIFEST.MF and OSGI-INF/node_jsXForm.xml describing the plug-in. Additionally, in the root directory there is a build.plugin.xml file which is an Apache ANT file to build the plug-in. We will not concern ourselves with these files as they contain mostly boiler plate content that are similar to all other IBM PureApplication System plug-ins.
From the various files and directories in the resulting plug-in, we mainly should be concerned with two directories: “plugin” and “src.” The “plugin” directory contains the files for the plug-in of which two files and two directories are mainly important: plugin/appmodel/metadata.json, plugin/config.json, plugin/parts/node_js.scripts, and the binary files in the directory plugin/parts/**. The “src” directory optionally contains Java source files and Apache Velocity template files for further customization, in this plug-in I will not use this feature.
The remaining files can be mostly copied from existing example plug-in with minimal modifications. Let’s take each one of the files I modified in turn and discuss its content and the changes needed to make for Node.js.
The metadata.json file located in the “plugin” directory is the key file providing most of the plug-in’s description. This includes its unique identification and all associated information such as name, description, images and help files.
The bottom section of the metadata.json includes the list of attributes that this plug-in supports. For each attribute you not only list its name, description, but also it’s type (like string
or integer or Boolean) as well as additional information such as help hover string and an optional regular expression to validate input and a message string when the input data is incorrect.
As I mentioned last time, the IBM PureApplication System pattern designer web tool parses the metadata.json for a plug-in to create a UI stylesheet containing the attributes listed for the plug-in. This stylesheet lists all attributes and using the attribute’s type and other information can provide help to the user as far as the possible valid values for the attribute.
The picture above shows the stylesheet for the Node.js plug-in. Note how the version attribute includes a drop-down list for the values since these version values are fixed and noted as a list in the metadata.json section for the Node.js version attribute.
When a plug-in is used to create a pattern in IBM PureApplication System, the pattern can be used and instantiated as a concrete deployment. Typically the user must provide missing information. For the Node.js plug-in most patterns using it will need to be concretized by the user providing the URL to the Git project where the application to deploy resides.
The other attributes having defaults means that the patterns can typically be instantiated using defaults. However, how does the system know which virtual machine (VM) should be created, how many, and where each plug-in should be deployed on?
While the IBM PureApplication System is usually able to automate the deployment of most patterns, it uses hints from the plug-in’s plugin/config.json file as its primary means to find the affinity of the plug-in for various VM parameters. For instance, if the plug-in requires some specific CPU or memory lower limits then this can be specified in the config.json file.
Other information such as the image for the VM as well as the VM type can also be specified. For our Node.js plugin, we will simply require the typical image which contains the default Linux operating system and most dependencies we need for Node.js.
Every plug-in will eventually be deployed when used as part of a pattern. The deployment process is where the work of most of the plug-in is executed. The deployment is essentially a flow that includes well known sequences where each plug-in gets to be configured, installed and started. As you may have now guessed, these plug-in steps are achieved by executing the pattern’s configure.py, install.py and optionally the start.py and stop.py when stopped.
The configure.py script is usually reserved to setup, configure and download dependencies for the plug-in. In our case, this is where we install PCRE (Perl Compatible Regular Expression) which is a library that we need to run Node.js on the Linux VM that is created. In the install.py we divide the installation into three parts:
- 1.Download and install the appropriate version of Node.js directly from Github.com. This is shown in line 37 which calls a shell script to complete the installation (shown on screenshot below). In the shell script we get the parameters and download the correct Node.js version in lines 21 to 23.
- 2.Similarly, in the Python script we download and install Git and NPM (Node Package Manager) are lines 29 and 39
- 3.Download the application from the Git repository, lines 44 to 46.
The start.py is used as expected to start the application and the app might have various NPM packaged dependencies, we first run NPM and then start the Node.js application server in the app directory. Deciding if NPM should be run is done using the Run NPM attributes of the plug-in. The basic command is something like:
exec /usr/bin/nohup /usr/local/bin/node $WORKING_DIR/$NODE_APP_NAME 2>&1 > /dev/null &
This assumes that environment variables are set for the working directory and the Node.js application name—passed as a user parameter from the plug-in.
In the plugin/parts directory and subdirectories we keep actual installation tarball files for PCRE, NPM and Node.js. This allows the plug-in to install in case the VM does not have access to the Internet, or usually, if a failure occurs when the download is attempted. In general, it’s a good idea to keep all dependent files that are needed for the plug-in to correctly setup and install. In our case, the correct version of PCRE is contained therein since the rest of the Node.js setup depends on it to be installed on the Linux environment.
Building the Node.js plugin
Building the plug-in requires Java 6 and Apache ANT. Java is needed since Apache ANT is a Java tool. Running the build is easily done with one command:
$ant -f plugin.build.xml
A correct installation of the Plugin Development Kit (PDK) means that the dependencies and other associated Java JARs will be on your CLASSPATH. The ANT command, if successful, results in a tarball file created in the export directory. This is the plug-in. In our case this file is export/node_js-184.108.40.206.tgz. The version is controlled by the value in the plugin/config.json. Changing that will result in a new version for the plug-in.
Deploying and testing the Node.js plugin
Now that we’ve created our Node.js plug-in and also discussed how to build the plug-in, one thing remains: how to use it? In this section we aim to answer this question by first describing how to install the plug-in into an IBM PureApplication System and use the plug-in to create a simple Node.js pattern. From there we will test the plug-in by deploying the pattern.
Adding plug-in to IBM PureApplication System
Once your plug-in is built the next step, before you can create patterns to test and use it, is to deploy the plug-in into an IBM PureApplication System. This can be achieved in one of two ways:
1.If you have access to the IBM PureApplication System controller node via SSH then you can add
and remove plug-in by executing the shell scripts: /opt/IBM/<ibm-pure-app-sys-directory>/plugin_add.sh and /opt/IBM/<ibm-pure-app-sys-directory>/plugin_remove.sh. The add_plugin command takes as argument the tarball file of the plug-in. For the remove_plugin command you pass the name and version of the plug-in to remove.
2.Alternatively, there is a web UI interface to manage plug-ins. This can be accessed in the admin section of the IBM PureApplication System dashboard. The plug-ins are listed and new plug-ins are added by using the web interface and uploading the tarball file for the plug-in.
When the plug-in is successfully added to the IBM PureApplication System plug-in catalog it will then show up in the tool palette on the left hand side when you attempt to create a new pattern or edit an existing pattern.
Creating a Node.js pattern
Now that the plug-in is installed the next step is to create a simple pattern to deploy Node.js application. Doing so is simple using the drag-and-drop interface of the IBM PureApplication System pattern designer application. You launch the application clicking on Pattern -> Virtual Applications -> New (+ button) and selecting the “Blank Application” option so that the pattern starts with a blank sheet.
Once the pattern builder application is loaded, you simply need to drag-and-drop the Node.js cloud component from the left hand side list to the canvas. Selecting the component shows its stylesheet. There you can select the default Node.js version to use as well as that NPM should be run to download dependencies. Save the resulting as a pattern giving it the name: Node.js pattern.
Testing a non-trivial Node application
Testing this new pattern is relatively easy. You now need to create a new cloud application using the Node.js pattern which shows up in the list of available patterns. What remains is to give the application a name and point to the Git repository. For testing purposes we will deploy the Nodejistsu Prenup, a collaborative Behavior Driven Development (BDD) application found on Github.com at: https://github.com/nodejitsu/prenup.
Prenup is a pure Node.js OSS application that facilitates Behavior-Driven Development (BDD) by creating a collaborative engagement between developers and clients. Once the cloud application is saved, it will show up in the list of cloud applications. All that remains is to click deploy. Once deployed, you can then access the details of the deployment, including the status of the VM, its log files and also a link to the deployed application. From there you can manage the deployment, such as stopping or deleting it.
In this post we completed the Node.js plug-in that we designed in the first post of this series. In doing so we covered in detail aspects of the plug-in and also showed how to build and install this plug-in in an IBM PureApplication System. Using the plug-in we created a simple Node.js pattern and used it to deploy the OSS Nodejistsu Prenup Node.js app on Github.com.
Next we will complete this series by discussing what can go wrong in the process so far and give hints for debugging and testing your plug-ins. Additionally, we will also briefly highlight some advanced features, such as quality of service (QoS) and link plug-ins which can be used to extend this current plug-in and make it more functional, for instance, using a link plug-in to create a connection between Node.js and a database or adding scaling QoS properties to the current plug-in.