Writing a Node.js kernel for IPyton in half an hour.

This will show you how to write a native kernel for the IPython notebook server using node.js.

For those who don't know what node is, we can look at the small description on their website :

Node.js is a platform built on Chrome's JavaScript runtime for easily building fast, scalable network applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices.

For those who don't know IPython and in particular its Notebook frontend, you are probably reading this in a static version of the notebook in which I wrote this. The IPython notebook is a web-base frontend which allows interactive programming and display of rich media type like images, video, $\LaTeX$ thanks to Mathjax. Python is for know the only native kernel supported, but we wish to improve this.

A prototype of ruby kernel as already been developped, and I am working on dooing the same with Julia

Let's go.

First you will need the content of this pull request if it have not been merged yet.

Make sure you can start the IPython notebook and that everythong works.

Create an IPython profile, I'll call it "node"

$ ipython profile create node

edit your notebook profile (~/.ipython/profile_node/ipython_notebook_config.py) and add the following :

c.SubprocessKernelManager.kernel_launch_program = ['node','~/node-kernel/kernel.js','{connection_file_name}']
c.MappingKernelManager.kernel_manager_class= 'IPython.zmq.subprocesskernel.SubprocessKernelManager'

where ~/node-kernel/kernel.js is where your node kernel will live.

you might want to install node also, it will be helpfull.

install node-zmq

simply use

$ npm install zmq

if you don't know what ZMQ is, Clark's third law state that

Any sufficiently advanced technology is indistinguishable from magic

So ZMQ is a magic messaging library.

Start to code

Our program will have to read its configuration from the filename given as the first argument and connect 3 ZMQ channel.

In [*]:
zmq = require("zmq")
fs = require("fs")

var config = JSON.parse(fs.readFileSync(process.argv[2]))

I'll let you look at the config object yourself. Setup the connexion strings :

In [*]:
var connexion = "tcp://"+config.ip+":"
var shell_conn = connexion+config.shell_port
var pub_conn = connexion+config.iopub_port
var hb_conn = connexion+config.hb_port

Create a context to evaluate the code we will receive later. (copy pasted from docs)

In [*]:
var util = require('util'),
    vm = require('vm'),
    initSandbox = {},
    context = vm.createContext(initSandbox);

First to avoid the message telling us that the kernel has died, we setup an echo server on the heartbeat channel which is a simple req/rep in zmq dialect. For whatever reason we have to log something on the console otherwise it does not work (no idea why, I'll figure out later)

In [*]:
var hb_socket = zmq.createSocket('rep');

            console.log("Dummy console log.");

Now we create and bind the two other channels, pub_socket should be of type pub and reply_socket of type xrep

In [ ]:
var pub_socket = zmq.createSocket('pub');

var reply_socket = zmq.createSocket('xrep')

Now most of the code will be on the callback which is called when the reply_socket receive a message

In [*]:
                               //rest of the code

Well, actually we will not use data. As we send messages in part, the callback will receive a varaible number of arguments. We are only interested here at the 3rd and 6th argument. Respectively header of arriving message and content.

In [*]:
var parent_header = JSON.parse(arguments[3].toString());
var unparsed_content = arguments[6];

We take care of extracting the code that should be executed if there is some.

In [*]:
if(unparsed_content != undefined ) {
    var content = JSON.parse(unparsed_content.toString());

var code = content?content.code:undefined;

And run it in a separate context that we've created before.

In [ ]:
var result
if(code != undefined){
    result = vm.runInContext(code , context, '<kernel>');
} else {
    result = 'undefined'

To construct the minimal messages that are accepted by the server we will need the following. It does not respect the specification but should be enough as an example.

In [*]:
var ident = "";
var delim =  "<IDS|MSG>"
var signature = ""
var metadata = {}

var header_reply ={

var header_pub ={

The part that depends on the result is he following. Here I explicitely set the execution_count to 1, but a correct kernel would count the number of execution done by the user.

As the frontend support several representation, I choose here to the usual one which is 'text/plain', and set it to the stringified result returned from the previous evaluation.

In [*]:
var content = JSON.stringify({

I can now send the reply to both the reply_socket and the iopub_socket. here the content are similar except for 4th one (header) build above.

In [*]:


I can now close my callback, save my file and run IPython notebook.

$ ipython notebook --profile=node

If everything went allright, I should now be able to write how I did it, and execute js on server side :

In [*]:
["hello from "," to IPython Notebook"].join("Node.js")
hello from Node.js to IPython Notebook

Closing thought.

There are of course many gotchas with current implementation, mainly the one of evaluating code in the vm, of course the kernel should catch the error of user to display traceback and register a display hook to redirect stdin/stdout to the frontend.

My knowledge of Node is really limited as this is my first program using it, so my guess is more competent people will certainly be able to do much more things with this as a starting point.

I will certainly post this as a gist and directly nbviewer with a link to the kernel.js file.

Please send me any correction or improvement you wish.

In [ ]: