XPages in Bluemix – Looking at the application dashboard

In this article I will explain the XPages in Bluemix application dashboard and what is available to the developer from the Bluemix web interface.

The Dashboard

Once you have created your new XPages application in Bluemix it will appear on the “Dashboard” with a weird spacecraft for an icon.


Clicking on the box will review a lot about your application


Link to the application

Clicking on the link at the top left will open your application in a new window

Clicking on the pencil next to it allows you to edit and / or create an additional subdomain for the application (yes it can have many)


Add Git

Clicking on this link will automatically create a source control site for this application


Memory allocated ($$$)

By default the application has 512M allocated to it. Once XPages in Bluemix goes live this will become more significant – this is the number by which Bluemix determines how much you get charged each month. Throughout Bluemix you are charged by the GigaByte hour. That is the number of Gigabytes utilized by a running application each hour. so in this case 512M for an hour is 0.5 Gigabyte hours. Over a typical month that is 720 hours, or 360GBHours. The free allocation is typically 375hours. So this one application running for a month is all you would have “free”. You can reduce this value through the menu, but what you need will be based on your application. Right now (July 2015) XPages is free – keep an eye on it !!!

VCAP services

On the bottom of the smaller box for IBM XPages NoSQL Database you will see a down arrow – if you click on it you will see what are called the “VCAP” services. These are the Cloud Foundry Environmental Variables.

“Environment variables are the means by which the Cloud Foundry runtime communicates with a deployed application about its environment. This page describes the environment variables that Droplet Execution Agents (DEAs) and buildpacks set for applications.”


This JSON construct is critical to telling Bluemix how the XPages boilerplate application works. When you use the self selection to deploy the application all this information is created on the fly. Looking at the labels for the data they should be relatively self explanatory for a Domino developer.






XPages in Bluemix – where is the design and where is the data?

Digging into the code for the example database – aaaah memories – but I digress…..as normal I am mostly blogging this for my own notes and if someone else happens to get something out of it – great 😀

Design and Data are separate?

So the design and the data are separate – well kinda – in fact “some” of the design is separate and “all” of the data is. Following the instructions on how to deploy the starter code and mess with it I managed to change the database and add a new XPage.

BTW for those who cannot find “Start Coding” it is here….took me ages to find it in front of my nose


Notes client

So the database look like this in the notes client – pretty normal


and viewing the design of this database we can see forms, views and No XPages of worth


The data can be changed through the notes client and immediately represented on the website


The Design

Looking at the boilerplate code which was downloaded from starter code we can see – no data, no form, no views – but there are the XPages


New design elements

So I created a XPage in the Design database and followed the instructions on deploying the new code – it worked


Adding a new form to the VIEW database, failed.

Adding a new form to the DESIGN database worked just fine



Conclusion – So where is everything again?

Views and Data are in the NoSQL database

XPages and Forms and in the Design database

There is a lot to learn here……..and a lot of questions….

Creating my first true XPages in Bluemix application

In this post I will document my attempt to create my first XPages application in Bluemix. I will be following the documentation (ish) and also clicking around to see what happens. This was written July 2015 and is likely out of date soon after.


I installed the new ExtLib by downloading the last release, then in designer installing the new eclipse plugin. This is not necessary for the sake of this post but will be later.


Go to the Catalog, then scroll to the bottom and select “Looking for more”


Then from the next page select the XPages Web Starter



You will then get the screen to create the instance – over on the right give it a name – in this case I called it xpages


Select Create

and you will get a screen telling you it is being created




Click on the “Dashboard” link at the top


Click on the box for the new application (xpages in this case)


Then we click on our new application link at the top – http://xpages.bluemix.net and there we have it.




Setting up our first application was a very simple set of clicking options. This is cloud – self service deployment of services and capabilities.

XPages in Bluemix – Experimental is live

Finally – we have XPages in Bluemix. It is currently “Experimental” which means nothing is guaranteed to function as expected, it is basically a good Alpha made available to early adopters for testing. This means that at any time, things could change. The model, the process, the everything. So while there is much to learn, much to play with….

This is absolutely 100% NOT production ready

Here are the links to the documentation

  1. Building apps with the IBM XPages for Bluemix runtime
  2. Getting started with IBM XPages NoSQL Database for Bluemix
  3. The new OpenNTF ExtLib release


There is a lot to cover and I know a number of people are looking at it. XPages exists in Bluemix as two completely separate entities.

  1. The XPages runtime based on JSF
  2. The XPages NoSQL data store

Why are they not one NSF like normal??

Tony McGuckin explained it all to me in Orlando back In January. It was one of those surreal conversations where everything he said made sense as it came out of his mouth and yet at the end of the conversation I knew nothing more than – oooo that makes sense………and I am clearly now more stupid than I was 20 minutes ago. Hopefully he will be able to better explain to us some day. 

So here is what I think I understood – it is to do with the way that Cloud Foundry underpins Bluemix. The runtime container which is running on top of Bluemix has to be able to run without the NoSQL data piece. For the sake of uptime failover, scaling and other fundamental aspects of Cloud Foundry I don’t entirely understand, they are separate. A side aspect of this means that the runtime is not dependent on the Notes NSF structure to create applications.

As normal – I have way more questions than answers – so let’s get rocking 🙂




An introduction to creating Domino applications in the Bluemix environment – SocialBizUg Webinar July 23rd

I am very happy to announce that on July 23rd, myself and Toby Samples, will be presenting a SocialBizUg webinar with the illustrious Martin Donnelly.

The topic as the title suggests is an introduction to creating Domino applications in Bluemix.

A link for the event is provided below – We look forward to showing you this exciting new capability.




Companies are looking to the cloud more and more for cost savings and to facilitate the creation of new modern applications. Join this webcast to find out how Bluemix – IBM’s Platform as a Service (PaaS) cloud offering – can help create applications quickly, securely, and in a scalable manner. Get an introduction to Bluemix, what it offers, and more specifically how XPages and REST play a role key in it.
Join this webcast and learn how to:

  • Easily get Bluemix up and running and how to create your own XPages applications quickly.
  • Integrate with the other services available within the Bluemix ecosystem.
  • Maximize the new and exciting possibilities this integration opens up to the Domino community.


IBM Bluemix – take your skills to the cloud

This article was originally posted on SocialBizUg.org and is being reposted with permission (Feb 2016)


For whose of you who do not know me. my name is Mark Roden and for those who do not know me I am a Web Developer for PSC Group LLC in Chicago,IL.

I have been awarded the honor of being an IBM Champion for 2014 and 2015. You can find out much more information about me on my blog – http://www.xomino.com and you find me on twitter @MarkyRoden

My Evolution

My career has been built on Lotus Notes, Domino and in the last 4 years XPages. I call myself a web developer because that is what I do, I build websites.

What my progression through Domino and XPages has taught me is that evolution and progression as a developer is essential, necessary and a lot easier than having to deal with other people forcing you to adapt on their time frame.

Everyone is different, some are drawn to the logic of a database interaction some to the user experience and some in the middle. Many classic Notes developers kinda fell into it without having a classic programming background. The great thing was Notes was the Model (Data), the View (Notes form) and the Controller (LotusScript) all at the same time. This made life easier, quicker and ultimately served the point of Rapid Application Development.

Today “RAD” is often done using tools which don’t require any programming whatsoever. With online tools like Quickbase and a host of others, users are able to create websites which collect and process data, run basic workflows, send emails and perform tasks, without programmers. But there is still a pervasive need for programmers, to automate those tasks which cannot be simplified to a couple of screens and an approve button.

Programming The Future Cloud

Cloud is something which has been around forever. From Compuserve to yahoo mail, cloud as a service has existed for decades. What has changed is not only a better marketing strategy of managed services, but a general acceptance that connectivity, security and critical business services can be run on other people’s networks.

Cloud is many thing to many people, Software as a Service (SaaS), Infrastructure as a Service (IaaS) and Platform as a Service (Paas). In our XPages world the cloud we have been able to use for a while not is Softlayer which is really IaaS. Someone else provides the hardware and network connectivity, but you have to run the server and patch it yourself. Platform as a Service takes care of that as well for you. With a PaaS you are entirely responsible for how “much” of a server you need, not the server itself. For more on what is cloud check out this link (http://www.ibm.com/cloud-computing/us/en/what-is-cloud-computing.html)

IBM Bluemix

IBM Bluemix is a Platform as a Service (mostly) cloud offering and XPages is coming to it. In June 2015 IBM will release XPages in two available forms within Bluemix. The release will be “experimental” which means

  1. Free
  2. Not perfect yet but in principle functional.

XPages will be offered as a “Build Pack” which means you can use the underlying XPages JSF runtime to build your apps on it, and it will also be offered as purely a data storage NSF. This means that you will be able to use a non-XPages buildpack to be able to run your web server/application server and use Domino as a data store.

To find out more about IBM Bluemix you can sign up for a free account at http://www.bluemix.net – for at least the next 6 months the XPages runtime and data store will be free to use without limitations. Once it goes from Beta to production ready the cost will be based on usage.

Your Evolution

Going back to my original point of evolving as a developer, this is an amazing opportunity to be able to play with something completely new, while at the same time using something which is completely familiar. Ignoring all the business benefits of a PaaS for the moment and focusing solely on a personal growth perspective, this is an amazing opportunity to learn.

Within the Bluemix environment, one of the things I have learned to appreciate is the working tutorials which are provided. IBM is also pumping out lots and lots of example blog posts and code samples so that you can begin to learn all the cool features.

If you have been struggling with XPages and or the idea of writing Java – no worries!! Bluemix gives you an opportunity and a chance to still work with something you know, the data model (Domino data), and at the same time learn something new like node.js (A powerful, scalable web server coded entirely in JavaScript).

Evolve yourself, it will be the best decision you have made in years.



Bluemix docker documentation update

Sunday night I commented on the Bluemix document site that the instructions did not work for the windows installed version of boot2docker.

Only yesterday I posted the following image taken from the site last Friday night


and commented how it didn’t work and I had to use curl.

I found this, this evening….


How cool is that – I have no idea if my comments/blog post was the reason for the change, but it certainly can’t have hurt.

If you see something wrong, do the right thing and report it and get it fixed for other people 🙂

It also goes to show that a program in BETA is going to undergo some refining – that is kinda the point of being in BETA. I am very impressed at the response time for the changes and I can only hope that the final version is better than what I had to go through.


The other thing of course is that yesterday’s blog is now total rubbish (facepalm)

Bluemix and docker BETA installation (part 1)

*** UPDATE***

I guess I should have seen this coming, but it is a good thing – the Bluemix docker documentation has been updated since this post was published. It is still worth reading to learn a few things about docker – but the referenced documentation site does not look like it is detailed in this post (less than 24 hours later!!)


Currently access to containers within Bluemix is at BETA (May 2015) and you have to request access to get to it. I am writing this for my own sanity and documentation…….

The instructions for installation are here


Having had to figure out how to give my boot2docker access to the internet to download files for Step 1, Step 2 was no easier…

Step 2





Well when I tried that I got this……


So having a quick Google and some Toby I found this link about using curl instead of wget

 wget vs curl: How to Download Files Using wget and curl

Using curl -O instead of wget it works


Step 3

  1. Install Python Setuptools, see Installing and Using Setuptools.

Install says do this – curl https://bootstrap.pypa.io/ez_setup.py -o – | python

and that bombed


Do this instead

curl -O https://bootstrap.pypa.io/ez_setup.py

sudo python ez_setup.py


Step 4

Abandon all faith ye who enter…..ok what next

docker@boot2docker:~$ wget cf.tgz -L -O https://cli.run.pivotal.io/stable?release=linux32-binary
docker@boot2docker:~$ sudo tar -zxvf cf.tgz -C /usr/bin/

well that didn’t work as we have already seen – try I tried this instead and this failed as well

curl cf.tgz -O https://cli.run.pivotal.io/stable?release=linux32-binary

I tried to download the file (https://cli.run.pivotal.io/stable?release=linux32-binary ) with chrome dev tools open and I saw this URL – http://go-cli.s3-website-us-east-1.amazonaws.com/releases/v6.11.2/cf-linux-386.tgz

so I tried this – and it worked

then this failed – sudo tar -zxvf cf.tgz -C /usr/bin/ because the name of the file is really cf-linux-386.tgz

so do this

  • sudo tar -zxvf cf-linux-386.tgz -C /usr/bin/

Step 5

Set the following variable: DOCKER_TLS_VERIFY=1

Note: If you are using Boot2docker, set the variable in your Boot2docker VM.”


So I checked the docker git site…https://github.com/boot2docker/boot2docker which states clearly…


I also saw to run boot2docker up and oooooo look at that export !!!



So do that (I wonder what the other two do – that might be helpful later)

  • export DOCKER_TLS_VERIFY=1

“You are ready to install the IBM Containers Extension.”


Steps 1 – 5 and none of them worked as per the instructions. I have done my part and commented on the post stating as such, but wow !!!

You can’t make this up…..more as soon as I can figure it out….or not !

I did learn a bunch about curl, boot2docker, oracle VirtualBox so not a total waste



Passing authentication information through the Bluemix Hybrid Secure Gateway

In this article I will demonstrate a couple of the things which can be passed through a Bluemix secure gateway, allowing us to create normal web based applications.


In the previous article I demonstrated how to create a TLS secured hybrid Bluemix application. In this article we will look at some of the web properties/headers, cookies etc which we can pass through the gateway.

The Gateway

To demonstrate what can be passed through the gateway I am using a simple notes form to display the incoming information

The Cookie, Header and username fields are all hidden if the field value is blank



Here is my application running on node, using the secure gateway and once again accessing the domino server hosted on my laptop.



No username, no cookie, no header.

Changing the code back within the calling application we are going to add some additional information. In the following code snippet you can see that we have added some header “Marky” information.

app.get('/secureTunnel', function(req, res) {
    tunnel.create('8888', function(){
        var options = {
            headers: {"Cookie": "", "Marky": "Hi Marky"},
            port: '8888',
            path: '/xomino/ainx.nsf/testform?readform'

        var callback = function(obj){
            res.writeHead(200, {"Content-Type": "text/html"});
        var obj = simpleHTTP.run(options, callback);


When we refresh the application we can see that the header has been passed through the secure gateway to the application itself:



If we try and log into the application (directly on the domino server) we can generate a session authentication token. These screenshots are taken directly from the domino server.



At this point though just because the domino window is logged in, the node app still records anonymous.

Adding the cookie to the node application code, which is passed through the gateway

app.get('/secureTunnel', function(req, res) {
    tunnel.create('8888', function(){
        var cookie = "DomAuthSessId=D2BF0063D62C9138E9F723BB88C046F5";
        var options = {
            headers: {"Cookie": cookie, "Marky": "Hi Marky"},
            port: '8888',
            path: '/xomino/ainx.nsf/testform?readform'

        var callback = function(obj){
            res.writeHead(200, {"Content-Type": "text/html"});
        var obj = simpleHTTP.run(options, callback);

We can now see that the user is authenticated within the bounds of the hybrid application


Pushing all this code up into Bluemix you can truly appreciate the authenticated hybrid app



In this article we have seen how we can push basic header information through the gateway and pseudo-demonstrate an authenticated application. There are of course multiple hurdles to overcome between this demo and a real world application, but I hope it has given you an idea for what’s possible.



Creating a secure Bluemix hybrid app using TLS encryption

In this article I will demonstrate how to secure a hybrid IBM Bluemix application using the Secure Gateway and the Mutual TLS encryption option.


In the previous article I demonstrated how to create a sample hybrid app which was unsecure because you could just call the gateway URL and access the application behind the firewall. While this worked well as a concept demo, it is not a production feasible set up. In this article we will look at how to set up a secure tunnel to the gateway URL and then on to our application.

The basis for this article comes from this developerworks article.


It took me a long time (relatively) to figure out how to make this work in my environment as I did not understand what was being accomplished by the example. I hope to provide a greater level of detail and explanation in this article.

Creating a secure gateway

Following the steps described previously we can set up a Secure Gateway within our Bluemix app. This time we are going to create a gateway which is secured with TLS encryption. As you can see from the image below, when you select the “TLS Mutual Auth” option a grey section appears underneath the form fields.

  • Select Auto Generate cert and Private key



Click on the + Icon at the end of the fields and you will see the new gateway created


Click on the gear icon at the end of the line and you will see an option to “Download Keys”. On selecting that a zip file will be downloaded. You will notice that I have not blurred out the port or Destination ID this time. This is because the point of this is that without those TLS Keys, knowing this information will be of no use to you. (You’re welcome).

Once the Keys are downloaded you need to add them to your node application (in my case in the root)



As you can see from the image below, the .pem files are just text files which are the key files used as part of the encryption handshake when we connect to the gateway.


Once the keys have been added to the project we are then able to create our basic app.

Basic node app 101

The following code creates a basic node website

var express = require('express');
var app = express();
var http = require('http');

app.get('/', function(req, res) {
    res.write("Hi I am the root")

var host = (process.env.VCAP_APP_HOST || 'localhost');
var port = (process.env.VCAP_APP_PORT || 4000);
app.listen(port, host);




Secure tunnel

We are going to create a secure tunnel to the Bluemix Secure gateway. We do this using the following code which should be saved as tunnel.js.

var tls = require('tls');
var fs = require('fs');
var net = require('net');

var options = {
    host: 'cap-sg-prd-5.integration.ibmcloud.com',
    port: '15101',
    key: fs.readFileSync('Hqb17PFJ9Oe_5lm_key.pem'),
    cert: fs.readFileSync('Hqb17PFJ9Oe_5lm_cert.pem'),
    ca: fs.readFileSync('DigiCertCA2.pem')

var creations = 0;
var server;

//In this case the port value is the port of the tunnel created on the local server
//to the secure gateway - this is NOT the 15xxx port of the gateway
exports.create = function(port, callback) {
    if(creations == 0){
        //server not currently running, create one
        server = net.createServer(function (conn) {
            connectFarside(conn, function(err, socket) {
                if (err){
        server.listen(port, function(){
            console.log('tunnel on port: '+port);
    } else{
        //server already running

function connectFarside(conn, callback) {
    try {
        var socket = tls.connect(options, function() {
            console.log('tunnel connected');
            callback(null, socket);
        socket.on('error', function(err){
            console.log('Socket error: ' + JSON.stringify(err));
    } catch(err) {

exports.close = function(){
    if(creations == 0){
        //close the server if this was the only connections running on it


This file needs some explanation. What it is doing is the following:

  • When we call tunnel.create we are going to create a secure tunnel from the current node server to whatever is passed in through the options object
  • The port on which the tunnel is created has NOTHING to do with the port of the secure gateway. This tunnel will connect a specific port on the current server to the port on the gateway server.
  • The tunnel itself connects to the secureGateway on port 15101 (in this case)
  • The port is created as part of the connection to the gateway. When the connection is complete the port is closed. This prevents someone from guessing the new port on the server and using it!
  • This is not the best way of doing it, it is not very flexible for a reusable, in production, service with multiple connections. The port and server should not be hard coded. They are for this example so it is easier to understand. We will look at making it generic it later.

Connecting to our backend service (simpleHTTP.js)

In this case I am demonstrating connecting to a web page, but there is no reason why you cannot connect to mysql, mongo or anything else. I have a simple http connection module which will connect to the specified webpage on the back end and return the page as a buffered string back to the original app.get(‘/secureTunnel’).

var http = require('http')
exports.run = function(options, callback){
    var response = {};
    var body = "";

    var req = http.get(options, function(res) {
        // Buffer the body entirely for processing as a whole.
        var bodyChunks = [];
        res.on('data', function(chunk) {
            // You can process streamed parts here...
        }).on('end', function() {
            body = Buffer.concat(bodyChunks);
            //put the response into a format which can be easily passed to the callback
            response = {'body': body, 'resHeaders': res.headers}

Building out our node app

Building out the rest of the app.js it looks like this:

  • Create the route for app.get(“secureTunnel”, function()……..)
  • When called the route does two things
    • Calls tunnel.create
    • Passes the connection request to simpleHTTP
    • Closes the tunnel


var express = require('express');
var app = express();
var http = require('http');
var tunnel = require('./tunnel.js');      //code used to create and manage the secure tunnel
var simpleHTTP = require('./simpleHTTP'); //code used to create the request to the http service (web page)

app.get('/secureTunnel', function(req, res) {
    tunnel.create('8888', function(){

        var options = {
            //host is not necessary in this case because it binds directly this server if blank
            port: '8888',          //The tunnel port
            path: '/xomino/jQinX.nsf/Marky?readform' //the path of the test page on my laptop

        var callback = function(obj){
            res.writeHead(200, {"Content-Type": "text/html"});
        //make the http call and display the results out on the page.
        var obj = simpleCopper.run(options, callback);

var host = (process.env.VCAP_APP_HOST || 'localhost');
// The port on the DEA for communication with the application:
var port = (process.env.VCAP_APP_PORT || 4000);
// Start server
app.listen(port, host);

Putting it all together

Here is the process for the connection laid out in bullet points:

  • Request comes in to /secureTunnel
  • Create a secure tunnel on port 8888
  • The tunnel is created on port 8888 by setting:
    • The secureGateway URL
    • The secureGateway port
    • The secure gateway keys
    • and then opening the new tunnel to the secureGateway
  • At this point the host website (bluemix) on port 8888 is not connected to the secure gateway on port 15101
  • The connection is made and the request to the service is made
    • The request to the connection is not made over port 80 or 4000 (or whatever you are using for the host), it is actually made to the newly created tunnel port 8888
    • The connection to 8888 routes out to secureGateway port 15101 – this is now permitted
    • The secure gateway in turn connects back into the hybrid environment ( and gets the desired information from within the firewall
    • The firewalled service responds back through the secure gateway and back to the host port 8888
    • The response is packaged up and returned to the user’s screen (for this demo the HI Message)
    • The tunnel on port 8888 is then closed and cannot be accessed by anyone any more.

And we have a result locally


Which is my node server locally, connecting to the secure gateway to come BACK to my local domino server


This of course would look WAAAY more impressive it was Bluemix making the call. So I committed the code and pushed it up the the xominoKnox Bluemix repository…….et voilà



What I hoped to achieve in this article is a step by step explanation of how a secure tunnel is created to facilitate the secure hybrid environment. As I mentioned this hard coded version is not ideal for production yet because the keys are hard coded to the connection. With a little effort the code could be genericised to use the connection tunnel multiple secure gateways within BlueMix.

The code for this project can be found here – https://hub.jazz.net/git/mdroden/xominoKnox but I assure you the gateway is not longer open 😉

Very Cool 🙂