Setting up a secure, custom domain, node.js site on Azure

In this article I will demonstrate the necessary steps to set up a node.js server running https, hosted in Azure.

Introduction

This article is a combination of my own work and a conglomeration of reference point blog articles which I had to find to achieve all of this.

Creating a node.js site on Azure

If you follow the instructions on this site you should be able to create an azure site (Get started with Node.js web apps in Azure App Service)

a1

Creating a custom domain

Once you have registered your new domain (in my case marky.co) you need to go to the azure portal and follow the instructions posted here (Configuring a custom domain name for an Azure cloud service). You cannot do this on your free tier though and this where you have to chose your plan carefully. To be able to interact with Office Add-Ins I need my service to be SSL enabled.

a2 a3

Once you have selected a Basic plan you should get the following options

a4

Assign your site and as the instructions stated – you can “Bring your domain” by changing the CName within your domain name provider DNS management tools.

a5

Adding SSL

There are a number of methods for getting an SSL certificate but I have taken to doing it for free – you can use the same process I detailed here for exposing your node server to manually collect the letsencrypt certificates  (Using Let’s Encrypt to create an SSL certificate for my Bluemix hosted web site) to create the .pem files.

To turn the .pem files into .pfx files you need to follow the openssl instructions here (How To: Get LetsEncrypt working with IIS manually)

openssl pkcs12 -export -out “certificate.pfx” -inkey “privkey.pem” -in “cert.pem” -certfile chain.pem

a6

The certificate.pfx file can then be loaded into the azure portal. When you import the certificate successfully it is displayed on the main blade automatically.

a7

 

Add the SSL binding

a71

aaaah we love the cloud….

a8

a9

IMPORTANT – Restart your instance and there we go

a10

Conclusion

In this article we have seen how to deploy an instance of node.js on Azure, applied a custom domain to it, created an SSL certificate and added to an azure instance. Once this is complete you should have an SSL secured node.js  instance running which can then be used for Office Add-in deployments.

 

Advertisements

Passing authentication information through the Bluemix Hybrid Secure Gateway

In this article I will demonstrate a couple of the things which can be passed through a Bluemix secure gateway, allowing us to create normal web based applications.

Introduction

In the previous article I demonstrated how to create a TLS secured hybrid Bluemix application. In this article we will look at some of the web properties/headers, cookies etc which we can pass through the gateway.

The Gateway

To demonstrate what can be passed through the gateway I am using a simple notes form to display the incoming information

The Cookie, Header and username fields are all hidden if the field value is blank

b1

 

Here is my application running on node, using the secure gateway and once again accessing the domino server hosted on my laptop.

b2

 

No username, no cookie, no header.

Changing the code back within the calling application we are going to add some additional information. In the following code snippet you can see that we have added some header “Marky” information.

app.get('/secureTunnel', function(req, res) {
    tunnel.create('8888', function(){
        var options = {
            headers: {"Cookie": "", "Marky": "Hi Marky"},
            port: '8888',
            path: '/xomino/ainx.nsf/testform?readform'
        };

        var callback = function(obj){
            res.writeHead(200, {"Content-Type": "text/html"});
            res.end(obj.body);
        }
        var obj = simpleHTTP.run(options, callback);

    });
});

When we refresh the application we can see that the header has been passed through the secure gateway to the application itself:


b3

b4

If we try and log into the application (directly on the domino server) we can generate a session authentication token. These screenshots are taken directly from the domino server.


b5

b6

At this point though just because the domino window is logged in, the node app still records anonymous.
b7

Adding the cookie to the node application code, which is passed through the gateway

app.get('/secureTunnel', function(req, res) {
    tunnel.create('8888', function(){
        var cookie = "DomAuthSessId=D2BF0063D62C9138E9F723BB88C046F5";
        var options = {
            headers: {"Cookie": cookie, "Marky": "Hi Marky"},
            port: '8888',
            path: '/xomino/ainx.nsf/testform?readform'
        };

        var callback = function(obj){
            res.writeHead(200, {"Content-Type": "text/html"});
            //res.write(JSON.stringify(obj.resHeaders)+"<hr/>")
            res.end(obj.body);
        }
        var obj = simpleHTTP.run(options, callback);
    });
});

We can now see that the user is authenticated within the bounds of the hybrid application

b8

Pushing all this code up into Bluemix you can truly appreciate the authenticated hybrid app

b9

Conclusion

In this article we have seen how we can push basic header information through the gateway and pseudo-demonstrate an authenticated application. There are of course multiple hurdles to overcome between this demo and a real world application, but I hope it has given you an idea for what’s possible.

 

 

How to add a Node.js Express route in a separate file

In this article I will show how you can manage your routes in a separate file from app.js. It also demonstrates more generally how adding modules to your applications works in node.js.

Introduction

In this article I will create a simple route in a route.js file and reference it from my app.js. This will demonstrate how to keep the code separated and easier to manage.

The example application

This is a very simple express example with only two routes – the root of the app which says “Hi I am the root” and a second one which says “I am a new route”.

The initial app is a very basic app created using express.

// Startup Express App
var express = require('express');
var app = express();
var http = require('http').Server(app);

http.listen(3000);

// handle HTTP GET request to the "/" URL
app.get('/', function(req, res) {
    res.write("Hi I am the root")
    res.end();

})

Which then produces the following simple page

n1

New routes.js

I created the following file routes.js which will display a message when going to /marky

module.exports = function(app) {

    app.get('/marky', function(req, res) {
        res.write("I am a new route")
        res.end();
    });
}

mobile.exports is node.js specific code which allows for code includes in this very manner. For more on this check out this article. Notice that (app) is passed to the function so that it properly scoped to the original code.

Back in the app.js we add a single line to require this new library and that’s it.

// Startup Express App
var express = require('express');
var app = express();
var http = require('http').Server(app);

//include other libraries
var routes = require('./routes')(app); //This is the extra line

http.listen(3000);

// handle HTTP GET request to the "/" URL
app.get('/', function(req, res) {
    res.write("Hi I am the root")
    res.end();

})

n2

Conclusion

More fundamentally than this simple example, this is the core of how node modules (including express) work. When you “require” express or http or any other module within your node application, this is how it is put together. Kinda cool 🙂

BTE102: The Demonstration application – beyondtheeveryday.com

In the previous post  I described how Mark Leusink and I are going to be speaking about Angular.js in our presentation at ConnectED later this month.

We are very proud to announce the demonstration application upon which the presentation will be based

http://beyondtheeveryday.com

This application was created by Mark Leusink and is an amazing example of how simple an Angular application can be integrated with Domino data. The application is fully responsive and is particularly nice to use on a mobile device. All credit goes to Mark for this, I am very flattered to be talking with such an astute and talented developer. You can find out more about the application from Mark’s blog – http://linqed.eu/2015/01/14/marky-marks-mobile-first-connected-sessions-demo-app/

The application is running on a Domino server.

b1

IMG_0339

During the presentation we will demonstrate how we got the application to work on a node server hosted in Bluemix and also demonstrate the application running on SharePoint, IBM Connections and as a native Mobile application.

If you are still not sure, listen to Pete Janzen talk about his recommendations for the conference 😉

Write once – run anywhere………………. !

Websockets in XPages: Improving on the automated partialRefresh interface

In this article I will further discuss how tom improve the user experience of an automated partial Refresh on an user’s XPage. Although these posts were originally about using Bluemix to host the node.js server I kinda feel that the focus has drifted onto websockets more than Bluemix. So in an attempt to make it easier to find I am going to use the Websockets in XPages title moniker for a few posts and then go back to Bluemix 🙂

Introduction

In the last article we looked at how to push a automated partialRefresh to a XPage application using websockets. In that article it was noted that the user experience was not ideal because the whole panel refreshed without the user knowing about it. For some apps that is appropriate and for others it may not be. At this point in his career Dave Leedy is impressed he gave someone else and idea and I quote: “wow! that’s fricken awesome!!!”

So, that’s not a great user experience – what if they were doing something at the time?

Yes I was thinking that too! So I believe we can improve the user experience a little and take what Dave suggested and tweak it a little. Now where have a seen something which let’s the user know there is new data changes but doesn’t refresh the page without their action……….

b4

oh yeah that.

Instead of refreshing the control automatically, we will make the message create a “refresh” icon on the page which the user can then update at their leisure.

b5

The modified code is all in what happens when the page receives the refresh socket message. I added a jQuery rotate function just for some added “je ne sais quoi“. In the function we can see that when the refresh event is detected by the socket code the refreshControl function is called. This in turn makes the hidden refreshIcon visible, adds an onClick event and then rotates it. The onClick event performed the partialRefreshGet as we saw in the previous example turning the page briefly grey. We then hide the icon and remove the click event (to avoid piling on multiple events as the page gets continually refreshed)

 

 // Function to add a message to the page
  var refreshControl = function(data) {

	  $('.refreshIcon')
	  	.css({display: 'block'})
	  	.on('click', function(){
	  	   var temp = $('[id*='+data.refreshId+']').css({background: '#CCCCCC', color: 'white'}).attr('id')
		   XSP.partialRefreshGet(temp, {})
		   $(this).css({display: 'none'}).off('click')
	  	})
	  	.rotate({
	      angle:0,
	      animateTo:360,
	      easing: function (x,t,b,c,d){        // t: current time, b: begInnIng value, c: change In value, d: duration
	          return c*(t/d)+b;
	      }
	   })
  };

  // When a refresh message is received from the server
  // action it
  socket.on('refresh', function(data) { refreshControl(data); });

The following video quickly demonstrates the new capability.

Conclusion

In this brief article we concentrated on how to improve a user experience by notifying them that changes were pending and then allowing them to determine when the changes were made.

I still don’t think this is as optimal as I would like but you get the idea. As I said a long time ago – the more DOM you are reloading the worse the user experience. With a viewPanel we are kinda limited on what we can and cannot refresh. A better option may be to architect the application just the new data and update as appropriate……….