Simple examples of how Google Developer Tools can aid Mobile Development

In this article I will show some simple examples of how Google Developer Tools can be used to help in mobile (responsive) development.


Google Developer Tools (F12) within Chrome is one of those things which you *know* is way more powerful that you have ever cared to look at but this week I was introduced to a very cool new feature – thanks to @simonreid123. The ability to control the veiwport and size of the viewing window, as well as being able to throttle the speed of page load has helped me better design a site for mobile. This does not obviously replace real testing on aa real device, but definitely helps !!

The Mobile button in Developer Tools

In this example we are going to look at the BBC website from today. In a normal browser it looks like this.


Bringing up developer Tools (F12) we can see a mobile button on the toolbar


Clicking on that brings up a new view of the page.


Once I do that I can select a “Device” from the menu



After then refreshing the page (as it tells me to) I can see the page as if it were an iPad.a4

I then chose iPhone 4 and got a smaller screen

Across the top you can see the screen width and markers – and in orange the media queries (as specified by the style sheet) where there are going to be changes



Dragging the tab to resize the browser you can see the changes on the orange lines



Another nice feature is the ability to arbitrarily slow the page load down as if we were on a slow network (more like a phone)




This is a very useful and cool feature I was not aware of. I *know* there are many many many other things I do not know about developer tools….it’s all a great learning experience 🙂


Binding jQuery code to an XPages partialRefresh using DOM Mutation events


In this article I will demonstrate how to bind to the event which triggers on the completion of an XPages partialRefresh. Using that binding we will then be able to action to contents of the newly added partialRefresh DOM elements.


In multiple articles I have discussed the use of the onComplete event of a programmatically triggered partialRefresh to be able to re-apply a jQuery plugin’s bindings to an area of an XPage. This works very nicely and integrates with the Dojo events controlling the xhr request to the Domino server.

A problem arises when you do not have a programmatically controlled partialRefresh, say for example in a pager. XPages uses the same technology to execute a partial refresh on a viewPanel – but you and I do not have programmatic access to the onComplete event without hijacking it.

This was brought back to my attention when reading Brad Balassaitis’ excellent article on adding font awesome to his data view. In that case he does not have an event available to him through the XPages designer so he has to hijack the Dojo calls. A practical solution given the tools available.

In general though I have always found using the XPage events a non-elegant way of controlling the page and there has to be a better way – I think upon reflection this is a nice learning experience and “good to know” article but not practical in production.

DOM Mutation events

These events have been around for a while but are now “deprecated” in favor of the new MutationObserver() constructor which is unfortunately not implemented in Internet Explorer until IE11

  • DOMAttrModified
  • DOMAttributeNameChanged
  • DOMCharacterDataModified
  • DOMElementNameChanged
  • DOMNodeInserted
  • DOMNodeInsertedIntoDocument
  • DOMNodeRemoved
  • DOMNodeRemovedFromDocument
  • DOMSubtreeModified

As the mozilla article states – “The practical reasons to avoid the mutation events are performance issues…...” – watching the DOM for changes every time a change happen has very processor intensive – believe me in my experiments if you latch onto DOMSubTreeModified and you are using jQuery which is constantly changing the DOM – you can easily drag your browser to its knees.

So in this article I am going to demonstrate how to use the “old” method for IE<11 and the preferred new method. You can then decide for yourself on the right way to do things – Dojo hijacking, degrading DOM performance or if you are lucky enough to not have to support IE – the way of the future 🙂

An example of the general problem

If I have a simple view panel on a page and I use some jQuery to stripe the rows it looks pretty…..(yes there are other ways to stripe the rows this is just to demonstrate the point).


But as soon as I hit the pager – the striping is lost. The DOM elements are removed and the new elements do not have the striping applied


The partialRefresh

As I am sure most of you know the partialRefresh as genius as it is, works by POST-ing the field values on the form back to the server where the JSF licecycle processes these POST-ed values and then returns a new set of HTML to the browser. That new HTML is inserted as a direct replacement of the DOM element which was being refreshed. Looking at the response from the server you can see below that when paging through a viewPanel the viewPanel1_OUTER_TABLE is re-downloded from the server and replaces the existing Table element in the DOM.


So my striped table is deleted from the DOM and replaced – ergo no more striping.

DOM Node insertion

Using the DOM Mutation event DomNodeInserted it is actually relatively easy to re-stripe the table.

I first surrounded the viewPanel with a div wrapper “viewPanelWrapper”. This is what I will use to listen to changes for. Because the whole outer table is replaced I cannot listen to events on it – it is removed along with my binding.

The first piece of code will demonstrate the event listener

$('.viewPanelWrapper').on("DOMNodeInserted", function(){
    console.log('a node was inserted')

When I run the above code snippet through firebug you will see that nothing changes (1). But when I click Next the partialRefresh is triggered and “a node was inserted”

If we then take this a step further we can add in our striping code again

$('.viewPanelWrapper').on("DOMNodeInserted", function(){
    console.log('a node was inserted')
    $('.viewPanel TR:even').css({background: '#FFCCCC'})

And that’s pretty much it – pretty simple really.


So then extending this simple example you can see how a jQuery plugin could be reapplied to any page after a partialRefresh has been triggered – JUST BE AWARE THAT THERE IS A PRICE TO PAY IN PERFORMANCE. If you are going to do this then make sure that you pick the smallest area to check possible and that it does not change every second – your browsers and more importantly users will not thank you. On and applying a jQuery plugin almost certainly also modifies your DOM – be careful not to create an endless loop of pluging in your plugin.

So the “better way”

This article explains the reasoning behind the new MutationObserver and more importantly why it makes more sense than what I just showed you.

DOM MutationObserver – reacting to DOM changes without killing browser performance.

Check out the “So what are these good for” section at the end – obviously they were talking about XPages 😉

Using a slightly modified version of their example we get this

var MutationObserver = window.MutationObserver || window.WebKitMutationObserver || window.MozMutationObserver;
  var list = document.querySelector('.viewPanelWrapper');

  var observer = new MutationObserver(function(mutations) {
    mutations.forEach(function(mutation) {

        $('.viewPanel TR:even').css({background: '#FFCCCC'})

  observer.observe(list, {
  	attributes: true,
  	childList: true,
  	characterData: true


Which works the same but as the article explains – WAY more efficient and also gives you the control to not screw up your plugins.

Remember though the caveat is modern browsers and that it is IE11 only


Overall this has been a fascinating learning experience for me. I can’t recommend using the DOMNodeInserted event listener because it definitely caused me pain and anguish in browser performance. The MutationObserver is a very interesting concept but I am not convinced I would use it in an application until I better understand it.

Using jQuery .when() to trigger a screen update after select2 has loaded

We have an XPage with over 100 complex fields on it and we found that applying Select2 to it took a noticeable amount of time – tangible to the user.

Users would see the original “SELECT” fields before the nicer select2 style was applied.


So we determined to hide all the fields on the form until select2 was finished loading, then show them all select2’d. The problem was that we could not find an obviously way to do this. I was looking for a parameter within the select2 and I could not get anything to work. So I looked elsewhere…..

We decided to hide all the form fields with a <STYLE> on the main section. Then when the select2 was finished brought them into view. The experience is much better on the user.


   $("select[class*='select2']" ).select2({
    allowClear: true
     $('#mainContent').css({'visibility': 'visible'})

Using the jQuery .when() causes the code to be synchronous, so I know that it will be triggered when the select2 code is finished.

For more on .when() check out the API

Integrating the Bluemix Watson Translation service into an XPages application

In this article I will demonstrate how to integrate the Bluemix Watson Translation service into a functioning XPages application.

Bluemix Watson Translation Service

Following on from one from my previous posts on the subject I have been looking for a good workable example of using a Bluemix service within an XPages application. As I said before this is as much an exercise in me learning more about node.js and Bluemix as anything else – but I also love being able to share.

Based on the previous post about how to get the Watson Q and A service up and running it took me 12 minutes to get a working example of the Watson Translation service up and running on a Bluemix site. Just for disclaimers I have no idea how good the Watson service is and I am not advocating it – this is purely an exercise in being able to use a Bluemix service, more so that what it does.

So I followed the steps in my previous blog but applied them to the example post on the Watson Translation service. This was pretty simple and once again the only thing to change was the manifest.yml file to name the service and application correctly.


When you type into the TEXTAREA box (1) and then hit “Translate”(2) the result is displayed in the Output (3)

The page does a full refresh and the answer is displayed. The node.js code running behind the application is fairly self explanatory :

  • (looking at the code snippet below) we can see the values POST’d from the application are received on the node.js server in the app.POST
  • The posted field values are turned into a separate request to the Watson translation service
  • The Watson request (watson_req) is sent as a serversiude request to the Watson service
  • The results from that are sent back to the response object “watson_res”
  • The node.js response (res) back to the user is then set back to the browser using the jade template (index)
    • return res.render(‘index’,request_data);

// Handle the form POST containing the text and sid, reply with the language'/', function(req, res){

    var request_data = {
        'txt': req.body.text,
        'sid': req.body.sid,
        'rt':'text' // return type e.g. json, text or xml

    var parts = url.parse(service_url);
    // create the request options to POST our question to Watson
    var options = { host: parts.hostname,
        port: parts.port,
        path: parts.pathname,
        method: 'POST',
        headers: {
            'Content-Type'  :'application/x-www-form-urlencoded', // only content type supported
            'X-synctimeout' : '30',
            'Authorization' :  auth }

    // Create a request to POST to Watson
    var watson_req = https.request(options, function(result) {
        var responseString = '';

        result.on("data", function(chunk) {
            responseString += chunk;

        result.on('end', function() { //this is triggered when the response from Watson is completed
            // add the response to the request so we can show the text and the response in the template
            request_data.translation = responseString;
            return res.render('index',request_data); //<------response sent back to the web page


    watson_req.on('error', function(e) {
        return res.render('index', {'error': e.message})

    // create the request to Watson



You can see that in action through firebug – here is the POST parameters

w3and here is the response – the new HTML containing the answer



Modifying the service

Here is the thought process which occurred to me:

  • Well that is cool but if I want to integrate this into an XPage then I will need to return the result as JSON and not a whole HTML page
  • But I do not want to break the example so how can I do that ?
  • I can use a new route – so if I POST at a different URL, the node.js server can be made to be smart enough to do something different
  • Ah…….and then I will have CORS issues because my application will be running at and the bluemix app is not

Replicating the POST as code

So the first thing I did was try and replicate the code in jQuery (as I know best) so that I could mimic a POST event in ajax without using a form. In the following example you can see I added jQuery to my page (jQuerify plugin for firebug). I then simulated an AJAX POST to the translation service (emulating the form post).



The response is the HTML of the new page, so I know this is at least the right AJAX code.



But when I am on the site and I try and repeat the same thing, as expected CORS issues.


Ryan Baxter (who frankly has been a huge support and help in learning all this) said to me

“You are in luck, CORS is dead simple with Node  –” and how right he was.

Adding the CORS package to my Bluemix application

The way you add a package to your Bluemix application is by updating the package.json file. This file contains all the npm modules that will be needed and what is even cooler is that these will be npm installed within your node server automagically on deployment


So then I need to add the cors enabled page to my application. But I did not want to change the core example on the site. So what I did was create a new “route” within the node.js application and told it to do something different….

Originally we had this within the example

// Handle the form POST containing the text and sid, reply with the language
//the route in this case is "/" - so basically the root of the application'/', function(req, res){

and to create a new route we just add a different

// Handle the form POST containing the text and sid, reply with the language
var cors = require('cors'); //add the cors module code to the application
var corsOptions = {
    origin: '' //allow copper to be a site which can work with the watson site

//note the new cors(corsOptions) parameter in the function'/xpages', cors(corsOptions), function(req, res){ //when a user posts to do this code instead of the default '/'

What I then changed was the response code – this was the original code responding with text and rendering using the jade index.jade template

        result.on('end', function() {
            // add the response to the request so we can show the text and the response in the template
            request_data.translation = responseString;
            return res.render('index',request_data); //render the response using the index,jade template and passing the request_data

Instead of res.render I used res.json to send the request_data object back to the browser directly.

        result.on('end', function() {
            // add the response to the request so we can show the text and the response in the template
            request_data.translation = responseString;
            return res.json(request_data);

So in this way I left the default page alone (“/” server root) – the example still works – but I added a new route (“/xpages”) so that my XPage application could POST at it.

I committed the code and pushed back up to Jazz Hub, restarting the service as we went…….and low and behold success…..


As you can see from the above image, I am at the website. Using the slightly modified code to now post at

I get a response from the Bluemix website with the JSON from the Translation service. You can see from the headers the CORS header is added for


Note – if you go to you will GET an error. This is because my node application does not have an app.get for the /xpages route

So then XPages….

Integrating the Watson Transation service into an XPage application

Once I have the working ajax code it is very simple to add it to an XPage and just pick up the field values on the fly….. I cheated to some extent and copied and pasted the HTML from the original example into my XPage….hey why not this is an example after all…

I added a couple of classes to the example so that I could easily pick out the field values…


And then the jQuery code to add to the application. In this code I

  • Select the .translate class (the translate button)
  • In the click event I get the data from the form
    • the .sid select box for the language translation
    • The text value from the textbox (.theOriginal) field
  • I then submit to the the translation service as before
  • In the .done() of the AJAX request, the incoming msg.translation is then added to the .theTranslation field


 The demo

As you can see from the video – I am able to:

  • type in any value
  • send it to the bluemix service
  • receive the answer and display it on the screen
  • save the original and translated values as notes documents within my XPages application
  • Firebug shows the fact that the POSTS are going to


I think this is a great example of using a “service” on another website to be able to enhance the functionality in your own XPages application. If IBM are successful in their investigation to put Domino inside of Bluemix, this capability will all happen behind the scenes and could be easily written as one true application……bring it on… 🙂



This exercise was particularly gratifying because it feels like a culmination of all the work I have done in XPages over the last 3 years.

  • Without the jQuery and Firebug work I did at the start of XPages development, prototyping this would have been much more tedious.
  • Learning XPages has forced me to have a better understanding of AJAX and jQuery
  • If it were not for Angular and the “Write once run anywhere” I did for MWLUG, I would not have learned about CORS.
  • If it was not for the Angular work I would not understand “routes” within an application
  • …..and now I am learning about node.js and Bluemix PaaS

Keep learning !



Oh duh, that is why Node Package Manager (npm) is so cool

I am in the process of playing with Bluemix and part of the reason or this is to learn more about node.js. It is really cool to learn something new and also to broaden the mind. But I digress – Node Package Manager (npm) makes using node.js so simple it is mind boggling…

Without going into masses of detail on node.js (which I might at some point) here is the simple example of what I am trying to do and why npm is so cool. Node Package Manager (npm) allows you to install any node “module” or code component library, very easily and with the minimum of effort.

I run node.js on my windows machine within a certain directory. I am looking into CORS enable one of my Bluemix applications so I am messing with the cors module in node.

Here is my directory structure to start with. It is only a simple hello world app but for the sake of this demonstration it is all I need.


Within node_modules we have the following packages which are the modules I used as I was learning the hello world example.


Adding a new module using npm could not be simpler

To add “cors” to my application I open the hello-world directory in a command prompt (Shift right click)


I install cors with the simple “npm install cors” (yes Ryan I used a Command Line Interface)


The files are downloaded from the npm website and they are added to my directory structure


and it is as simple as that. I now have all the cors code which can be accessed by my application.

It only struck me this morning as to the sheer brilliance of this as I “muscle memory” did this to install a new module because it occurred to me I needed it. Simplicity is a thing of beauty to me.

In summary

  • I wanted to install a new module
  • The npm architecture with node centralizes all the code modules you need
  • I did not need to know where to look for the cors package existed
  • I found my example of how to use CORS online
  • I added the code module I needed to my application with 3 words
  • so cool 🙂

Create your own Watson Q and A example with Bluemix, Webstorm and Jazz Hub


In this article I will demonstrate how to get up and running with one of the Bluemix/Watson service examples. I will be using the example provided by IBM in their documentation as the basis for the article but the way in which I achieved the final goal was quite different from the way that they explained it in the example.

This example will use:

I could have added BS words as an attention seeking headline, and it would fit because there is so much I want to show in this one post (it is probably should be 5 separate posts). But I figured that it’s probably better to have a more description title about what this is really about (more googlable that way). So be warned this is a longer post than normal because of all the pictures. Going through this experience helped me better understand Git, Webstorm (and how it uses Git), Jazz Hub and Bluemix.

Creating an example of using the Bluemix Watson service.

Earlier this week IBM announced that they added the Watson API as a service to Bluemix. I honestly have no idea what I would ever use this for in my line of business but the coolness factor is huge!

In this post I am going to demonstrate how I was able to create the example service without using the same process as laid out in the IBM documented example. As this is a long post (lots of pictures) I will be discussing the separate parts in other posts.

There are also other ways of creating your App from scratch (for example Create an App, select node.js as a runtime and then bind the Watson service.) The reason I did the example this way was to highlight that you do not always have to start with a new service. I needed a node.js runtime, and it happened that one of my services already provided that (the DataCache starter service). That’s kinda cool and kinda the point of Bluemix!

1) Log into Bluemix

Your should be presented with your dashboard.

2) Create an App 





Select the Node.js runtime


Name the App (in my case the App is called xominoWatsonQandA and the Host is xominoWatsonQ)


3) Add a service – Watson Question and Answer 



4) Assign meta-data to the new service

  • Give the service a name – you will need this later so don’t over complicate this (qa-service)
  • The Watson service is currently in Bluemix as a Beta at time of writing


5) Confirm you have an application

Now we have the pieces necessary to build the application on.

  1. A Node server
  2. The Watson service.


6) Make sure you have a Jazz Hub account

If you do not already have a Jazz Hub account then go and create one here

NOTE: The Jazz Hub site uses your IBM login userid and your Jazz Hub password – do yourself a HUGE favor and make the password the same as your IBM account and if you ever change the password on your IBM account you have been warned 🙂

Here is a shot of my Jazz Hub account before we start – note no “xominoWatsonQandA”


7) Add your new application to Jazz Hub

Select the “Add GIT” option on the top right of your App dashboard


8) Confirm existing service code?

The next prompt asks me if I want to create starter code in the new Git Source Control repository. In this case I do not – un-check it.


9) Go to your new source control Git repository

Click on the Jazz Hub URL displayed on the right hand side.

The Add Git link will be replaced with a link to the repository. Click on the link to go to there. We will look at Edit Code in another post.




10) Get the URL of the repository

In the picture below you can see over on the right there is a link for the Git URL – click on that and copy the URL.


11) Open Webstorm IDE

As I was working my way through this example I wanted to learn more about how Webstorm functions as a Git client. You can use Source Tree or the Git command line if you prefer.

You need to ensure that Git is enabled in Webstorm before you proceed. it is not configured out of the box. The instructions for enabling git are found here You will need to have Git installed before hand. Follow the ssh key instructions and it will show you how.

12) Checkout the new repository

Select Checkout from Version Control > Git form the Webstorm VCS menu option



Paste the URL from the Jazz Hub site into the Vcs Repository URL


Select Clone and a new project is created within Webstorm and as you can see, the files downloaded are those from the repository on Jazz Hub


12) Download the example code and add to the local repository

Following the example site, download the sample file Unzip it locally and drag the files into the repository directory created by Webstorm in the previous step.


Clicking back into Webstorm you will see the files refresh in the project – they all appear “red” because although the files are in the repository directory they are not current added to the Git configuration, it does not know they exist.


13) Add the files to the local Git repository

Right click on the project and add the directory as follows.


The files will all turn green

14) Edit the manifest.yml file

Within Bluemix the manifest,yml file is the key to holding everything together. It contains the “services” and the application they are used in. In my case the service name for Watson was qa-service and the application is xominoWatsonQandA.

Save the Manifest file


You can also see from the manifest file command “node app.js“. If you look into the other files currently in the example you will find app.js. This tells Bluemix how to start the application.

15) Commit the files to the local repository

Right click on the project name – select Git > Commit Directory

In this case I chose to Commit and Push back to the repository




Webstorm then does a stupidity check on your files (we all need that)


And then Webstorm tells you that you are stupid!!


On dear……….well actually if you look into this, Webstorm is reporting issues in the example app css files and missing semi-colons in the JavaScript. Without going into correcting all of them, the application works despite these errors. So we Commit and continue.


My commit comments are “Modified manifest and all files”



We get confirmation that the files were committed and pushed up to the Jazz Hub repository

16) Confirm Jazz Hub

You can now see by refreshing your Jazz Hub page that all the files are now in the server repository. You can also see next to my picture in the middle “Manifest change and all files” as the last commit comment.


Then comes the really cool and completely not Command Line experience. In the top right of the page you can see Build and Deploy….

17) Build and Deploy


18) Confirm

You will be taken to a page where you can see the results of the build and deploy. Jazz Hub is very smart and self aware. If you have un-committed changes within your repository it will not let you Deploy. That is for another day though.



Note: what’s interesting is that the Deploy to URL which I cannot change is not what I would expect as which is the route I gave the application in the first place. Turns out I not have two routes to the same App. I can see this back on my bluemix dashboard as there is now a (+1) next to my routes options!

So note to self, name your route the same as the application name!


19) You now own Watson

Click on the route link back on your Bluemix dashboard and you now have a running example of the Watson Heathcare service


In review

In this example we saw a LOT of cool new features, concepts and ideas for Domino developers and you know that we have barely scratched the Bluemix, Webstorm, Jazz Hub surface. I learned a significant amount of new and cool things from going through this process. The first time from scratch it took about 90 minutes to figure out what I was doing. The second time I went through it, even taking all the screenshots for the blog it took 25 minutes.

We only changed two lines in one file to make this work – anyone can do this !!

Like I said at the top of the article, I have no idea how I would use Watson in the day job but that was not the purpose for doing this example. These “services” are componentized capabilities which we can take advantage of. Imagine the growth possibilities as more and more services are added to Bluemix. This has got to be fun to watch and follow along. Remember IBM are investigating putting Domino into Bluemix.

The Watson Cloud REST API information is available here!/Question_Answer

This is so cool 🙂

Getting into IBM Bluemix….this could be interesting

My interest in IBM Bluemix has grown since it was announced. I took part in the Beta, but frankly didn’t do very much because I really couldn’t figure out what it was or why I cared. Since then though IBM has announced that they are investigating putting Domino into Bluemix.

Before I start down this path I have to say at this point that I have no idea what “Domino in Bluemix” even means. I have no idea what that will look like and while I would love to second guess and make a long wish list…..I can’t, so I won’t. What I want to do with the blog posts is to detail what I find out about Bluemix as I go along. There are more and more examples of how to use Bluemix and some of the services it provides. As I go through them I do not want to repeat what the example say, but rather add my perspective as to why I think it is important and relevant to me the XPages developer. I *will* make the assumption that there is a reason/purpose/goal behind investigating putting Domino into Bluemix and I want to be ready for if/when it gets there.

Ultimately I want to answer the question for myself: What is Bluemix and why do I give a hoot? – (obviously hoot not being the word I really used) and that would have made quite a nice blog post title I am sure, but sometimes my better judgement wins out 😉

To be clear though I am no Ryan Baxter – he’s he guy that really gets it. So I will do my best to explain, but I apologize if I screw things up. As Dave Leedy constantly tells me, if you want to know the answer to a problem, post the wrong answer and people will be falling over themselves to make you look stupid and give you the correct answer. Either way we will get there 🙂

So what is Bluemix and why do I care?

Bluemix is designed to be a “Platform as a Service (Paas)”. There are others out there like which have been around for a while. IBM is not the first to the party on this whole cloud thing but they are definitely “in”. If you want the high level overview of what is Bluemix then check this out – What is IBM Bluemix?

“Bluemix is an implementation of IBM’s Open Cloud Architecture, leveraging Cloud Foundry to enable developers to rapidly build, deploy, and manage their cloud applications, while tapping a growing ecosystem of available services and runtime frameworks.”

Blah Blah what on earth does that mean to me? Off the top of my head I had no idea and it was not until I start to to get into the examples did it become clearer to me.

Without signing up you can actually read some of the documentation – and look there is another definition

“IBM® Bluemix is an open-standards, cloud-based platform for building, managing, and running apps of all types, such as web, mobile, big data, and smart devices. Capabilities include Java, mobile back-end development, and application monitoring, as well as features from ecosystem partners and open source—all provided as-a-service in the cloud.”

Coooooool and Blah Blah what does that mean to me??

To find out more about Bluemix, you have to *do* Bluemix, and it is free. You should go to and create a free account (trial for 30 days). After that you get a sizeable amount of “free” services and disk space to play with.

Getting started – Read the documentation

First mistake I made was to not RTFM. Funny that, being a man and a developer I was clearly at a disadvantage. Who needs documentation? – heck I joke about how my users don’t read documentation, why should I be any different (facepalm).

Unfortunately from this XPages developer’s perspective the “Bluemix overview” was interesting, but so far over my head as to be unfortunately useless. Large distributed architecture is not something we are used to in the Domino world.

Start by doing some of the examples

While I realize at this point I have not given any real perspective on Bluemix, I am assuming that because you are reading this you are at least vaguely interested in it……so take an hour and do one of the examples. After that things will start to become clearer and I will talk more on the subject and the examples in the coming weeks.

Go to the first example on the documentation website – build yourself a web application. Go through the motions and just do it. This will do nothing more than familiarise yourself with the interface at this point. You may still be starting at it and thinking “so what”, but you should do it. It gets easier. For most of you it will be the faster you’ve ever created and used a web server of any sort.

Don’t be scared of the command line

Whatever Ryan says about how awesome the Command Line Interface (CLI) is for cloud foundry, I believe that CLI is scary to XPages developers who are used to tools to do “command line stuff” for them.

Just do the example – and trust me it will get easier and make more sense.

In the next article I will talk about IBM DevOps Services at Jazz Hub and how you shouldn’t have to use the command line to better understand how this all works together.