Friday, October 3, 2014

More Fun Creating Dojo widgets and MVC

Here is another article in my "Fun with Dojo" series.

Dojo traditionally has been considered slow.  One reason is that Dojo loads alot of modules before it instantiates the page.  Dojo with AMD helps reduce the number of modules when it first loads but not the number of modules.  This is great.  However, one thing that I really dislike are Dojo Dijit widgets.  They are heavy, slow, and require a lot of modules to be loaded.

We like the Bootstrap widgets because they are lightweight. So instead of using Dijits that are built into XPages we created our own lightweight Dojo Bootstrap widgets that utilized the most minimal amount of required Dijit modules and dependencies.

One of the things that we had to figured out was what bare bone modules that we needed beside dijit._Widget in order to create our own widgets and include them in the loading process.  When we were using Dojo 1.53, we created a custom loader with all the required modules contained in a single file.  Rather than including _TemplateMixin and _WidgetsInTemplateMixIn and other mixin modules, we did everything using our own code within the buildRendering lifecycle.  This help reduce the additional modules that was required.
<div id="happy" class="user">
 <button class="icon"><i class="fa fa-user fa-lg"></i></button>
</div>

However, this approach eliminated a very power module.  In the example above, you can get a handle to the button node of the widget by using:

var btnNode=dojo.query('button',this.domNode)[0];

or

var btnNode=dojo.query('.icon',this.domNode)[0];

For users not familar with Dojo "this.domNode" is a handle to the widget, "this" being the widget itself.  Please note that "this.domNode" does not exist until it goes through the buildRendering lifecycle.

This is how we were getting a handle since we did not need all the stuff that was in the module dijit/_TemplatedMixin since we were calling buildRendering directly.  This worked fine but this approach added more code then we wanted for the widget.

So we took another look at how we could utilize what was included with Dojo, but at the same time not load stuff that we did not need. Within _TemplatedMixin, there is a call to another core module, dijit/_AttachMixin.  This module creates attach points and events to get a handle to different parts of the widget DOM and events. So in our previous example, rather than using dojo.query all you have to do is add the attribute "data-dojo-attach-point" to the DOM fragment and run
this._attachTemplateNodes(this.domNode) during the buildRendering lifecycle.

<div id="happy" class="user">
 <button class="icon" data-dojo-attach-point="btnNode">
  <i class="fa fa-user fa-lg" ></i>
 </button>
</div>

So your widget code will be:
define("mywidgets/button",[
 "dojo/_base/declare",
 "dijit/_WidgetBase",
 "dijit/_AttachMixin",
 "dojo/dom-class"
],function(declare,_WidgetBase,_AttachMixin,domClass){
 return declare([_WidgetBase,_AttachMixin]),{
  buildRendering:function(){
   this.domNode=this.srcNodeRef;
   this._attachTemplateNodes(this.domNode);
  },
  postCreate:function(){
   this.inherited(arguments);
   domClass.add(this.btnNode,'blue');
  }
 });
});

So now we have a handle to the button node using "this.btnNode". Simple and easy. A view/controller in Dojo can just be a Dojo widget with Dojo widgets inside. Therefore you can extend the same process if you are using Dojo for your MVC framework.  This will be the topic of another future blog.

"this.srcNodeRef " is the reference to the DOM fragment on your HTML page that represents the widget.

Thursday, October 2, 2014

The Good, Bad, and Ugly about AMD in Dojo

We were testing iPhora Touch 2 with the attendees of MWLUG 2014 in the MWLUG portal and as usual, the interface had issues even after having outside users provide feedback before we rolled it out.  When you are so focused on the creation process it hard to see issues until it is seen by fresh pair of eyes.  So we are working on a new faster and improved interface for iPhora Touch. This is the evolution of any product.

Since we are using a JavaScript/JSON Restful API approach most of the changes will be on the front end and only minor changes to the back end which is great.  In theory we can swap out different interfaces with little impact on the back end.  This is where the newer AMD approach for Dojo comes into play. The use of AMD in Dojo allows us to modularize the UI components even more than what we had.  Therefore, we have been upgrading all the iPhora widgets from Dojo 1.53 to Dojo 1.10.0 and at the same time upgrading from Bootstrap 2.32 to 3.2 and the latest version of Fontawesome.  The View/Controller of the MVC framework becomes no more than a Dojo widget.  This will be the subject of upcoming blogs.

The AMD approach with Dojo has many advantages, but it also has a number of things that I do not like about it.  For example below is a simple set of code for reading a cookie and adding it to a DOM node and adding a class to a div tag using Dojo.

HTML
<div class="data-display"></div>

Dojo without AMD
dojo.require('dojo.cookie')
var c = dojo.cookie('info');
var node=dojo.query('.data-display')[0];
node.innerHTML=c;
dojo.addClass(node,'show');


Dojo with AMD it becomes
require([
 "dojo/cookie",
 "dojo/query",
 "dojo/dom-class"
],function(cookie,query,domClass){
 var c=cookie('info');
 var node=query('.data-display')[0];
 node.innerHTML=c;
 domClass.add(node,'show');
});
As you notice the amount of code for Dojo with AMD can quickly add up and if you miss a module path/declaration you get an error that won't make sense.  So you need to be careful in defining the path/declaration pair.

But like all good developers, I decided to create a new feature with my development tool to use a template to create my code.   This is where my mustache function comes in handy.  Below is an example of my code template that I have created.



As a result, all I have to do is have my compiler create the Dojo/Bootstrap widgets from the declaration tags, for example <ux:ipField id="hello"/> and assign the appropriate required modules to the assigned mustache tags and do a simple LotusScript find and replace.

However, one critical advantage of AMD is that you can create your own module and include or replace an existing module relatively easily. For example, we needed to replace the Dojo MVC JsonRest module "dojo/store/JsonRest" with our own "iphora/JsonRest" since it does not meet our requirement in how we are doing REST services. We copied the JsonRest.js code modified it and then placed into our iphora directory and now it is loading our module.

require([
 "dojo/cookie",
 "dojo/query",
 "dojo/dom-class",
 "dojo/store/JsonRest"
],function(cookie,query,domClass,jsonRest){
 var c=cookie.get('info');
 var node=query('.data-display')[0];
 node.innerHTML=c;
 domClass.add(node,'show');
});

require([
 "dojo/cookie",
 "dojo/query",
 "dojo/dom-class",
 "iphora/JsonRest"
],function(cookie,query,domClass,jsonRest){
 var c=cookie.get('info');
 var node=query('.data-display')[0];
 node.innerHTML=c;
 domClass.add(node,'show');
});

So we just point the path to our module.  No other changes in the code are required. 

Another disadvantage with the AMD module approach is that you need to be extremely careful if you are trying to create a compressed build of your own modules to reduce the number of http request during the loading process.

With Dojo without AMD, you can easily just concatenate all the modules into one single script file and load that script as part of the page instantiation process. However, Dojo with AMD requires you to create a build using an elaborate process that utilizes a package.json.  This is similar to how npm in nodejs does their builds.  More information can be found in this article.

http://dojotoolkit.org/documentation/tutorials/1.8/build/

Dojo with AMD is more powerful then ever but it just takes a little more work and can be fun!

Wednesday, October 1, 2014

Why an Ideal MVC Framework Fails

One of the "Next Big Things" these days is the use an MVC framework for developing Web applications. There are a number of MVC frameworks out there including Backbone, Ember, and the rapidly growing Angular. 

Ideally these MVC frameworks are great, hook things together and any changes to the model (data) and all the views are updated based on the controller code. If there is changes to the model, the data is automatically updated on the server. This is the ideal case.

However, it assumes one critical thing, you have the access rights to CRUD data on the server. This is an extremely dangerous assumption.

In our iPhora security model, we assume the opposite and you do not have access rights to any data on the server. We do not trust anything and any request. We assume that you are trying to hack and inject. Your authorization is checked during each request. We looked at using Angular since it is the hot stuff of the MVC world and everyone seems to be using it. We also looked briefly at Backbone. However, these required us to hack and patch the code to do what we wanted it do. As a result, it would be hard to maintain in the future when new versions of Angular or Backbone comes out.

Since we have been using Dojo for awhile, we looked at a couple of Dojo-based MVC frameworks including Dojorama and the Dojox/MVC. These two also assumes an ideal MVC approach which is not what we wanted. Also Dojox/MVC is heavy.

Most of the MVC frameworks works well for simple applications and even more complexity applications. But for our iPhora applications like iPhora Touch, within one session, the user access rights and roles will vary constantly depending of the state and application the user is accessing. 

One thing that all these MVC frameworks lack is good documentation not on how to use it but documentation regarding the core architecture and functionality. You have to go through all the code and test and sometimes it is trial and error and assumptions. What a pain.

So we decided to create our own MVC framework using Dojo. The newer AMD approach for Dojo lends itself to the MVC approach.  Dojo already has core methods to handle MVC including dojo.store, dojo.stateful, dijit.watch/unwatch, and dojo.store.observeable. Also, don't forget the powerful dojo.subscribe and dojo.publish. We used the Dojorama project as a guideline and starting point. This MVC project is a good demonstration of the power of Dojo.

So where are we? Our initial MVC architecture is completed and we have modified our development tools to handle the new MVC approach. But, we expect more changes to come as we test.  The base core code is almost complete and will be ready for a test during the next couple of days. Our existing page-based JSON Restful services API will require some minor changes that are relatively easy to make. We are half way done in updating our existing iPhora Dojo/Bootstrap widgets to Dojo 1.10/Bootstrap 3.2 with MVC-based hooks. So stay tune for more updates.

Tuesday, September 30, 2014

Don't What it, Hitch It

Async callbacks and closures is an incredible thing about JavaScript.   This is what makes environments like nodejs possible, a topic of many blogs to come.  Many of the JavaScript frameworks like my favorite framework Dojo have the ability to deal with closures and callbacks.  One of the issues that one needs to deal with is getting a handle to the current scope within a closure.

We discussed this and how to deal with it in Dojo in a previous blog.

http://dominointerface.blogspot.com/2014/03/when-is-this-is-this-and-what-is-this.html


The nice thing about Dojo is how extensive the library is.  This is also a negative thing since there is so many features that you might not beware of. As I am converting our iPhora library from Dojo 1.53 to Dojo 1.10.0, I realized there is also another method that you can use to get a handle to the current scope within a closure and that is dojo.hitch which has been around for a long time. 

In our previous example, we had a simple widget.

<div class="widget" id="happy">
<ul>
<li id="1">
  <a href="#" tabindex="-1"><i class="icon-tools"></i>Hello</a>
 </li>
<li id="2">
  <a href="#" tabindex="-1"><i class="icon-tools"></i>Hello</a>
 </li>
<li id="3">
  <a href="#" tabindex="-1"><i class="icon-tools"></i>Hello</a>
 </li>
<li id="4">
  <a href="#" tabindex="-1"><i class="icon-tools"></i>Hello</a>
 </li>
<li id="5">
  <a href="#" tabindex="-1"><i class="icon-tools"></i>Hello</a>
 </li>
</ul>
</div>

To get handle to this within the closure, you can use this method
dojo.query('ul > li',this.domNode).connect('onclick',this,function(e){
 alert(this.id);
});


or define "what" equal to "this" to get a handle
var what=this;
dojo.query('ul > li',this.domNode).connect('onclick',function(e){
 alert(what.id);
});


With Dojo, you can also provide the current scope "this" by using dojo.hitch
require(["dojo/_base/lang","dojo/query"],
  function(lang,query){
    query('ul > li',this.domNode).on('click',lang.hitch(this,function(e){
       alert(this.id);
      }));
});


Though you can use dojo.hitch in this way to replace the previous methods, one of the best ways to use dojo.hitch is with async xhr callbacks.  dojo.hitch provides an elegant approach when you need to do a callback during an async request.  I use this all the time for populating widgets or in the case of the MVC terminology, controllers.

If you try this.
require(["dojo/_base/xhr"], function(xhr){
  var args = {
    url: "hostnameurl",
    load: this.callback
  };
  xhr.get(args);
});


you will get an error because "this" is not defined within the closure of the xhr request. You could define var what=this provide a handle as we did before. However, dojo.hitch is a better solution.

require(["dojo/_base/xhr","dojo/_base/lang], function(xhr,lang){
  var args = {
    url: "hostnameurl",
    load: lang.hitch(this,callback)
  };
  xhr.get(args);
});

Now, you can get a handle to your current scope "this" and your callback will not bomb.

Monday, September 29, 2014

Installing Nginx Reverse Proxy on CentOS for Domino Our Experience

Over the past few weeks there has been a significant number of discussions about Domino and the lack of SHA-2 support.  Jesse Gallagher had an entire MWLUG 2014 session on this very topic.  When I ask Jesse to present on this topic, unbeknownst to me what a hot topic this would become and so timely.  First, IBM should have fixed these problems years ago.  For us this is a critical issue that if not addressed will kill the market for Domino.

Thanks to Jess efforts and contributions, there is a workaround that he presented and published recently in a series of blog articles on this very topic.  His solution is to configure nginx as a reverse proxy for Domino so that SHA-2 certificates can be used with Domino. nginx is not just a Linux solution but can also be a Windows solution since it is available for the Windows platform.

Jesse's article focused on setting up nginx reverse proxy on a Ubuntu server.  My comments here are about the differences between what Jesse explained for the Ubuntu server and what you will encounter in configuring nginx on CentOS in particular CentOS 6.5.

A great resource to guide you through the process of installing Nginx on CentOS can be found here:

http://www.rackspace.com/knowledge_center/article/centos-installing-nginx-via-yum

By default the installation will not create site-available and site-enabled directories as it would in Ubuntu.  And they are not needed. A series of default SSL configure files will be created in the /etc/nginx/config.d directory including a ssl.conf template file and default.conf.   Make a copy of the ssl.conf file with the suggested renaming convention that Jesse explained and edit it in accordance to Jesse's instructions.  One thing to remember is that you need to rename the default.conf file in order for your new conf file to work.  If not, the default.conf file will override your new conf file settings.

After installing nginx you will need to create a request key for the SSL certficates

1. Create a directory for the SSL certificates, # sudo mkdir /etc/nginx/ssl

2. Switch to this directory # cd /etc/nginx/ssl

3. Create the request

# sudo openssl req -new -days 365 -sha256 -nodes -keyout example.com.key -out example.com.csr -newkey rsa:2048 

Notes: 
leave out -sha256 for a SHA-1 request.
the -keyout parameter is the filename is the name of the key file that will be created
the -out parameter is the name of the request that will be sent to the certificate authority
in response to this command, you will asked to provide the organizational information and address, etc.
if the certificate expiration is to be different than one year, change -days 365 to the desired number of days

4. Limit access to the key , # sudo chmod 400 portal2.phoragroup.com.key

5. Send the .csr file to the certificate provider for signing. They will send back a crt file.

6. Limit access to the signed certificate, # sudo chmod 400 example.com.crt

7. The certificate authority may send back multiple files. If so, then concatenate them:

# sudo cat example.com.crt  root1.crt > example.com.combined.crt

Notes:
The CRT file with your hostname.crt should be first. If there is more than one root or intermediate certificates, not sure of the order of those (try it and see).


After concatenating the keys to create the combined file, make sure you have a carriage return between the beginning and ending of each certificate. If you do not do this you will get an ERROR.

# sudo vi example.com.combined.crt 


The process is relatively easy even for a Linux newbie like myself.  We are planning to add nginx as our reverse proxy for all our installations of Domino.  One advantage of having nginx as reverse proxy is that by having Domino connect using http instead of https for the reverse proxy, there is less loading on the Domino server and the system as a whole can handle more https connections.
 
The installation process of the SSL certificate follows the standard convention used by most solutions rather than the specific approach required by Domino.  Therefore, there is much more documentation available.

Additional articles I suggest one read is:
 
http://www.nginxtips.com/hardening-nginx-ssl-tsl-configuration/

Please note, that if you have Network Solutions, they still do not have SHA-2 certificates available yet.  Can't believe that.

To learn more on how to install CentOS and Domino, Devin Olson has a series of great videos and pdfs on Youtube. 

http://www.youtube.com/watch?v=geBE13qqz7w


If you want to chat with the ICS Linux, sign up for the ICS Linux Chat on Skype that was started by our friend Bill Malchisky. 

Thursday, September 25, 2014

Gotcha, Creating Dynamic Script Blocks Using SSJS

For our iPhora applications we only use one XPage and dynamically create the content that appears.  We do this by storing the dynamic content in Notes Rich Text fields and using SSJS to read and generate the HTML/JavaScript during runtime. 

Since we are moving to an single page MVC model, we were adding a few addition xp:scriptBlock to generate the initial loader and pulling the information from a NotesRichText field.  We were using doc.getFirstValueString('richtextfield') to pull and create the script. No problem until, we had a JavaScript object declaration that was longer than 72 odd characters.

Unfortunately, there is an issue with Notes Rich Text fields that we encountered in the past using LotusScript that I forgot all about.  When you read a Notes Rich Text field it will automatically add a carriage return after 72 characters and this can drive you crazy. If the carriage return occurs between a JavaScript object declaration then you will get an JavaScript error.

To resolve this you need to switch to doc.getFirstItem("richtextfield").getUnformattedText() to read it.  The automatic carriage return that Domino/Notes pulls in is filtered out, but the carriage returns that you put in remains.

Tuesday, August 26, 2014

MWLUG 2014 Article in Grand Rapids Business Journal

Great Article about MWLUG 2014 in the Grand Rapids Business Journal.  Special thanks to Louise Burton for helping place the article.

http://www.grbj.com/articles/80431-ww-ii-vet-headlines-downtown-lotus-conference

Monday, August 25, 2014

MWLUG Is Heading South

With MWLUG 2014 just about to start in a couple of days, we will be announcing the host city for MWLUG 2015 on Friday at the closing session. So we have decided to head south for 2015.  And yes it is a bit of a stretch in calling some of these towns being in the Midwest.  Here are the potential host cities for MWLUG 2015.  We would like to hear your input so fill out the survey.
  • Atlanta, GA
  • Cincinnati, OH
  • Columbus, OH
  • Dayton, OH
  • Louisville, KY

https://www.surveymonkey.com/s/PZGSND2