Friday, September 16, 2022

The iPhora Journey - Part 7 - Transforming Domino with Microservices

Most of the concepts that we have been talking about in our iPhora Journey are neither new nor revolutionary. Collectively, they merely represent the current state of the art in designing web applications.

In other (i.e., non-Domino) platforms, you would need an array of services each running on a separate server, and you would have to build the integration between the different components and maintain security between all of those components. With virtualization and containers all of the necessary services can now bundled into a single installation, which is helpful, but the integration of those components can still be fraught with issues, especially compatibility between different versions of those services.

With Domino all of this is taken care for you: authentication, directory services, security, web services, and database services are all tightly integrated in a single server,  which is an advantage that no other technology offers. While Domino has its quirks and does not support many of the latest features, hopefully HCL will be addressing those in the near future.  Even as it stands right now, the simple addition of nginx as a proxy to Domino addresses most of the deficiencies.

The iPhora platform is built on Domino, and in doing that we take advantage of the inherent integration of the Domino services. But we also incorporate design methodologies that are usually associated wit non-Domino applications. With this approach, iPhora has the capabilities and features of a modern web application, but all it needs to run is a single server with the most minimal of hardware requirements. We challenge any vendor to be able to do everything that iPhora does on a single box as small as this.


Creating a modern application is not solely about using latest technology to build the application. Well-supported technologies with a long term commitment to maintenance are always preferable to the latest design fad. The important question is what you do with that technology? 


The iPhora concept evolved over the span of 15 years, as we developed applications for organizations, both large  and small, and gained experience with how companies handled their internal business processes. Step by step, we abandoned the more monolithic approach normally used when creating Domino applications, for a more fluid and plug and play architecture. Eventually, we wound up with a service based and loosely coupled architecture, a microservice architecture. It should be noted that this applies to the iPhora components that we developed, not to the underlying Domino services.


Wikipedia defines microservice as:

"A microservice architecture – a variant of the SOA (service-oriented architecture) structural style – arranges an application as a collection of loosely-coupled services. In a microservice architecture, services are fine-grained and the protocols are lightweight. The goal is that teams can bring their services to life independent of others. Loose coupling reduces all types of dependencies and the complexities around it, as service developers do not need to care about the users of the service, they do not force their changes onto users of the service. Therefore it allows organizations developing software to grow fast and big, as well as use off-the-shelf services more easily. Communication requirements are reduced. These benefits come at a cost to maintaining the decoupling. Interfaces need to be designed carefully and treated as a public API."

Using the programming tools available in Domino, we were able to build a microservice architecture that passes data in and out using a common bus and transport schema. Each microservice is an isolated black box with only input and output.

Remember that in a previous blog I mentioned the importance of the JSON format and the ability to quickly process JSON. The reason is that the data transport layer between microservices is all JSON. By having all input and output defined as JSON objects, we achieve a standardized and easily extensible communication layer. This allows microservices to be independent and reusable by multiple services and processes. A typical process can chain together a series of microservices to accomplish a specific and highly complex task, as opposed to writing a custom services to accomplish the same thing.

But how do you connect these different microservices together to create your application? This leads to our next discussion next time, flow-based programming.

Tuesday, September 6, 2022

The iPhora Journey - Part 6 - An Application, Rethinking and Redefining

In our previous articles, we discussed the advantages of using JSON in applications and the importance of being able to effectively process JSON. This leads to the interesting question: what exactly is an application, and the answer to that question may differ depending on who you ask. Most people today would assume you are referring to a phone app, and even then there are wide range of options. A college student might think of apps like Tiktok or WhatsApp, and a person looking for a significant other might automatically think of Hinge or Tinder. All of these people would be able to provide you with multiple examples of apps, and they might be able to list some of the common features of an app, even if they do not understand any of the technical details.




In general, a web app today consists of the following four modules, even if the technical details may differ tremendously from app to app:

  • User interface (UI)
  • Database
  • Business Logic
  • Security

Now, let's consider what an app is in the traditional Notes/Domino sense. Here, an application still consists of the same four modules, but everything is mixed together (actually smeared-together may be a better term), as opposed to being separated. A Domino application consists of one or more Domino databases, and access and security is defined within each database using ACLs.  This is true for both Notes client and Web-based applications created on Domino -- so far, so good.  However, when it comes to the UI and Database modules, Domino uses the concept of a form, which defines both the user interface and the data structure for storage, modules that are typically separated in more modern apps.

Moreover, the business logic is programmed directly into the actual UI components all over the form. Even something as simple and basic as controlling what UI elements are visible (i.e., hide-when formulas) requires that the programming logic is dispersed throughout the form and may be associated with each and every UI component.

This approach can work for stand alone applications or situations involving only a few applications. However, as the number of applications increases, providing the infrastructure to effectively create and manage the applications and user access becomes more and more difficult.  This is very true for Domino applications because the business logic can be implemented in so many different ways and in so many different places.

The complexity of Domino applications tends to increase even further because the data is distributed throughout multiple databases using different data structures. The more different forms an application uses, the more views are needed and the more indexing is required, resulting in the need for more storage space.

So our primary goal was to structure iPhora to follow a modern application architecture, but still provide the extensive data security one expects from Domino. We wanted to be able to build and manage hundreds of applications with the same ease as doing it for one application. Almost all of the development projects that we have been involved with so far began with one critical application that eventually expanded to dozens of related applications, each with different sets of users and different ACLs requirements. For one customer, that one initial application grew into 50 applications and over 100 processes, each with different roles and user access requirements and all accessed and managed within a single instance of iPhora. Yet even as more and more applications were added, no additional Domino databases were needed.

An iPhora installation uses only seven Domino databases, each serving a specific purpose within the iPhora architecture: 

  • API / GateKeeper
  • Attachment Store
  • Business Logic
  • Data Store
  • Logging
  • UI Interface
  • User Profile

In other words, there is no longer a one-to-one relationship between applications and Domino databases. Any number of applications, including widely different types of applications, can be accommodated using this structure.

Since data security is of the upmost importance for iPhora, it was extremely important that users never have direct access to the data, which is contrary to a traditional Domino application. We wanted all of the databases to be totally locked down. Our API consists of LotusScript agents that interact with the data according to strict protocols. In general, the only database that users interact with is the database that holds those agents, and the UI interface. The other databases cannot be opened using Domino URL commands.

Since data is stored as JSON, the data store module utilizes a single Domino form to store all records regardless of the application, although the fields may different from one form to another. Any attachments that are uploaded into a data store document are actually stored in a separate attachment store database which also utilizes only one single Domino form. With DAOS, we can handle 1 TByte of attachments and if more are needed, we just can simply add another attachment store.

For companies that already have data in Domino databases, iPhora has the ability to read and display data from other Domino databases, including databases that are structured in the traditional Domino way. The only requirement is that form structure needs to be defined and added to iPhora.

In conclusion, by re-thinking and re-imagining what an application is in Domino, Domino becomes a scalable application platform on which to build and run secure applications that utilize a modern design framework.  The advantages of a structured and scalable application architecture become apparent as one moves towards utilizing no-code/low-code tools to build applications, something we will consider later in this series. 

How we access and communicate with external services including non-iPhora Domino applications is the subject of our next discussion, Microservices and Naoservices

Saturday, August 20, 2022

The iPhora Journey - Part 5 - Dammit Jim, I'm a LotusScripter not a JavaScripter



As often said by Dr. McCoy in the original Star Trek series, he is a doctor, not a ____________ .  However, just like Dr. McCoy, you may sometimes need to work on things that seem very alien to your experience.  One of those things, might be JSON. LotusScript, which was derived from Visual Basic, was written long before JSON existed, and therefore, LotusScript had no built-in capabilities for handling JSON objects.

As we mentioned in Part 4, JSON plays a critical role in iPhora.  All data are stored as JSON, and JSON serves as the primary data and communication format between modules, functions and services.  All core components operate using JSON-based configurations.  Therefore, it was extremely important that we are able to fluidity create, read and process JSON.  

Creating a JSON string is relatively easy in any programming language. You can create it even using Commodore 64 Basic, and if built sequentially, one line at a time, it is possible to create JSON representations of very complex objects. The string manipulation capabilities of LotusScript are more than adequate to build large JSON objects in this way. Reading a JSON String can also be accomplished utilizing the same set of LotusScript string functions, but it then requires a significant amount of filtering and manipulation to locate and retrieve the data you are looking for. The task becomes more and more difficult as the JSON becomes larger and more complex.

Starting with HCL Domino 10, LotusScript provided some basic components to read and assemble JSON known as the NotesJSONNavigator class, and this was a major improvement.  This effectively replaced the tedious process of string manipulation and treated JSON as objects for reading and writing. However, beyond the simple objects and arrays it was designed to handle, the tools provided within LotusScript are still very limited, especially compared to JavaScript.

As JavaScript developers, we wanted the LotusScript JSON parser to reflect the JSON processing functionality that was found in JavaScript. Since we had lots of experience manipulating JSON on the front-end (using JavaScript), that set our expectations for what was needed on the back-end. We modeled our approach on JavaScript, and that determined the methods that we had to implement in LotusScript.  After many years and many versions, we finally came up with the Flex JSON Parser.

So why did we call it the Flex JSON Parser? Because it was more flexible than our many other attempts in creating a JSON parser. Below is a comparison of the functions between JavaScript and the Flex JSON Parser. Note that spec is simply a string that uses dot notation to refer to a specific JSON object, array, or keyname, regardless of how many levels down it might be, for example, it could be "[0][25].grass.type"

JavaScript Flex Parser
Create a JSON Array
var a = []; or set a = new JSON(10)
var a = JSON.parser("[]"); Call a.parse(|[]|)

Create a JSON Object
var a = {}; or set a = new JSON(10)
var a = JSON.parser("{}"); Call a.parse(|{}|)

Convert JSON Object to JSON String
var b = JSON.stringify(obj); b = obj.stringify()

Enumberate an Object
Object.keys( obj ); obj.getKeys(spec)

Push Array
objs.push( obj ) objs.push( spec , obj )

Push Apply Arrays to Arrays
objs.push.apply(objs, bObjs ) objs.pushapply( spec , bObjs )

Length of Array
objs.length; objs.getCount(spec)

Get Value of Key
var a = obj["hello"]; a = obj.getValue("hello")

Set Value of Key
obj["hello"] = a; obj.setValue("hello", a)

We also adopted some methods that we utilize from the Dojo Toolkit Framework
lang.mixin( aObj , bObj) aObj.mixin(bObj )

But the extremely important methods that allow for significant processing of JSON for the Flex JSON Parser are:

set obj = objs.getElement( spec )
Call objs.setElement( spec , obj )
index = objs.getIndexByKeyValue(spec, key , value)

These functions allow us to extract JSON objects from larger JSON objects and then to insert them into specific locations within other JSON objects.

With the Flex JSON Parser we can manipulate and process JSON objects that include a mix of objects and arrays at any level. It gave us the processing capabilities that defined how iPhora works and opened up new approaches that we once assumed were not possible in LotusScript. So a LotusScript developer can now process JSON like a JavaScript developer.

Watch a comparison between JavaScript and the Flex JSON Parser in action
https://vimeo.com/741483881





Wednesday, August 10, 2022

The iPhora Journey - Part 4 - JSON is King - The How

 The iPhora Journey - Part 1 - Reimagining Domino

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 4 - JSON is King - The How


As we mentioned yesterday, in reimagining Domino, we wanted Domino to be a modern web application server, one that utilized a JSON-based NoSQL database and be more secure compared to other JSON-based NoSQL platforms.

A Domino document existing within a Domino database is the foundational data record used in iPhora, just as it is with traditional Domino applications. But instead of just storing data into individual fields, we wanted to store and process the JSON in a Domino document.  However, text fields (AKA summary fields) in Domino documents are limited to only 64 KBytes, and that is a serious limitation. 64 KBytes of JSON data does not even touch what the real world typically transfers back and forward between browser and server. We looked into the possibility of using rich text fields to store JSON data, but such fields are messy to work with and the rich text storage format contaminates the data with carriage returns and other unnecessary artifacts. Instead, we decided to make extensive use of MIME fields. We discovered that reading/writing data into MIME fields is just as fast as accessing summary fields, but they can store vastly more data -- up to 1 GBytes in a single field without corrupting the data. Realistically, the average size of data that we store in a MIME field is roughly one MByte.




Since our UI framework was a JavaScript SPA framework, it only seemed natural to use a RESTful API approach to communicate between browser and server, with JSON as the data exchange format. However, LotusScript web agents have an inherent limitation of being able to send and receive only 64 KBytes at a time, a limitation that is incredibly -- well -- limiting to designers. Unfortunately, this limitation does not seem to be going away any time soon, as it even affected the Notes HTTP Request class and the JSON Navigator class that were so recently added to LotusScript. It has since been improved. In order to resolve these two issues, we had to experiment over and over again. To resolve the output of JSON, we ended up chunking the JSON output into 64 KByte chunks and streaming them out to the browser. In order to receive JSON data over 64 Kbytes, we had to resort to receiving the JSON data as form-post and treating the form submission document as merely a container for JSON. After working this out, we can easily send and receive megabytes of JSON data without a problem.

However, Domino had another nasty surprise in store for us. We were happy campers with the ability to send and receive megabytes of JSON. However, iPhora is used globally, including Asia and Europe. Therefore, support for the UTF-8 character set is extremely important. Imagine the look of undisguised horror on our faces when we discovered that incoming POST data to LotusScript web agents utterly corrupted all UTF-8 characters. This is because LotusScript Web agents receive the incoming POST data using the Lotus Multi-Byte Character Set (LMBCS) format, with no support for UTF-8. We notified HCL regarding this issue, but the complexity of the existing LMBCS implementation made it difficult to change without causing other problems. At the same time, the pressure we were receiving from Asia and Europe to provide support for UTF-8 resulted in some serious depression and near-despair at Phora Group. So it was back to the drawing board, and with the aid of Java we were able to create our own scheme to smoothly and quickly handle UTF-8 characters within LotusScript agents.

As we moved forward in the development of iPhora, JSON became the standard, not just for data exchange between servers, but also for defining security, APIs, data structures, data processing, data exchange between software components and modules, and even between functions and methods. JSON allowed us to decouple components, thus extending the concept of reusability. This is something that JavaScript supports intrinsically since JSON is inherent to JavaScript, but in LotusScript, no.

So how does one read, write and process JSON using LotusScript? That is a good question. Over the years, there have been a number of LotusScript JSON parsers created by individuals, such as the SNAPPs JSON Writer from Rob Novak and his company and later Lars Berntrop-Bos created his Turbo JSON Writer which was much faster. Eventually, support for JSON was added to LotusScript with a new class in Domino 10. However, that JSON support was severely limited and quite buggy. It has since been improved to handle more than 64K bytes of data; however, that class is still cumbersome to use when you have to manipulate massive amounts of multi-level JSON data, such as data containing a mix of objects and arrays sometimes embedded 7 to 10 levels down. Processing JSON data is not just about reading it or simply creating it one line at a time. You often need to insert a large JSON object into an existing object, or remove an object or element from an array, or inject data into an existing array element. From a JavaScript programmer's point of view, those are all very simple tasks. However, they are not so easy within LotusScript, nor even with Java.

As JSON became more and more important to us, we had to find a way to quickly handle and process large amounts of JSON. One alternative was to switch from LotusScript to JavaScript/Node and the Domino AppDev Pack. Unfortunately, that does not support MIME fields, which were now essential to iPhora. Over the years, our team created several different types of JSON parser to help process JSON. The first versions were slow and awkward to use, but over time, they got better and better. After many generations of development, we created our Flex JSON Parser, a true JSON parser that is fast and flexible. It is much more in par to how JSON is processed using JavaScript.

Next time, we will talk about the use of this JSON parser and why having such a parser was a game changer. The next edition is titled, "The iPhora Journey - Part 5 - Dammit, Jim, I'm a LotusScript Developer not a JavaScript Developer (AKA Processing JSON Data with the LotusScript Flex JSON Parser)" We will do a quick comparison with JSON processing using JavaScript and the LotusScript NotesJSONNavigator classes.


The iPhora Journey - Part 4 - JSON is King - The How

The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 1 - Reimagining Domino

Tuesday, August 9, 2022

HCL workshops coming to CollabSphere 2022

As part of CollabSphere 2022, on Tuesday October 18, 2022. HCL will be running three virtual workshops the day before CollabSphere starts. Each workshop will be 4 hours long with breaks and be limited to 12 attendees only. You will be able to sign up for the workshops as part of the CollabSphere 2022 registration process which start next Monday, August 15, 2022. You will be placed on a waiting list if the workshop is full. Below is a list of workshops that will be provided by HCL Digital Solutions Academy:

=========================================
Tuesday - 10/18/2022 - Morning
Deploying HCL Sametime Premium 12 on Kubernetes
We'll cover setting up your own Kubernetes cluster and deploying Sametime Premium, which includes chat and meeting components. In addition to this, we'll use the time to cover network architecture, configurations, best practices, and troubleshooting.

Tuesday - 10/18/2022 - Morning
HCL Volt MX Development Jumpstart - Domino Developers Edition!
Are you a Domino Developer looking to understand how your skills can be used with HCL Volt MX? If so, the HCL Volt MX Development Jumpstart for Domino Developers Workshop is for you! Attend this class to get your very first MX application up and running.
  • Learn to integrate HCL Volt MX with a Domino back-end database
  • Develop a Photo Blog Mobile & Web Application from scratch
  • Download reusable assets to get you started quickly with development

Tuesday - 10/18/2022 - Afternoon
Deploying HCL Domino on Kubernetes
Wondering how to install and configure your Domino server on Kubernetes? Are you looking to learn how Domino can help your organization in their Cloud Native objectives? Or do you just want to know what namespaces, pods, and containers are or why you should care? If you answered Yes to any of these questions, this workshop is for you! After attending this session, you will be familiar with the fundamental aspects of Kubernetes as well as everything it takes to run not one but multiple Domino servers in a Kubernetes cluster.

The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 1 - Reimagining Domino

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 4 - JSON is King - The Why


JavaScript Object Notation (JSON) was defined by Douglas Crockford in early 2000s as a data exchange format that grew out of JavaScript. JavaScript code can in fact be structured as JSON data, and even the HTML DOM can be represented as JSON data (more about that in a future blog).




JSON has now become the dominant data exchange format, superseding XML. It provides data exchange between the server and JavaScript-based clients, including web browsers, and it is also used in web services and web hooks. It is not only readable by humans but it is also lightweight. Databases such as MongoDB and RavenDB use JSON as the data storage format, and there are a number of derivatives, such as BJSON, a binary version of JSON used in MongoDB.

More and more companies are relying on either cloud-based or external web services to operate their business, and 99% of all web services rely on JSON. So the importance of being able to process, store, and manipulate JSON is essential for any web application server. When we started this journey, the only thing in Domino that supported JSON was the ReadViewEntries call (with OutputFormat=JSON), and that was it.

As part of our quest to reimagine Domino, we wanted to have Domino behave like a modern web application server, specifically, a JSON-based NoSQL database platform, similar to MongoDB. Note that Domino has many advantages over competing database technologies. A Domino document can be up to 2 GBbyte in size, while MongoDB has a limit of 16 MBytes. Domino also supports master-to-master replication and has a more robust and more granular data security model. So we had to figure out how to send, receive, and store JSON data in Domino.

Another factor in our decision to transform Domino into a JSON-based NoSQL platform was portability. At the time, it was not clear how much longer Domino would be supported by IBM (a question that was happily resolved when HCL acquired Domino), so we needed a design approach the was platform agnostic. Again, a JSON-based NoSQL design (one that was API-driven) would achieve that. It would have separation between user interface and data store, making it much easier to move to an alternative data store if necessary.

We decided to break this part into two posting since it was getting too long. So, tomorrow, The iPhora Journey - Part 4 - JSON is King - The How. 


The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 1 - Reimagining Domino

Wednesday, August 3, 2022

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 1 - Reimagining Domino

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

There are many ways to create the user interface (UI) for a web application. The HTML page could be created on the server and then pushed out. It could be static with the data generated on the page by the server with JavaScript, providing a more dynamic experience, or the server could generate new HTML content to update portions of the web page. XPages or PHP are good examples of this. Another method is to have the web page partially generated by the server and have JavaScript build the rest of the HTML by pulling data from the server via an API. This is the approach used in the Single Page Application (SPA) model.

In all cases, it is still dependent on the web server technology being using.  As mentioned previously in this blog, XPages is dependent on complete integration between form and document, which effectively results in tight integration between form/UI/data.  This close coupling limits the flexibility on both the server side and client side.

With its long history, Domino has gone through a number ownership changes. Before HCL Technologies acquired Domino from IBM, the future of Domino was somewhat in doubt, and during that time, we felt it was necessary to have the ability to switch to a different server platform if Domino was no longer an option. Therefore, we concluded that our UI framework had to be totally independent of server technology. We focused on a pure JavaScript-generated UI framework that only existed in memory and utilized a single HTML container. The layout, navigation, display, updates etc. would all be driven by API requests and responses to the server.

With this approach, it does not matter how the APIs are created or what server technology is being used. Currently, we implement APIs with LotusScript Web agents, which are one of the most secure ways to generate data and communicate with web clients on Domino. However, we can also also switch to APIs that are created using Java or Node. All that matters is that the API framework and structure are consistent. A loosely coupled platform allows for an easier upgrade process, as individual components can be upgraded separately, and provides increased portability. It also allows the back-end team to focus on the business logic running on the server and the front-end team of UI designers to focus on the interface. This also means that the front-end team needs minimal, if any, knowledge of Domino, which greatly expands the talent pool for hiring UI professionals.

This approach evolved into our SPA framework, which initially used a single XPage as an HTML container. We eventually determined that there was no advantage to having it as a single XPage and moved it to a single Domino Page, which did not have all the overhead associated with XPages.

An IBM Business Partner once asked us what would happen to our investment in our framework if we were forced to change server technologies?  And the answer was nothing, except for minimal changes to the API routing table. 

So, how did we create this UI framework?

The Rise of UXPages

So, 13 years ago, we created UXPages almost as a parody of XPages. It started as a playful experiment to see if we could come up with something to replace XPages. Like XPages, we wanted an XML representation of the interface, but rather than compile to Java, it would compile to JavaScript. Since we are somewhat lazy and do not want to manually write JavaScript code, we sought a more automatic way of generating code, including support for versioning. We also wanted to avoid the use of Domino Designer to create and edit the code.


Thus, the iPhora Application Designer was born. It was a Notes client application that was written in LotusScript. It compiled the UXPages XML into JavaScript, and via the magic of DXL, it pushed the JavaScript into the Domino server, never having to open Domino Designer. We even incorporated Nodejs so that we could utilize Uglify (which may be the worst name for a JS library in history) to minimize the JavaScript. However, we did not want to write a JavaScript framework from scratch. Our starting point was Dojo, which was already included with Domino. But after researching how Dojo was implemented in XPages, we decided that it would be better do use Dojo on its own and not the version that was included with XPages.

Unlike JQuery, which is another JavaScript library, Dojo was the first true JavaScript framework. Before ES6 came along, Dojo already supported super classes, classes, inheritance, modules, etc. As a framework, Dojo provides a complete architecture for building widgets, including widget lifecycle management. But standard Dojo widgets, like the ones currently used in XPages, are rather ugly. So, we decided to build all of the widgets from scratch and not utilize any of the existing Dojo widgets. But then how would we make our new widgets beautiful and appealing? We settled on the CSS framework Bootstrap, which was developed by Twitter. In doing so, we avoided using any of the Bootstrap widgets, since they were based on JQuery. Instead, our widgets had to bind with the incoming data from an API request.

As we progressed in our widget development, we realized that we could extend the use of using widgets to handle views, panes, and other features of the application. As a result, our SPA application is just a multi-layer sets of widgets, where each layer inherits from its parent widget. The concept of widget lifecycle was extremely important. As a user navigates between virtual pages, new widgets, view, and panes are created and old ones destroyed in order to avoid consuming an ever-increasing amount of memory.

It is true that many in the industry have moved on to other libraries, such as Vue, React and NextJS. However, having long term support and longevity should not be underestimated. We already have applications running on UXPages that are close to a decade old. Also, the integration and use of Dojo within XPages really gave it a bad reputation, at least within the Domino community. However, it should be noted that even in the latest version of XPages, the version of Dojo is extremely out of date (currently, it is eight versions behind the latest version). When the current version of Dojo is used as we use, it is capable of generating a modern and engaging UI experience.

As UXPages evolved, we added data binding with the APIs, utilized newer ES6 features of JavaScript, and added listeners/broadcasters to widgets so they can talk to each other and exchange data. The evolution is still continuing. So, what's next for UXPages? Possibly, PWA capabilities and more. However, for now the components that made up iPhora Application Designer form the foundation of our No-code/Low-code iPhora AppBuilder product, which will be described in a future blog entry in this series.


Next time, The iPhora Journey - Part 4 - JSON is King

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 1 - Reimagining Domino

Tuesday, July 26, 2022

The iPhora Journey - Part II - Domino, the Little Engine that Could

The iPhora Journey - Part I  - Reimagining Domino

We have been working with Notes and Domino since 1995, back when it was still called Lotus Notes. The name "Domino" did not come into the picture until a few years later when the Notes server was renamed. Our company, ReCor, provided computer-based training Notes and Domino, and our course offerings included training for application development, administration, and end-users. After the early 2000's, demand for new installations of Notes and Domino began to decline, along with the demand for training. 

At this point we decided to start a new company, one that focused on IT Support and application development, but with an emphasis on Domino. This is how Phora Group began. Our initial problem was that all of our customer contacts were for training, and while the ReCor customer base was extensive and included many very large companies, those customers were already supported by either internal staff, IBM, or IBM Business Partners. Therefore, our focus had to be on new customers and figuring out what new customers needed.

Our new customers tended to be smaller than what we were used to, but there were many of them and we did not have to contend with internal IT and development staff. We created a digital workspace -- although that term did not exist then -- called the Integrated Business Framework and this was a Notes client-based solution. It included a number of standard modules, including CRM, sales order, inventory, technical support, project management, workflow, and others. As our customer base grew, we added more modules to provide support for an ever-increasing range of business functions.

Over time, support and scalability became a problem because each new module meant changes to the design of multiple databases to accommodate the new features. This is because all modules had to be selectable from within any other module. We resorted to using features like embedded views, which soon became a nightmare to manage.  Moreover, the Notes client is something of a UI beast, and it is extremely difficult to create the kind of user interface that non-Notes users expect. Customers had already become familiar with UIs from other software and even from web applications. For a long time, Chris Blatnick maintained a blog called "Interface Matters" where he described different approaches to improving the look and feel of applications running on the Notes client.


During that time, we created some exceptional Notes client interfaces like the one above, but each one required a massive amount of work. With traditional Notes development, the UI elements and business logic are contained within the form and the data is stored in a document, and the integration between form and document is so complete that it is almost impossible to conceptualize one without the other. Not only that, but there is no centralized place within a form for the UI and logic. They could be in a set of Action buttons or hotspots or agent calls or even in events, such as the QuerySave event. Some code might run on the client and some on the server.

With the rise of web applications, XPages was introduced as the preferred method to take Domino applications to the web. However, XPages was still dependent on complete integration between form and document, an approach that became more and more cumbersome to maintain.

This binding between the form and the data may have worked in the Notes client, but that is not how modern web applications are designed and the linkage makes it extremely difficult to apply the latest web technologies to Domino. It became clear -- at least to us -- that the link had to be severed in order to move forward.  Separating the form and the data allows the developer to more easily create the UI and manage the data. It also means abandoning the Notes client. For us, all applications would now be web applications. This was a watershed moment.

We started by experimenting with an API-driven model rather than the client/server model used with the Notes client. All interaction between the web client and Domino server would be through server-based agents. This was easier to maintain and made the data more manageable. 

This leads to the obvious question of why use Domino if the focus is only on web applications? There are plenty of platforms out there that are faster, newer, more web-centric, and in many cases open source. For example, you could design a web application using a combination of Nginx/Express/Node/MongoDb. So why use Domino? 

First, it is important to look at the essential components of a web application:

  • User and Data Security
  • Web Engine
  • Business Logic
  • Database

As we mentioned in part 1 of this blog series Domino excels at:

  • Data Security
  • Master-to Master Replication
  • Integrated Web Application Server

If we went with the Nginx/Express/Node/MongoDb combination, there would be no built-in user security or data security. It would fall upon us to create it. With the current climate of hacking and malware, we would need an entire staff to develop and maintain just the core security features. With Domino that core security responsibility is handled by HCL, which means that we don't have to spend precious resources creating and maintaining that security infrastructure,. Not only that, the Domino security model is already highly granular, including user-level, database-level, document-level, and even field-level security. No competing web database technologies have anything similar.

Domino also includes an integrated web application server, which frees us from having to create all the code that is necessary to connect the different components: web server to security infrastructure to database. Domino brings all that is needed into one cohesive platform that can literally be run on even the smallest of server. Because of Covid, we downsized our offices and the server box pictured was our operational Domino server. In many cases, a single-server Domino web application is fully sufficient for a small businesses. It represents a complete solution.



If you have ever shopped for a house, you would have probably heard the phrase "good bones" used to describe a certain property. This refers to an older house that may have some appearance/image issues but is solidly built. Well, Domino is like that. It may not the first choice that springs to mind for a web platform, but it has great bones. It is the little engine that keeps on chugging.

So, is it possible to take what Domino has to offer and transform that into a modern web application platform? Can we extend the Domino access and security models to provide an even more secure platform focused exclusively on web access? Could we implement a new vision for Domino, a vision that includes:

  • Web API-based with JavaScript SPA
  • JSON-based application engine
  • JSON-based database
  • Flow-based processing engine
  • Graph API
  • Security fidelity even higher than what Domino provides
  • Portable to other platforms

So that was our quest. Taking Domino from the Little Engine that Could to the Little Engine that Can.

Next time, The iPhora Journey - Part 3 - Creating an Integrated UI Framework.


Tuesday, July 19, 2022

The iPhora Journey - Part I - Reimagining Domino

Domino, which is currently owned by HCL Technologies, is one of the most enduring application platforms ever built. It owes its existence to Ray Ozzie, who was heavily influenced by his use of the PLATO system, a pioneering interactive/educational network at the University of Illinois. The first version of Domino (then called Lotus Notes) was released in 1989, and Domino applications from 1989 can still run on the newest version (12.01). You can build Domino applications for deployment on the Notes clients, mobile devices, or Web browsers, and for programming and customization, you can use Nodejs, Java, LotusScript, and Formula language, or any combination of them. Regardless of which programming languages are used, a typical application is usually represented by a single Domino database.

However, the IT landscape has changed significantly since Domino was a dominant player in the market. The migration to web and mobile applications using cloud-based solutions has led to the steep decline of traditional client-server architectures. This, combined with the rise of no-code / low-code development tools for web and mobile applications, has made Domino more of legacy system than a cutting-edge platform. Easy integration with external services via APIs is now a must, and deployment time is now measured in minutes, not days. Companies are implementing digital workspaces, allowing users to browse through the offerings in an application store, select, pay (if needed) and deploy the application to their workspace all in less time than it takes to get a cup of coffee.

Over the past few years, HCL has made a number of great improvements to the Domino platform, which can now be deployed on Docker and Kubernetes. However, the application development process, whether for Notes clients or Web browsers, really has not changed much at all. Fifty percent of all Domino development still centers around Notes client applications. This is fine for the existing Domino community, but having to install a large client is a very hard sell to new customers.

If we are to reimagine Domino, the first step is to identify the things that Domino excels at. These happen to be the same things that Domino did well from the beginning:

  • Data Security
  • Master-to-Master Replication
  • Integrated Web Application Platform
  • Multiple programming tools with tight integration to data stores

As a fully integrated application server, Domino provides an advantage in ease of deployment and manageability. Of course, this can also be an Achilles heel when it comes to upgrading to the latest versions of critical technologies like Java. The tight integration makes it more difficult to upgrade the individual components that make up Domino.

The next step is to identify new features and capabilities that are desirable to customers today:

  • No-code/low-code development
  • API-driven platform
  • Web and mobile client support
  • Easy private/public cloud deployment
  • App store delivery
  • Easy integration with external services

These are the features that are needed in order to keep Domino relevant in the future. Competitive database solutions, such as MongoDB, RavenDB, and Couchbase, all interact well with a variety of UI and server technologies to achieve these capabilities, while Domino struggles with such integration because of difficulties upgrading to the latest versions of critical components or simply lacking support for certain capabilities. By the way, as NoSQL solutions, all of those database products owe a debt of gratitude to Domino, which pioneered that technology in the corporate world.

This is part one of a fourteen-part series describing our long journey in redefining the Domino platform and how to use it to meet the expectations of today's customers -- specifically new customers. Our focus is totally on what new customers are looking for, not what existing Domino customers expect.

Domino may not be the ideal platform for a modern web application, but at least it provides a solid foundation built on security, data storage, and replication, all nicely integrated. Like all technologies, Domino has many shortcomings. The questions we had to ask ourselves are these: Do the shortcomings outweigh the benefits? Are there workarounds to mitigate the shortcomings, or even turn them into advantages? How can we apply industry best practices for web design to the existing Domino architecture? During this journey, these questions and many others came up again and again.

We discovered early on that it was necessary to abandon all existing notions of what an application is in Domino. The traditional Notes client application is simply not compatible with modern web technologies. A new design approach and a new application definition was needed, one that focuses on web/mobile deployment. With that in mind we started our journey.

Next time, Part II - Domino the Little Engine that Could

Thursday, July 7, 2022

CollabSphere 2022, October 19-20, 2022, "Our Community, Our Stories"

 


I am please to announce that CollabSphere 2022 will be held October 19-20, 2022. Registration, sponsorship and abstract submission will start the week after next. New for this year is the addition of the CollabSphere Discord channel where the conversation and learning starts before and after the conference. All attendees are invited to join us. We will have online activities on Discord so make sure you join. The conference is about you and your story and we want to hear your story. More about this in the near future. The cost for CollabSphere 2022 will be $150 USD with a discount of $150 USD for the total cost of $ 0 USD.

Thanks again to the folks at Prominic.net for hosting our web site.

https://collabsphere.org

CollabSphere 2022 Presentation: COL103 Advanced Automation and Cloud Service Integration for your Notes/Nomad Applications

Our presentation for CollabSphere 2022. Learn how to quickly add workflow automation into Notes/Nomad applications using No-code tools that ...