Friday, September 16, 2022

The iPhora Journey - Part 7 - Transforming Domino with Microservices

Most of the concepts that we have been talking about in our iPhora Journey are neither new nor revolutionary. Collectively, they merely represent the current state of the art in designing web applications.

In other (i.e., non-Domino) platforms, you would need an array of services each running on a separate server, and you would have to build the integration between the different components and maintain security between all of those components. With virtualization and containers all of the necessary services can now bundled into a single installation, which is helpful, but the integration of those components can still be fraught with issues, especially compatibility between different versions of those services.

With Domino all of this is taken care for you: authentication, directory services, security, web services, and database services are all tightly integrated in a single server,  which is an advantage that no other technology offers. While Domino has its quirks and does not support many of the latest features, hopefully HCL will be addressing those in the near future.  Even as it stands right now, the simple addition of nginx as a proxy to Domino addresses most of the deficiencies.

The iPhora platform is built on Domino, and in doing that we take advantage of the inherent integration of the Domino services. But we also incorporate design methodologies that are usually associated wit non-Domino applications. With this approach, iPhora has the capabilities and features of a modern web application, but all it needs to run is a single server with the most minimal of hardware requirements. We challenge any vendor to be able to do everything that iPhora does on a single box as small as this.


Creating a modern application is not solely about using latest technology to build the application. Well-supported technologies with a long term commitment to maintenance are always preferable to the latest design fad. The important question is what you do with that technology? 


The iPhora concept evolved over the span of 15 years, as we developed applications for organizations, both large  and small, and gained experience with how companies handled their internal business processes. Step by step, we abandoned the more monolithic approach normally used when creating Domino applications, for a more fluid and plug and play architecture. Eventually, we wound up with a service based and loosely coupled architecture, a microservice architecture. It should be noted that this applies to the iPhora components that we developed, not to the underlying Domino services.


Wikipedia defines microservice as:

"A microservice architecture – a variant of the SOA (service-oriented architecture) structural style – arranges an application as a collection of loosely-coupled services. In a microservice architecture, services are fine-grained and the protocols are lightweight. The goal is that teams can bring their services to life independent of others. Loose coupling reduces all types of dependencies and the complexities around it, as service developers do not need to care about the users of the service, they do not force their changes onto users of the service. Therefore it allows organizations developing software to grow fast and big, as well as use off-the-shelf services more easily. Communication requirements are reduced. These benefits come at a cost to maintaining the decoupling. Interfaces need to be designed carefully and treated as a public API."

Using the programming tools available in Domino, we were able to build a microservice architecture that passes data in and out using a common bus and transport schema. Each microservice is an isolated black box with only input and output.

Remember that in a previous blog I mentioned the importance of the JSON format and the ability to quickly process JSON. The reason is that the data transport layer between microservices is all JSON. By having all input and output defined as JSON objects, we achieve a standardized and easily extensible communication layer. This allows microservices to be independent and reusable by multiple services and processes. A typical process can chain together a series of microservices to accomplish a specific and highly complex task, as opposed to writing a custom services to accomplish the same thing.

But how do you connect these different microservices together to create your application? This leads to our next discussion next time, flow-based programming.

Tuesday, September 6, 2022

The iPhora Journey - Part 6 - An Application, Rethinking and Redefining

In our previous articles, we discussed the advantages of using JSON in applications and the importance of being able to effectively process JSON. This leads to the interesting question: what exactly is an application, and the answer to that question may differ depending on who you ask. Most people today would assume you are referring to a phone app, and even then there are wide range of options. A college student might think of apps like Tiktok or WhatsApp, and a person looking for a significant other might automatically think of Hinge or Tinder. All of these people would be able to provide you with multiple examples of apps, and they might be able to list some of the common features of an app, even if they do not understand any of the technical details.




In general, a web app today consists of the following four modules, even if the technical details may differ tremendously from app to app:

  • User interface (UI)
  • Database
  • Business Logic
  • Security

Now, let's consider what an app is in the traditional Notes/Domino sense. Here, an application still consists of the same four modules, but everything is mixed together (actually smeared-together may be a better term), as opposed to being separated. A Domino application consists of one or more Domino databases, and access and security is defined within each database using ACLs.  This is true for both Notes client and Web-based applications created on Domino -- so far, so good.  However, when it comes to the UI and Database modules, Domino uses the concept of a form, which defines both the user interface and the data structure for storage, modules that are typically separated in more modern apps.

Moreover, the business logic is programmed directly into the actual UI components all over the form. Even something as simple and basic as controlling what UI elements are visible (i.e., hide-when formulas) requires that the programming logic is dispersed throughout the form and may be associated with each and every UI component.

This approach can work for stand alone applications or situations involving only a few applications. However, as the number of applications increases, providing the infrastructure to effectively create and manage the applications and user access becomes more and more difficult.  This is very true for Domino applications because the business logic can be implemented in so many different ways and in so many different places.

The complexity of Domino applications tends to increase even further because the data is distributed throughout multiple databases using different data structures. The more different forms an application uses, the more views are needed and the more indexing is required, resulting in the need for more storage space.

So our primary goal was to structure iPhora to follow a modern application architecture, but still provide the extensive data security one expects from Domino. We wanted to be able to build and manage hundreds of applications with the same ease as doing it for one application. Almost all of the development projects that we have been involved with so far began with one critical application that eventually expanded to dozens of related applications, each with different sets of users and different ACLs requirements. For one customer, that one initial application grew into 50 applications and over 100 processes, each with different roles and user access requirements and all accessed and managed within a single instance of iPhora. Yet even as more and more applications were added, no additional Domino databases were needed.

An iPhora installation uses only seven Domino databases, each serving a specific purpose within the iPhora architecture: 

  • API / GateKeeper
  • Attachment Store
  • Business Logic
  • Data Store
  • Logging
  • UI Interface
  • User Profile

In other words, there is no longer a one-to-one relationship between applications and Domino databases. Any number of applications, including widely different types of applications, can be accommodated using this structure.

Since data security is of the upmost importance for iPhora, it was extremely important that users never have direct access to the data, which is contrary to a traditional Domino application. We wanted all of the databases to be totally locked down. Our API consists of LotusScript agents that interact with the data according to strict protocols. In general, the only database that users interact with is the database that holds those agents, and the UI interface. The other databases cannot be opened using Domino URL commands.

Since data is stored as JSON, the data store module utilizes a single Domino form to store all records regardless of the application, although the fields may different from one form to another. Any attachments that are uploaded into a data store document are actually stored in a separate attachment store database which also utilizes only one single Domino form. With DAOS, we can handle 1 TByte of attachments and if more are needed, we just can simply add another attachment store.

For companies that already have data in Domino databases, iPhora has the ability to read and display data from other Domino databases, including databases that are structured in the traditional Domino way. The only requirement is that form structure needs to be defined and added to iPhora.

In conclusion, by re-thinking and re-imagining what an application is in Domino, Domino becomes a scalable application platform on which to build and run secure applications that utilize a modern design framework.  The advantages of a structured and scalable application architecture become apparent as one moves towards utilizing no-code/low-code tools to build applications, something we will consider later in this series. 

How we access and communicate with external services including non-iPhora Domino applications is the subject of our next discussion, Microservices and Naoservices

Saturday, August 20, 2022

The iPhora Journey - Part 5 - Dammit Jim, I'm a LotusScripter not a JavaScripter



As often said by Dr. McCoy in the original Star Trek series, he is a doctor, not a ____________ .  However, just like Dr. McCoy, you may sometimes need to work on things that seem very alien to your experience.  One of those things, might be JSON. LotusScript, which was derived from Visual Basic, was written long before JSON existed, and therefore, LotusScript had no built-in capabilities for handling JSON objects.

As we mentioned in Part 4, JSON plays a critical role in iPhora.  All data are stored as JSON, and JSON serves as the primary data and communication format between modules, functions and services.  All core components operate using JSON-based configurations.  Therefore, it was extremely important that we are able to fluidity create, read and process JSON.  

Creating a JSON string is relatively easy in any programming language. You can create it even using Commodore 64 Basic, and if built sequentially, one line at a time, it is possible to create JSON representations of very complex objects. The string manipulation capabilities of LotusScript are more than adequate to build large JSON objects in this way. Reading a JSON String can also be accomplished utilizing the same set of LotusScript string functions, but it then requires a significant amount of filtering and manipulation to locate and retrieve the data you are looking for. The task becomes more and more difficult as the JSON becomes larger and more complex.

Starting with HCL Domino 10, LotusScript provided some basic components to read and assemble JSON known as the NotesJSONNavigator class, and this was a major improvement.  This effectively replaced the tedious process of string manipulation and treated JSON as objects for reading and writing. However, beyond the simple objects and arrays it was designed to handle, the tools provided within LotusScript are still very limited, especially compared to JavaScript.

As JavaScript developers, we wanted the LotusScript JSON parser to reflect the JSON processing functionality that was found in JavaScript. Since we had lots of experience manipulating JSON on the front-end (using JavaScript), that set our expectations for what was needed on the back-end. We modeled our approach on JavaScript, and that determined the methods that we had to implement in LotusScript.  After many years and many versions, we finally came up with the Flex JSON Parser.

So why did we call it the Flex JSON Parser? Because it was more flexible than our many other attempts in creating a JSON parser. Below is a comparison of the functions between JavaScript and the Flex JSON Parser. Note that spec is simply a string that uses dot notation to refer to a specific JSON object, array, or keyname, regardless of how many levels down it might be, for example, it could be "[0][25].grass.type"

JavaScript Flex Parser
Create a JSON Array
var a = []; or set a = new JSON(10)
var a = JSON.parser("[]"); Call a.parse(|[]|)

Create a JSON Object
var a = {}; or set a = new JSON(10)
var a = JSON.parser("{}"); Call a.parse(|{}|)

Convert JSON Object to JSON String
var b = JSON.stringify(obj); b = obj.stringify()

Enumberate an Object
Object.keys( obj ); obj.getKeys(spec)

Push Array
objs.push( obj ) objs.push( spec , obj )

Push Apply Arrays to Arrays
objs.push.apply(objs, bObjs ) objs.pushapply( spec , bObjs )

Length of Array
objs.length; objs.getCount(spec)

Get Value of Key
var a = obj["hello"]; a = obj.getValue("hello")

Set Value of Key
obj["hello"] = a; obj.setValue("hello", a)

We also adopted some methods that we utilize from the Dojo Toolkit Framework
lang.mixin( aObj , bObj) aObj.mixin(bObj )

But the extremely important methods that allow for significant processing of JSON for the Flex JSON Parser are:

set obj = objs.getElement( spec )
Call objs.setElement( spec , obj )
index = objs.getIndexByKeyValue(spec, key , value)

These functions allow us to extract JSON objects from larger JSON objects and then to insert them into specific locations within other JSON objects.

With the Flex JSON Parser we can manipulate and process JSON objects that include a mix of objects and arrays at any level. It gave us the processing capabilities that defined how iPhora works and opened up new approaches that we once assumed were not possible in LotusScript. So a LotusScript developer can now process JSON like a JavaScript developer.

Watch a comparison between JavaScript and the Flex JSON Parser in action
https://vimeo.com/741483881





Wednesday, August 10, 2022

The iPhora Journey - Part 4 - JSON is King - The How

 The iPhora Journey - Part 1 - Reimagining Domino

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 4 - JSON is King - The How


As we mentioned yesterday, in reimagining Domino, we wanted Domino to be a modern web application server, one that utilized a JSON-based NoSQL database and be more secure compared to other JSON-based NoSQL platforms.

A Domino document existing within a Domino database is the foundational data record used in iPhora, just as it is with traditional Domino applications. But instead of just storing data into individual fields, we wanted to store and process the JSON in a Domino document.  However, text fields (AKA summary fields) in Domino documents are limited to only 64 KBytes, and that is a serious limitation. 64 KBytes of JSON data does not even touch what the real world typically transfers back and forward between browser and server. We looked into the possibility of using rich text fields to store JSON data, but such fields are messy to work with and the rich text storage format contaminates the data with carriage returns and other unnecessary artifacts. Instead, we decided to make extensive use of MIME fields. We discovered that reading/writing data into MIME fields is just as fast as accessing summary fields, but they can store vastly more data -- up to 1 GBytes in a single field without corrupting the data. Realistically, the average size of data that we store in a MIME field is roughly one MByte.




Since our UI framework was a JavaScript SPA framework, it only seemed natural to use a RESTful API approach to communicate between browser and server, with JSON as the data exchange format. However, LotusScript web agents have an inherent limitation of being able to send and receive only 64 KBytes at a time, a limitation that is incredibly -- well -- limiting to designers. Unfortunately, this limitation does not seem to be going away any time soon, as it even affected the Notes HTTP Request class and the JSON Navigator class that were so recently added to LotusScript. It has since been improved. In order to resolve these two issues, we had to experiment over and over again. To resolve the output of JSON, we ended up chunking the JSON output into 64 KByte chunks and streaming them out to the browser. In order to receive JSON data over 64 Kbytes, we had to resort to receiving the JSON data as form-post and treating the form submission document as merely a container for JSON. After working this out, we can easily send and receive megabytes of JSON data without a problem.

However, Domino had another nasty surprise in store for us. We were happy campers with the ability to send and receive megabytes of JSON. However, iPhora is used globally, including Asia and Europe. Therefore, support for the UTF-8 character set is extremely important. Imagine the look of undisguised horror on our faces when we discovered that incoming POST data to LotusScript web agents utterly corrupted all UTF-8 characters. This is because LotusScript Web agents receive the incoming POST data using the Lotus Multi-Byte Character Set (LMBCS) format, with no support for UTF-8. We notified HCL regarding this issue, but the complexity of the existing LMBCS implementation made it difficult to change without causing other problems. At the same time, the pressure we were receiving from Asia and Europe to provide support for UTF-8 resulted in some serious depression and near-despair at Phora Group. So it was back to the drawing board, and with the aid of Java we were able to create our own scheme to smoothly and quickly handle UTF-8 characters within LotusScript agents.

As we moved forward in the development of iPhora, JSON became the standard, not just for data exchange between servers, but also for defining security, APIs, data structures, data processing, data exchange between software components and modules, and even between functions and methods. JSON allowed us to decouple components, thus extending the concept of reusability. This is something that JavaScript supports intrinsically since JSON is inherent to JavaScript, but in LotusScript, no.

So how does one read, write and process JSON using LotusScript? That is a good question. Over the years, there have been a number of LotusScript JSON parsers created by individuals, such as the SNAPPs JSON Writer from Rob Novak and his company and later Lars Berntrop-Bos created his Turbo JSON Writer which was much faster. Eventually, support for JSON was added to LotusScript with a new class in Domino 10. However, that JSON support was severely limited and quite buggy. It has since been improved to handle more than 64K bytes of data; however, that class is still cumbersome to use when you have to manipulate massive amounts of multi-level JSON data, such as data containing a mix of objects and arrays sometimes embedded 7 to 10 levels down. Processing JSON data is not just about reading it or simply creating it one line at a time. You often need to insert a large JSON object into an existing object, or remove an object or element from an array, or inject data into an existing array element. From a JavaScript programmer's point of view, those are all very simple tasks. However, they are not so easy within LotusScript, nor even with Java.

As JSON became more and more important to us, we had to find a way to quickly handle and process large amounts of JSON. One alternative was to switch from LotusScript to JavaScript/Node and the Domino AppDev Pack. Unfortunately, that does not support MIME fields, which were now essential to iPhora. Over the years, our team created several different types of JSON parser to help process JSON. The first versions were slow and awkward to use, but over time, they got better and better. After many generations of development, we created our Flex JSON Parser, a true JSON parser that is fast and flexible. It is much more in par to how JSON is processed using JavaScript.

Next time, we will talk about the use of this JSON parser and why having such a parser was a game changer. The next edition is titled, "The iPhora Journey - Part 5 - Dammit, Jim, I'm a LotusScript Developer not a JavaScript Developer (AKA Processing JSON Data with the LotusScript Flex JSON Parser)" We will do a quick comparison with JSON processing using JavaScript and the LotusScript NotesJSONNavigator classes.


The iPhora Journey - Part 4 - JSON is King - The How

The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 1 - Reimagining Domino

Tuesday, August 9, 2022

HCL workshops coming to CollabSphere 2022

As part of CollabSphere 2022, on Tuesday October 18, 2022. HCL will be running three virtual workshops the day before CollabSphere starts. Each workshop will be 4 hours long with breaks and be limited to 12 attendees only. You will be able to sign up for the workshops as part of the CollabSphere 2022 registration process which start next Monday, August 15, 2022. You will be placed on a waiting list if the workshop is full. Below is a list of workshops that will be provided by HCL Digital Solutions Academy:

=========================================
Tuesday - 10/18/2022 - Morning
Deploying HCL Sametime Premium 12 on Kubernetes
We'll cover setting up your own Kubernetes cluster and deploying Sametime Premium, which includes chat and meeting components. In addition to this, we'll use the time to cover network architecture, configurations, best practices, and troubleshooting.

Tuesday - 10/18/2022 - Morning
HCL Volt MX Development Jumpstart - Domino Developers Edition!
Are you a Domino Developer looking to understand how your skills can be used with HCL Volt MX? If so, the HCL Volt MX Development Jumpstart for Domino Developers Workshop is for you! Attend this class to get your very first MX application up and running.
  • Learn to integrate HCL Volt MX with a Domino back-end database
  • Develop a Photo Blog Mobile & Web Application from scratch
  • Download reusable assets to get you started quickly with development

Tuesday - 10/18/2022 - Afternoon
Deploying HCL Domino on Kubernetes
Wondering how to install and configure your Domino server on Kubernetes? Are you looking to learn how Domino can help your organization in their Cloud Native objectives? Or do you just want to know what namespaces, pods, and containers are or why you should care? If you answered Yes to any of these questions, this workshop is for you! After attending this session, you will be familiar with the fundamental aspects of Kubernetes as well as everything it takes to run not one but multiple Domino servers in a Kubernetes cluster.

The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 1 - Reimagining Domino

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 4 - JSON is King - The Why


JavaScript Object Notation (JSON) was defined by Douglas Crockford in early 2000s as a data exchange format that grew out of JavaScript. JavaScript code can in fact be structured as JSON data, and even the HTML DOM can be represented as JSON data (more about that in a future blog).




JSON has now become the dominant data exchange format, superseding XML. It provides data exchange between the server and JavaScript-based clients, including web browsers, and it is also used in web services and web hooks. It is not only readable by humans but it is also lightweight. Databases such as MongoDB and RavenDB use JSON as the data storage format, and there are a number of derivatives, such as BJSON, a binary version of JSON used in MongoDB.

More and more companies are relying on either cloud-based or external web services to operate their business, and 99% of all web services rely on JSON. So the importance of being able to process, store, and manipulate JSON is essential for any web application server. When we started this journey, the only thing in Domino that supported JSON was the ReadViewEntries call (with OutputFormat=JSON), and that was it.

As part of our quest to reimagine Domino, we wanted to have Domino behave like a modern web application server, specifically, a JSON-based NoSQL database platform, similar to MongoDB. Note that Domino has many advantages over competing database technologies. A Domino document can be up to 2 GBbyte in size, while MongoDB has a limit of 16 MBytes. Domino also supports master-to-master replication and has a more robust and more granular data security model. So we had to figure out how to send, receive, and store JSON data in Domino.

Another factor in our decision to transform Domino into a JSON-based NoSQL platform was portability. At the time, it was not clear how much longer Domino would be supported by IBM (a question that was happily resolved when HCL acquired Domino), so we needed a design approach the was platform agnostic. Again, a JSON-based NoSQL design (one that was API-driven) would achieve that. It would have separation between user interface and data store, making it much easier to move to an alternative data store if necessary.

We decided to break this part into two posting since it was getting too long. So, tomorrow, The iPhora Journey - Part 4 - JSON is King - The How. 


The iPhora Journey - Part 4 - JSON is King - The Why

The iPhora Journey - Part 3 - Creating an Integrated UI Framework

The iPhora Journey - Part 2 - Domino, the Little Engine that Could

The iPhora Journey - Part 1 - Reimagining Domino

CollabSphere 2022 Presentation: COL103 Advanced Automation and Cloud Service Integration for your Notes/Nomad Applications

Our presentation for CollabSphere 2022. Learn how to quickly add workflow automation into Notes/Nomad applications using No-code tools that ...