I believe that parents fall into two categories during the holiday season. Those who are buying Legos by the pound with the belief they are inspiring their children to be imaginative and develop engineering skills, and those who have banned Legos because the piles from previous years are taking up more space than bedroom furniture. The genius of Legos is undeniable. The most complex creations can be built by just about anyone, and since the process of creation is so satisfying, everyone gets hooked. I am sure I went through a period when all I wanted to do “when I grew up” was work for Lego designing new creations.
These days, automation and deriving insights from data define high-performance businesses. As organizations seek to develop these capabilities, I often hear executives wanting to “train everyone in programming” and then contracting basic classes in Python (perhaps under the guise of Anaconda or Databricks training) for all their analysts and decision-makers. If we assume learning to program is at least as hard as learning a foreign spoken language, then a few weeks of instruction should get us to the programming equivalent of reciting the alphabet and finding the location of a restroom. I doubt that is enough to build a successful business. Further, not everyone enjoys studying the grammar of spoken languages. The assumption that every analyst will enjoy working with the “grammar” (i.e., syntax, design patterns, algorithms, and architectures) of programming seems like a stretch. Does this mean the dream of universal programming ability, or the “citizen data scientist,” is unreachable?
If we can make programming as simple as logical problem-solving, then organizations could achieve the dream of data-driven insights being generated by every analyst and decision-maker they employ. I haven’t met anyone who can’t create a flowchart of how they would solve their current problem. Outlining the functional step-by-step workflow to get from A to Z is just part of being human. Expressing that flowchart so a computer can automate it is the sticking point. So why not solve this challenge with the ready-made building blocks that can be assembled, the way we do with Legos? It would be wonderful to have the ability to work with Legos in our professional lives!
This is what low-code platforms are trying to give us. Programming with a low-code platform becomes a visual process of dragging functional components from a library and dropping them on a palette, arranging and connecting them into a complete workflow. The library of components provides the pile of Legos, and the workflow lets you stack them and create anything you can imagine.
Specifically, low-code platforms reduce the burden of programming in several ways, opening this world to non-experts. Most obviously, the steps (Lego building blocks) wrap and abstract away all the low-level code to handle tasks of any complexity. Users learn to configure steps through purpose-built, intuitive dialogs which expose the relevant documentation right where it applies. This keeps users from searching in vain for help when they barely understand underlying concepts enough to form useful search queries. A week of training is also much more enabling for the “citizen data scientist” than general programming training because complexity is abstracted away.
Further, these platforms usually allow a user to inspect results of computations at any step in the process, watching as incremental transformations unfold without needing to understand things like “heaps” or “Integrated Development Environments” and their debugging machinery. Best practices are built right into the platform and components so users can take advantage of parallelism, distributed/asynchronous processing, and elastic scaling without needing to understand, for instance, what a “race condition” is or how to share resources coherently. Even for skilled programmers, the speed of prototyping possible with a low-code platform will outpace custom code development. While the overhead of low-code platforms is not always optimal in production situations and not all programming shortcuts/algorithmic tricks are supported, I find a great way to justify the cost of custom development is to show immediate value with rapidly functional proofs of concept – and often the rapid prototype scales so much better than the manual process it addresses and adds value so quickly to an organization that it reduces the priority of creating an optimized custom solution below prototyping the automation of a different business process.
More people being able to build programs and contribute to automated solutions for their work will lead to more innovation and faster progress. I believe that an even bigger advantage is in knowledge capture and transfer. Organizations fear losing years of experience when a long-time employee departs, and furiously try to document all the lessons learned, tips/tricks, hunches, and much more that are locked in their head – only to find the documented narrative is rarely useful when the nuggets inside it are needed. For developers, we have nightmares of being handed thousands (often millions) of lines of code with little documentation and told to “figure it out.” Visual flowcharts of every process, especially when complete with notations explaining why steps are used, communicate with precision and at a functional level that doesn’t require tracing code. When those flowcharts ARE the program, documentation is built in! Versions of workflows expose how they have changed over time, allowing audits across time. Some of the best low-code platforms exploit the structure of multiple workflows together, learning how organizations operate and using this knowledge to accelerate the creation of a new workflow. For example, if most users who start with step “A” all follow with the same next step “B,” the platform can suggest this same follow-on step “B” when other users start similarly with step “A.” Corporate knowledge becomes an exploitable asset!
Expert programmers benefit from low-code platforms as well. No one is an expert at everything nor enjoys programming “boilerplate” functionality. A low-code platform that allows custom code to be integrated either as new “Lego” building blocks or just embedded in a “custom code” step lets developers spend all their time focusing on the part of a solution that requires their expertise. The library of low-code building blocks can quickly provide the parts of an application that aren’t unique. This lets organizations focus on what unique value they provide and keeps developers from being bored with mundane tasks.
Low-code platforms are not yet fully realizing the “ideal” assumed throughout this post, and care must be taken in selecting which is appropriate for an organization’s needs. The level of abstraction possible, the library of processing “steps” available, the extensibility/flexibility/maturity, the amount of platform overhead, deployment metaphors supported, and support community robustness are all very different between current platforms.
KNIME and UiPath are close partners of NuWave Solutions who offer full-featured low-code platforms, although I would argue our ETL/BI partners, including Qlik, Talend, Clover, Informatica, and Oracle all offer variants of low-code platforms for data-oriented businesses. KNIME is focused on data science use cases, while UiPath is focused on automating desktop operations. Since most data scientists tend to double as programmers, KNIME tends to require a bit more skill, but the platform is the most capable I have experienced and what I use most often these days. UiPath is working hard to give anyone who can use a computer the ability to create automations through allowing “Artificial Intelligence” to watch them work. Both offer unlimited open-source versions to get you started, with enterprise offerings for heavy workloads and team collaboration.
I use KNIME almost constantly these days and find it is the only platform able to scratch every diverse itch without burdening me with little frustrations. KNIME first attracted me with its open-source Java-centric approach since I still see myself as a Java programmer. I stayed and committed to the platform because it outperforms alternatives in several areas. KNIME is easy for programmers (Java, Python, or R) to extend, and because of this has a huge library of steps and integrations. I often use KNIME as a visual interface to Spark, H20.ai, TensorFlow, Amazon Web Services (AWS), and other frameworks/services. Since I approach problems as a programmer, I dislike platforms which bias the way I can implement solutions. Aside from KNIME preferring denormalized row operations, there is little platform bias. KNIME has the most robust loop and flow control support of any low-code platform I have worked with – an easy test for this is to try and create a recursive loop, which KNIME makes simple. One of KNIME’s best features is the ability to package workflows as “components,” or steps that get added to the library for other workflows to leverage. This makes it very easy to create higher-level abstractions and more organization-specific/standard reusable steps. KNIME can create any part of an application from the backend data management through the custom interactive web user interface. There are many more features that differentiate KNIME, but perhaps most compelling is that development with KNIME requires an order of magnitude less time and talent than methods NuWave has employed in the past – that is, one-tenth the time and staff to achieve the equivalent (and often better) results! This is before the knowledge management benefits which come from using visual workflows are realized. Yes, KNIME is impressive. The new startup commercializing the codebase is perhaps still too academic to realize the goldmine they are sitting on, but that makes me like them even more.
NuWavers are doing more and more incredible things with low-code platforms, and it is easy to get caught up with excitement. Since we focus on supporting the government, where developers are in short supply, the rise of low-code platforms is going to be very transformative and make the government more efficient, data-driven, transparent, and agile. NuWave looks forward to helping drive this next generation of mission success, and I will be happy to finally achieve my childhood dream of building with Legos all day long.