7 Things You Didn’t Know You Could do with a Low Code Tool
Surprisingly easy solutions for complex data problems.
Today I want to play the myth buster. I want to demystify this myth that low code tools can only solve simple problems. In particular, I would like to take you on a tour of unsuspectable applications developed with the open source low code tool for data science KNIME Analytics Platform and its commercial counterpart KNIME Server. For example, did you know that with KNIME software you can connect to a sensor board and transfer data into a repository? Or that you can create music with a deep learning model? Or that you can create deep fake images with GANs? Or do web scraping? Or connect to data repositories on the cloud? Or deploy an application as a REST service? Or build a dashboard? All of that with low code, very low code.
Let’s start the tour.
1. Acquiring Sensor Data from a Sensor Board
The first low code solution that I would like to show you is an IoT application, and specifically a web service that transfers data from a sensor board to a data repository. Commonly, an IoT problem involves connections to sensors, the setup of a data storage, and optionally some time series analysis to predict the future. We also challenged ourselves with an IoT problem and we built a solution for temperature forecasting, fully relying on KNIME software. In particular, the project’s goal was to know how warm it will be on the next day, ie. a temperature forecasting solution.
The weather station consisted of two main parts: the data collection part and the data forecasting part.
Data collection:
- A sensor board
- A data repository for data storage
- A REST service to transfer the data from the sensor board to the data storage
Data forecasting:
- A sARIMA model trained to predict the temperature in the next hour
- An application to clean up and process the acquired data and to train the sARIMA model
- A dashboard application to predict the next hour temperature using the sARIMA model along with expected minimum and maximum temperature of the following day.
Figure 1. The web service in charge of the collection of IoT sensor data was implemented with KNIME software.
The REST service for the data collection is triggered from the sensor board every time a new sample is available. The only task of the REST service is then to verify the integrity of the Request data and to store it into the data repository.
You can find all applications used to build this weather station in the KNIME Hub folder “KNIME Weather Station” and more details in this article on the KNIME blog “Temperature Forecast from IoT sensor data”.
2. Importing Book Metadata via iPhone Screenshots
A similar problem to the transfer of data from a sensor board is the transfer of images from a mobile phone. In this second application, we built a book inventory application. Here a web service receives a photo of a book from an iPhone, extracts the book metadata and cover, and saves the whole information to a final repository, through the KNIME Server.
- A native custom mobile app captures the image of a book copyright page through the iPhone camera, converts it to a base64 encoding, and sends it to a web service on the KNIME Server in the form of a Request object. The code for this iPhone app is implemented following the specs provided by Apple.
- The web service accepts the image, extracts the ISBN identifier via OCR, retrieves book metadata and cover image via Google Books, and finally stores the information into a data repository (e.g., an SQLite database).
The web service operating the transfer is implemented with KNIME Analytics Platform, it is actually quite simple, and it is reported in Fig. 1. Only one line of Python code was used in the making of this web service, and precisely in the “Decode base64 image” metanode. The one line of code was embedded in a Python Script node and referred to the base64.decodebytes() function in Python.
You can download the workflow for free from the KNIME Hub.
Figure 2. A codeless RESTful API that imports a screenshot from an iPhone app, retrieves book metadata and cover image from an ISBN identifier, and stores them in a database.
3. Deploy an Application as a REST Service
Talking about REST services, can you imagine how easy it is to create a REST service with KNIME? An example is shown in Fig. 3. All you need is a Container Input node to accept the data in the Request, a Container Output node to provide the data for the Response, and all required operations in the middle. Indeed, every KNIME workflow moving onto a KNIME server is automatically productionized as a REST service. So, all you need are the two Container nodes for the data input and output.
Figure 3. A KNIME workflow including Container nodes to set the Request and Response data structure, to be deployed as a REST service
The assembling of a REST service via a KNIME workflow is explained in this article “Explore REST capabilities for RESTful workflows” on the KNIME blog. The execution of a KNIME workflow as a REST service on the KNIME Server is explained in this article “KNIME REST Server API” on the KNIME blog.
4. Build a Dashboard
Not all applications are put into production as REST services. Some are deployed as web applications, with beautiful dashboards running on a web browser. Building more or less complex dashboards is also possible (and easy) with KNIME Analytics Platform.
Widget and Data Visualization nodes provide view items. Assembling such nodes into a component produces a composite view, combining and tying together all view items of the graphic nodes within the component. An example of the dashboard generated by a very simple yet powerful component is shown in Fig. 4. As you see a handful of Bar Chart and Table View nodes generate a sophisticated dashboard of connected and interactive tables and bar charts.
Figure 4. The content of the component (left) responsible for the creation of the dashboard (right).
The reference article for this application is “Creating a dashboard with a very simple component” published in the Medium journal “Low Code for Advanced data science”. The corresponding workflow is also available on the KNIME Hub.
5. Create Music with a Deep Learning Model
Enough with the engineering type of tasks! The next question is: Can a low code tool, like KNIME Analytics Platform, implement advanced data science applications? Let’s start with another tour.
KNIME Analytics Platform offers a wide range of machine learning algorithms: from clustering to decision trees, from random forest to XGBoost, from gradient boosted trees to regressions, and from neural networks to deep learning. Using the KNIME Deep Learning extension - based on the Keras libraries - different nodes implementing different neural layers can be combined together to create different types of neural architectures.
In fig. 5 the brown nodes implement the neural layers. The neural architecture is quite complicated and includes three input layers and three output layers in parallel, because the music is represented by three sequences of vectors: notes, durations, and offsets. The network is trained on Schubert music data to generate music. However complicatedly, the network is built, trained, and deployed without using one single line of code. Listen to a sample of this network generated music in “AI plays piano”.
Details of this project are reported in the article “My AI plays piano for me” in the journal “Low Code for Advanced data science”.
Figure 5. This deep learning network including a layer of LSTM units is trained to produce Shubert-style music.
This project on AI music generation is actually an extension of a previous project on AI rap song (and Shakespeare text) generation. The related article “Use Deep Learning to write like Shakespeare” is also available in the same Medium journal.
6. Create Fake Images with GANs
Another interesting application is the application of Generative Adversarial Networks (GANs) to the generation of images, in our case of faces.
Figure 6. This workflow implements a GAN and trains it on a face dataset.
Since the KNIME Deep Learning extension does not offer yet the generator and the discriminator layer, a hybrid approach Python-KNIME was adopted here. The required few lines of Python code were packaged within a KNIME component node, named DL Python Creator. Different instances of the DL Python Creator component, with different parameters, implement respectively the generator and the discriminator part of the GAN (Fig. 6). Once packaged within a component, the required Python lines can be handled as all other KNIME nodes, without the need of reading or changing the underlying code.
A popular face dataset from Github was used to train the network. Notice that this dataset contains a very large number of higher resolution colour images and therefore the training of this network was computationally quite intensive. The GPU acceleration for the deep learning package made it possible to train the network in a reasonable amount of time.
The network was then used to generate fake faces and you can see the results in Fig. 7.
More details about this application are available in the article How to create GANs with KNIME Analytics Platform on the Medium Journal, Low Code for Advanced data science.
Figure 7. A number of fake faces generated by our trained GAN.
7. Build a Tweet Bot
The last application that I wanted to insert in my list is a Twitter bot. That is, an application (a bot) that accesses Twitter, extracts all tweets including a given hashtag, and then retweets them. The workflow that implements this Twitter bot is quite simple (Fig. 8). It accesses a twitter account, performs the search for the selected hashtag, some cleaning like removing retweets, and then reposts all remaining tweets.
This application is described in the article “Confirm that you are a robot” in the journal “Low Code for Advanced data science”.
Figure 8. A workflow implementing a Twitter bot.
Conclusions
These simple workflows are just an example of how easy some tasks can become in the low code world with respect to the coding world. Indeed, connector nodes – all connector nodes – must implement all required operations to access the selected data source. This makes the connection operation as easy as drag&drop, and hides all the ugliness of the required permissions, drivers, and configurations from the eyes of the user.
Figure 9. The KNIME connector cheat sheet.
KNIME Analytics Platform, for example, includes a very large number of such high-level connector nodes to access all kinds of data sources, like SQL and noSQL databases, REST services, cloud repositories, big data platforms, Spark, Kafka, all sorts of files, MS office documents, Google documents, and so on. Most of the data sources accessible via a KNIME connector are reported in Fig. 9. This cheat sheet can be downloaded for free from the page “KNIME Connector Cheatsheet.pdf” on the KNIME web site.
Rosaria Silipo is not only an expert in data mining, machine learning, reporting, and data warehousing, she has become a recognized expert on the KNIME data mining engine, about which she has published three books: KNIME Beginner’s Luck, The KNIME Cookbook, and The KNIME Booklet for SAS Users. Previously Rosaria worked as a freelance data analyst for many companies throughout Europe. She has also led the SAS development group at Viseca (Zürich), implemented the speech-to-text and text-to-speech interfaces in C# at Spoken Translation (Berkeley, California), and developed a number of speech recognition engines in different languages at Nuance Communications (Menlo Park, California). Rosaria gained her doctorate in biomedical engineering in 1996 from the University of Florence, Italy.