NTT DATA Business Solutions
Asad Mahmood | September 4, 2020

SAP HANA Cloud in Conjunction with the “New” Development Approach

Author – Asad Mahmood, Principal Solution Architect, NTT DATA Business Solutions UK.

This blog aims to share some of my initial impressions of the new SAP HANA Cloud offering and I’ve dovetailed this with my foray into HANA Deployment Infrastructure (HDI). SAP HANA Cloud was released a couple of months ago and will eventually replace the former (but currently available) SAP HANA Service. Both of these run on SAP Cloud Platform (SCP). I will return to SAP HANA Cloud in a few moments but first, a few words on HDI. HDI has been touted as the new development approach for Database developers for some time but I must confess that I have been reluctant to migrate away from the well-established, SAP HANA Studio.

To provide some context, I have been developing on on-premise SAP HANA systems and SAP HANA Service in SCP using SAP HANA Studio for some considerable time. SAP HANA Studio is a client application installed on my laptop and has been available since the inception of SAP HANA. Amongst many other features, it allows me to create database objects in the “Catalogue” section e.g. Tables, DB Views, Procedures, etc. and Calculation Views within the “Content” section. Once the Calculation Views are in place, we can connect SAP BusinessObjects (BOBJ) and SAP Analytics Cloud (SAC) to produce and share insights with business users. Importantly HANA Studio has supported design time and runtime artefacts. Bliss.

With SAP HANA Cloud I don’t see a way to connect SAP HANA Studio and figured that it was high time to make sense of HDI!

SAP HANA Cloud ships with SAP HANA Database Explorer which provides HANA Studio-like capabilities insofar as database development artefacts are concerned. This allows me to create, drop, insert into Tables, etc. This is known as the classical schema approach and is supported with SAP HANA Cloud. An example has been inserted below – including the obligatory “Test” table.

“Everything I have been able to do in SAP HANA Studio” I nonchalantly state at this point. I promise that is the last of me reminiscing about SAP HANA Studio. Critically, SAP HANA Database Explorer does not support the creation of Calculation Views. Panic.

The creation of Calculation Views now requires SAP Web IDE Full-Stack. The IDE and my development project can be seen below:

SAP Web IDE Full-Stack runs on SCP and requires a valid subscription. This can be done via the SAP Cloud Platform Cockpit:

Curiously, it’s called SAP Web IDE for HANA development in this section. Unlike SAP to change the name of an app. Must be a typo…

Yes, this allows us to create Calculation Views but not in the way we have known it. There is a complete separation between design and runtime objects amongst a few other matters…

SAP Web IDE Full-Stack is the new home for HDI solution development. After creating a Multi Target Application Project and a SAP HANA Module, I was ready to create my scripts in the form of files (requiring designated file extensions). These scripts represent the design time for each of the database artefacts I want to create. Here are a few examples from my current project:

The Date Dimension creation followed by a Store Procedure I developed to dynamically populate this Table:

Some JSON to upload data from a CSV file contained in the project:

Whilst we can define the same objects, you can see the definition (and language in the last example!) have invariably changed. All of these scripts reside in my project and are design time objects until I choose to build. Once built, the objects can be accessed in our Runtime HDI Container. This is ostensibly a Schema in SAP HANA based on all the prodding and poking of the last few days to try and figure out what was happening under the covers. Conveniently, a Database Explorer is included in the SAP Web IDE Full-Stack. I have been using this extensively to test and compile my code before battening it down into a design time file:

Once we have defined our scripts and built these successfully, we are ready to deploy this solution to a SAP HANA Cloud instance. This provides a cleanly named Container (whereas the build produces a cryptically named Container) which we can then use to connect our analytical tools to.

The deployment is handled by an MTAR file. This is produced once the Project has been built. The MTAR is a deployment-ready file for all the project artefacts and references the YAML file for its definitions. Once the deployment is complete, we are ready to connect BOBJ, SAC or a third party tool to the resultant Calculation Views and commence our analysis.

There are many awesome features of SAP HANA Cloud that I intend to share in forthcoming blogs – none more so than the new Relational Data Lakes but I wanted to use this piece to share my experiences of HDI with SAP HANA Cloud.

I hope this helps you evaluate and make this transition. As always, please get in touch if there are any questions or insights that you wanted to share.  Contact us here