ABSTRACT
The adoption of enterprise resource planning systems in both the public and private sectors aims at having business transactions effectively captured, processed and stored. However, for most businesses, running real-time models on transactional and historical data is time-consuming, often happening overnight to prevent system contention. In order to gain a competitive edge, instant information is key to organizations. It empowers decision-making and improves the quality of the decisions made. This research implemented an analytics dashboard prototype that that uses in- memory computing in the cloud foundry environment to leverage the parallel computing offered by the cloud environment. The prototype was simulated in a multicore environment with 16 and 32 core processing unit cores. The response times for 1, 2, 4, 16, 64 and 128 cores were calculated using Amdahl's law of response times. A survey on the effectiveness of the IMC-based analytics dashboard in the business context was conducted with 20 key business users. The research findings revealed that save for when concurrent load is required and the CPU bandwidth to the memory bus notwithstanding, organizations intending to use in-memory computing technology such as HANA do not need to spend in core processing units acquisition of more than 16 cores. This research also established that real-time analytics and reporting is realized by in-memory technology’s high-end computing performance. Real time information ensures continuous business transparency. It also empowers decision-making at strategic and operational levels as well as improves the quality of the decisions made.
TABLE OF CONTENTS
CHAPTER ONE
INTRODUCTION
1.1 Background 11
1.2 Problem Statement 12
1.3 Objectives 13
1.4 Research Questions 13
1.5 Significance 13
CHAPTER TWO
LITERATURE REVIEW
2.1 Introduction 14
2.2 OLTP vs OLAP in ERP Design 14
2.3 The Capabilities of In-Memory Computing 16
2.4 The New Programming Model with IMC 24
2.5 Literature Summary 25
2.6 Overall System Architecture 26
CHAPTER THREE
RESEARCH METHODOLOGY
3.1 Introduction 27
3.2 System Development Methodology 27
3.3 Evaluation & Analysis 32
CHAPTER FOUR
IMPLEMENTATION, RESULTS & DISCUSSION
4.1 Introduction 35
4.2 Specifications and Analysis 35
4.3 Inputs and Outputs 36
4.4 The Design of the Open CDS-based Virtual Data Modelling in HANA 37
4.5 Runtime Statistics of the Prototype 38
4.6 Effectiveness of the In-Memory Computing in the organizational context 41
CHAPTER FIVE
SUMMARY, CONCLUSION AND RECOMMENDATIONS
5.1 Introduction 43
5.2 Linking the Study Findings to Objectives 43
5.3 Conclusion 46
5.4 Limitations of the study 47
5.5 Recommendations for future research 47
REFERENCES 48
APPENDICES 49
Appendix I: Prototype Survey 49
Appendix II: The Analytics Dashboard 50
Appendix III: The CDS Views 51
Appendix IV: The Business Logic 53
Appendix V: The OData Project 59
TABLE OF FIGURES
Figure 1: Transaction-consistent snapshot of the OLTP data (Funke et al., 2012). 18
Figure 2: The count of queries executed in two sets of ERP systems (May et al., 2017). 20
Figure 3: NFAE and FDA optimizations in (May et al., 2017). 21
Figure 4: Nested CDS views relationship (May et al., 2017). 23
Figure 5: Traditional Data-to-Code model vs New Code-to-Data model (Heilman et al., 2013). 24 Figure 6: Code Pushdown with Application Server ABAP (Heilman et al., 2013). 25
Figure 7: Overall System Architecture (Pattanayak, 2017). 26
Figure 8: Design-Led Development Process (SAP SE.2020.https://experience.sap.com) 27
Figure 9: The Prototype Dashboard Design 28
Figure 10: The Proposed Analytics User Interface 29
Figure 11: Development and Deployment Process Steps 31
Figure 12: Persona for the dashboard prototype 36
Figure 13: The Design of the CDS-based Virtual Data Model 37
Figure 14: Response time with an IMC on 32 CPU Cores 38
Figure 15: Response time with an IMC on 8 CPU Cores 39
Figure 16: The Analytics Dashboard’s Average Response Time 40
Figure 17: The Industry Distribution in the Survey on using the IMC-based Analytics Dashboard
. 41
Figure 18: Results of the Effectiveness of the In-Memory Computing in the organizational context 42
TABLE OF TABLES
Table 1: Inputs & Outputs 36
Table 2: Response times with an IMC on 8 & 32 CPU Cores 39
Table 3: Response times with IMC on CPU Cores between 1 and 128 40
ABBREVIATIONS
ABAP Advanced Business Application Programming
ANSI American National Standard Institute
API Application Programming Interface
BI Business Intelligence
BPM Business Performance Management
DBMS Database Management System
DML Data Manipulation Language
CDS Core Data Services
CFO Chief Financial Officer
CRM Customer Relationship Management
CSS Cascading Style Sheets
DDL Data Definition Language
ERP Enterprise Resource Planning
FDA Fast Data Access
HANA High performance Analytics Appliance
HTML5 Hyper Text Markup Language version 5
IMC In-Memory Computing
JSON JavaScript Object Notation
KPI Key Performance Indicators
LLVM Low Level Virtual Management
MMU Memory Management Unit
MVC Model-View-Controller
NFAE Native For All Entries
ODATA Open Data Protocol
OLAP Online Analytical Processing
OLTP Online Transaction Processing
SAP System Applications and Products in Data Processing
CHAPTER ONE
INTRODUCTION
1.1 Background
The adoption of ERP Systems in both the public and private sectors aims at having business transactions effectively captured, processed and stored. A number of Enterprise Resource Planning systems have emerged to automate business or organizational processes such as Financials, Sales & Distribution, Procurement, Human Capital Management, Enterprise Asset Management, Production Planning, Quality Management and Project Systems. Properly implemented ERPs have been effective as far as driving the business operations are concerned. Since the beginning of the century, there has been an explosion in the volume of data from business operations. This has birthed two types of Information Systems namely, Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP).
OLTP is effectively delivered with traditional ERPs. As part of the project cutover, all the data from legacy systems are migrated for system Go-Live preparations. Post system Go-Live, more data continues to be captured in the form of business transactions. Over time, this creates a huge but significant data warehouse necessary for providing business intelligence. The need to have real-time data analytics and the never-ending demand for better and faster performance has been limited by the architectural designs of the ERP systems. The most recent designs of the traditional ERPs comprises either a 2-tier architecture or a 3-tier architecture. Both architectures have the application layer separate from the database layer, and the code (business logic) resides in the application server but interfaces with the database server through I/O processing. Most of the database servers used in these architectures are disk-based and row-oriented (Farber et al., 2012).This leads to slow performance especially with the need to ingest millions of records in less than a second required for action and/or decision-making. The solution to such reporting and query requirements is the OLAP.
In OLAP, analytical processes mostly run overnight to prevent system contention hence getting analytical report is a process that can take more than a day. Moreover, much as data warehouse systems are an established component of the information systems landscape in most companies, many have failed due to architectural concepts and data modeling. OLAP has been deployed and applied in many industries such as manufacturing (for order shipment and customer support), retail (for user profiling and inventory management), financial services (for claims analysis, risk analysis, credit card analysis, and fraud detection), transportation (for fleet management), telecommunications (for call analysis and fraud detection), utilities (for power usage analysis), and healthcare (for outcomes analysis) (Reddy et al., 2010).
In recent developments in OLAP, column-stores have become more popular leading to In-Memory Computing (IMC) as is the case with Hyper System and SAP High Performance Analytic Appliance (HANA) database.IMC’s capability to execute the business logic inside the database kernel as opposed to inside the application server is what makes it more powerful in handling data analytics with lower latency. Leveraging cloud computing and mobile technologies provides even a more powerful user experience by offering faster data analytics irrespective of device or operating system platform. The advent of cloud computing and mobile technologies has prompted for access to real-time, actionable and relevant insights from mobile devices. Real-time analytics is an enabler to revenue growth and operational efficiencies (Agostino, 2004).This research is relevant for businesses running ERPs especially with Sales & Distribution modules and in particular for Chief Commercial Officers, Chief Information Officers, Chief Technology Officers, as well as Head of Information and Communications Technology. It is also important for System Analysts ,System Developers, and Solution Architects who have an interest in code-to-data paradigm, where data intensive operations are pushed down to the database instead of to the application server.
1.2 Problem Statement
With the steady growth of e-commerce, there is need for real-time analytics on data especially in sales performance management. For most businesses, running real-time models on transactional and historical data is time-consuming, often happening overnight to prevent system contention. It is a significant concern for organizations to get real-time queries at lower latency.
1.3 Objectives
a) To examine the capabilities of In-Memory Computing
b) To design and develop a prototype dashboard that leverages an IMC for real-time analytics
c) To experiment the application of code-to-data paradigm
d) To evaluate the latency in data processing from the prototype in the business context
1.4 Research Questions
a) What are the capabilities of In-Memory Computing?
b) How much of a paradigm shift does In-Memory Computing has on traditional computer programming?
c) What is the CPU core requirement for real-time data visualization in In-Memory Computing?
d) How applicable is the In-Memory Computing in the business context?
1.5 Significance
In order to gain a competitive edge, instant information is key to organizations. It empowers decision-making at operational levels and improves the quality of the decisions made. It also strengthens the relationship between the organization and its customers, suppliers, partners and shareholders.
Login To Comment