Home > Blog > Blog
1 2 3 49 Previous Next

Blog

726 Posts
Irfan Khan

Partnering in the Cloud

Posted by Irfan Khan Sep 19, 2014

We might think of Cloud computing as a relatively recent phenomenon, but in fact the drive towards virtualization and sharing of computer resources which led to the Cloud can be traced to the 1950’s and some of the earliest mainframe systems. In the 1970’s, the time-sharing model deployed by large mainframe providers reached its apex with the introduction of the virtual machine, providing an early, primitive version of the Cloud. In the early 1990’s, large telecom providers began offering Virtual Private Network (VPN) service, enabling customers to leverage resources of the public network as though they were their own.

 

In the mid-1990’s, as it became apparent that the Internet would transform computing through connections and shared resources, the concept of the Cloud as a computing platform began to come into focus. By the late 1990’s / early 2000’s, a few pioneering software vendors introduced multi-tenant Software-as-a-Service (SaaS) solutions, some of which began to enjoy widespread adoption. In roughly the same time frame, the first true Cloud infrastructure offerings became widely available, with services focused primarily on storage and computation. From 2005 – 2010, Infrastructure-as-a-Service (IaaS) offerings began to enable growing numbers of user organizations to move aspects of product development and delivery to the Cloud. In the most recent years, we have seen the emergence of Platform-as-a-Service (PaaS) offerings that provide both a computing platform and a solution stack, enabling users to fully develop and deploy applications via the tools and resources provided via the Cloud. Meanwhile, the Cloud isn’t the only thing evolving. Business models are rapidly changing, as are demands from customers, requirements from government and other compliance-seeking entities, risks from cyber-thieves and other bad actors, and, of course, relentless challenges from competitors. To survive, businesses must be faster, more responsive, and more flexible than they ever have before. They are called on to bring new products and services to market, or to implement whole new ways of interacting with customers, in a fraction of the time that would have once been available. More than just relieving existing administrative burdens, these organizations are looking to the Cloud to transform how they deliver value in this accelerating environment.

 

Throughout the long evolution described above, organizations have enjoyed significant technological and business benefits from the virtualization and resource-sharing that lies at the heart of the Cloud.  Chief among these benefits is the increased business focus that moving to the Cloud can enable. Relieved of the burden (and much of the expense) of implementing and maintaining IT infrastructure, organizations can now turn their attention and resources more fully in the direction where they belong: their core business functions.  As businesses venture more deeply into the Cloud in pursuit of this transformation, they come to vendors of cloud services with a list of significant challenges:

 

  • How do we integrate multiple technologies to provide a single and operationally elegant solution to our customers?
  • How do we deal with hybrid cloud deployments that include public cloud, SaaS, and PaaS into a single solution that can be
    governed via GRC regulations?
  • How do we deal with the fact that real-time is no longer a novelty or luxury, but a necessity regardless of the volume, velocity, and variety of data and regardless of the complexity of the infrastructure?
  • How do we deal with cross-cloud integrations and security?
  • How do we deal with the reality of data locality and privacy laws that are complex as anything we’ve seen (including different accounting rules)?


To meet to such demands, Cloud vendors have had to evolve along with the services they are offering.  With their customers demanding increasingly sophisticated and responsive PaaS solutions, it is no longer enough to be a service provider. They must become true business partners with their customers.

 

Recently I wrote about SAP Simple Finance, which combines secure Cloud delivery for financial processes with the real-time capability of the SAP HANA platform. I was pleased to note that our customer, Zurich Insurance, is looking at a 65% faster monthly close process and reporting that is 1000 times faster than the previous implementation of financials provided. Other early adopter users are talking about even more sweeping changes to their financial processes as enabled by the new platform. One customer is looking to cut an 18-day close process to a single day, effectively enabling a daily rather than a monthly closing of the books.

 

Multiply those kinds of sweeping changes by the thousands of business processes that SAP supports across the Enterprise – and then multiply that result by the thousands of SAP customers who have moved to the Cloud or are doing so now – and we see transformation of business process on a global scale. SAP customers are making the move, with some 35 million business users already using the SAP Cloud across HR, finance, procurement, ERP, and more.  SAP offers the most comprehensive cloud computing portfolio for business applications in the market and that is just the beginning of the story.

 

Users of the SAP HANA Cloud Platform leverage an environment that works with them where they are as a business, one that encompasses SaaS, PaaS, On-Premise, Hybrid and Public Cloud. Those implementing PaaS can now develop and deploy customizable and extensible applications to meet any imaginable set of B2B or B2C requirements. The apps can be extensions of existing applications on-premise or cloud applications or whole new creations. They are introduced to solve specific problems or to open up new revenue opportunities. A few examples demonstrate how the SAP Cloud Platform meets customers where they are:

 

Apps to Better Connect with Customers
Multinational food and beverage giant Danone was immediately attracted to the SAP Cloud Platform because of the way it brings together the Cloud, big data, and mobility. They saw the platform as very much in line with their own strategies for reaching out to and connecting with their customers via new purpose-built apps. To date they have deployed a lightweight call center in South Africa and a web shop for selling nutraceuticals in Germany. They have found that the platform allows them to start small and rapidly deploy alternatives as they explore.

 

App Extensions to Enterprise Systems
One of the world’s leading management consulting companies, Accenture, introduced an app for Audit and Compliance as-a-Service in response to the reality of hybrid on-premise and SaaS environments. Accenture’s app, an extension of SuccessFactors, leverages a rules library to identify compliance issues and eliminate inconsistencies between the Cloud and On-Premise HCM solutions. Accenture was so impressed with the results that they are now looking to build extensions to integrate their own cloud-based and on-premise solutions.

 

A Business Built on Apps
Opal is a start-up providing native applications, developed via the SAP HANA Cloud Platform, for the fresh-food retail business. The company delivers pre-sized forecasts to help retailers better plan the demand for their fast-moving goods, reducing costs of stock and product waste. Thanks to HANA, users can manage operations in real time with access to the most up-to-date data.

 

In addition to the requirements for greater speed and flexibility noted above, businesses are called upon to be more extensively connected with one another – as customers, as vendors, as partners – than ever before. In response to such requirements, business networks have evolved to streamline and automate vital inter-business functions.  Users of the SAP HANA Cloud Platform have access to critical business networks to support the apps they deliver. Via SAP Hybris, our customers have access to enterprise software and on-demand solutions for e-commerce, multi-channel commerce, master data management, and order management. The more than 500 Hybris users include such big names as Target, Samsung, and 3M, as well as a large number of startups.

 

Then there is Ariba, which enables our customers to market and sell their apps to over 250,000 customers worldwide with the SAP HANA Marketplace. The Ariba Network is the largest open trading network in the world, used by nearly one million companies in more than 140 countries around the world. Companies like Volvo, OfficeMax, and Aviva are among the many SAP customers experiencing a whole new level of connectedness and opening up tremendous new opportunities in the Cloud via Ariba.

 

Across the board, SAP has made significant advancements in acting as a true partner, elevating the business impact that customers can achieve running applications and key custom function extensions in the Cloud. Powered by the unique capabilities of the SAP HANA Cloud platform, SAP is increasingly positioned to be the Cloud company, leading the way in combining both the infrastructure and the deep business understanding required to enable our customers to overcome the challenges, leverage new capabilities, and deliver a new generation of business results.

 

Twitter:  @i_kHANA


SAP HANA is not only evolving your companies business, it is also evolving your daily work being part of SAP HANA implementation teams or operating SAP HANA System Landscapes.

 

Therefore my colleagues Richard Bremer and Lars Breddemann decided more than a year ago to put their SAP HANA administration expertise together. After more than six month writing exercise they are happy to hold the SAP HANA Administration book in hand to share their experience working with SAP HANA, providing detailed examples and best-practice recommendations with you. Lars also posted on SAP Community Network Book Announcement: SAP HANA Administration.

 

 

Since the official launch date on Sep 3, 2014 the SAP HANA Administration book is available in different formats. So either if you are a Project Manager or a Project Member performing SAP HANA implementations or if you are a Technical Consultant or Administrator operating SAP HANA system you will definitely benefit from having it always handy!

 

In order to reduce the release-dependency, the book has a strong focus on teaching fundamental concepts of how SAP HANA systems work and behave next to conveying hands-on-skills.

 

The first two chapters are mostly an introduction to SAP HANA architecture and typical usage scenarios. In chapters 3 to 6, topics related to the database system such as installation and update, persistence and business continuity, or scaling SAP HANA are covered. The chapter 7, 8 and 9 deal with objects in the database in general and database tables in particular. In chapters 10 to 13 the transaction handling, the repository as a backend for managing development artifacts, and the topics of user management, system security, authentication and authorization are described. Chapter 14 summarizes Richard and Lars SAP HANA administration learnings to provide an end-to-end view on planning and setting up an SAP HANA system, followed by an in-depth discussion on performance and root cause analysis.

 

I had a chance to take a first look to the book yesterday and I think either if you will read it cover-to-cover or use it as a reference to look up details, it’s a “MUST-HAVE” piece!

 

Take a first look on the sample at the SAP Press page and get your own impression.

DSC_0148.JPG

 

My team at SAP has been collaborating with the Laboratory for Computational Physiology at MIT which is part of the Harvard-MIT Health Sciences program. Physionet is the premiere publisher of anonymized research databases for medical research.  We are using SAP HANA to help build the next releases of the their MIMIC research database. This is a unique use of HANA in that the clinical and signal information contained in the MIMIC dataset have a complex structure which requires very complex queries and signal processing when mining the data for clinical/treatment innovations in the Critical Care space.

We had a chance to share some of our innovations with a full house of data scientists at the annual Conference on Knowledge Discovery and Data Mining (KDD), a conference that provides premier forum for advancement and adoption of the “science” of knowledge discovery and data mining.  We jointly taught a hands-on tutorial entitled, "Management and Analytic of Biomedical Big Data with Cloud-based  In-Memory Database and Dynamic Querying", in which we demonstrated some of the techniques we implemented to work with the MIMIC II database on SAP HANA. The data scientists were all given access to the HANA version of MIMIC II running on HANA One  (HANA on the Amazon Cloud). Participants could follow the tutorial and try out the example codes on their own computer(s) simultaneous to the instructors.

None of the participants had ever been introduced to the performance benefits of in-memory computing or HANA, but they finished the tutorial quite impressed with the technology. We heard overwhelmingly positive feedbacks about the performance of very complex queries, the integration of analytical libraries, such as R and AFL, as well as the built-in visualization capabilities of HANA studio.

 

I have attached a few of the presentations so you can get a flavor of the type of things we are doing with HANA.

 

If you have any questions for feel free to  email me at SAP.

As is my regular habit, I caught up on some reading this weekend. Of note were a Wall Street Journal article and an Internet of Things (IoT) guide. The content of DZone's 2014 Guide to Internet of Things, as good and informative as it was – and it was useful, thought provoking and well worth the read – was not what struck me. It was the descriptive of the opening paragraph and mention of a telepresence robot that resonated and stuck with me. Perhaps, because I was in a setting similar to the one described; sitting in a patio chair, listening to a playlist from an internet music source on my Jambox. And that I too can, and do, control some aspects of my home via my mobile devices, and have had experience with a telepresence robot. It put me a the mood and started me thinking of the amazing world in which I am very privileged to live, how far things have come, and to wondering what would come along next to marvel me.

 

So, to my experience with a telepresence robot. Ivan Anywhere, our telepresence robot wanders the halls of the once Sybase, now SAP, Waterloo office, and has been doing so since 2007. The name is a compound of the developer who uses it to work remotely and the SQL Anywhere product that powers it. The feat of creating a mobile telepresence to allow a developer, who moved over a 1000 miles away to interact more meaningfully with his co-workers was impressive and cutting edge 7 years ago. But time and technology move on. One intern I recently took on a tour of our office stated ‘oh skype on wheels’ when they first saw Ivan. And now it seems that there are other offices with telepresence robots. Soon they will be everywhere and chatting with the fridges at that. The ingenuity, creativity and drive that created Ivan Anywhere, is still alive and flourishing at the Waterloo office, now merged with others at SAP, and from the acquisitions it has made. A large, synergistic pool of amazing talent driving cutting edge innovation.

 

And that leads to the Wall Street Journal article of August 21, Chinese Gadgets Signal New Era of Innovation. It discusses the startup environment in China where there is a strong drive and growing network for innovation. A large portion of this innovation is coming in the form of apps and smart  gadgets; form sensor devices for sports equipment to IoT devices, to wearables. The description of “tinkers with big dreams” with an emphasis on cutting edge technology made me think again of the innovators at SAP. And of the SAP Startup Focus Program and ecosystem; over 1,500 innovators and visionaries around the world that SAP is enabling and encouraging. And I realized not only do I live in an amazing world, but I work for a pretty cutting edge forward thinking company, which some may find surprising given its size and scale.

 

And I wonder what is coming next to marvel me. Though they are about 8 weeks away I am really looking forward to Strata Hadoop World NYC, October 15 -17 and to SAP TechEd && dcode, in Las Vegas the following week October 20 – 24. I am going to be at both, and am now really excited to see what I will come across at the shows that will knock my sox off. If you want to experience some innovating from some amazing talent come to the shows. Maybe our paths will cross and you can share with me what makes you see what a wonderful world we are privileged to live in, and what marvels you.

  • Learn how the SAP HANA Platform for Big Data can help you take a simplified approach to making Big Data work for you in deriving true value from Big Data for your business at www.sap.com/bigdata

Whether you’re a technology fanatic or a casual observer of the industry, it’s always helpful to take stock of industry trends in a broader context – particularly as they relate to SAP solutions.  Several weeks ago, we sponsored a webinar on “Selecting Cloud Platforms for Customer Engagement,” featuring Forrester’s John Rymer and SAP’s Suresh Ramakrishnan.  In their 60 minute presentation, John and Suresh did what they do best:   discuss, respectively, cloud platforms in the broadest possible context, and why the SAP HANA Cloud Platform is the ideal Platform as a Service (PaaS) for our customers.

 

John kicked off his presentation with an important baseline: the fact that faster delivery of great software has been a paramount goal in the technology world for decades. But as he points out, the pressure to build amazing apps quickly is now greater than ever: while mobile devices, social media and the proliferation of data have made customers more empowered and demanding, the digital competition to attract and retain them is unprecedented. Hence, the need for real-time, next generation customer engagement applications in the cloud. John provided several excellent examples of such mashup apps, in B2C and B2B industries alike. While the objectives, audience and underlying technology of these applications are diverse, striking commonalities abound:

 

SAP HANA Cloud Platform Forrester Webinar 082114_Final.bmp

 

This leads to the fundamental question: how does one build such solutions? “It starts with cloud platforms,” according to John, who outlined various services that are essential for the development of such applications, including predictive analytics, real-time integration, and those that foster a cross channel customer experience. Citing the “tremendous number of questions from clients on selecting cloud platforms,” he then provided a nice summary of the advice provided to such customers. I don’t want to steal his thunder, but I hope you find John’s framework and suggestions to be as valuable and thought provoking as I did. Among other things, he offered an intriguing analysis of cloud platform services as they relate to two dimensions: Value to Business, and Proximity to Business (the latter best understood by answering the question, “does the service provide functions that business people understand and interact with every day,” or is it primarily IT-centric?). Finally, he ended his presentation with a fascinating statistic from Forrester: of 600 application delivery shops recently surveyed in North America and Western Europe, 32% now routinely delivery software in a staggering one to three weeks. Given the unprecedented pace of application development today -- not to mention the scale of such apps -- it’s no wonder Forrester says the public cloud market is now in “hypergrowth” phase.

 

Enter SAP HANA Cloud Platform, our in-memory Platform as a Service that enables you to build, extend and run applications on SAP HANA in the cloud. While I know many of you are very familiar with SAP’s cloud platform already, you’ll hear from Suresh about how it fits nicely into John’s framework, and how customers like Danone and NFL are using it to build superb, highly engaging next generation customer applications. You’ll also learn about our comprehensive set of services for apps, database and infrastructure, and how you can start building and discovering applications quickly with the SAP HANA Marketplace.

 

I hope you enjoy viewing our webinar with John and Suresh, and learning about the services and capabilities to look for in a cloud platform. On behalf of all of us at SAP, we look forward to hearing about your application development successes, and to seeing your next generation cloud apps in action.

Supply chains are becoming increasingly more complex and more central in driving business results. They have transformed from chains to interconnected networks requiring much more fluid and dynamic processes.  Balancing supply and demand and getting the forecast right has real impact on the performance of the business. This process requires powerful technologies to effectively manage and leverage the big data volumes now available from structured and unstructured sources.  How can an IT professional help choose the right solution that helps unify sales, operations, finance, marketing and supply chain through technology? This infographic will help. Put on your Sherlock hat and start searching.

 

STEP 1: Observe your surroundings

Ask your users what problems are most pressing, what objectives are they trying to solve with the new technology? If your sales and operations team spends most of their time assembling data points from large sets of spreadsheets and various systems than planning, the accuracy of their forecast may suffer.  Some common challenges are inability to create “one view” forecast with sales teams spread worldwide, inaccurate forecast due to missing data, delays, and lack of modeling capabilities. These can lead to lost market share, inventory shortages or surplus.

 

STEP 2: Gather Your Facts

How can you help the sales and operations team to access supply chain data in real time, giving visibility into planned and actual demand information across different product lines, with the support of “what-if” scenario modeling and collaboration capabilities to arrive at the holistic forecast? These capabilities would help:

Balance supply and demand

Greatly reduce dependency on inventory

Integrate financial and operational planning

Link high-level strategic plans with day-to-day operations

 

In fact, according to IDC study[1], companies that run successful S&OP processes claimed up to 20% inventory reduction, up to 25% faster time to market and were able to gain up to 5% more revenue.

 

STEP 3: Draw Your Conclusion

Give SAP Sales and Operations planning powered by HANA a try. This cloud-based solution enables real-time planning and calculations on the fly to handle billions of total planning points. Your Sales and Operations team will be able to collaborate and orchestrate the S&OP process by manipulating big data from all areas of your business, collaborating across company boundaries, and do it real time to increase the speed of your decision making processes through rapid analytics and simulation capabilities. Start your trial today.

 


[1] IDC Research, Modern S&OP in Consumer Products

Back in April, I wrote about the University of Kentucky’s success with SAP HANA that was conveyed by an IDC study. The study revealed that SAP HANA increased data reporting speeds by as much as 420 times, making granular data available in minutes rather than days! The university now experiences up to 15 times improvement in query load times, and up to 87% reduction in ETL times. On the business front, the study reported an ROI of 509%!

 

 

There is a human aspect of this amazing technology driven story. It is the impact this technology is having on student lives. Join me on Sep 17, at 10.30 am ET / 7.30 am PT when I will be live with 3 students from the University of Kentucky and their CIO, Dr. Vince Kellen to talk about how the student experience leverages the power of SAP HANA, and what their visionary CIO has been able to accomplish in positively impacting the future of the students at the university.

 

 

Remember to click the link below at 10.30 am Eastern (7.30 am Pacific) on Wednesday, September 17, 2014.

 

Link: http://www.uky.edu/ukit/about/streams/executive-council

 

Follow me on Twitter at: @puneetsuppal

John Appleby

The SAP HANA FAQ

Posted by John Appleby Sep 9, 2014

I’ve been meaning to pen an update to this FAQ for nearly 2 years, with this being the primary listed reference on Wikipedia, but somehow never found the time. When I heard Steve Lucas wanted to collaborate, I thought it was time for a rewrite and update!

 

 

Part 1 – HANA Overview

What is SAP HANA?

 

SAP HANA is an in-memory database and application platform, which is for many operations 10-1000x faster than a regular database like Oracle on the same hardware. This allows simplification of design and operations, as well as real-time business applications.  Customers can finally begin to reduce IT complexity by removing the need for separate and multiple Application Servers, Operational Data Stores, Datamarts and complex BI Tool implementations.

 

SAP HANA is a “reinvention” of the database, based on 30 years of technology improvements, research and development. It allows the build of applications that are not possible on traditional RDBMS, and the renewal of existing applications like the SAP Business Suite.

 

 

Why did SAP build a database?

SAP co-founder and Chairman Hasso Plattner believed that if a database could be built with a zero response time, that business applications would be written fundamentally differently – and IT landscapes could be simplified. The research institution at the Hasso Plattner Institution in Potsdam theorized that with modern computers and software design, this would be very nearly possible.

 

SAP makes business applications and since it was clear that none of the incumbent software vendors like Oracle would write such a database and application platform, they needed to build their own. In addition, this would be the springboard for a complete renewal and simplifying of SAP’s applications to take them through the next 20 years.

 

 

Is SAP HANA just a database?

 

No. When SAP went to build HANA, they realized that the next generation of business applications would require a much more integrated approach than in the past.

 

SAP HANA contains – out of the box – the building blocks for entire enterprise applications. HANA can take care of the requirements that would be served by many layers in other application platforms, including transactional databases, reporting databases, integration layers, search, predictive and web. All of this is served up working out the box, with a single installation.

 


Where does SAP HANA come from?

SAP built SAP HANA from the ground up, including research from the Hasso Plattner Institute in Potsdam, the acquisition of the IP from the p*Time database, the TREX search engine, BWA in-memory appliance and MaxDB relational database. It has been extended with intellectual property from the Business Objects and Sybase acquisitions with products like Sybase IQ and Business Objects Data Federator.

 

Whilst HANA has a legacy and some code from other products, the bulk of the database and platform has been written from the ground up.

 

 

What makes SAP HANA fundamentally different?

SAP HANA is different by design. It stores all data in-memory, in columnar format and compressed. Because HANA is so fast, sums, indexes, materialized views and aggregates are not required, and this can reduce the database footprint by 95%. Everything is calculated on-demand, on the fly, in main memory.  This makes it possible for companies to run OLTP and analytics applications on the same instance at the same time, and to allow for any type of real-time, ad hoc queries and analyses.

 

On top of this SAP built solutions to all the problems of columnar databases, like concurrency (HANA uses MVCC) and row-level insert and update performance (HANA uses various mechanisms like a delta store).

 

If this wasn’t enough SAP added a bunch of engines inside HANA to provide virtual OLAP functionality, data virtualization, text analysis, search, geospatial, graph (will be available soon) and web. It supports open standards like REST, JSON, ODBO, MDX, ODBC and JDBC. There is as much functionality in there as a whole Oracle or IBM software stack, in one database.

 

 

What kinds of use cases does SAP HANA support?

The first HANA deployments were all analytical use cases like Datamarts and Data Warehouses because the benefits are there right out the box. EDWs like SAP BW run like lightening with a simple database swap.

 

With a transactional application like Finance or Supply Chain, most things run a little better from a simple database swap (SAP claim 50% faster for their own core finance).  The real benefits come when logic from the applications are optimized and pushed down to the database level, from simplification of the apps (SAP is building a simplified version of their Business Suite), or from ancillary benefits like real-time operational reporting, real-time supply chain management or real-time offer management.

 

Best of all, unlike the other database systems in the market, HANA supports all applications on the same instance of data at the same time.  No more copying, transforming and re-organizing data all over the enterprise to meet the needs of different applications. HANA perfectly serves the needs of all applications with one “system of record” instance.

 

SAP have provided a Use Case Repository that catalogues the various use cases for HANA.

 

What SAP Applications run on SAP HANA?

SAP CEO Bill McDermott said “HANA is attached to everything we have”.

 

Almost all the major SAP Applications now run on the SAP HANA platform. This includes the SAP Business Suite (ERP, CRM, PLM, SCM) and the SAP BW Data Warehouse.

 

The BI Suite including BusinessObjects Enterprise, Data Services and SAP Lumira are all designed to run on the HANA platform.

 

There are a set of Applications Powered by SAP HANA including  SAP Accelerated Trade Promotion Planning, SAP Collection Insight, SAP Convergent Pricing Simulation, SAP Customer Engagement Intelligence, SAP Demand Signal Management, SAP Assurance and Compliance Software, SAP Liquidity Risk Management, SAP Operational Process Intelligence, and SAP Tax Declaration Framework for Brazil.

 

In addition, SAP runs much of its cloud portfolio on HANA, including the HANA Cloud Platform and SAP Business ByDesign. The Ariba and SuccessFactors apps are in the process of migration.

 

What’s the business case for SAP HANA?

We’ve built business cases for HANA deployments of all sizes and whilst they vary, there at a few common themes:

 

  • TCO Reduction. In many cases HANA has a lower TCO. It reduces hardware renewal costs, frees up valuable enterprise storage and mainframes and requires much less maintenance
  • Complexity to simplicity.  HANA simplifies landscapes by using the same copy of data for multiple applications.  Our implementations have shown that adding additional applications to a HANA dataset are very fast and easy, delivering business benefits quickly.
  • Differentiation. HANA’s performance, advanced analytics (Predictive, Geospatial, Text analytics) and simplicity often mean a business process can be changed to be differentiating compared to competitors. Customer scenarios like loyalty management, personalized recommendations and anything where speed or advanced analytics capabilities are differentiating are all candidates.
  • Risk Mitigation. Many customers know that in-memory technologies are changing the world and so want to put an application like SAP BW on HANA or LOB Datamarts as a first step, so they can react quickly for future business demands.

 

 

Is SAP HANA a database, platform, appliance, or cloud?

 

SAP HANA was designed to be a truly modern database platform, and as a result the answer is: all of the above. A modern database should be a database, platform and be available on-premise or in the cloud.

 

SAP has a large installed-base of on premise ERP customers, and the HANA platform supports their needs, especially the need for an enterprise-class database. Many of those customers are looking for an on-premise database to replace the traditional RDBMS.

 

The demanding needs of an in-memory database mean that SAP elected to sell SAP HANA as an appliance, and it comes pre-packaged by the major hardware vendors as a result.

 

However the future of business is moving into the cloud, and SAP HANA is available as Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) with HANA Cloud Platform and Managed Cloud as a Service (McaaS) with secured HANA Enterprise Cloud and via 3rd party cloud vendors. Customers can also choose Hybrid deployment model that combines on premise and cloud. More details on this are available here.

 

 

How does SAP HANA compare to Oracle or IBM?

SAP HANA was designed to be a replacement to Oracle or IBM databases, either for net new installations or for existing customers. In most cases it is possible to move off those databases easily, and gain reporting performance benefits out of the box. Then it is possible to adapt the software to contain functions that were not possible in the past.

 

All three of the major RDBMS vendors have released in-memory add-ins to their databases in the last year. All of them support taking an additional copy of data in an in-memory cache, or in IBM’s case columnar tables.  All of them provide improved performance for custom data-marts.  But make no mistake; caching data has been around for a long time, while an in-memory database platform to run transactions and analytics together in the same instance is a new innovation.

 

Traditional database caching solutions are similar to the GM and Ford response to hybrid cars – take their existing technology and bolt new technology to it. SAP HANA is more akin to Tesla, who rebuilt the car from the ground up based on a new paradigm.

 

And so HANA’s capabilities from a business application perspective are 3 years ahead in technology from what others have.

 

 

How is HANA licensed?

SAP tried to keep licensing simple with HANA.

 

HANA is available in the Cloud as Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and as an application platform (AaaS), and it is possible to buy all those options now, on a monthly basis, from the SAP Website.

 

For on-premise customers, HANA is licensed in one of two major ways:

 

First, is as a proportion of your Software Application value (SAV), just like you can license other databases from SAP. This could be for your whole estate, or for a specific product like BPC.


Second, is by the unit, which is 64GB of RAM. There are a few editions of HANA, depending on your need, that bundle other software and allow more, or less, restrictive usage. The pricing is tiered, depending on the number of units you buy, and accretive.

 

In all cases, HANA licensing includes a lot of functionality that you would pay extra for in other databases. For example, Dev, Test, HA, DR licensing are always included. And if you buy HANA Enterprise, you have access to all functionality at no additional cost – including Predictive Libraries, Spatial, Graph, OLAP, Integration and Web. HANA contains a huge amount of functionality that would require 20-30 different SKUs from Oracle.

 

For those customers who need the base functionality of HANA but not the bells and whistles, there is now a HANA Base Edition, on which you can add other functionality as required, at a lower cost point.

 

FeatureHANA BaseHANA PlatformHANA Enterprise
Partitioning, Compression, Security
Calculation Engine, Aggregation Engine
XS Engine, River, SQLScript, HANA Studio
BRM, BFL
Smart Data Access
Predictive Analytics Library
Geospatial
Planning
Graph
Search, Text Analysis
System Landscape Transformation, Data Services

 

 

Part 2 – HANA Technology

How big can a SAP HANA database grow? Does it scale?

With current hardware, SAP HANA can scale up to 6TB for a single system, and can scale out to 112TB in a cluster, or more. There is no hard technical limit to the size of a HANA cluster. Higher configurations are tested and certified at customer sites.

 

We are currently working with 24TB single systems with SAP that we expect to see this year.

 

At Bluefin, we regularly work with 2-10TB of memory in a single HANA DB, and this is where we find most business cases make sense. Remember that a 10TB HANA appliance can store a vast amount of data (as much as 50-100TB from a traditional RDBMS due to HANA’s data compression capabilities); this could represent all the credit card transactions for a top 10 bank for 10 years or more.

 

In addition, we find that customers look to be more intelligent about how they tier data with an in-memory appliance. Once the HANA database grows past 2TB, it makes a lot of sense to use a cold store like Sybase IQ for slow-changing data.

 

 

Is SAP HANA a row- or column-oriented database?

SAP HANA stores data for processing primarily in columnar format.  But unlike other columnar databases, HANA’s columnar store was designed from the beginning to be efficient for all databases operations (reads, writes, updates). In practice, 99% of the database tables in SAP ERP are columnar tables, including transactional and master data tables.

 

HANA can also store data in row format, but this is primarily used to store configuration information and queues – only scenarios for which the column store is specifically not suited. With HANA, data is stored once, in its most granular form, and aggregated on request. There is no hybrid row/column store, no duplication or replication of data between row and column stores – HANA stores the data in the column store only.

 

 

Does SAP HANA require indexes or aggregates?

Every column in SAP HANA is stored as an index, and therefore HANA has no need for separate primary indexes. Secondary indexes with multiple columns are possible and used for OLTP scenarios like the Business Suite. HANA will also self-generate helper indexes to ensure that multi-column joins are efficient.

 

It is almost never necessary to aggregate data in HANA in advance because HANA calculates so quickly. HANA processes at 3bn scans/sec/core and 20m aggregations/sec/core which means 360bn scans/sec and 2.5bn aggregations/sec on a typical 120-core appliance. As a result it is much more efficient to calculate the information you require on demand.

 

 

Is SAP HANA a Big Data platform?

Yes, although HANA is best suited to high-value data, because it keeps data mostly in-memory. When Big Data is low value (e.g. web logs), HANA is very well suited as the store for high-value aggregated information and applications.  This could be an organization’s hot data, e.g., 4 months of financial information for quarterly reporting.  Other sources could be used to store additional data; for example SAP IQ could store 13 months of financial data for annual reporting (warm data) and Hadoop could store >10 years of financial data for seasonal and long term trend analysis (cool data).  Large volumes of data in both IQ and Hadoop can be analyzed in combination with data in HANA, so it is possible to process the data in HANA into full-text Google-style indexes without storing all the detail in HANA.

 

 

Is SAP HANA Enterprise Ready?

Yes. From its inception, HANA was intended to be a mission-critical database.

 

SAP HANA always stores a copy of data on disk for persistence, so if the power goes out, it will load data back into memory when power is restored (generally on-demand, but this is configurable). It stores logs so a very low Recovery Point Objective is possible.

 

HANA also has inbuilt capabilities to replicate the data to standby systems, so in a cluster, you can have High Availability and in any configuration you can have a cluster for Disaster Recovery and Fault Tolerance for business continuity. Disaster Recovery can be configured at the storage-level (depending on vendor) and also at the database level, which is called system replication.

 

It’s worth noting that most customers implement either HA or DR for HANA. It is exceptionally easy to setup (DR takes just a few clicks) and most customers that invest in HANA find business continuity is important to them.

 

SAP HANA also has interfaces for 3rd party backup and monitoring, like TSM or NetBackup. Solution Manager and SAP Landscape Virtualization Management are supported if you’re an SAP shop.

 

 

What is the SAP HANA release schedule?

SAP HANA was designed to be “timeless software”, meaning that any revision can be updated to any other revision with no disruption. It is possible to update from any revision of HANA to any other, with very few restrictions.

 

Every 6 months there is a major release of HANA, called a Service Pack. Service Pack 8, or SPS08, was released in June 2014. SPS09 is expected in November 2014. These contain new features and major updates, and SAP HANA continues to be developed.

 

Each SPS gets a number of updates, or revisions, and these contain fixes and performance improvements only, as you would expect in enterprise software. Typically these are released every 2-6 weeks, based on demand. As HANA matures, we have seen fewer revisions per SPS.

 

In addition, there are maintenance releases of SAP HANA for the previous SPS for an additional 6 months, to allow customers to apply critical fixes whilst planning an update to the latest SPS. The maintenance releases contain only critical bug fixes.

 

 

What happens if the power goes out?

SAP HANA is a completely ACID-compliant database which is designed to have a low Recovery Point Objective (RPO). HANA writes savepoints to disk at frequent intervals, which contain a snapshot of what is in memory. In-between savepoints, HANA saves a log of each database change to a fast flash disk.

 

If the power goes out, HANA loads the last savepoint and then plays the logs back, to ensure consistency.

 

 

What hardware does SAP HANA run on?

HANA appliances must be certified and come either as pre-built appliances from your vendor of choice or as a custom build using your storage and networks “Tailored Datacenter Integration” or TDI.

 

SAP maintain a list of certified hardware platforms which currently includes Cisco, Dell, Fujitsu, Hitachi, HP, Huawei, IBM (Lenovo), NEC and SGI, and is being extended all the time. Note that this list only contains the new “Ivy Bridge” appliances and not the older “Westmere” appliances.

 

The exact hardware and storage configuration varies depending on a vendor. Some use servers and other use blades, some used a SAN storage network whilst IBM uses local storage with the GPFS distributed file system. In our experience, all these variants work very well.

 

In addition you can buy HANA in the cloud from Amazon, SAP and various other outsource partners like T-Systems or EMC. In this case, you can either pay a monthly subscription fee including license, or use an existing Enterprise license “Bring Your Own License”.

 

 

What Operating Systems does SAP HANA run on?

For Intel x86, both SUSE Linux and RedHat Linux are now supported options. Both have a SAP-specific installer that configures Linux correctly for SAP HANA out the box.

 

For IBM POWER, the SUSE Linux operating system will be supported. At this time it does not look like SAP will support AIX.

 

What development software does SAP HANA use?

 

SAP HANA has two primary development environments. The main desktop software is called HANA Studio, which is based on Apache Eclipse. HANA Studio allows for administration and development in a single interface, which is extremely effective. It is possible to create entire developments in HANA Studio, which provides application lifecycle management and development capabilities for all HANA artifacts – from data model through to stored procedures through to web application code.

 

There is also a web editor and administration panel based on Apache Orion, which continues to be developed and is a useful addition. We expect to see convergence of these two tools in the future, to allow choice for cloud developers in particular.

 

Lifecycle management is entirely managed within a Web application within the XS Application Server.

 

 

What client software does SAP HANA support?

 

SAP HANA has a wide range of interfaces. SAP’s own BI Suite, Lumira, Design Studio and Analysis for Office software all have native HANA connectors. Likewise, many third party applications like Tableau, Qlik and MicroStrategy all have HANA connectors.

 

SAP HANA has open standards support for ODBC, JDBC, ODBO and MDX as well as a raw SQL client, hdbsql. In addition, there are Python libraries for HANA.

 

Integration support is possible using the XS Engine for OData and Server Side JavaScript.

 

In addition, ETL software like Data Services and Informatica is supported, as well as System Landscape Transformation (SLT) and Sybase Replication Server (SRS) for real-time replication.

 

 

What language was SAP HANA written in?

The majority of the SAP HANA software stack was written in C++. In fact, when you compile SAP HANA objects, they do in turn become C++ code, which is one of the reasons why HANA is so fast. The Predictive Analysis Library and Business Function Libraries are also written in a HANA-specific variant of C++ called L-Language, which provides memory protection.

 

Certain optimizations have been made using C and machine code, which is common for many databases. In addition, a lot of the tooling for HANA was written in Python, for ease of writing and adaption.

 

Since SAP HANA contains a web server, a lot of code is now written in HTML and Server-Side JavaScript – including the SAPUI5 library and Apache Orion-based Web Editors.

 

 

What happens if my data exceeds my memory size? Can I control data temperature?

HANA always stores data on disk and loads parts of database tables on demand into RAM. When RAM is exhausted, HANA will drop out parts of database tables that were least recently used.

 

In addition, the Smart Data Access data virtualization layer allows you to access data in any other database, like Sybase IQ or even Oracle and transparently access it like any other data in HANA. This helps improve the TCO of HANA, and simplifies your IT landscape by reducing the amount of data copied, transformed and moved around the enterprise.

 

In a future release of SAP HANA, we expect to see a transparent disk store, where warm, lower value data can be stored at a lower TCO. This is called dynamic tiering.

 

It’s worth noting that HANA and Hadoop are great friends – you can store documents and web logs in Hadoop and then store aggregated information in HANA for super-fast analysis. Need to add a new measure? Run a batch job in Hadoop from HANA to populate it.

 

 

What's coming next for SAP HANA?

 

The HANA SPS08 release was all about enterprise readiness and stability and there were relatively few new features. In SPS09 we see this changing once more and it looks like there will be lots of new functionality that customers will find useful. These are the themes we expect:

 

  • Support for more hardware platforms (IBM Power, maybe Intel E5) with fewer restrictions on components, plus larger hardware platforms (16- and 32-socket), and multi-tenancy.
  • A built-in disk-based store that supports dynamic data tiering for warm data, to reduce TCO.
  • The start of integration for event processing and ETL, and code push-down, and HANA Studio Integration.
  • Increased support for Hadoop and HDFS access.
  • Improvements to system replication, backup and system copies for Enterprise scenarios.

 

 

Where can I find more detail?

I have taken the time to curate a page on SAP’s Community Network SCN “SAP HANA – a guide to Documentation and Education”, which contains numerous links to other resources. If you’d like to know more about SAP HANA, then this is a great place to start.  SAPHANA.com is also a wonderful resource to find out more about HANA. You can start using HANA today with free SAP HANA developer edition.

 

Notes and sources

Some of this information came from meetings and interviews with the key HANA friends at SAP – Hasso Plattner, Vishal Sikka, Franz Färber, Mike Eacrett, Steffen Sigg, Carsten Nitschke, and many others.

 

Special thanks to Steve Lucas for his efforts collaborating on this FAQ with me. Steve lives and breathes HANA and having his input into this FAQ is awesome! Also, thank you to Mike Prosceno and Amit Sinha for their editorial assistance. All the good stuff in this piece is theirs, and the mistakes are all mine.

 

As an end-note, the questions in this FAQ were compiled from two primary locations - articles and comments on existing HANA sources, and conversations with customers. If you think there are questions missing - please go ahead and ask them in the comments!

Talking to my customers and prospects, many of them have understood that, at its core, SAP HANA is a very fast database. But many of them ask, “OK, so HANA’s fast, but why do I need that speed? How does it help my business?” Let me answer that question by telling you a short story based on my life before I joined SAP.

 

When I finished school, I was a driver and assistant to a team of emergency doctors.. As the word “emergency” implies, getting to an urgent medical situation as fast as possible had a very direct benefit – it helped save lives.  A person who suffers a serious heart attack has about 4 minutes until their brain cells start to die. It’s obvious how speed of response comes into play into this situation. Unless you can get to a heart attack victim in a matter of minutes that person’s chance of a full recovery – or even survival – is at risk.

 

For an ambulance driver, the speed with which I could get to an accident or heart attack victim not only critical to the patient, it was critical to how my efficiency as a driver was judged. I know that this is an extreme example but I tell this story to many of my customers to help them understand that speed isn’t just a nice-to-have, for many businesses and business functions, it has become a must-have.

 

Let’s  look at an example of this from my business life.

 

Slide10.jpg

I like this slide because it very nicely illustrates a typical process flow in finance. You can see the difference that being able to do a deep-dive on the data and perform a root cause analysis makes. There is real money that can be saved by understanding the issues hidden in the data that could not be seen until all of the available data – both transactional [OLTP] and analytical [OLAP] - was able to be analyzed. 

 

 

Slide04.jpg

This slide was shown by Hasso Plattner during his Keynote at SAPPHIRE 2014 in Orlando. It illustrates just how far we have come in terms of IT infrastructure over the last 20 years. For example, the fastest available processors have gone from 4 Cores with 1GB Memory to 288 Cores with 24TB of Memory. This allows us to do things radically differently. But, using a motor racing analogy, a racecar is not just a road car with a very fast engine. The racecar and all of its individual components have to be designed from the ground up to deliver the speed required to win races and championships.

That is what we have done with SAP HANA - it been designed from the ground up to be fundamentally different and speed is a by-product of that design  But SAP HANA is about more than just speed. HANA’s ability to process and analyze both transactional and historical data in volumes and at speeds greater than ever before challenges the way that many companies have traditionally approached business analytics.

So, as my ambulance story shows, speed is good especially when it enables deeper and faster data analysis. In the next part of my blog, I’ll discuss some more ideas on how SAP HANA enables companies to think about analytics in different ways especially when they try to incorporate Big Data with their traditional, internal data.

Twitter: @cochesdiez

spirit.jpgToday aircraft safety holds paramount significance.  One of the world’s largest commercial, military, and business jet corporations, Spirit AeroSystems, manufactures composite and metallic aircraft fuselages, propulsion systems, and wing structures.  When Spirit split from Boeing in 2005, its customer base grew substantially, resulting in a $41 billion backlog of orders to fill. To address the data logjam, Spirit chose SAP HANA. Now the company has a single source of truth, in real time, taking the burden off of employees for manual reporting. The result is that each manufactured aircraft fuselage is created efficiently and properly, saving time waiting at airports, less planes held back for maintenance, and happier, more confident flyers.


Read the full story on Spirit's engagement with SAP here: Fly with Confidence | ZDNet

 

Connect with me on Twitter @CMDonato, LinkedIn, and on Google+.

It is amazing to me how little the benefits of the Suite on HANA are understood or even known in general and by the members of the Americas' SAP Users' Group specifically. The sERP system with reincorporated components like CRM, SRM and SCM is in my mind a bigger step forward than the introduction of R/3 22 years ago. Why is it so difficult to communicate the benefits? Let me try to find an explanation and reiterate the long list of benefits.


First, the HANA database is attacking the established leaders in the database market: Oracle, Microsoft, IBM and Teradata. That is a pretty bold move and a lot of people might fear to bet on the newcomer. The fact that all of them are following SAP with in-memory database features themselves should give you comfort and verify that SAP HANA points in the right direction.


Second, speed doesn’t seem to be a value in itself but this is not true. No customer of a SAP ERP, CRM, SCM, etc system would accept if performance degrades after a maintenance cycle by as little as 10%. So, an increase of a factor two in OLTP and a factor 10 to 100 in OLAP shouldn’t be more than welcome?!? Speed is the number one reason for business processes to be supported by information technology. Just to name a few:

  • earlier period closing
  • better forecasting
  • simulation of results or organizational changes,
  • better insights into customer behaviour,
  • real time sales and costs analysis in any fashion,
  • flexibility to simulate organizational structures within the system,
  • better service level in the customer facing applications


The speed allows us to simplify not only how we run the applications but how we use them.


Third, SAP was the trusted advisor to their customers. Now, as an innovator, this should not work any more?!?

The changes in sERP are so radical and against the grain of most business and technology publications of the past that only a few of the incumbents have managed to change their views. R/3 was similarly radical, but a worldwide movement towards client-server paved the way. Today many say, you can do without these changes and be happy. I hope I can demonstrate that staying put is missing a huge opportunity. You have to understand the architectural changes in the applications and the impact on the business, which are now possible with an in-memory platform.


Fourth, are you afraid of too much concentration in one system?

To unify ERP, CRM, SCM, SRM, PLM and connect them with the new network applications for purchasing, workforce management, transportation, travel, etc or generic on-demand applications like office, sales support, recruiting, etc leads to a massive simplification in both IT and business. The unified system is much less complicated, no aggregates updated transactionally, no redundant data to be managed, no unnecessary data transfers between applications, a much smaller data footprint allows for an unbelievable TCO reduction and a never before experienced integration of vital business functions.


Fifth, was everything wrong in the past?

No, most was necessary to achieve a decent performance, but now we can radically simplify the architecture of the systems and get as a free benefit a new flexibility to change and extend the functionality of the enterprise applications nearly on the fly. We remove a large part of the code and reduce the maintenance efforts for the customer. The early adopters will have the advantage to move ahead of the competition.


Sixth, nobody believed that SAP can reprogram large parts of the systems without disrupting the business.

Not unlike R/3, where we kept the R/2 functionality intact, sERP keeps the current functionality, the data, and the configuration largely unchanged. Most of the applications are completely read-only and therefore we can have new functionality in parallel without any risk of impacting the data integrity. The current users of SAP Business systems needed a clear path into the future and a disruptive approach wasn’t the option. The transition of the BW to BW-on-HANA is an excellent example for this strategy. Unfortunately, the name HANA was interpreted as "High-Performance Analytic Appliance" by SAP. This is simply wrong because HANA is platform for applications including an in-memory database with tools and a set of application libraries. This platform allows for a superior OLTP and OLAP system, especially if they run together in one instant. 


Seventh, for capacity reasons we introduced redundant systems like SAP BW. What’s the strategy now?

With read-only replications of all data in memory, we can provide this capacity on the original transactional data without any delay. Large parts of a data warehouse can now move back to the transactional system or its copy. Again this is a simplification without increasing the data footprint and only possible with HANA. This doesn't make the idea of data warehouses obsolete.


Eighth, users complain about the complexity of SAP applications.

Only with HANA, SAP found a way to really simplify the architecture dramatically. The fundamental requirements for on-demand software in the cloud where helpful to create the energy-level necessary to start the massive rebuild. Some aspects of the new architecture are also applicable to non-HANA based on premise systems.


Ninth, this all doesn’t address the main problem of a complicated and outdated user interface. 

The dramatically improved response times help instantly. More importantly, the speed allows us to redesign most of the transactions and reports using new UI technologies like SAP FIORI for screens and Lumira for visualizations. The new user interface comes completely in parallel to avoid instant retraining of thousands of users. Everything new comes for the mobile phone, the tablet, the laptop, and the desktop simultaneously.


Tenth, this sounds well, but the cost are too high and there is not enough know how available.

The real cost savings are in the cloud, where hardware and operational costs can be managed much more efficiently, but even on premise, managed by SAP, with the right hardware configuration, costs will be much lower than in current systems. Just take the data storage costs or the time for backups, or to manage the integration of separate system, the costs to optimize the infrastructure including the database and the efforts to maintain the system. They all will come down due to reduction in volume and complexity. Since the current functionality will be carried forward and all new functionality comes in parallel, the risk can be minimized. Any database administrator can be trained on HANA in a very short time. Most of the education is now available as Massive Open Online Courses (MOOCs). The current consulting partners have already retrained large numbers of their people and will continue to do so.


Eleventh, there are no new must have applications.

Only now breakthrough applications like predictive maintenance, applications using sensor data and unstructured text, are possible. We can build a comprehensive simulator for corporate performance or analyze the human genome & proteome for enabling personalized medicine. Actually a whole new division in SAP, the Co-Innovation group, is focussing on completely new applications developed together with our valued customers.


Twelfth, some companies claim that they can’t see the business case for HANA.

First you have to understand the potential of a HANA-based system, then you have to trust that the benefits are true. You have to identify the pain points, which could be resolved and see the new opportunities to improve the relationship with the customer. Look for the value of simplification, a prerequisite for the adoption of innovation in the future or revisit some of the great ideas, which failed in the past. Both might be possible now. 



I could continue on and on. A new book, coming out soon, will hopefully give you some insights, answer most of the questions and portray a future of enterprise computing. One thing I am sure about is, the bold ones, adopting the new world early, will benefit tremendously. And it’s not about a "good car" versus a "super car", it’s about replacing ocean liners by jets crossing the atlantic.

The number of SAP Business Suite on HANA customers continues to grow rapidly and with it the number of new customer success stories. In her new article titled Over 1200 Customers and Growing, Lauren Bonneau Managing Editor at Wellesley Information Services sheds new light on 5 new stories from Sapphire Now 2014 in Orlando.

 

While there are many factors that influence the rapid adoption of any new product in the market the following three were clearly evident in the article: clear and compelling business value, lower total cost of ownership (TCO) and ease of deployment compared to existing market alternatives.

 

Lower TCO

I'll start with the lower TCO factor as this is the one that the article zooms-in on during the intro. The lower TCO of Business Suite on HANA compared to classical databases is indeed a consistent theme referenced by customers and a strong driver behind the product’s fast adoption.  The below 5 factors are constantly cited by our customers as contributors to the low TCO:

Capture44.PNG

In the article, Unilever, Hidrovias and Pacific Drilling all talk about the simplified data footprints that they have experienced after moving to Business Suite on HANA while Mercedes AMG discusses how virtualization and Tailored Data Centre Integration allowed them to simplify the integration and operation of HANA within their own datacenter. Hidrovias do Brasil shared their cost analysis on how running Business Suite on HANA Enterprise Cloud will result in 70% savings over 5 years compared to on premise alternatives.For a more in-depth analysis of HANA’s lower TCO check out Forrester's Total Economic Impact study.

 

Business Value:

While lower TCO is a strong value proposition, it cannot by itself drive the rapid market adoption of a new product. There needs to be a clear and compelling business value. As is apparent from the stories and videos in the article, customers today look at Business Suite on HANA as a way to move towards becoming a Real-time Business.  In the case of Suzano that meant gaining real-time access to data in the general ledger, real-time reconciliation, and real time stock reassignment. In the case of Hidrovias do Brasil it was capturing real time logistics insights from their commodity transportation vessels in the waterways of Latin America. For Pacific Drilling it was the ability to capture real time insights form their deep water drilling vessels off the coast of Mexico, Brazil and Nigeria.

 

Beyond the immediate business value, all 5 customers saw Business Suite on HANA as an innovation foundation for the future of their business. Yoram Boradaty from Pacific Drilling summarized this nicely when he said “We see our move to SAP Business Suite powered by SAP HANA -as a ramp-up customer- as not just a best  approach to solve the problem of providing new insight to the business through real-time operational reporting and analytics, but it also positions the organization strategically for the future”.

 

Ease of deployment

The 3rd factor that was evident in the article was the ease of migration to HANA. Thanks to a strong and global partner ecosystem of certified Hardware and System Integrators,  Business Suite on HANA customers are able to go live with the solution within a short period of time and with no disruption to the business. In the case of Hidrovias that time was only 4 months and is expected to go down to 3 months for their future roll-outs. In the case of Suzano, the company only logged 6 hours of downtime related to HANA during the migration process while Mercedes AMG reported no disruption to the business after going live in April 2014 despite being the first company to go live with Business Suite on HANA in a virtualized productive environment.

 

Conclusion

Everyday now new customer stories, like the ones in this article, are emerging countering any remaining misperception about HANA lacking business value, being cost prohibitive or being too complex to implement. To the contrary, its for these very same reasons that customers are choosing HANA.

SAP is committed to continuously enhance the openness of SAP HANA providing its customers and developer community with ultimate amount of flexibility and choice. Openness helps customers achieve innovation, increase agility and lower TCO.  This year’s SAP HANA product news certainly reflects this commitment.

 

Since the beginning of the year, SAP has introduced a number of new hardware and software innovations across all layers of the SAP HANA technology stack, further extending SAP’s commitment to openness. Working closely with its partners, SAP has incorporated the latest technology advancements around microprocessors, memory, storage, networking, operating systems, and virtualization into the SAP HANA infrastructure.These innovations not only open up many new deployment choices for SAP HANA, they also deliver real business value to SAP customers through significant cost saving and dramatic performance improvements.

 

Here is a recap of the SAP HANA infrastructure innovations that we brought to the market this year:

 

  • On Feb 18th, Intel released a new generation of the Xeon family of microprocessors (E7 v2, also known as Ivy Bridge).  The new Ivy Bridge processors provide up to 50% more cores, have up to 4x times improved I/O bandwidth and can support up to 3x times more RAM than the previous generation of processors. SAP’s hardware partners were ready to deliver pre-certified SAP HANA Ivy Bridge appliances on day one of Intel’s announcement. Thanks to the increased clock speed of the cores and having twice the amount of available memory with new microprocessors, most customers have experienced dramatic (often doubled) performance improvements when running their applications on the new SAP HANA Ivy Bridge appliances. Larger memory sizes enable higher scale-up capabilities of single node systems.  For many customers, these higher scale-up capabilities eliminate the need to deploy scale-out configurations which can be more costly and more challenging to manage.


Today, there are more than 100 supported SAP HANA Ivy Bridge appliance configurations.  This number continues to grow with new configurations being introduced almost daily. Don’t forget to check the SAP Certified Appliance Hardware for SAP HANA  site for the latest, up-to-date list of supported HANA Ivy Bridge appliances.

 

  • Several important SAP HANA news announcements were made at this year’s Sapphire Conference:

 

 

SAP HANA on VMware provides new deployment architecture and allows for better utilization of the underlying hardware infrastructure by enabling multiple SAP HANA virtual machines to run on the same host appliance. Customers can also leverage VMware template cloning for fast and easy provisioning of new SAP HANA instances.  Zero downtime maintenance is enabled via VMware vMotion support, by allowing customers to dynamically move virtualized HANA instances to new HANA-certified hardware for scheduled maintenance. Finally, native HANA and VMware high availability capabilities can be combined for increased business continuity and to further streamline and optimize data center operations for customers who have adopted data center virtualization strategies.

 

 

SAP HANA on RHEL provides customers with an additional deployment choice. Customers that already run Red Hat in their IT landscape can now also deploy SAP HANA on Red Hat, thus leveraging existing skills and standardizing their business operations within the data center. The platform also provides military-grade security technology such as Security Enhanced Linux (SELinux), typically required by many companies in financial, government and other highly regulated sectors.

 

As of today, several SAP partners (e.g. Cisco, Dell, Fujitsu, IBM, and NEC) already have RHEL-based SAP HANA appliances available.  This list has been rapidly growing.  You can find the list of certified vendors and configuration information at SAP Certified Appliance Hardware for SAP HANA.

 


Many businesses today run their mission critical SAP applications on IBM Power Systems. SAP HANA on IBM Power provides our joint customers with a choice to preserve/extend their investments (in infrastructure, people, processes) on Power, while adopting next generation technology like SAP HANA to drive innovation in the future. This will further expand the deployment options for SAP HANA, providing customers who want to standardize on IBM Power with the option to deploy SAP HANA on their infrastructure platform of choice (when it becomes generally available).


An early adopter program that will allow customers to go live with single node, scale-up SAP Business Warehouse or ERP on SAP HANA on Power 7+ is planned for the end of November (in the context of HANA SPS09).

 

Also at Sapphire, SAP briefed its partners that it will further relax SAP HANA hardware requirements for non-production systems and extend SAP HANA appliance reference configurations for production systems.  SAP has made great progress on both of these topics since Sapphire. 

 

In case of SAP HANA non-production systems, hardware requirements have always been less stringent than the ones for production systems. For example, customers could deploy multiple workloads and consolidate their development and test systems by sizing the non-production systems to use a 2x times higher core-to-memory ratio than the one used for production systems.  Last month, SAP made additional steps to further relax hardware requirements and lower cost for SAP HANA non-production systems:

 

  • The low-end family of E7 processors (e.g. Westmere EX Xeon E7-x8xx or IvyBridge EX Xeon E7-x8xx V2) are now allowed in non-production systems and there are no more restrictions for core-to-memory ratio. For example, customers can over-provision and use as much main memory as they want if they are willing to accept the penalty of lower performance on their development/test systems.
  • The minimum storage-to-memory ratio requirement has been relaxed from 5x times to 2x times for non-production usage
  • Storage KPIs are no longer enforced in non-production systems.  For example, customers can use any local storage or shared storage with standard disks.

 

As for the SAP HANA production systems, several new reference configurations are now being approved for SAP HANA Ivy Bridge appliances:

 

  • The  list of supported Ivy Bridge configurations has been enhanced with additional HANA “T-shirt” sizes (such as the addition of 2 socket - 384 GB, 512GB; 4 socket – 768GB; 8 socket – 1.5TB for OLAP scale-up;  2 socket-512GB for OLAP scale out). New configurations with more granular memory sizes have also been added for OLTP loads, including 4 socket -3 TB, 6TB, and 16 socket configurations ranging from 128GB to 12TB).


These more granular memory options and larger memory sizes will not only provide more choices for SAP HANA customers, they will also provide significant cost savings by making it easier for customers to buy only as much memory as they need. In addition to this, the ability to scale-up to 12TB on a single node will enable many customers to remain on their single systems longer before they have to scale-out, reducing TCO.

 

  • SAP has also relaxed the storage requirements for single node, scale up systems.  The storage requirement, “The storage media itself needs to be protected from failures,” can now be achieved with one single SSD card in the system. This will further reduce the cost of SAP HANA entry-level systems.

 

In summary, the SAP HANA ship is steadily moving deeper into the sea of openness.  We are constantly adding new hardware and software choices that provide customers with increased flexibility and cost benefits when deploying SAP HANA into their Data Centers. Stay tuned for SAP news on the many new and exciting innovations that are coming very soon…

 

 

Related Readings:

 

Suite on HANA, Simple Finance, and a really cool explosion video as a bonus

 

Ever since returning from SAP SAPPHIRE in June, I wanted to write this blog. Reason being is that I got so excited to see the official announcement of both SAP Simple Finance and the HP Converged System 900 for SAP HANA pretty much in the same keynote. And why the excitement? Because it brings together

 

  • a tremendous additional business value proposition for Suite on HANA and
  • the largest scale-up HANA appliance with up to 12TB of RAM for truly mission-critical deployments like ERP and CRM.

 

And this, in my humble opinion, changes the cost-benefit ratio for migrating a customer’s SAP ERP system to HANA so drastically that we will see increasing HANA adoption in this key HANA segment. But what makes me so certain about this? Read on to find out!

 

 

No more aggravating aggregates

 

While SAP HANA has improved the traditional SAP Business Suite applications in many areas like Materials Requirements Planning (MRP) and Operational Reporting, improvements so far have been based on targeted optimizations of existing SAP programs for SAP HANA. With Simple Finance, SAP has for the first time fully rewritten and refactored a major Suite component in its entirety and has been able to take full advantage of the power of SAP HANA.

 

The original trick for Simple Finance was to remove aggregates and the materialized tables they are stored in. Pre-HANA, they were critical to increase system response time – but had the side effect of increasing system management overhead while decreasing business agility. With SAP HANA, these tricks are no longer needed since reporting against the line-item data is extremely responsive and the additional load is highly affordable.

 

Hasso Plattner explains this nicely in his recent blog, “The Impact of Aggregates,” which make for some interesting reading. He also touches on a situation that I have seen in the past: These oh so rare changes in organizational structures, aka YAR (Yet Another Reorg). So what happens in a traditional SAP Finance system in a management-driven reorganization? Here’s a sampling:

 

Reorgs entail changes in reporting relationships as well as product ownership. And while reporting relationships are simple lines in PPTs, they have very complex permutations when you look at compensation, bonus pools and cost allocations for overhead. It gets even more complex looking at the product/services side and accounting for cost of goods sold, sales and revenue, all represented in cost centers, profit centers or regular accounts as part of your company ledger and/or sub ledger. For the finance department to manage and report on all these data points in a responsive manner, aggregates for revenue-per-product-line or compensation-per-department, for example, were stored in materialized tables in traditional finance applications before HANA.

 

Now, when a reorg happens, several or all of these assignments of people, material cost, and revenue to cost/profit centers are changing. To complicate the matter, the active date of the changes may be in the past or future and/or the business would like to simulate the proposed changes before committing to them. And while SAP has built a process called Profit Center Reorganization to facilitate such reorgs, a quick look at this help page describing the necessary steps will give you an appreciation of the limitations and the prohibitive complexity of such a reorg.

 

Per Hasso Plattner, “the simplification is huge and makes the whole accounting system so much more flexible.” Having been part of re-org projects in the past, I could not agree more.

 

In short, SAP Simple Finance is a true game-changer for SAP customers and will make the business benefits of HANA highly tangible for business stakeholders. But how about the other two discussion points in every Suite on HANA conversation:

 

  • Sheer size of the ERP database, and
  • The mission critical nature of ERP systems requiring 24x7 availability?

 

It’s true, bigger is better!

 

Most SAP customers have started with moderate database sizes when first implementing SAP ERP. Over the years, with growing worldwide and functional footprint, acquisitions, plus years of historic data, ERP database sizes have grown into the double digit TB range. At Hewlett-Packard for example, one of the core SAP ERP production systems has the massive volume of 40TB.

 

While SAP HANA is able to scale-out linearly for analytic use cases and it is well understood how to build large landscapes based on smaller interconnected nodes, SAP does not yet support this for Business Suite on HANA. That means that the only possible and supported way to implement Suite on HANA is by leveraging a single scale-up system configuration. Let’s consider for a minute the HP example with 40TB of uncompressed ERP data. It is obvious, that this single system requires a massive amount of RAM, so let’s do some math:

 

 

HP’s ERP system size

40TB

Typical HANA compression ratio for ERP

5:1

Compressed HP ERP data volume

8TB

HANA Appliance RAM allocated for ERP data versus working RAM

50-50

Theoretical size of HP’s ERP on HANA system

16TB

 

 

As the calculation shows, HP IT will theoretically require a single HANA appliance with 16TB of RAM! Since such a large HANA appliance does not exist, the HP IT team is well underway with system archiving activities to tame the database volume and make a transition easier. So how much data will they have to shave off to fit into a single SAP HANA appliance?

 

SAP and HP co-innovation: the out-sized scale-up businesses need

 

In early 2013, SAP challenged HP to build the largest available scale-up HANA appliance for mission critical customer environments. HP’s mission critical server team took the challenge and the two companies teamed up for deep co-innovation. The result: In June 2014, after diligent joint engineering and testing, HP announced HP ConvergedSystem 900 for SAP HANA (CS900) during Bernd Leukert’s SAPPHIRE keynote.

 

The breakthrough CS900 HANA appliance, based on the latest Intel architecture with 16 XEON E7 (Ivy Bridge-EX) CPUs, is the only system in the market that provides 12TB of RAM. This compares to only 6TB of RAM by the next largest competitor system. And best of all, CS900 solves HP IT’s challenge of migrating their 40TB system into SAP HANA (after a little house-keeping as mentioned earlier).

 

While I am discussing the 12TB scale-up solution in this blog, the picture below shows other available configurations as well as the key common components. Please note that this is only the beginning of the product roadmap. There are more versions of CS900 to come.

 

CS900 family picture.jpg

Picture: System family for HP CS900 for SAP HANA

 

 

 

For when failure is not an option

 

Compare a company’s ERP system to the cardiovascular and nervous system of the human body. Failure of these systems leads to life-threatening situations frequently. So can an extended ERP system outage for a corporation. Therefore, any company considering moving their ERP system to a new platform has to be careful to ensure the new platform provides high availability and disaster recovery. To put it in other words: business continuity! So, how does HP CS900 stack up to this requirement?

 

The 12TB scale-up solution is built on Superdome 2 and ProLiant technologies which are based on legendary Tandem NonStop fault tolerant systems, originally built for banks and stock exchanges. The high fault tolerance of Tandem systems in the past as well as the HP CS900 for SAP HANA now, are based on high levels of RAS (Reliability, Availability and Serviceability) protecting data integrity and allowing for long periods of un-interrupted system uptime without failure. These high RAS levels are achieved by built-in system redundancies and complex system self-check and self-healing mechanisms – and are key for minimizing risk of system failure for a mission critical SAP ERP system. But unfortunately, even the best systems can fail due to individual components or outside forces. What then?

 

 

But what happens when your datacenter explodes?

 

SAP HANA has come a long way and since HANA SPS6 provides all required high availability and disaster recovery (HA/DR) capabilities for failing over single nodes or entire systems either locally or to secondary data centers. However, these failover scenarios require a certain level of human intervention when relying on native SAP HANA capabilities alone. I think this is the reason why expert opinions are split on the topic of HANA’s data center readiness. Here is my take:

 

SAP offers mature and flexible data replication technologies, however, failure detection and automatic resolution is available via complementary third party solutions from all major system vendors for their respective HANA offerings. Nevertheless, the capabilities and maturity of these offerings differ and the broad majority of vendors only offer storage based replication. The only two system based replication vendors, complementing and working on top of the above mentioned SAP replication technologies, are HP ServiceGuard and SUSE Cluster (in beta). They work directly on the in-memory layer within SAP HANA.

 

A full comparison between both approaches is beyond the scope of this blog but I hope I brought across the point that there is a large variety of HA/DR solutions available in the market, and a mission critical Business Suite on HANA can be operated in a highly risk-mitigated fashion. Additionally, the following table shows a high level comparison between the two major approaches for HA/DR.

 

 

 

Storage Replication

System Replication

Vendors

HP, IBM, Hitachi, CISCO, Dell, Fujitsu, NEC, VCE, Huawei, Lenovo (China only)

- HP ServiceGuard

- SUSE Linux Cluster (in beta)

Supported HANA use cases

- Scale-up

- Scale-out

- Scale-up

- Scale-out (only HP)

Replication strategies

- Synchronous

- Asynchronous

- Synchronous

- Asynchronous

Bandwidth requirements

Higher

-  Replication of partial transactions results in costly roll-backs when transaction is cancelled, e.g. due to failure

Lower

- No transmission of cancelled transactions; replication only after full commit

- Sync mode: only log files are transferred continuously with transaction commit, rest async driving lower bandwidth

Disaster recovery (*)

- Performance optimized:

- Cost optimized:

 

- Slow

- Slow

 

- Fast

- Medium

Openness

Hardware vendor dependent

Infrastructure agnostic

Additional capabilities

n/a

- Zero downtime management (aka NetWeaver connectivity suspend)

- Cascading multi-tier system replication (only HP)

Key roadmap capabilities

n/a

Active/Active Operation (read only reporting on secondary fail over site)

(*)

Performance optimized

- Secondary system completely used for the preparation of a possible take-over

- Resources used for data pre-load on secondary

- Take-overs and Performance Ramp shortened maximally

Cost optimized

- Operating non-prod systems on secondary

- Resources freed (no data pre-load) to be offered to one or more non-prod installations

- During take-over the non-prod operation has to be ended

- Take-over performance similar to cold start-up

 

 

 

A picture (or video) is worth a thousand words

 

The question remains, how do these large scale mission critical systems behave in a live disaster recovery situation? Let’s have a look at this video showing exactly such a fail over process from a primary to a secondary data center. Arguably, the circumstances are manmade but well, close enough to a real disaster situation in my opinion. But see for yourself please.

 

 

Explosion.JPG

Video: HP Disaster Proof Solutions

 

 

 

 

Closure

 

Cool stuff, no?

 

I hope you can see the business potential of SAP Business Suite powered by SAP HANA with solutions like SAP Simple Finance – as well as a very feasible, low-risk path toward implementing this solution within your company. I personally believe that this solution is game-changing. Not only does it improve existing company processes like finance or MRP greatly, it also offers vast opportunities for moving your company to an entirely new level. Here is one final example:

 

Also in June at SAPPHIRE, I met a mid-size German producer of industrial equipment (from the so called “Mittelstand”) which is migrating all its business systems to SAP HANA. This is all part of the company’s goal to transform itself from a product-centric company to a services-centric company – one that leases its products to customers and complements them with rich services focusing on predictive maintenance. Think the “Internet-of-Things” (IoT).

 

With SAP HANA and Business Suite on HANA on a platform for mission critical operations, every SAP customer has the opportunity to up his game, increase business, and IT efficiency and expand into entirely business and solution areas. And this is great!

 

Thanks for reading and I am looking forward to your comments. - Please also have a look at this technical brief for HP ConvergedSystem 900 for SAP HANA for finding out more.

 

Swen

 

 

PS I thought this SPECjbb2013-MultiJVM benchmark may also be worth a look …

 

PPS A word about myself: I have been part of the SAP HANA journey from the early days as an employee of SAP Labs in Palo Alto. I recently (re-) joined Hewlett-Packard as the SAP Chief Technologist within HP’s SAP Alliance organization and am glad that I am still working on this exciting topic

 

 

.

In recent quarters, SAP has announced Tailored Datacenter Integration (TDI) and Virtualized SAP HANA to provide production deployment flexibility and better adaptation of existing and established data center processes and infrastructure standards.  These recent innovations help SAP HANA customers to realize better return on investment (ROI) and reduce total cost of ownership (TCO).

 

To further lower TCO and accelerate already breathtaking adoption rate, SAP has recently relaxed SAP HANA infrastructure requirements for non-production usage, such as development and testing, in which performance is not focus of these tasks.   What does this actually mean to SAP HANA users including enterprise customers and partners?  It will further alleviate budget constraints faced by all enterprise customers today by enabling majority of commodity infrastructure components available for SAP HANA non-production usage, either in a virtualized or bare metal environment, without certification.  It will help SAP HANA customers to radically accelerate rate of innovation.

 

My colleagues Adolf Brosig, Zora Caklovic and I would like to highlight following:

 

  1. CPU: Starting with lower end of Intel E7 family of processors
    • Intel Ivy Bridge v2 CPU and related chip sets are now certified with SAP HANA, at clock speed of 2.8 GHz with turbo speed of 3.4 GHz, for production usage.  However, earlier generations of processors in Intel E7 family with substantially lower cost, such as Westmere EX (E7-x8xx) or Ivy Bridge EX (E7-x8xx V2), are now also available for non-production usage.   Customers now have much more flexibility and are able to re-use some existing infrastructure components for software development and testing with SAP HANA.
  2. Memory: Maximum main memory available in a host machine
    • With Intel Ivy Bridge v2, for production analytical workload, SAP has been recommending that the ratio between main memory and CPU needs to be 256GB per CPU.     This ratio is not going to be enforced in a non-production environment.  SAP HANA customers can use as much main memory as possible in a host machine for any development or testing tasks, in which performance is not focus of these tasks.
  3. Storage: Any storage including RAID 1 or higher on proven file systems
    • As performance KPIs are not enforced for non-production usage, SAP HANA users can use any local or shared storage with standard disks, with RAID 1 or higher is supported.  Minimum storage to memory ratio requirement is also relaxed from 5x for production to 2x or even less for non-production usage.  Also, any proven File System, such as XFS, NFS and GPFS (IBM only), can be used for log and data volumes in a non-production environment.
  4. Network: Any standard networking components
    • In a typical production environment, 10Gbit/s or even more powerful connections is required for performance optimization.  For a non-production environment, any commodity networking components can be used.   Customers will likely be able to re-use existing network infrastructure for these non-production tasks.

 

Relaxed Hardware Specifications for Non-Production final Sept 15.png

 

 

We are in an exciting phase of SAP HANA revolution.  Besides certified SAP HANA appliance, customers have much more flexibility with TDI and Virtualized environment for production usage and simplified infrastructure requirements for non-production usage.  All of these innovations will further lower TCO and accelerate SAP HANA adoption.

 

Disclaimer:  SAP HANA configuration for production environment will only be provided on certified appliance listed in reference section below.

 

Reference:

  1. SAP HANA Virtualized:  SAP Note 1788665
  2. Certified SAP HANA appliance configuration on Intel E7 V1 (Westmere) processors.
  3. Certified SAP HANA appliance configuration on Intel E7 v2 (Ivy Bridge) processors.
  4. Certified SAP HANA Enterprise Storage Hardware supported under the HANA Tailored Data Center Integration (TDI) program.
  5. Recommended Customer Price for Intel Xeon Processor E7 v2 Family
  6. Recommended Customer Price for Intel Xeon Processor E7 Family

News Archive

SAP HANA News Archive

Filter Blog

By author:
By date:
By tag:

Contact Us

SAP Experts are here to help.

Ask SAP HANA

Our website has a rich support section designed to help you get the highest quality answers quickly and easily. Please take a look at the options below to get you answers from the SAP experts.

If you have technical questions:

If you're looking for training materials or have training questions:

If you have certification questions:

Χ