Home > Blog > Blog > 2012 > August

I was asked today if I had "any view on whether the Operational Data Store (ODS) is a dedicated node or is a just a set of processes within a shared machine (RK: shared with EDW and data mart workloads) and prioritised using workload management?"


An ODS is an Operational Data Store which is sort of an intermediary between OLTP transactions and data warehouse queries to handle transactions that are too long-running for the OLTP system and too quick for an EDW. So if a single system could handle both OLTP and data warehouse transactions then there would be nothing to distinguish an ODS.


In a perfect scenario there would be one set of tables solving OLTP, ODS, data warehouse, and analytics queries. There would be no redundant data (except to solve for high availability). Figure 1 shows three workloads querying against one database instance on a single server or cluster.

In a nearly perfect scenario there would one set of processes on a shared platform with redundant data for each workload. Figure 2 represents three database instances, each with basically the same information, running on a single server or cluster. Note that it is often the case that each DBMS instance may run a different, specialized, DBMS product which is adding to the complexity.

In the real world these things tend to run on separate systems. Figure 3 depicts each workload on a separate server or cluster with separate database instances and with all of the costs associated with data movement between systems, floor space, power, and systems administration.



Blog20120829 Perfect World.png

I think that the near-perfect world is nearly achievable, but the workload management problem has not been solved to everyone’s satisfaction.

SAP HANA is designed to create a perfect scenario. In Release 1, engineering targeted data warehouse and data mart workloads. In subsequent releases we will support bigger and bigger data and OLTP. We already see less data redundancy and reduced schema complexity as aggregate tables, which hold derived redundant data, are eliminated.

It is an important distinction. No other DBMS is designed to support all of these workloads simultaneously. SAP HANA is very special in this regard.

According to the 2011 Forrester report on “Big Data Patterns for Enterprises” where they have highlighted the enterprise use case scenarios into five different categories for Hadoop based Infrastructure projects.

  1. Exploration and Machine Learning – ex: Social Media networking.
  2. Operational Prediction – Ex: Predictive Analysis.
  3. Dirty Operational Data store – Ex: Operational Reporting.
  4. Bulk operations and extreme ETL – Ex: Bulk Data Extractions.
  5. Machine generated data – Ex: Streaming real time events via Complex Event Processing.

In this blog, i will highlight the 4th category where the low cost scalable, reliable aspects of Hadoop along with batch processing based on Map-Reduce techniques are put to good use, and how SAP Real Time Data Platform (SAP HANA and SAP Data Services) play an important role to solve the complete end to end big data use case scenario. “Big Data” adoption is rapidly catching up with Enterprise Customers, as new Infrastructure tools are available from various software vendors on how to solve the big data problem to handle both structured and unstructured data with in their enterprise organizations. More and more customers are working on various internal big data projects bringing in their Business and IT teams together, where they are leveraging their existing IT infrastructure tools and investing on new tools to handle big data, and provide more insights into the Information data to their executives in the company, so that they are able to make day to day business decisions very quickly.

Big Data technology is designed to extract value economically from very large volumes of wide varieties of data by enabling high velocity capture, discovery, and analysis as shown in Fig 1. As the amount of information continues to explode, organizations are faced with new challenges for storing, managing, accessing, and analyzing very large volumes of both structured and unstructured data in a timely and cost-effective manner.


  Fig 1. Real time insights from a variety of data

In addition, the variety of data is changing enormously. According to Gartner, “Enterprise data will grow 650% over the next few years, with 80% of that data unstructured” – meaning that the data explosion spans traditional sources of structured information (such as point of sales, shipping records, etc.), as well as non-traditional sources (such as Web logs, social media, email, documents, etc.). Also, organizations today are facing new challenges in increasing the speed by which they process data and deliver information to users to ensure competitive advantage.

With SAP Real Time Data Platform, which includes: In-memory and real-time capabilities delivered by SAP HANA, deep integration with Sybase database products, EIM Solutions for data movement, data integration, data quality and data governance, for seamless modeling, universal orchestration, and a common metadata repository, enterprise customers today can leverage their existing IT infrastructure tools to integrate with Hadoop MapReduce and  Hive Infrastructures to solve their big data use case business scenarios.


  Fig-2. SAP Real Time Data Platform

When a customer starts exploring and working on big data projects, they have to consider the following costs at each layer of the integration steps.

  1. The costs to collect and store the data in a Hadoop infrastructure
  2. The costs to analyze and process the data via Map reduce techniques
  3. The costs to integrate with their existing enterprise infrastructure and consume the data


The biggest advantage that comes from Hadoop is the inexpensive scalable and robust storage where the files are stored across the file cluster. SAP Data Services delivers a Hadoop connector that provides high-performance reading from and loading into Hadoop. SAP Data Services identifies, extracts, structures and transforms the meaningful information from Hadoop/Hive and provisions the data to SAP HANA, Sybase IQ, or other data stores for deeper analysis. This allows for reliable, efficient and optimized real-time analysis across all enterprise information assets - structured or unstructured.


Fig3. Hadoop Enterprise Integration with SAP Data Services

SAP Data Services provides easy-to-use UI design paradigm for Hadoop/Hive components with a specific behind-the-scenes extensions, leverage the power, scale and unique functionality of Hadoop  to extract, load and text data processing and Hive Data Warehouse. SAP Data Services generates queries using Hive Query Language to push down operations and Hive converts queries to Map Reduce jobs against files in Hadoop. It later uses multiple threads to process data from Hive and load data in SAP HANA using high-performance, bulk loading mechanism as shown in Fig3. The push down logic is accomplished:

  • Through SQL like operations by utilizing Hive - add on and Pig scripts
  • Basic functions including aggregate functions
  • Transformations via text data processing

Enterprise customers can achieve the following benefits of gaining unprecedented insight for competitive advantage by analyzing unstructured data stored in Hadoop with structured data in EDW:

  1. Lower TCO for enterprise organizations that do not have in-house Hadoop developer expertise. With SAP Data Services, you can write a query that is pushed down into Hadoop and translated into MapReduce, so BI experts can discover information without having to program directly in the MapReduce programming language.
  2. Identify emerging trends, opportunities, or risks by processing text data stored in Hadoop. With SAP Data Services, text data processing can be pushed down into the Hadoop file system to extract relevant data hidden in mounds of files, Web logs, or surveys. This enables organizations to  move only relevant data to a data warehouse for deeper analysis with structured data for new, contextual insights.
  3. Increase confidence in decision making by cleansing data from Hadoop before storing in data warehouse.  With the data quality management functionality of SAP Data Services, organizations can clean, validate, and enrich data from Hadoop to ensure compliance with company standards. As a result, organizations gain confidence in the quality of data when making critical business decisions.
  4. Enable real-time analytics by rapidly uploading relevant data from Hadoop into high-performance data stores such as SAP HANA enabling organizations to glean new insights gleaned from mounds of unstructured data combined with billions of rows of transaction data to perform up-to-the-second sales forecasting, more efficient inventory management, etc.




In conclusion, Enterprise customers should evaluate their Technology & Business use-case scenarios to determine the most appropriate technology combination based on factors such as data, storage, cost, concurrency and latency considerations. Customers must also look into on how to Enable new business scenarios and applications from big data through actionable, real-time insights in its business process context, volume and velocity, combined with batch deep behavior and pattern recognition with reference to volume and variety (Fig-4.). SAP Real-time Data Platform provides a strong infrastructure foundation support with technology combination & deployment options for enterprise customers to handle their Big Data Analytics in real time in conjunction with Hadoop batch data processing.

It’s clear that mushrooming data growth, coupled with the decline in relative technology costs capable of managing and leveraging that information, has led many organizations to initiate (or consider initiating) Big Data analytics programs.

Large distributed retail networks generate huge amounts of rich data over time - sales, tender, money movements and inventory are the raw fuel for analytical analysis. The challenge is processing that level of information as close to real-time as possible for the benefit of the business.

With the advent of Big Data we are seeing a lot of customers get excited about NO (Not Only) SQL databases like Casandra, Mongo DB etc. One of the key value propositions of these new kinds of DB’s is its ability to run on commodity hardware plus the flexibility to not restrict only to SQL programming.

With Hadoop, users can load and retrieve data using general purpose programming languages like Java and this opens up a much wider developer audience. Key advantages of using Hadoop are


  • Data is distributed across several nodes, this provides the ability to have the computation being processed by the local node which has the data.
  • Easy to handle partial failures.
  • More commonly used programming models for map reduce tasks.

Although most of the points associated with Hadoop are positive there are few scenarios like delivering real-time reporting which is something Hadoop cannot deliver by default. There are few instances where this might be the case but in general Hadoop is strong on batch reporting where delta loads are done in Enterprise Data Warehouses trying once or twice a day only.

These are scenarios where a solution like SAP HANA with its real time data replication support using SLT or Sybase replication (roadmap) can deliver to companies when trying to address analytic needs as and when new data is loaded thru their transaction system.  Some of these scenarios that support this use case are


  • Track the effect of marketing initiatives close to real-time.
  • Inventory levels and margins are tracked much faster, enabling profit maximizing decisions.
  • Fraud detection, especially around the use of credit cards, can be more reactive.

Customers like T-mobile, Honeywell have taken advantage of these real-time analytics scenarios to get ahead of their competition by delivering a much more agile experience to their end customers.

MKI is one customer who is using this powerful combination of SAP HANA and Hadoop to reduce the time it takes to perform Genome analysis when treating cancer patients to compare genome data between healthy individuals and affected patients.

By using SAP HANA as the mission-critical and reliable genome data platform it becomes possible for MKI to deliver advanced medical treatment by reducing the time in the following scenarios

  • Case analysis:  comprises of Fragment Extraction, High Speed Entry, and Genome DNA extraction. First, all cases are collected and the data is preprocessed using Hadoop (fragment extraction).  After that, HANA is used to do fast data analysis to find the patterns in the genome fragments and find the relationship between genome and the case.
  • Data Consumption: With the genome fragment library and the relationship to the cases in place a doctor can collect a patient’s genome and send it to the system to compare the genome fragment.  Based on the knowledge library, the doctor can recommend most appropriate treatment for the patient.
  • Case study: The new clinic case will be sent to the researcher to do further study which can improve the correctness of the knowledge library.


Another use case where Hadoop and SAP HANA were used in conjunction to create value based on the criticality of data is from Cognylitics. This application was used to enable financial institutions to monitor the risk and performance of their portfolios, streamline operations, help ensure regulatory compliance in real time to enhance profitability.

This solution stores the mortgage information of consumers in Hadoop and applies complex predictive algorithms on this data to identify most probable delinquent mortgages and runs these algorithms (Business Function Libraries) inside the predictive engine of SAP HANA. The ability to combine customers corporate data with unstructured data coming from social media provides an unthinkable intelligence to the analytics that can be collected out of this combination.


Some of the technologies that have enabled this combination for SAP HANA and Hadoop to work effectively are


  • SAP BusinessObjects Data Services 4.1 – provides the ability to load data from Hadoop into SAP HANA.
  • SAP BusinessObjects BI4.x – can use the Information Design Tool to connect to SAP HANA and Hadoop and create a multi-source universe (Semantic Layer) and report using any of the BI4.x clients.


As an acknowledgment to this success companies like Cisco and EMC have partnered with SAP to create big data real time analytics package which will enable companies to tame the huge amounts of data in their enterprise and make informative decisions out of it.

Recently, a lot of folks have been asking about where to find SAP's Roadmaps for HANA and our Data Warehouse portfolio.


Firstly, all roadmaps for SAP are available on service marketplace at http://service.sap.com/roadmap. This site is available to all customers and partners that have access to our software via their service marketplace logon. This site is restricted to customers and partners for legal reasons.


The direct links to the roadaps are available below:



However, there has been a lot of debate in the community around using HANA vs BW on HANA. Firstly, Steve has written a blog called "Does HANA Replace BW - Part 1". Then Jon Appleby weighed in with "Does HANA replace BW - Part 2". Sven Jensen then wrote two blogs about BW on HANA: BW as an Enterprise Data Warehouse as well as Building the Business Case for BW on HANA.


Now as you look at these blogs, you'll see that there's a lot of debate of whether HANA will replace BW and what people should do. This blog as well as the associated document below describes where we're going with Data Warehousing.


Fundamentally, BW is a management framework to manage data. Companies can build point data marts. Now the world has largely evolved from just Enterprise BI to supporting other Analytical Use Cases such as Adhoc Analysis, Self Service BI, Operational Analytics, Real-time Analytics, Social Media Analytics, or Big Data Analytics. Even including analytical use cases that are driven by governance and logs, such as Query-able Archive Analytics.


As these analytical capabilities evolve, we find that the best place to store "High Value" Analytical data is HANA. However, as you have multiple data-marts, you will typically evolve to requiring a Data Warehouse approach. Fundamentally, a data warehouse is a methodology to manage multiple data marts. Common types of data warehouses include a "Central Data Warehouse" or a "Logical Data Warehouse". Each approach has benefits, and BW today, a central data warehouse that provides capabilities to manage multiple types of datamarts, and HANA fundamentally broadens the types of data marts you could support as it evolves to the Real-Time Data Platform, as new types of analytics are driven by new types of data (events, signals, etc...). For examples of new types of data driving analytics today, see Steve's blog entitled "Beyond the Balance Sheet: Run Your Business on New Signals in the Age of Big Data".


As SAP has created a Centralized Data Warehouse approach with SAP BW, that is not the only type of Data Warehouse capability a customer may want, and SAP also supports custom building data warehouses for customers who don't prescribe to the framework we've provided with BW.


The following document addresses SAP's approach and strategic direction in Data Warehousing. First, starting with the types of data marts our platform is evolving to support, followed by the challenges of point data marts and the advantages of adopting a Centralized or Logical Data Warehouse. SAP's strategy is to allow customer choice whether the customer wants to adopt BW as their Data Warehouse or whether customers want to build their own.


Finally, SAP is evolving the point technology to the "Real-Time Data Platform" as a single platform for running BW or Custom Building Warehouses.

See this document for details on the strategic direction for data warehousing at SAP.


For details on the roadmap for each product, see service marketplace here: http://service.sap.com/roadmap


I imagine the debate will continue as there are always folks that believe one product or approach is superior than another. Usually, this comes from background and familiarity with a particular product or approach. In order to meet customer requirements, SAP's strategy is open to customer choice for Buying BW on HANA or leveraging the Real-Time Data Platform to Build your own warehouse. Either way, I imagine the debate will continue. This is largely because customer requirements will dictate the approach that is required to meet the needs.




Scott Shepard


Posted by Scott Shepard Aug 24, 2012

I am happy to announce the launch of the new SAP NetWeaver BW on HANA FAQ space in the SAP Experience SAP HANA community.


The purpose of this knowledge repository (FAQ) is to provide

  • a central place to SAP NetWeaver BW on HANA links with detailed information and the possibility to directly engage with the subject matter experts in their individual linked-in blogs
  • a platform for the SAP NetWeaver BW on HANA community, where all customers, prospect and partners can actively engage, ask questions, answer questions and provide feedback and comments.


SAP has made significant investments into SAP NetWeaver BW on HANA to protect existing SAP BW customer investments and to innovate one of SAP’s most successful solutions.





SAP NetWeaver BW on HANA is the first existing SAP solution completely enhanced to take advantage of SAP HANA In-Memory technology.  SAP provides a roadmap for all SAP BW customers to move their BW assets from a traditional relational database (RDBMS) to the SAP In-Memory HANA platform.


SAP NetWeaver BW on HANA is the best SAP NetWeaver BW release ever. We invite the SAP NetWeaver BW on HANA community to explore the new capabilities and to engage with their peers to discuss with each other all aspects of the solution.


The BW on HANA FAQ is embedded on the Experience SAP HANA - BW on HANA landing page:

The space can also be accessed directly: Space: BW on HANA FAQ | Experience SAP HANA


Scott Shepard is leading the Solutions Management team for the areas focusing on SAP HANA, SAP BW, and SAP BWA. Scott Shepard, Daniel Rutschmann, and Erich Schneider have created this repository of information so it is easily consumable for the SAP Ecosystem.

In circa 2012 the Income Statement and Balance Sheet continue to be two central sources of information needed to assess a company’s financial health – and, in some cases used to judge how best to run it. Interestingly the present day balance sheet, a company’s primary statement of financial position, received its modern form in 1868. The income statement we use today, which is a company's financial statement that indicates how revenue is transformed into net income, was defined in the 1930’s.


It is not my intention to dwell on the riveting topic of the history of modern accounting! Here is the point: The signals of health in a business we have been trained to look for haven’t changed for a century or more, yet the amount of information available today that could indicate the relative health of a business is radically different. It’s sad to say, but Ebenezer Scrooge would be as adept in a board meeting today as the day that Charles Dickens conjured him up in 1843!


The amount of information available today is indeed tremendous, and it continues to grow exponentially. We all know this expanding universe via the term “Big Data”, which represents the aggregate of global data from private companies, government entities, social networks and the like. It’s grown to the point where we use the term “zettabyte” to describe it! (That’s a 1 with 21 zeroes after it). The issue is not just that we are dealing with copious quantities of data; we are also, dealing with data that has incredible variety – text, multimedia, social media, etc.


Ironically, with all this information at our disposal, we continue to use the same old lagging signals, derived from concepts like a Balance Sheet and Income statement as the key indicators of a business’s health! This happens not only quarterly and yearly intervals, it also permeates in our operational KPIs such as “revenue by region”, “cost per unit” and provides a rear view analysis of the business to drive our business in the present and future. Does it not seem incongruous that while  the “noise” surrounding a business has increased millions of times over during the past century, the signals of health we look for in the board room haven’t changed at all?


While I am not proposing that customer sentiment on Twitter be added to an organization’s SEC filings, is it that crazy to consider utilizing that information to actively run the business? With all this new “noise” (e.g., constant generation of Big Data) shouldn’t we be grooming our companies to constantly scan for “new signals” that could be powerful indicators of future performance? My previous point on Twitter sentiment is a great example: Would you say that every manager in your business is aware, every day, of whether your customers are feeling positive or negative about your company? It may not be as critical to your balance sheet as inventory on hand, but it certainly has an impact.


Understanding and identifying new signals in all this Big Data noise within the context of the global digital economy is critical to any organization’s success. Fretting over the size of data is irrelevant, but rather we should occupy our time focusing on understanding what all this data is actually telling us!  Some of us fret because we are concerned about our abilities to deal with the volume, velocity and variety of the data that we are being hit with. Well, there is good news on that front as well – we do have ways now to address this quite elegantly. Before I get into that, let us take a look at an example of a new signal.


So what are the “new signals”?

These range from the hopefully obvious, such as social sentiment analysis (what your customers are saying about you or a newly launched product), to the not-so-obvious examples like using NOAA (National Oceanic and Atmospheric Administration) data to predict shipping delays. And it includes signals that companies have been aware of but have proven difficult to measure, such as proactively reducing fulfillment gaps with real-time supply chain visibility and by predicting customer demand or improving the ratio of successful insurance claims audits by automating fraud detection. These signals vary by industry and are often a leading indicator of business performance.


In keeping with my earlier line of thought, let us consider a financial example. The Income Statement and Balance Sheet are at best rear-view measures of the top line and bottom line.  They provide a snapshot in time of all that has happened, but very little, if any, indication of what is happening in the enterprise. For example, an online retailer with a subscription model experiences a massive drop in stock price because of poor Income Statement results. Further analysis indicates that there was a dramatic drop in subscribers (Churn) in one of their most profitable segments. It would stand to reason that a pre-emptive pulse-check on the churn could have helped stem the bleeding and perhaps prevented the reaction on Wall Street. This churn is an example of a new signal that could have helped this one enterprise run its business more proactively rather than look for explanations with a rear-view perspective. So how does a company gain the ability to know midstream that it has a problem or there is an opportunity going unaddressed? How will a company gain the intelligent actionable insights to make this happen? What is needed is the ability to ask complex questions going across the volume and variety of data in an interactive manner, and most importantly in real-time.  How best to accomplish this? 


Every company needs a real time data platform

We are so conditioned to the heavy construction, slow paced mode of gathering information and waiting days, weeks or months for anything meaningful to come back that we fail to look for anything better out there. The truth is that something better already exists – a real-time data platform – that can ingest the ever growing data elements that influence your business and compare them with traditional measures from the balance sheet and income statement, giving insight now vs. months down the road. This real-time data platform is based on SAP HANA and has extreme capabilities to collect, move, store and process massive amounts of data. It’s the latest innovation in storing information from any source and discovering new trends, patterns and behaviors in real time…all critical to gaining competitive advantage.


It is a revolutionary platform that’s best suited for performing real-time analytics, and developing and deploying real-time applications. Leveraging its unique ability to handle both transactional and analytical workloads fully in-memory, it delivers blazing fast results in real-time to your queries performed on live raw data. In other words, you can be reporting on and pondering on business events as they occur. No matter what new signal you need to focus on, you gain the ability with SAP HANA to zoom in on it in real-time thus enabling you to take pre-emptive action to stem the flow of negative influences and step up the actions that will bring a positive impact to your bottom line!



Every company needs a data scientist

I point out the capabilities of SAP HANA to illustrate the point that we no longer need to feel inadequate in the arena of measuring new signals. To be successful in the long run, we also need to make the discovery, definition, and deployment of these measures part of a new data-driven culture within the organization. Every company needs a data scientist. Companies on Wall Street have had these for decades. They just call them “quants” (Quantitative Analysis Experts). These are people that comb endless streams of information and look for NEW information, looking for trends, or signals, that could have an impact on a stock, bond or other trading market that would create advantage for the firm. So my question for you is: Who is your “Quant”, or, if you prefer a grander title, who is your Chief Data Scientist? Without this critical thinker in an organization, I fear that firm will be relevant to Mr. Scrooge a hundred years from now!



In order to run a 21st century business, it is imperative to pay attention to the many non-traditional signals that reflect your organization’s health. Do not be inhibited by the ferocity or variety with which data is proliferating – instead, focus on how best to tune into those new signals. Appoint your “quant” or Chief Data Scientist and leverage a real-time data platform, and you’ll lead your company beyond the balance sheet!

I often talk about the unprecedented success of HANA in terms of customers, revenues and new frontiers, however till now, I haven’t talked much about the unbelievable energy and excitement around the Startup Program, that I commenced almost 5 months ago. For a young and first ever such program at SAP, to see the kind of participation, not just in the US, but across all continents has been remarkable! It is both humbling and a great testament of times to come, when I see and meet passionate founders and entrepreneurs in the largest startup hubs in the world, come energized and enthused to build on our Platform. And this is just the beginning. So far we have organized 9 Startup Forums globally; 6 more are planned in the upcoming weeks. It’s great to see the program go from, strength to strength.


SAP and Startups? Many people have asked us in surprise. The truth is, with SAP HANA, we have a unique opportunity to offer a valuable, disruptive technology to third party software developers as an open platform for development. Startups, being inherently agile, are well positioned to take advantage of this and really the push the technology to its limits. On our part, we have worked hard to live up to our promise as ‘The New SAP’. We have been working at startup speed, cutting through red tape wherever possible, reworking processes to be friendlier to young entrepreneurs and making ourselves available to help them 24x7. In one of our favorite validations – one startup we worked with said “We used to joke in the nineties that SAP stood for ‘Slow And Painful’. Now we’ll have to eat our words – that just does not apply anymore”. When Hasso and I met with the first few startups at SAPPHIRE Orlando, after an energizing one hour conversation he told me “Vishal, the reason we need to work with these startups is because the stuff they are doing is three years ahead of what our enterprise customers will want”. By learning from their imagination we can imbibe the best of the global startup culture and continue delivering the best enterprise solutions on the planet.


So what is this program and how does it work? Currently, we are in the recruitment stage of the Startup Focus program. Besides the popular events in Palo Alto, SAP Startup Forums have been held in Waterloo, Tel Aviv, Seattle, Beijing, Bangalore and, most recently, in Berlin. Yet to come are Startup Forums in Tokyo, Sydney, Dublin, Paris and Singapore. Overall, we’ve met with nearly 600 (six hundred) startups and from them selected the most promising ones to move ahead in the program. Only a small percentage of the startups we meet are invited to the next phase of the Startup Focus program. Once the startups with strong HANA use cases are identified through a technical evaluation following the forum, they move ahead into the Development Accelerator (DA) stage where they receive support in the form of free training, development licenses and technical support and consulting. Startups that complete a compelling demo in the DA stage, then are promoted via a variety of channels including SAP press releases, SAP events and social media.


At each of these events, we have been encountering amazing new use cases for HANA. In Palo Alto, we met with Liquid Analytics whose Liquid Decisions product offers a whole new approach to enterprise BI delivered on a mobile device. In Tel Aviv, we met PrimeQue offering a unique variation on enterprise semantic search. We were pleased to find the great teams from Cyborg Trading in Waterloo, who are building sophisticated automated trading systems and Appature in Seattle, who is synthesizing in-depth customer views and using that to measure ROI from marketing campaigns. Everywhere we go we find amazing ideas, inimitable use cases, all dealing with the new possibilities stemming from real time analytics, on large data sets. And once again, new realities that were both unthinkable and unachievable before! So all you founders, entrepreneur’s and evangelists out there, if we haven’t met you as yet, I look forward to meeting you very soon, and working together to make purposeful applications, for our future



For more information about the Startup Focus program, visit the program page at http://www.experiencesaphana.com/community/startups (interested startups can also sign up at the page). You can also find more at the Facebook page at http://www.facebook.com/sapstartups.

August 15 is really vacation time in Europe. Some would consider planning to host the forum on August 15 in Berlin somewhat risky – especially when you have to recruit 30 startups in 6 weeks. The team who all have other “day” jobs at SAP pulled out all the stops and we had a phenomenal turnout. Thirty-five startups mostly from Germany and even one out of Austria came to participate in a day of learning and collaboration focused on big data trends and technologies. More importantly they came to present their HANA use cases and stand a chance to get into the SAP Startup Focus Program.


Day of collaboration packed with exciting agenda

Cafer Tosun.jpgbigpoint.jpg


Cafer Tosun, SVP and  Managing Director of SAP Innovation Center in Potsdam who sponsored the event presented the opening note to an audience of 120 invited guests comprising of start-ups, representatives from SAP Ventures & Hasso Plattner Ventures, SAP employees and invited Media. Matthias Uflacker demo’ed the results of a HANA use case that the SAP Innovation Center has co-innovated together with BigPoint for presenting real-time offers for on-line gamers to drive revenue




Time Police.jpgThe start-ups then got a brief introduction on SAP HANA from Ingo Brenckmann, Senior development manager at SAP before the 15 of the startups did a 5 minute pitch in 3 different sessions moderated by Craig Cmehil who was the official “time-keeper”. The remaining start-ups had the opportunity to present their big data solutions in the mix-it session in the afternoon. The agenda additionally featured:

  • A panel discussion by Kaustav Mitra (SAP), Stefan Herr (Ankhor) who are one of the first “graduates” of the SAP Startup Program , Rouven Westphal (Hasso Plattner Ventures) moderated by Soenke Moosmann
  • An inspiring executive note from Aiaz Kazi who re-iterated the importance of the focus progam claiming  the best ideas for HANA are sure to come from passionate entrepreneurs’
  • Startups got to engage with topic experts in intensive deep-dive discussions covering: 
    • SAP HANA by Ingo Brenckmann, Jörn Hartwig (d.Labs)
    • Startup Funding Opportunities by Rouven Westphal/Jens Schmidt-Ehmcke (HPV)
    • SAP Startup Focus Program by Kaustav Mitra and Stefan Herr
    • SAP Innovation Center by Andreas Raab, Gretta Hohl


Exciting Innovative big data use cases from Startups



The start-ups presented use cases from diverse industries such as Healthcare, Retail, Pharmaceutical, Utilities, Banking, Automotive to name just a few

  • HoneyTracks claimed that "Games are bigger than Hollywood" and presented their game analytics use case that enables personalized in-game messages to maximize revenue 
  • Pixray  has image-recognition software that can be used to protect intellectual property  of photographs, digital images, company brands on the web and already has 50 customers
  • Croking enables companies to analyze & respond to opinions or ideas from consumers on social channels
  • Plista enable personalized reading recommendations and advertisements to web visitors 
  • Contatix semi-automate market researchers work. They developed a model that "learns" & gets smarter the more it gets used to crawl web for docs
  • XQS  track and  trace drugs in pharmaceutical industry using RFID  in order to combat counterfeiting of drugs and also find "black holes"
  • SocialMeme analyze social media data and create live topic maps to enable businesses to react to impacting news
  • VMS  find correlations between KPI's to detect inefficiencies in operational processes using granular sap data 
  • Acteno  help industrial businesses become energy efficient by enabling performance management of big volumes of smart meter data 
  • Next Audience  offer “better data than google” data  for personalized digital adverts in real-time 
  • Anfesito  presented next evolution to online banking for consumers aka “itunes for personal finances” 
  • Myspotworld   presented gaming app that allows consumers to easily identify and buy sustainable products 
  • Betterfood made the business case on optimizing the food value chain by stating that  Germany wastes 11 million tons of food per year 


Opportunities for data collection is growing especially from”internet of things” yet companies fail to maximize value from the data. Recurring themes were:

  1. Personalization – The need to provide more targeted products and services from medicine to digital messages, offers or adverts.
  2. Optimization – Find ways to reduce “waste” be it energy or food consumption or detect inefficiencies in operational processes
  3. Better response – be it protection of digital IP, react to impacting news or consumer opinion.
  4. Informed consumers –  to enable consumers to make better choices be it buying sustainable products to personal finance


Bottom line was that HANA offers startups the ability to not only analyze the data faster, initiate corrective response in real time but also scale better than they can do today!


Prominent Media Coverage


The event got great media coverage. Here are some key highlights



Social Media Buzz

The Berlin startup event was also the trending topic on twitter. Thanks to Aiaz Kazi, Hilmar Schepp and Craig Cmehill who helped drive the buzz on the social channels.


aiaz tweet.png

hilmar tweet.png

craig tweet.png

social reach 1.jpg

social reach 2.jpg

social reach 3.jpg


I am already looking forward to being part of further events in the startup program. Follow the conversation on Twitter via #SAP #startups.

Join the global SAP Controlling Community September 24-25, 2012 in San Diego, CA for two exciting days of innovative training sessions led by internationally recognized thought leaders. Connect with a vibrant and growing Controlling community and build lasting relationships.


The latest information on SAP HANA and its impact for controlling users will be presented at the conference.


On Sunday, September 23 Julien Delvat, Chair of the ASUG Managerial Accounting Special Interest Group, Carsten Hilker, Solution Manager at SAP for the Cost and Performance Management portfolio of solutions within  ERP (CO) and EPM, & John Jordan, founder and principle consultant at ERP Corp will present a pre-conference panel discussion on "The future of SAP Controlling and Financials functionality". This session will explore the impact of SAP HANA insight on SAP's strategic direction for the product.


The following HANA-related sessions will be presented during the conference:

  • Impact of SAP HANA: Revolutionary changes for your SAP Controlling data – Janet Salmon
  • New ways to improve the end user experience in SAP Controlling – Janet Salmon
  • Key considerations for functional and usability upgrades in SAP Controlling with and since ECC 6.0 – Carsten Hilker
  • Reporting options for controlling across the SAP portfolio – Carsten Hilker


Read more about SAP HANA sessions at Controlling 2012. Visit www.Controlling2012.com to learn more about the conference.


Have fun!

Most of you might know the SAP Community Network (SCN), which is an online network where more than 2 million SAP professionals connect, network and innovate. It is very active on Facebook, twitter, linked-in, YouTube, slideshare and many more social media sites. Also SCN hosts the world premier events SAP TechEd and SAP Technology Forum.


Try to envision the enormous amount of data which has to be handled in the background? Think about the problems which might exist regarding performance and the need to serve customer and partners? Can you imagine the challenges the SAP Community Network team is facing regarding analyzing in flexible ways and being able to work in an agile manner?


To be honest the team faced hard times trying to do their job effective…. until they decided to adopt SAP HANA.


Watch the video where Mark Yolton, SVP Community Network & Social Media explains benefits, such as the speed to run analysis, the possibility to ask “what-if" questions and thrill down further, the much more beautiful reports using SAP Business Objects on top of SAP HANA and of course portability and mobility.


SAP HANA provides his team with the support the need to keep the community as happy as they are and let them grow even further.MarkYoltonVideo.jpg

Robert Klopp

Some External HANA Posts

Posted by Robert Klopp Aug 13, 2012

I have been blogging on data warehousing and HANA-related topics for several months and wanted to share... So here are some specific links from the Database Fog Blog for your consideration.


Exalytics vs. HANA: What are they thinking?

This post has been viewed over 1000 times. It suggests that Exalytics and HANA are not the same beast and will not compete. It is worth viewing the comments.

Who is Massively Parallel?  HANA vs. Teradata and (maybe) Oracle

This note points out the architectural implications of HANAs MPP foundation.

Real-time Analytics and BI: Part 1 - Singing for my Dinner

This considers the implications of real time analytics and of real time dashboards.

Numbers Everyone Should Know

This note shows the underlying metrics that make HANA fast.

More on Exalytics: How much user data fits? and More on Exalytics Capacity...

These question some of the assumptions made about Exalytics sizing.


There are other notes on Big Data and on hardware architecture as well. These posts are mostly from an outsider peering in. Now, I look forward to posting here and there - as an insider.

In my two previous blogs SAP BW on HANA and Building the Business Case for SAP BW on HANA, I talked about the importance of how IT and Business “BI” governance is needed to fully leverage the potential of SAP BW on HANA. Most readers are probably aware that BI has been struggling in regards to delivering on its business benefits promise. In my opinion, this is in many cases due to the lack of a nimble, Business and IT centric Governance framework for BI (and its related data). As I said in my previous blogs, there are countless example of how companies that have implemented SAP BW as an afterthought to an SAP ECC effort or viewed SAP BW as simply a list of legacy reports to be “re-invented” on an SAP BW platform.


While most of these implementations were considered successful, they seldom achieved in delivering meaningful business benefits  in terms of “wowing” the business with “cool” functionality or features. SAP BW reporting was viewed as something unglamorous that was produced “in a factory”, where a specification had to be created at the beginning and eventually reports were deployed. Business involvement and ownership was in many cases lukewarm.  For many, this approach was considered “good enough” when companies standardized business processes and the focus was on cost savings.


Today, however, I see that more companies realizing the importance of making better decisions based on facts to start driving towards the goal of making decisions in real time using incoming on-line data streams (for instance fraught detection during a trading session). 

Unfortunately, many of today’s IT governance for managing reporting are taking an “Information Factory Assembly Line” approach which is not necessarily designed to address these new fast changing business demands. I would  argue that a new integrated Business & IT BI/Data Governance approach should be established to accommodate the business “real-time and consumer” focus.


Let’s take a look at a possible new BI/Data Governance approach by looking at an example and how the SAP BW on HANA could function as a significance business enabler. Let's assume that a global company has implemented a Sales, Operation, & Planning (SOP) business process.  At a high-level SOP is a structured process for having one integrated plan for meeting future company goals profitably. This plan is driven by the long-term finance plan, and then integrates Finance, Sales, Marketing, the Supply Chain, and New Product Development plans. Using a strict business meeting cadence, with a large number of participants/stakeholders across many business functions, SAP BW reports are used as tools to manage the company more effectively. These SAP BW reports use and combine data across the Sales, Finance, Inventory, etc. domains requiring an initial complex development effort.

The traditional “Information Factory Assembly Line” approach generally requires a strict process where the business would write requirements and then hands them over to IT for development. Once development has completed this task, IT then hands the reports back for rounds of testing which finally results in production-ready for use in the SOP process. This cumbersome process can be viewed like steering an ocean tanker with long turnaround lead-times.


As you can imagine, this process looses the sustained business accountability/ownership for reports and the underlying data. In addition, for fast business changes the ‘ocean tanker like steering’ lead-times are far too long often leading the business to create their own reports without necessarily having the right data insights, development skills or controls in place. As a result, the SOP cadence becomes a cumbersome process  creating the perception that the SAP BW reports are either a bottleneck or the source of inconsistent data.


With having the SAP BW EDW running on HANA the company now has the ocean tanker with all its valuable foundation but also a speedboat that can provide additional information as needed in real-time as part of the SOP process. For example, let’s assume that right before an SOP meeting an analyst uncovers unexpected Sales Demand in a certain region as he uses a BI tool such as BEx on top of SAP HANA. He then easily combines the information with current inventory and factory capacity in SAP BW on HANA BEx and makes new reports available right in time for the SOP meeting. During the SOP meeting these reports could then be used to dive into the external Sales Demand data. This new data can be analyzed on the spot and decisions can be made in real-time. Potentially even business work instructions could be sent out using SAP StreamWork as a result of this analysis. SAP BW on HANA can now be viewed as a powerful and business user-friendly solution in the SOP meetings and become the SAP BI  “Wow-factor” we have been looking for.


So life is great!        Well, not so fast!       We now have an ocean tanker world and a speedboat world that potentially could collide. Over time, too many reports could be created, or data cleansing rules could be defined incompletely, etc.

These issues could become even more problematic as they potentially remain undetected and only become known  “after the fact”.

The traditional “Information Factory Assembly Line” governances are generally of little help as they do not cover business activities and do not foster business ownership for reports/data. 

I would advocate that a new BI/Data Governance model is needed to integrate the ocean tanker and speedboat world that SAP BW on HANA has enabled to meet the real-time and consumer driven business demands. I call this new governance the “Enterprise Data and Reporting Management Governance Model” with the following key responsibilities:

  1. Define and own overall global data and reporting standards
  2. Assign and direct Business and IT standard owners that develop standards within their processes under the umbrella of the global data and reporting standards
  3. Oversee and ensures that Business and IT standard owners implement and enforce compliance to standards
  4. Manage a cascading governance cadence that connects all relevant Business and IT stakeholders on all levels of the organization within and across business processes
  5. Facilitate the process of global data and reporting standard innovation using the global data and reporting standards standard owners as innovation engines


This new “Enterprise Data and Reporting Management” governance approach wouldfully expose the power of SAP BW as an EDW on HANA to the business end-user. It would fully deliver on the true self-service model promise while at the same time reducing the risk for uncontrolled reports/data growths and inconsistencies. This new “Enterprise Data and Reporting Management” governance model provides the necessary flexibility to adapt to business changes leveraging a self-services approach, while providing integrated business and IT controls that ensure that standards are being met, and changes implemented timely.

It’s only been 12 months since the introduction of SAP HANA to the database world. This revolutionary new database architecture from SAP is an in-memory break-through and has captured the imagination and pioneering efforts of enterprises globally. But the reach of HANA has impacted the collective database community all the way out to innovative startups. At SAPPHIRE NOW 2012, SAP’s global customer event in Orlando, 10 startup companies showcased their solutions running on HANA. The upcoming SAP Startup Forum on August 30th in Palo Alto, California will explore big data startups’ potential on HANA.

Even though SAP has been around for 40 years, SAP still has the passion and heart of a startup, and startup forums and the SAP Startup Focus Program are a great example of that mindset. SAP’s objective is to tap into the startup ethos and link some exciting big data or predictive analytics startups with key SAP technologies and decision makers.

With the rollout of SAP Startup Forums worldwide it is interesting to note that the forums have been receiving some significant coverage across a broad spectrum of media channels including TV coverage on CTV in Canada, print media, in Ireland with Silicon Republic, (Ireland's leading technology news service), and of course, multiple touch points across a number of web properties from GeekWire and the Seattle forum to the Wall Street Journal’s coverage of the new $155 million early-stage venture fund.

Startup forums are a day of collaboration and learning and SAP Startup Forums, including the August 30th event, focus on big data trends and technologies as they relate to SAP HANA. Selected startup companies will have the opportunity to present to a group of attendees from local media, SAP and SAP Ventures as well as exhibit to SAP employees in a tradeshow setting.

Invited startups will contribute to the big data discussion, engage and learn from SAP technical resources, hear from and meet with key SAP leaders, explore customer and funding opportunities, and network and win ‘bragging rights’ over the course of the day.

If you want to follow along virtually follow #SAP #Startups for all tweets related to these events or @SAPInMemory for HANA related news. Feel free to ask questions by using #SAP #Startups.

If startups are interested in participating in a future SAP Startup Forum they can register their interest by completing the online form.

For information about the upcoming SAP Startup Forum in Palo Alto, click here.

See what other startups have experienced with SAP HANA and how it is playing a critical role in solving complex business scenarios.

Recently, a senior executive from a well-known technology vendor stated that “economics are not pointing to all in-memory”. This might be one of those quotes that fails the “oops” test in a few years. Memory prices are dropping at 30% every 18 months (according to the same person in the same interview), bringing to mind a quote from another well-known person in the technology business:


I've said some stupid things and some wrong things, but not that. No one involved in computers would ever say that a certain amount of memory is enough for all time … I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again.” – Bill Gates (http://en.wikiquote.org/wiki/Bill_Gates)


Twenty years ago I used to carry IBM tapes on customer calls to give product demos, because I knew everybody had a tape drive for their mainframe since disk was too expensive for anything but the last year or two of data. Good luck trying to find active tape drives in most data centers today.


The same momentum that predicted the move from tape to disk, indicates that the move from disk to memory is the next step, and thus that memory based data solutions are inevitable. Disk will be the near-line option. I think there may be an argument over the timescale (now, two years, maybe even five years), but that is all. Not that there is anything special about memory, if anybody invents a storage capability that is faster than current memory, and it costs under $5,000 per terabyte, we will all be opening our wallets. Let’s face it, disk I/O is the biggest single technical factor slowing down computing today. Everything else is measured in nanoseconds, disk is measured in milliseconds. It is like measuring a passenger jet vs. a snail (average speed of a garden snail, 0.03 mph, passenger jet, 500mph) – but with memory we are talking about a million times faster (our jet would fly at 30,000 mph!). You could travel from San Francisco to New York at snail’s pace (cost: $60, travel time: 9.5 years), or you could board our hypothetical jet (cost: $4,900 dropping at 20% every year, travel time: 5 minutes). You pick, but I won’t be flying snail class for long, important trips.


In the same interview, it was stated that the “Sybase model” was a collection of small data marts. This statement may fall into the Bill Gates 640K comment category. I don’t think Sybase/SAP ever said that. The confusion may arise from the customers we have who can’t get the speed or flexibility from their enterprise data warehouse (EDW) and use our flexible, fast, analytics products to handle their burning needs (e.g.,  senior management need their report to run in under 15 minutes, not 15 hours).


SAP Sybase IQ, an enterprise data warehouse, can scale up to the largest data sets. As an example, SunGard uses SAP Sybase IQ to address large data volume issues (> 1 trillion rows), to speed query response time, and to reduce administration costs (http://www.sybase.com/detail?id=1095244).

At SAP (who acquired Sybase in 2010) we have a much larger vision for data, and there is more good news for current and future SAP customers. The recently announced SAP Real-Time Data Platform by Dr. Vishal Sikka offers a roadmap and vision for the synthesis of SAP’s data management, data movement and analytics products into a cohesive, flexible, and very scalable (note: yes, this means EDW size) data solution (http://www.sap.com/news-reader/index.epx?pressid=18621).


King Canute commanded the waves to recede, but they didn’t (so he proved his point to his sycophantic court that he was not all powerful, plus he got his feet wet). Commanding customers to stay on their disks and ignore the in-memory wave might be the modern equivalent, sadly without King Canute’s sense of irony!

Lessons from 10,000X SAP HANA Performance Club


SAP HANA is being adopted at an amazing rate, as demonstrated in SAP's strong Q2 results. Then, about a week ago in Beijing at our Sapphire, more than 100k people listened to my keynote live, and there were crowds 10 deep lined up at the HANA "city" to learn about HANA!  As I was walking the show floor, I ran into Patrick Hoo, the CIO of NongFu Spring, a beverage company in China, who gave me an incredible story. He mentioned how with HANA, he was able to achieve an astounding 35% improvement in his company's transportation costs, by getting dramatically better at calculating and planning optimal truck routes, often on-the-fly and within a day, when it used to be a day+ long batch job before.  And I asked him how big a deal it was, and he said that transportation costs are by far the largest costs in his company, taking up 15% of NongFu Spring's revenue!!  And this got me thinking, on the long day flight back, that we have often focused on the breakthrough technological improvements that HANA bring, but not always articulated the business benefits and the end-user empowerment enabled by this technology.


So today, I want to focus on this, and share with you how 17 customers have achieved a 10,000+ times performance improvement in their existing business processes, and what this means to their business.


To be able to easily comprehend the impact of the 10,000x performance improvement that SAP HANA delivers, consider this: a 10,000x improvement in the typical speed of a snail, moving 1 centimeter in 10 seconds, would allow the snail to compete with world record holder Usain Bolt's sub-10s sprint in the 100 meter dash at the London Olympics! From first hand conversations that I have had with many CIOs that have a high innovator's index (see my last blog titled: Measure of an Innovator: The Innovator’s Index), I have seen this dramatic improvement in performance enable them to change the clock speed of their business, leaving their competitors behind, like snails. They can now do more in less time with fewer resources - dramatically improving their current business processes and powering new business models that may provide a snail versus Usain improvement in revenue and profitability in the future. How? I have categorized these improvements and opportunities in 3 main areas.


#1. Real-time Analytics

The early use cases with SAP HANA address a conventional pain point by accelerating analytics, and hence insights, on massive volumes of detailed data. Hence, end users can not only get answers faster (e.g., CO-PA reports that used to take hours can now run within seconds) without disruption, but also are now empowered to ask any question without requiring additional investment of IT's time or resources to pre-define queries, pre-aggregate data into tables or cubes, or to generate indexes and materialized views.  Animal nutrition leader, Provimi, for example speeds up their CO-PA report from 10 hours down to 2.4 seconds with SAP HANA, covering all company codes and 2 years of data. This accelerated performance (15,000x faster analysis and insight) has enabled Provimi to increase profitability as business users can make real-time purchasing decisions for raw materials based on the latest demand forecasts. Similarly, Lenovo has improved reporting performance dramatically cutting down the latency from the time data is entered to the time it is analyzed, thus driving operational excellence in global shipments to meet ever shrinking product lifecycles. Similarly Mitsui in analysis of retail data, Hilti in understanding profitability of leased tools, and SAP IT in 14,000x faster sales pipeline analysis show how speed of analysis can empower decision makers to analyze massive amounts of diverse data sources with trusted fresh information in real-time and to make sound decisions. Effectively, this helps these organizations drive a culture of data driven decisions. It bears mention that this speed is not accomplished by sacrificing completeness of data sets or their pristine value as has often been the technique employed by many in the past.


#2. Real-time Business Process

While real-time insights power operational and strategic decisions, sometimes, this same insight can alter business processes leading to decisions that enable much greater scale of savings across the company. Surgutneftegas, the Oil and Gas giant in Russia, enabled an amazing real-time inventory and material management process for 20,000 remote oil wells by delivering the right quantity and type of materials when needed. With SAP HANA, they were able to perform inventory analysis 14,000 times faster to reduce out-of-stock situations - for them this is a mission critical process in Siberia. Similarly, Cornell University Bookstore improved inventory turnover and decreased inventory at hand by using SAP HANA to forecast whether a title will be out-of-stock at any time during the school year. With SAP HANA, the forecasting process was improved by 22,000x (from 10+days to 39 seconds). Additionally, SAP has further enriched this process by leveraging SAP HANA to recommend both optimal stock level and sourcing strategies (old vs. new) with an associated confidence range to maximize margins for Cornell - this improvement was not even considered previously.  ARI, a leader in fleet management services, has leveraged SAP HANA to improve critical roadside assistance to customers and can now provide differentiating value to customers with complex fleet services.  Nongfu Springs, as I described previously, using SAP HANA, reduced the time (from 24 hours down to 3.7 sec) it took to do complex supply chain analysis of bottled water from source to retail locations thus optimizing the business process. Essar, a leader in engineering and construction in India, used SAPHANA to improve its project level profitability by obtaining real-time insights into cost, performance and timeliness of projects across the entire group - from 15+ hours down to 4.8 seconds (an improvement of 11,500x). According to Mr. Ram, CIO of Essar, for the first time IT is ahead of the business and they can now provide new ways on conducting project planning and execution.


#3. Real-time Business Models

Empowering people to make the right decisions and improving the business process would make any CFO or COO happy with SAP HANA. However, what really excites CEOs is the ability to rethink their businesses based on the speed that SAP HANA can deliver. Consider the opportunity to rethink the business models within the airline industry with dynamic pricing optimization based on real-time availability of seats and demand, as Emirates is planning with SAP HANA - a process that took 12 hours to calculate the optimal pricing for a fare class, now takes 0.89 Sec in SAP HANA scanning 165M records (48,539x faster).  One of the largest consumer appliance manufacturers in China - by running the credit verification process 15,000 times faster, they will now be able to perform real-time credit checks and approve loans on the spot. Similarly McLaren's CEO envisions being able to predict and transform the outcome of races based on real-time analysis of car sensor data, leveraging historical data with predictive models for a 14,000x faster data analysis - from 5 hours to 1 second. Yodobashi is transforming its customer relationships (loyalty reward calculation) by leveraging a 100,000x faster incentive calculation (from 3 days to 2-3 seconds). This enables Yodobashi to offer personalized offerings while a customer is at a physical store or on its website. Think about the possibilities for Charmer Sunbelt Beverage, that can now  get the right selection of the perfect beverages to the hottest selling store shelves at the right time through real-time Order-to-Cash calculation across 300 KPIs and multiple systems - this used to take 30 days and now takes only 28 seconds (86,000x improvement). Or, consider the potential investment returns for customers of Global Financial Technology (GFT), an investment management software company, that uses SAP HANA to accelerate the quantitative analysis of stock portfolio running  their proprietary  models to calculate the most optimal holding duration that maximizes stock return; they've discovered that using SAP HANA this process can be 10,000 times faster. Think about the social impact that SAP HANA can make when hospitals like Charité can cost-effectively provide personalized healthcare by employing pattern discovery that brings together information from several treatment cases to allow a holistic analysis, thus eliminating the previously time-consuming manual tasks within the process. These examples clearly demonstrate that speed is a fundamental capability enabling new business models and hence delivering new value

and ultimately competitive advantage.



Speed in computing has often come at the cost of limiting data or expending resources to curate it. "Real-time" has always been a notion of a quick response, but in the past we have seldom seen evidence of measurable impact for businesses that would benefit from having a truly real-time approach. Thus, not until now, have businesses experienced resounding business value from speed and real-time response that has not been impacted by subjective decisions about how to use data sets or by latency caused by redundant data layers. With SAP HANA any CIO can dare to dream about delivering on innovative business models achieving higher scores on the innovator's index - not only will they likely see tremendous performance improvement, but also witness the execution of processes previously thought not possible before owing to constraints associated with traditional databases. This renaissance of computing from SAP is well underway, and there is no reason for anyone to be left behind.  A dramatic improvement in business is in our hands, if we choose to challenge ourselves.  Are you ready to stretch your Innovator's index? Have you already seen similar benefits?  If so, or once you have done so, the whole HANA and IT community and I would love to hear your thoughts on this.

Half the readers will be thinking JAY-Z and half will be thinking at Billy Joel right now, but that's demographics for you. For various reasons (largely because we believe that it is important) I've been heavily involved in creating SAP HANA use cases for the last week or so, and something very interesting happened which I think is worth sharing.


I went around my company, Bluefin, looking for SAP HANA use cases. I went to our vertical Centre of Excellences, to people in projects and - well - pretty much anyone who was willing to talk about it. Tried various approaches from "give me your top 3 SAP HANA use cases" to harassing and haranguing people. Two people within my team came up with a number of great ideas. By the way the reverse is also true - almost no one else came up with anything at all. What was specific about these two people:


First, they both had substantial exposure to SAP HANA - both in terms of creating solutions for SAP HANA and hands-on experience. Second, they also had substantial experience in the wider SAP solution portfolio including visualisation tools and integration tools as well as things like Mobile. Third - and crucially - they came up ideas from pretty much anywhere, including experiential ideas.


One had recent experience of someone he know in a cardiology ward, and had immediately spotted how SAP HANA could improve patient experience. He didn't need a background in healthcare to spot the SAP HANA benefit. The other had a conversation in a bar around the betting industry, and he immediately spotted an opportunity for SAP HANA.


So what did these two have in mind? I believe it is a HANA State of Mind.


What constitutes a HANA State of Mind?


I'm pretty certain it's a "contagious condition": I got it from Steve Lucas and the time we have spent creating solutions and ideas together. I'm also pretty certain it's addictive - because once you start spotting opportunities for SAP HANA, you won't stop until you find ways to disrupt entire industries. I think it's characterised by a few things:


Simplification. The world is complex and good technology makes it less so. SAP HANA is amazing at reducing layers of complexity and entire swathes of IT. Because it is fast, it can replace several existing solutions. When you're in a HANA State of Mind, you see the need for simplification.


Creation. It's easy to come up with ideas to make existing processes better, but where SAP HANA excels is to solve problems that couldn't be solved before. When you're in a HANA State of Mind, you look at the impossible as a perfect opportunity for SAP HANA.


Disruption. There are tons of SAP HANA ideas that make things faster. Improved analysis, more detailed analysis, faster response times, real-time. However these are speed-plays and there will be a new speed-play next year, as every CIO knows. But when you're in a HANA State of Mind, you find ways to disrupt industries. Take the Genome Analysis case study for example - it's simple and obvious when you see it, but it disrupts an industry.


Final thoughts


What I can't figure out is how you spread the condition without immersing people in SAP HANA. The in-Memory paradigm requires a slightly different mindset and I'm not sure that people get it until they've experienced it. But what I know for sure is this: it's a pyramid principle and the more people are in a HANA State of Mind, the more will find new uses for her.


And somewhere in all of this, are some absolutely incredible, industry disrupting, ideas. Over to you.

Did you hear about the most recent SAP Inside Track covering HANA topics that recently have been discussed in the community? With around 100 attendees online and 60 live at the Palo Alto SAP location, there was lots of engagement during this Inside Track. The on-site attendees were asking a lot of questions and we had an active Twitter feed (#sitpal) and questions coming in virtually. 

The event started with a hot discussed topic: BW powered by SAP HANA. Rohit Kamath from the SAP Customer Solution Adoption Team demonstrated the modeling and dataflow aspects for a business warehouse system based on SAP HANA. In addition, Jeffrey Krone, Founder and VP Technical Alliances of Zettaset, presented how automating the Hadoop process using Zettaset can deliver new levels of operational efficiency to the healthcare industry, including faster patient on-boarding.




Yusuf Bashir, Senior Director of Database and Technology Solutions Management at SAP, discussed HANA in the cloud, covering its options and alternatives. Bashir also clarified that SAP HANA cloud is a new deployment option, not a product.  Besides, Hari Guleria, BI Business Value Architect and owner of the In-Memory SAP HANA LinkedIn group, discussed HANA in the SAP BI landscape. The day was concluded by Chris Hallenbeck, HANA and Analytics Evangelist of SAP, by showcasing HANA demos on Advanced Text Search (with HTML5 UI), Smart Meter Analytics, and Business Objects Explorer running on HANA.

I want to thank you everyone who made this SAP Inside Track possible. A special thanks goes to our external sponsors Zettaset, K2 Partners, and ICM America and speakers, David Hull, Rohit Kamath, Yusuf Bashir, Jeffrey Krone, Hari Guleria and Chris Hallenbeck for making this community event successful.  And a big shout out to all our participants, on-site and virtually, your attendance and engagement made this SAP Inside Track fun.

You have missed the Inside track? We have the replay (First Part, Second Part) and slides for you! In addition I highly recommend a Tweet Doc that has been created by Tammy Powlas, SAP Mentor and highly engaged community member. It is showing the Twitter conversations around #sitpal and definitely worth to look at.

What is your feedback to the event? Which topics would you like to be covered next time? Comment below and let us know what we can improve for future events.

News Archive

SAP HANA News Archive

Filter Blog

By author:
By date:
By tag:

Contact Us

SAP Experts are here to help.


Our website has a rich support section designed to help you get the highest quality answers quickly and easily. Please take a look at the options below to get you answers from the SAP experts.

If you have technical questions:

If you're looking for training materials or have training questions:

If you have certification questions: