Home > Blog > Blog > 2012 > October

SAP is congratulating the San Francisco Giants and all their fans on a fantastic 2012 World Championship. To help all fans re-live exciting moments of the San Francisco Giants journey, we decided to look at some stats of the World Series. Please see for yourself the 5 most mind-blowing stats we found out.

 

 

1.) Phenomenal Pitching

 

The Giants sent amazing arms to the bump as the four San Francisco starting pitchers had a combined ERA of .98 and the entire staff threw for a gaudy .99 ERA in the four Fall Classic victories. Giant ace, Matt Cain’s 3.86 ERA was the highest for a starter during the four game sweep, while both Ryan Vogelsong and Madison Bumgarner pitched scoreless frames, and the latter only allowing two hits in seven innings. The bearded Giants closer Sergio Romo collected three saves while not even allowing a single hit in two ninth inning and one series clinching 10th inning relief outings in which he stroke out the side.

 

The phenomenal Giants’ pitching by the staff held the potent Tigers’ offense to a dismal .157 batting average. The first Triple Crown winner since 1967, Miguel Cabrera, and the Tigers' prime offseason acquisition, Prince Fielder  (usually the MLB's most fearsome 3-4 tandem), were only able to muster a meager .148 combined batting average against the monster pitching.

 

 

2.) Panda Power

 

 

With both his big bat and gifted glove Giants’ third baseman Pablo Sandoval powered San Francisco to their second World Series title in three seasons. The Kung-Fu Panda batted an incredible .500 in the four game series, knocking in four runs and hitting for 18 total bases. In Game One Sandoval sent three balls into the stands at AT&T tying a Major League record for most Home Runs in a World Series game. Pablo’s bat alone did not cause havoc for Detroit as Sandoval made some spectacular defensive plays at third base robbing numerous Tigers’ of base hits. The Venezuelan was honored as the World Series MVP for his outstanding performance in helping to bring back the Commissioner’s Trophy to the City by the bay.

 

 

3.) Clutch 2 Out Hitting

 

 

Coming off of six days rest from their sweep of the Yankees in the ALCS, Detroit skipper Jim Leyland had ample time to set a pitching rotation that pundits lauded as a formidable four person starting staff built for the playoffs.  Even though they were swept, the Tiger’s staff still brought the heat as they recorded 41 strikeouts in the series. The Giants relied on timely 2 out hitting to gain crucial momentum early in Game one of the series and rode it all the way to the Commissioner’s Trophy.

 

 

7 out of the 16 runs the Giants scored in the series came with 2 outs including the first 5 in Game 1 off of reigning AL Cy Young and AL MVP winner Justin Verlander. Also Marko Suctaro’s decisive single in the top of the 10th to seal the sweep came with 2 outs.

 

 

4.) Zito’s World Series Redemption

 

 

After being left off the post-season roster in 2010, left-hander Barry Zito shined in Game 1. 10 years after winning the Cy Young pitching across the bay Zito’s World Series start was the longest time for a pitcher in the league between receiving the highest pitching honor to starting a World Series. Surprisingly Zito looked like a Cy Young contender as he defeated reigning AL Cy Young winner and AL Strikeout Leader Justin Verlander. Zito pitched like it was 2002 allowing only one run over 5.2 innings of work while collecting a crucial Game 1 victory.Zito even drove in a run for support batting in Brandon Belt on a single to left.

 

 

5.) Freak in the Pen

 

 

After a disappointing 10-15 Regular Season, 2-time Cy Young Tim Lincecum was place in the Giants’ bullpen for the postseason. The “Freak” ever did shine in his new duty as a reliever. Lincecum found his rediscovered his nasty control out of the bullpen, striking out 17 batters and only allowing a single earned run in 13 innings of relief work. Besides a single walk in Game 3, the “Freak” was perfect in his 4.2 innings of relief work, collecting one
hold while fanning eight Tigers.  Lincecum’s stellar postseason relief work helped to crown a difficult personal season with the ultimate team accomplishment: Yet another World Series ring.

 

 

How did we accomplish this Analysis so Quickly? - With SAP HANA One!

 

SAP HANA One, the leading In-Memory Platform SAP HANA in the Cloud hosted by Amazon Web Services, enabled us to quickly and easily put these baseball stats together. If you want to learn more about SAP HANA One please check here.


Now you can share your HANA Experiences with us and get more impressions of the HANA Team's day-to-day work! Today we're having a special fun task. We will be live at the San Francisco Giants parade to celebrate the latest World Series winner! Our goal is to provide “360 degree parade coverage” for you. In addition, this is an easy way for those of you who are not attending to connect to the parade via real-time pictures from your friends and colleagues and us.

 

SFGiants..jpg

You can get live photo updates from the SF Giants Parade and maybe even some Halloween Shots by completing the following steps:

 

  • Download PhotoTribe **(please keep in mind the app is currently only available in the US)
  • Join the HANA tribe: PIN 72599
  • If you are remotely located, share your photos with us – your favorite Giants memories, how you are participating, photos in team gear, etc.
  • Also tweet the photos and share your experience: use these Twitter hashtags:
    #SFGiants, #SFGChamps, #SAP, and #HANA
  • Example Tweet from yesterday:
    #SAP #HANA team will celebrate in person & capture memories for millions of #SFGiants fans worldwide using #PhotoTribe pic.twitter.com/6CaSOexV


You are wondering how SAP HANA relates to Baseball?

To help all fans re-live exciting moments of the SF Giant’s journey, we decided to look at the stats of the season and will present the 10 most outstanding facts of the season. And SAP HANA One in the Cloud hosted by Amazon Web Services, enabled us to quickly and easily put the baseball analysis. Find out more about HANA One at http://www.cloud.saphana.com and don't forget, join our HANA Tribe and share your #SFGiants parade experience with us!


Happy HANA!


Here is Vishal Sikka in conversation with Aiaz Kazi, talking about SAP HANA One and what you need to get started.



You missed the LIVE Hangout with SAP HANA? I have good and bad news for you.

 

I'll tell you the bad one first. You missed the live reflection on HANA by our participants including one surprise guest (actually even two). Can I get drumrolls please? We've had Vishal Sikka stopping by spontaneously to talk about the SAP HANA Academy which entails free SAP HANA tutorials for everyone. In addition, we've had John Appleby joining us on short notice talking about career opportunities for SAP HANA, including career guide blogs he has written recently.

 

However, I also have good news for you. We have a replay for you! You can still watch what our experts had to say around SAP HANA and how happy everyone was to see Vishal joining. Here are the top 3 reasons to watch!

 

  1. Get a reflection of recent SAP HANA announcements by key members of the SAP HANA Product Management Team, Chris Hallenbeck and Doug Martino, Aiaz Kazi, Head of Technology Marketing at SAP, and John Appleby, Head of Business Analytics & Technology , at Bluefin Solutions.
  2. Find out who came up with the name SAP HANA One and why the SAP Startup Focus Program is incredibly important to HANA
  3. Get an informal live presence by Vishal Sikka as part of the Google Hangout was the coolest and most special experience of the day! You should watch how excited the other participants were to see him joining the Google Hangout.

 

Hangout.JPG

And as Aiaz proposed, stay tuned for a future HANA Hangout, because it will be more fun and games including some contentious discussions! Thank you to all participants and to Kevin Cassidy, Community Manager at SAP, for moderating the Hangout.

 

Happy Monday!

Based on various comments from my colleagues Ruks Omar, Andrew de Rozairo, David Russell and Stephan Gatien it became clear that it would be worth making a quick note of how HANA applies to Telco providers.  No doubt we’ll expand on specific topics in future blogs but this one will serve to set the ball rolling.

 

Simply put there are (at least) three main characteristics of HANA that appeal to Telco’s.

  1. Scale to handle big data volumes. Unique in-memory approach and thus the ability to very quickly process very large numbers (Billions) of records, e.g. CDR’s or network telemetry,  for example being able to go through 3Bn records in less than a second on an average sized system. This enables the use of more granular data – a Telco may be more used to dealing with aggregated data. HANA enables them to deal with data at the most granular level, and to do this easily and without being penalised on performance. Note especially that this data does not need to be data originated by an SAP application.
  2. Simplicity of architecture. Because of the superior performance, plus other factors, the ability to provide a much simplified way of approaching analysis through the elimination of the need for pre-built layers of data(aggregates, indexes etc), this makes responding to new requirements or changes to older ones much quicker, less costly and more agile. For example when monitoring a network not needing to maintain prebuilt aggregates or cubes to provide network stats (see later). Other systems may be able to handle the volume but it is the simplicity and ease that HANA brings to the task that is especially appealing here.
  3. Single platform for multiple type of analysis. The ability to use multiple built-in analytical techniques to analyse data all within the same platform, rather than having to use multiple tools, e.g. predictive analytics, text analysis.  HANA represents the re-invention of information processing platforms, and one of the principles used in its design was that all the tools you need should be at your fingertips, that is, as callable services without the need to have to integrate multiple tool sets simply to do a piece of analysis. HANA already goes a long way to meeting this goal.

HANA Helps Address the Key Elements of Modern Competition

The Telecommunications industry is highly competitive, there not much margin in voice anymore and intense competition for data packages. However, Telco’s have a unique relationship with their customers, they know who they are, and potentially where they are, and who they interact with. Thus the problem devolves to how to monetize the customer relationship and thus create new revenue streams (this applies to non-prepay, and needs permission of course)


  • The need for speed. There is ample commentary about the pace of life, and of business speeding up. For Telco’s on the business side customers can churn quickly (a ‘frictionless’ environment),  and on the network side network congestion problems occur quickly and can disappear again equally fast. Whichever side of the business is impacted, the need to react quickly is very important.
  • A Telco might sell information about customers (with their permission) and make offers for location based advertising and other offers from 3rd parties. As we said, they have value in knowing who you are and where you are. Being able to quickly and easily navigate and analyse the volume and complexity of data is something HANA is well suited to. Likewise, a challenge is how to build ‘sticky’ customer sales and services offerings that bind customers to the company. This may equally involve understanding how the customer uses and experiences the Telco services, and the services reached through the Telco.
  • The need for a unified view of their customer and the need to provide this quickly and easily.  This is not just for structured and unstructured data, though this is increasingly important, but also the basic need to combine BSS and OSS data to get real value. For example if a customer has just had four dropped calls in the last five minutes (Network data), look to see whether they are worth saving (CRM), and what tariff they are on (CRM/Billing), and make the right offer with the apology, whilst this seems obvious many companies still find this difficult to do.
  • It is noticeable that, with the move to 4G, Telco’s are looking to value added services that bind the customer close, for example a combined offer of being able to replace a handset within a day with fast backup from the cloud and to stream movies and other content quickly to the handset.  Understanding how to support this and ensure that it is working smoothly may well need real time analysis. This is both to ensure good quality customer service, and to respond quickly to changing customer requirements, if this is so then it is easy to see how HANA ticks these boxes.
  • Focusing on ‘influencers’ either in calling rings, or friends and family. If you have €1m in marketing budget should you spend it on the 1,800 influencers in your customer base rather than making blanket offers? A competitive threat is that Telco’s become ‘dis-intermediated’ that is their customers use bandwidth to contact others and buy products from them – e.g. Amazon, eBay, Google Play – with the Telco used simply as a transport mechanism. Understanding the calling patterns between customers and seeing who is an early adopter or influencer can help combat this, but this involves analysis of the calling patterns between customers, not just understanding aggregate calls for a single customer. This is also about analysing social media and seeing who is commenting on a Telco’s services and thus would be expected to be supported by social media analysis too.
  • Web based data; of course there is now a wealth of web click data, whether this is just a general stream of social comment from which sentiment can be ascertained, or whether it is the ability to analyse customer behaviour on the web (again permission based).
  • HANA is not just a database. It also contains a text processing engine and can invoke algorithms in its Business Function Library or through using ‘R’ the statistical modelling suite. HANA is already used in conjunction with NoSQL processing such as Hadoop in Finance for Risk analysis and for Medical Genomic processing, it is natural that it will fill the same kind of role for Telco’s. Those of you familiar with the strategy for the SAP Real Time Data Platform will know of the plans to merge the already well developed Hadoop interoperability, proven in the SAP Sybase IQ product, into the combined platform.

 

Classic Telco Analytic Apps Remain Relevant (but can be improved with HANA)

Having said this we should not forget that the classic Telco analytical applications are very much still relevant, though they are more likely to be seen these days as a ‘ticket to the game’  rather than the sources of differentiation they once were. These all require the analysis large volumes of data and HANA can do these in a much simplified and more agile way, these include old favourites such as:

 

  • Revenue Assurance, monitoring the whole “call-to-billing” chain looking for opportunities to identify lost revenue, either due to error or fraud.
  • Customer insight and Customer Segmentation Analysis. The need to gain insight into customers and the various segments that they fall into continue to be key requirements. But it should be remembered that the results are not static; historical segmentation is continually evolving, for example as the “youth” segment of the past splits into various micro-segments.
  • Churn Analysis, predicting what makes people move between providers. Customer profitability is very closely tied to the length of time a customer stays with a provider. In many markets, Customers are heavily subsidised up front, with say, expensive handsets, and the money is made in the later stages of the contract. If they churn, Telco’s are losing all that ‘investment’. This is what drives profitability. Combatting this involves being able to better connect the dots between network activities and individual customer experience for example to catch potential sources of dissatisfaction such as dropped calls, poor compliance with service level agreements , etc. and to do this early and be proactive in fixing it. Also working out what drives customer retention or movement (there is no point mailing or calling a customer about improved network performance if they see handsets as fashion items and it is the offer of a new more fashionable handset that would drive them to change providers).
  • Cross Sell and Upsell – Customer segmentation etc. Understanding the characteristics of sets of customers and how they might be transitioned to other more profitable segments to increase their ARPU (Average Revenue per User).  That is, to understand segmentation so that onward activities such as campaign management can be driven from a position of strength.
  • Tariff Re-Pricing. The ability to go through large volumes of individual calls re-pricing them so as to investigate the impact of competitive call plans lends itself very well to HANA’s high speed in memory ‘data flow’ type architecture. If we need to go through every record and re-price it, touching each and every call as we do, then the responsiveness of HANA can help here. By the way, HANA is also excellent for scenario planning of the effect of different tariffs – particularly in conjunction with SAP Convergent charging.
  • Network Monitoring and Analytics. On the network side, any investment is huge. Bandwidth is constrained, and is difficult to grow it quickly. Any potential to squeeze more out of existing assets is very valuable and drops straight to the bottom line.
    In conversations with Telco’s it seems a common way of doing this is to have a high volume data feed coming from the network and for this to be rolled up and aggregated into a collection of prebuilt data cubes that hold pre-aggregated dimensions of data. The reporting is then done against the cubes.  This approach works, but it can be very hard to modify or to introduce new reports, if new data is needed then the prebuilt cubes have to be redefined and re-aggregated, together with adjusting any partitions, indexes that are used (or new cubes constructed) In addition, the data feed processes need to be modified too. Using HANA it is easy to see how a simple data model representing the lowest level of data would be implemented and the incoming feeds pointed at it and er… that’s pretty much it. Any aggregation, even against Billions of records would be done on the fly in a few seconds. If a business user wants the data aggregated in a different way they just go ahead and do it, no fuss, no delay.

 

Turbo Charging SAP Processing

Lest we forget, many Telco’s run SAP. Specific functions such as Month End Closing, Planning (BPC), Financial Reporting and Profitability Analysis (CO-PA) can all be considerably speeded up and made more flexible by turbo charging your standard SAP processing with HANA.


If BW is used for operational reporting then the whole process of loading data, executing reports and making changes can be sped up and made more flexible through the use of HANA. This is because HANA can relax or change many of the previous design requirements needed to ensure performance. Multiple copies of physical data and cubes can be replaced by virtual cubes that are basically views onto the underlying data. This work is known as ‘LSA++’ in reference to the modifications that have been developed to the SAP Layered Scalable Architecture. If you use BW and want to know how it is being made more flexible and with higher performance, including the ability to more easily combine SAP and non-SAP data then you should get familiar with LSA++.

See: http://spr.ly/bwonhanafaq and http://www.experiencesaphana.com/docs/DOC-2113. Also, we should remember that whether you use SAP BW or not if you wish to mix SAP data and non-SAP data then HANA is the ideal vehicle for doing this.

 

Real Time Offer Management and Marketing

It is now common to see organisations moving away from regular but relatively infrequent mailings and towards more flexible, frequent and agile targeting of customers. Since Telco’s are blessed with always being potentially in touch with their customer, via SMS or their Smart Phone features, it becomes possible and relatively easy to extend the work done previously with customer segmentation and select smaller and smaller more targeted ‘micro segments’ of customer based on behaviour and demographics. Having done this it then becomes possible to make offers to small but well targeted groups on an almost continuous basis, different approaches can be trialled and the responses seen very quickly – which can then be followed up by further steps of the campaign. Thus we can see marketing campaigns potentially being executed hundreds or thousands of times a day, probing the market and measuring what works and what doesn’t. See comments of T-Mobile.

 

We can easily see how Real time Offer Management might be extended into a Real Time Decisioning capability and calculating the  ‘best next action’ . The real time detection/sensing of certain activities could lead to better offer, more personalized services and tailored actions. I seem to recall that some of the original AT&T research into ‘moments of truth’ those moments which your interaction with a company either reinforced or destroyed the illusion of intimacy and customer care, this implied that really you’d like a company to behave just like a friend;  to remember your name, who you are, what your preferences are and the recent experiences you’d gone through. With systems such as HANA it is easy to handle the volumes and complex analysis and much simpler to set up the system and try out approaches without needing complex systems tuning and design. This is probably the subject for a separate blog.

 

 

Helping Yourself, and Helping Others

Also we should not forget that some Telco’s who are also information providers and Systems Integrators can provide additional revenue streams by providing HANA hosted services and ‘HANA in the Cloud’ type offerings. The recent HANA Cloud announcements from SAP underline the suitability of HANA in the Cloud (after all what is a Cloud made of if not CPUs with attached memory?) So a Telco may make use of HANA for its own analysis, or it may make money by helping others to do the same, for example see the offering from Savvis.


Summary

As we’ve seen, the new capabilities that SAP HANA brings to the market, very high performance, simplicity of implementation and the ability to make use of multiple processing techniques in the same platform, are all well suited to addressing Telco problems.

These extend beyond simply turbo charging SAP applications but extend to modern analytical applications where it may be used stand-alone or in complementing other ‘Big Data’ technologies. At the same time SAP HANA can bring a fresh and simpler approach to the more classic Telco analytical applications such as Revenue Assurance, Network Monitoring and Churn Analysis.

We hope that this quick and simple summary helps you understand the relevance of HANA within Telco and look forward to developing this theme in upcoming blogs.


For further ideas on how HANA can be used, across industries, then please see the Use Case repository.


The mission of E-intake, a Santa Barbara, CA based startup, is to help the health-conscious, and the not so health-conscious, meet life and health goals with a personalized mobile health app, while allowing those who are open to coupons or offers from vendors of products they consume most often to receive them.

 

One mission critical problem we needed to solve early in the development cycle was how to deliver a real time mobile experience to a base of a million users, who in the course of a month will generate billions of user consumption records.

 

SAP HANA's Startup Focus program provided E-intake a way to get the hands-on training and access to resources needed to verify that HANA would solve that critical problem. The results: analytic queries for product trending and product suggestions that took eight seconds on a conventional relational database was resolved in a couple of milliseconds on HANA, all without spending engineering effort to set up a caching infrastructure. Our developers also appreciated the SAP HANA studio, because having such a robust suite of analytic tools saved them time.

Within 5 years, there will be 50 Billion smart, connected “Things” in the world – sensors, equipment, devices, and machines – all transmitting massive amounts of data.  This growth is creating an opportunity for leading equipment manufacturers to take advantage of these remote connections and create advantages in efficiency, innovation, and responsiveness, displacing their “non-connected” peers and fundamentally altering markets as we know them today.

 

The ThingWorx software platform is enabling this new era of “Connected Businesses” by linking people & systems with these billions of smart sensors, devices & machines.

 

We are working with a number of customers, across different industries, to deliver Remote Monitoring and Remote Service Management solutions that utilize remote equipment data.  In Remote Services Management, large equipment manufacturers as well as highly specialized device manufacturers (such as medical devices, from laboratory equipment to MRI machines) remotely monitor and maintain the equipment that they have sold to their customers through after-market service contracts, collecting massive amounts of data along the way.  They use this data to help diagnose problems, as well as feedback to product design.  Typically, data is collected from each device/machine at relatively high frequencies for the life of the connected equipment.

 

This amount of data can quickly become a big data problem.

 

ThingWorx has the ability to collect this remote equipment data and store it in SAP HANA. We can then use a combination of ThingWorx semantics and analytical UI capabilities on top of the HANA data store and data analysis to provide a solution to create additional value on top of the collection of data.

 

SAP HANA becomes extremely important in the downstream analysis of this data.  The types of analyses seen include the following:

  • Analyze a machine’s performance metrics against the “optimal” performance metrics
  • Analyze an equipment’s history to see where degradation may have started
  • Predictive algorithms for detecting a machine fault or degraded performance BEFORE it happens (this also requires a great deal of data to be collected and analyzed in order to arrive at the correct predictive algorithms)
  • Optimization routines to help customers get better results from the equipment they are running
  • Feedback to sales on equipment that is overloaded, or for replenishment of consumables related to a machine
  • Feedback to product design teams on how a product family is utilized and is performing as a whole

 

These examples are just a glimpse of what is possible with ThingWorx and SAP.  Together, ThingWorx and SAP will enable the next generation of innovative solutions for leading companies looking to extend their value to the new world of 50 Billion smart, connected things and transform their business to a Connected Business.

I'm very excited to announce that we are going to hold our first ever SAP HANA and Big Data Google+ Hangout. There’s been a lot of talk in the industry lately around in-memory computing, big data, and SAP HANA more specifically, and we feel it’s time to discuss. SAP HANA experts, Aiaz Kazi, SAP Mentor and Head of Technology & Innovation Marketing and Amit Sinha, Head of Database & Technology Marketing at SAP will be joining SAP Community Manager, Kevin Cassidy to talk all things big data, streamed LIVE for your viewing pleasure. And we even have a special guest joining us, Irfan Khan, CTO of Sybase, who will share his insights around Big Data.

 

What is a Google+ Hangout? A Google+ Hangout is a modern take on the internet chat room. Panelists can attend via their webcam to hold a live discussion which is then streamed publicly on SAP’s Google+ profile, YouTube channel and/or website.

 

The SAP HANA Hangout will take place on Monday, October 29, 2012 at 9 am PT / 12 pm ET. It will be streamed to SAPHANA.com, YouTube & http://spr.ly/HANAHangout (Please keep in mind that the link won't be live until 12 pm ET on Monday). Aiaz & Amit have been in the SAP HANA team since the early beginning and have valuable insights. You don’t want to miss this session with three key experts.

 

Let your voice be heard! If you have any in-memory technology/SAP HANA/Big Data related questions, please let us know by commenting below and we'll make sure to add the question to our Google+ Hangout! Please keep in mind that the link will not go live until 9 AM PT/ 12 PM ET next Monday.  http://spr.ly/HANAHangout.



SAP held the first ever SAP Startup Forum in Sydney on 21 September. This global initiative offered startup companies a unique opportunity to create easy-to-access solutions to help solve real-world problems using SAP HANA. We saw some great innovations during this first SAP Startup Forum in Australia where eight startups pitched their solutions and demonstrated their use cases for HANA.


Six have been selected to participate in the Development Accelerator (DA) program. The companies were selected based on a number of criteria: the innovation of the idea; feasibility of the business case; proximity to SAP’s portfolio; fit with SAP HANA: technical feasibility; and whether the solution could be market ready within one year and have a prototype in time for SAPPHIRE NOW Madrid in November 2012.


The DA kicks off with a 2-day boot camp where the startups can learn more about SAP HANA and discuss and validate their business case with HANA experts. The primary objective is to get SAP HANA proof points by testing / porting startup apps to the platform.  SAP provides the participants with a free version of HANA to build and test with as well as additional technical support.


The six startups chosen were:


SenSen Networks: a video and business analytics (VBA) solutions company. It can convert video images into accurate business analytics data to deliver increased productivity, revenue and novel insights for decision makers in a wide range of businesses. SenSen VBA solutions detect, extract and analyse relevant events from video streams and render them as valuable information to businesses.


SummaBI: a cloud business intelligence tool, developed specifically for hospitality and retail sales businesses/franchises.


Resonate: a provider of customer feedback (Net Promoter Score) programs and data visualisation solutions for large brands in the Asia Pacific region. Resonate identifies the key customer drivers, turns unhappy customers to brand advocates and offers a closed loop feedback program.


baank: a website and mobile app that helps consumers manage their money, achieve saving goals and get the best deals.


EngineRoom: a specialist in enabling the world to leverage, discover and create value from their data. It brings big data processing on demand.


GoCatch: a smartphone application that connects taxi drivers directly with passengers. The taxi-location app uses GPS to allow passengers to book and track taxis close to their location using their smartphones. Thousands of real taxis from Australian cities can now be seen driving in real-time through the 3D world of Google Earth.

GoCatch is the winner of the People’s Choice Award (voted by SAP Staff).

What a Week!

Posted by Vishal Sikka Oct 22, 2012

This morning I woke up to the sight of cirrocumulus cloud formations (“Pakhshabh Megh”, for the Indians amongst us) outside my window; these clouds are indicative of a looming change in weather.  Perhaps an impending cooling of the air, indicative of autumn finally making its way to Silicon Valley.

Vishal_blog_Oc.22.jpg

I also found myself reflecting on quite an amazing week we just had, between SAP TechEd conference in Las Vegas, and our HANA customer council in New York. In our own industry, where companies are reporting soft results and indeed in the world at large, there is a clear change in weather ahead.  Software is transforming the world in a fundamental way.  On the one hand, the non-linear improvements in cost-performance of hardware, communication, data storage, and even delivery models and software development tools, means that our articulation of software’s intent is key to the future, and given the non-linear growth in cost of adding to the incremental complexity of a landscape, a non-disruptive renewal strategy is the only viable one. And on the other hand, a singular emerging focus on the end-user, their empowerment, means we must focus on amplification of their reach with technology, with design.  With these three imperatives, design-thinking, non-disruptive renewal, and a new real-time platform for our growth, we approached the week.

 

This week we made 5 substantial announcements that demonstrate the steps we are taking towards contributing to this ongoing shift and the imperatives.

 

1. HANA’s evolution from Database to a platform:

  1. SAP HANA began by eliminating the boundary between OLTP and OLAP as a database. It is now dissolving the boundary between database and application servers to create a modern platform.
  2. We showcased our cloud platform; SAP HANA Cloud with application and data services on a RAM optimized cloud. It’s a much simpler construct, that helps create new and relevant experiences for future applications. As part of this vision, we announced the general availability of SAP NetWeaver Cloud service that now runs on HANA, and already has more than 700 users on it! With free developer licenses, you have to download and try this out.

 

2. The unmistakable demonstration of HANA’s scale:

  1. We showcased the world’s largest in-memory system with one Petabyte of raw data in SAP HANA. Besides the stats in my speech, visitors to the test drive area got live access to this system, to throw any query to this system and see the results for themselves. It had 1.2 trillion records, about 10 years worth of transactions from a large retailer, being analyzed in sub seconds.
  2. For those who doubted a columnar database’s ability to process transactions at scale we showcased 1.5M inserts/second, demonstrating unmistakable transactional excellence. We also launched SAP BusinessOne on HANA, the first app on SAP HANA, renewed completely, to combine transactions and analytics on one platform.
  3. We showed new platform capabilities, especially including real-time programming, and applications services and RDL (River Definition Language).

 

3. A revolutionary launch of SAP HANA One, on the Amazon public cloud:

  1. SAP can do big deployments with ease; it’s the small ones where people think of SAP as expensive with longer time to value. Well, think again! With SAP HANA One, a single unit deployment of SAP HANA on Amazon Web Services (AWS) platform at $0.99/hour, companies big and small, startups and ISVs can get started in under 10 minutes and deploy applications on HANA productively. SAP HANA One is not only an innovation in enabling in-memory technology on AWS, but also in business, pricing and open partnership models.
  2. On SAP HANA One, we launched SAP EPM OnDemand for Expense Insight at $0.49/hour and our startup partner Taulia launched CloudFinance, a dynamic discounting process with real-time offers. CloudFinance allows third party financing organizations to make risk-free, short-term investments through purchasing supplier’s receivables and collecting full payment from buyers. Through SAP HANA Taulia is able to do extensive risk profiling of suppliers and their receivables in real-time, allowing potential third-party payers to provide funds based on up-to-the-minute calculations.

 

4. We demonstrated a true ecosystem forming around HANA, specially developers, innovating rapidly on an open blueprint of innovation:

  1. There are now more than 100 startups on HANA.  These startups, have utilized SAP HANA for its capabilities, showcasing breakthrough results. They are doing amazing, and truly inspirational things from supply chain integrity to forecasting epidemics to interactive 3D star charts with billions of stars.
  2. The SI ecosystem around SAP HANA is growing rapidly in extent and expertise. TechED had a crazy race (aka SMACKDOWN) where two SIs, Bluefin and Optimal, competed to migrate the ConAgra Foods BPC+BW system to HANA within 48 hours on the show floor.  Optimal won the competition showcasing end to end solution delivery on BW/BPC/mobile, while Bluefin won HANA technology award for a fast migration (under 3 hours!)
  3. HP launched the first disaster tolerant solution for SAP HANA while IBM and Intel collaborated on the petabyte scale 100 node SAP HANA cluster.
  4. We announced the first SAP HANA Distinguished Engineers from the community who have expertise and contribute to the growing body of knowledge on SAP HANA.
  5. We launched SAP HANA Academy with 125+ videos, all for free.  It provides easy access, easy learning and community contribution direct from the developers and distinguished engineers. This is simply awesome!

 

5. And last but not least, the amazing synchrony and convergence, both of minds and of roadmaps, with Sybase.  Our friends, and now our fellow warriors, together I have no doubt that we make the best database team in the industry. The combination provides innovation, not just integration, with many industry first breakthroughs such as a unified table, ultra low latency analysis, extension to massive data sizes with disk tables, common query federation, Hadoop integration and a single modeling environment.

 

After TechEd I went to New York, where more than 100 customers joined us for the HANA council, and the buzz was just incredible.  We saw an embrace of design-thinking where customers went beyond the speed gains with SAP HANA to rethink their business processes and do non-disruptive renewal of their enterprise landscapes simultaneously. The buzz on SAP’s efforts with consumer applications, app designer and personas to power next generation user experience was palpable.

 

And as we close the week, these numbers were already changing.  37 customers are already on HANA One, 751 users already signed up for SAP NetWeaver Cloud on HANA, 8166 downloads of SAP Visual Intelligence on HANA and Optimal and Bluefin demonstrated moving the ConAgra BW system to HANA in 3 hours!!!  It is a different ball game now, a different reality.

 

So as I look ahead to the rest of the year and beyond, SAP HANA will strengthen as the platform for business. It will be a modern platform for startups, developers and customers to build, without the constraints imposed by previous generations of technology.  A great platform provides the foundation for what I call “purposeful innovation” that changes the way we think and work. At SAP TechEd, one did not need to look far to find the seeds for future purposeful innovations. For instance, 3 middle school kids,  Nikola Bura, 11; Jordan Qassis, 12, and Angelo Castro, 13, all students at Roberto Clemente Middle School, presented “Food Agent,” an iPhone app that scans quick response (QR) codes on grocery products and provides consumers information on an item’s origin and ingredients. This was awesome! To see these young developers compete with seasoned experts on our new platform with a “purposeful innovation” shows that change is on the way - cirrocumulus clouds are harbingers of a great future ahead.

Liquid Analytics is where mobility meets big data.  We provide users with engaging and inspiring mobile experiences along with real-time, actionable insights on their big data, enabling them to act on current opportunities and issues in real-time.  From our “Insightful Experience” strategy sessions that help companies jumpstart their enterprise mobility strategy, to rapid lightweight “Value Apps” running on HANA, to implementation of “Liquid Decisions”, our goal-driven app that runs on HANA, we bring the promise of mobility, combined with big data, to the SAP product suite.


Liquid Decisions provides real-time actionable insights to the Wholesale/Direct Store Distribution (or “DSD”), Logistics, and Warehouse and Factory Materials Management (“IM&C”) markets via proprietary predicative algorithms.  For example, we provide Restock product recommendations tied to accounts to help Wholesale/DSD companies manage utilization of shelf space and ensure their products are never out of stock on retailers’ shelves, and we provide recommended products to Wholesale/DSD companies for rapid order execution. 


In addition to the Wholesale/DSD industry, our Liquid Decisions solutions for the IM&C space gives factory managers the ability to track crucial metrics on components and visualize their factory floors in real-time.  Finally, we enable Logistics companies to track their shipments to ensure they will arrive at their destination on time. 


Liquid Decisions InsightPops enable the execution of large volumes of transactions in a very short period of time, such as sales reps completing 80% of their orders 20 minutes before end-of-day cutoff.  Liquid Decisions data mines voluminous amounts of data from SAP, BI systems, cloud services (such as Salesforce.com) and other sources, and performs statistical processing and complex event processing from multiple sources.  SAP HANA enables Liquid Decisions to process our InsightPop algorithms in-memory and in real-time. 


SAP HANA is more than just hardware optimization. SAP HANA’s proprietary underlying data model enables the rapid import of data from external systems as well as enabling rapid calculations of aggregate values.   SAP HANA enables Liquid Decisions to generate insights and push them to users’ iPad and other mobile devices faster than any alternate solution today.  This is useful for our customers who use batch processing. A great example of this is Logistics customers who do end of day truck layout and route planning – this can be accomplished up to 525x faster with HANA.  The result is insightful analytics and inspiring user experiences all in real-time.

First take from a long-time user of the SAP HANA platform on Amazon Web Services (AWS)


Let me get started with the facts before I expand on my own experience with SAP HANA on Amazon Web Services (AWS):

 

Vishal Sikka announced the availability of the SAP HANA platform available in the public cloud during his TechEd keynote which was packed with exciting news this morning! The brand new offering – called SAP HANA One – is being hosted by Amazon Web Services (AWS) and can be consumed via the AWS Marketplace. SAP HANA One allows anybody – including SAP customers, SAP partners, independent ISVs or SAP itself – to build and deploy breakthrough SAP HANA solutions with minimum upfront investment in a do-it-yourself and pay-as-you go fashion.

 

Starting up an SAP HANA instance on AWS takes only a few minutes, plus some time to download, install and configure SAP HANA Studio. If you have done it before, all those tasks combined will take much less than 30 minutes. And if you have never done it before, we put the required steps at your fingertips at www.cloud.saphana.com.

 

 

What are good SAP HANA One use cases?

While it provides you the end-to-end platform capabilities of SAP HANA, SAP HANA One is limited to 60GB of main memory (using currently available AWS instance types). Half of that is required to run the database itself – leaving you with about 30GB left over for data storage. With data compression that ranges up to 70% in some customer examples, this is a good amount of space – but not enough to run an entire company on. Therefore, SAP HANA One is best suited for departmental use cases instead of company-wide deployments.

Building and deploying these use cases play a critical role in your ongoing innovation initiatives as a company. SAP HANA One helps make that innovation easier by delivering on its promise to get mission-critical breakthrough applications up and running quickly without any upfront investment and at minimal ongoing cost. Have a look at these three examples, one from SAP, one from a partner and the last one a showcase of a custom solution:

 

  • SAP EPM onDemand for Expense Insight is a solution powered by SAP HANA One. Providing department managers with real-time insight into their expenses, this solution is commercially available via the AWS Marketplace right now.
  • Taulia is an expert in helping large organizations reduce their total spend. An early adopter of SAP HANA One, Taulia has build an application that enables companies to do extensive risk profiling of payment accounts in real time (www.taulia.com).
  • Sentinel is a showcase application running on SAP HANA One. This application fuses algorithmic trading with deep text analytics to boost investment portfolios, time trades, and find the hottest stocks via social media buzz.

All three are good examples of what customers and partners can develop and deploy using SAP HANA One on AWS.

 

 

So Why am I so Excited About this Announcement?

Way back in the dawn of SAP HANA, I had the job of getting some SAP HANA test-drive demos up and running on the www.experiencesaphana.com web site. Things were moving fast (and have not slowed down since then). My team needed to go live with the demos in time for a keynote address by Vishal last year. We had no time to wade through a formal purchasing process, so I decided to use the Amazon Web Services (AWS) cloud option. Result: Our SAP HANA demos worked and performed beautifully and have run ever just as well ever since. Make sure to check out the test drives if you have not done so yet.

As with SAP HANA One, the key value proposition for me and my team was to use the SAP HANA platform with little upfront investment. The thought of purchasing hardware for the deployment of online demos gave me hearth burn. AWS was the solution to the problem. In the end, we were able to deliver close to 10k demos in one month alone.

 

 

How to Find out More about SAP HANA One?


Visit us at TechEd Las Vegas!

For anybody at SAP TechEd in Las Vegas this week, please visit us in the Clubhouse and look for the kiosk with the title “SAP HANA One”. There you’ll find advice from SAP HANA and AWS experts. You can also get your own promotional $25 coupon to be used on AWS for deploying SAP HANA One.

 

Not in Las Vegas? No problem!

For anybody else who did not make it to TechEd, we have documented all the facts and necessary steps at www.cloud.saphana.com. This link also guides you to a discussion forum on the SAP HANA website enabling community support for SAP HANA One. The first 2000 customers logging on to this community support page may request a $25 AWS coupon for use on AWS.

 

 

Summary

I hope you are seeing the potential that SAP HANA One provides to SAP customers and the ease of use and speed to deployment of a true breakthrough SAP HANA solution.

 

I am sure the new offering will take you implementation …

 

 

                                … from Design time to Run Time in No Time at All!

 

Feedback and questions are encouraged!

1.2 Trillion records in less than a second!

 

You may have seen the latest TechEd announcement from Vishal Sikka where we announced our latest performance and scalability test results for HANA.  It is just incredible to see how far we can take HANA.

 

Recently, we built the world’s largest in-memory HANA system, with 100 nodes, 4000 cores, and over 100 TB of RAM in our co-location facility in Santa Clara, California.  We then loaded this system with over 1 PB of raw uncompressed data representing 1.2 trillion records.  This is like 10 years of sales data for a very large enterprise at over 328 million transactions a day.  Now imagine if you wanted to do a year- over- year comparison of sales history for any time window in that 10 years.  Not just the last couple of years, but think about comparing year 10 to year 1.  In a traditional Oracle database, you would need to find a DBA to get that year 1 data out of the archives to even start the analysis.

 

How about on our 100TB system?  Our performance tests on this “big data” shows a fast response time of 0.43 -0.50 seconds for ad-hoc sales and distribution queries and 1.2 to 3.1 seconds for more complex year over year trending data over different time periods. 


No waiting and no DBAs. 


Just breakthrough performance and scalability.


And there's more to come. More test results, white papers, and details.


And if you happen to be at TechEd Las Vegas, be sure to visit the Test Drive area in the Technology Showcase, where you can find out more about the testing, and talk to the performance experts that made it happen.

jon-headshot-2.jpg

By Jon Siegal, Founder & CEO, Fan Appz, Inc.

SPECIAL NOTE: For those attending SAP TechEd in Las Vegas this week, you can see the Fan Appz demo today (10/16) or tomorrow (10/17) in the clubhouse demo area. Check the schedule for final times. You can also find Jon on Twitter @jonsiegal (http://www.twitter.com/jonsiegal).

 

Fan Appz, a growing startup in the heart of Los Angeles’ ‘Silicon Beach,’ helps such brands as FOX, NBC Universal, and Zales turn casual fans into loyal customers. Our state-of-the-art Personalized Marketing Platform gives enterprise customers everything they need to grow, engage, learn from and convert their followers into repeat buyers and – ultimately – brand evangelists.

 

Fan Appz clients have some of the largest social media audiences in the world, representing hundreds of millions of fans and followers. Our platform provides brands with an extensive array of Social Experiences to engage their social media audiences, and to capture and remember valuable behavioral and social data from every interaction.

 

Each social interaction a brand secures through one of our Social Experience Applications brings insights that can be used to improve the fanappz-logo-stacked.jpegpersonalization, targeting and performance of their marketing efforts. Clients can also ask for personal inputs on specific topics to further customize their marketing and communications with fans, prospects and customers alike.

 

For example, one of our large retail clients used our Social Experience Apps to ask moms on Facebook to rank which items they most wanted for Mother’s Day. Our client used that data to create a multi-channel marketing campaign across its social, email and print catalog efforts, as well as on-site and in-store promotions. The campaign drove a significant increase in sales of the items the moms picked for that season, and has inspired a multi-year collaborative engagement with fans.

 

As the sophistication of our clients’ marketing campaigns grows, so does the amount and complexity of the data we’re helping them manage. By March of 2012, that explosion of relevant social data inputs posed both technical and economic challenges that we needed to solve in order to support clients’ growing needs and our own strategic goals.

 

After all, our mission is to help brands use social data insights to improve the targeting, personalization and performance of all of their marketing efforts, from social, email and online advertising, to on-site and in-store promotions. Call it a marketer’s utopian dream, but our vision is for brands to address each fan as an individual, not as part of some mythical mass audience.

 

Hence, this spring we began looking for a technology partner whose underlying technology was capable of addressing our technical and business requirements. At the same time, we sought a partner who embraced our vision and ambitious goals, and could provide a level of collaboration and support that would accelerate our ability to meet product development milestones.

 

Enter SAP HANA. Through our research, we quickly discovered SAP HANA’s next-generation database solution. It enabled us to provide our clients with real-time analytical processing of large volumes of complex data. And through joining the SAP HANA Experience program, we were able to get the hands-on training and access to resources we needed to very quickly begin development.

 

Some might call it a dream to address each fan, prospect or customer as an individual. But we’re confident that, someday soon, we’ll just call it smart marketing.

 

Please let us know if you have any questions about our work with SAP HANA and we welcome your comments below.


SCOOPing your Enterprise - ERP and BI reunited: Steering and Operating Your Enterprise in Real-time

 

What is SCOOP all about?

 

SCOOP is about Seeking Cash Opportunities in Operational Processes.

 

Fascinatingly enough there is a huge gap between the controlling of enterprises at large and the controlling of their business processes. Nowadays many if not all enterprises are managed towards shareholder value. But for their business process risk and reward decisions financial insights are rarely leveraged. The reason is simply that there is no common knowledge about how for example a change in on-time delivery is impacting for example the
Days-Sales-Outstanding (DSO) of the enterprise.

 

SCOOP closes this gap. SCOOP enables better decisions.Tradeoffs become educated. Risks can be managed. Qualified targets can be set. Operational performance can be controlled.

 

What does SCOOP do?

SCOOP analyzes the actual business operations of an enterprise. Technically it scans the actual postings (Belege) in the live SAP system. By applying sophisticated statistical methods prognoses respectively simulations of operational processes can be performed. For this neither business process modelings nor instrumentations of the live SAP system are required.

 

Why was SCOOP impossible before HANA?

The sheer amount of to be analyzed data forced people to revert to offline. Business processes got modeled and reflected in a star schema. Operational (ERP) data got extracted (selectively), transformed (cleansed and aggregated) and loaded (regularly and frequently) into so-called data warehouses. On them all sorts of business intelligence (BI) analyses could take place – till pre-thought business process models broke or detailed data were missing in the warehouse. BI turns out as not only being too slow but also too inflexible for fast running enterprises.

 

With HANA the world has changed. ERP data can be analyzed on the fly. BI analyses can be executed on the original data. Offline becomes online. Hindsight forensics turns into simulations and prognoses of the operational performance of an enterprise.

 

What is VMS all about?

VMS is specialized in optimizing the return on investment in SAP systems. VMS provides SAP customers with actionable, objective, fact-based cost and performance prognoses. For this purpose, the actually executed business transactions, customizing, support and maintenance processes and the technical environment of the SAP operation are gauged and analyzed. VMS uses sophisticated statistical methods and its collective knowledge of 2,700 measured SAP systems (among them 650 SAP BW systems). The result is optimal business operation by making the best use of SAP software: Low costs. High performance. Optimal business processes.

 

The author is Chief Delivery Officer at VMS AG in Heidelberg, Germany.


Customers dealing with large volume of data are embarking on best possible solutions which can provide the ability to analyze large volume of data.  For example in the currently prevailing scenario among Utility Industries there will be an explosion of data as a result of the planned roll-out of Smart Meters across various countries.


This blog aims to provide the overview of the best solution that will suit the most complex requirements among existing SAP In-memory analytic products ‘BW on HANA’ vs. ‘SAP HANA as a data mart’ that would have the ability to analyze the data rapidly and would give a significant advantage of quicker implementation of standalone instance and look at consolidating the existing BW and the fresh BW instance later.


Factors which would influence this decision are:

1. Upgrading the existing BW 7.0 system.


2. Customers are clear on what data to put into HANA, so that the HANA memory sizing shall be limited which is a decisive factor.


3. Time constraint to report on the regulatory and analytics reports to business – so the decision to go alone for stand alone data mart – with SAP HANA or SAP BW on HANA for a faster implementation.


                                 Fig 1: SAP HANA as Datamart

1.JPG


2.JPG

                                  Fig 2: BW on HANA as Datamart


Pros and Cons of using SAP HANA as a Data Mart


Even though SAP HANA as a data mart has been powered with In-memory processing, only the reporting part of it is accelerated not ETL. In addition to that it also requires skills in HANA modelling, SQLScript, IMC Studio etc along with this there is a requirement of additional skills for appliance administration.

  • Additional administration & Skill sets requirements

For customers already running SAP BW, there arises an overhead on skill requirements on development, admin & maintenance for both the existing SAP BW instance and an instance for SAP HANA as data mart.


  • Complex Reporting requirements

Requirement of data level security options is limited. For customers with complex reporting requirements such as time dependant master data, dynamic hierarchy and it will be difficult to handle these requirements using the SAP HANA as Stand alone data mart

Competitive advantage of BW on HANA as Data Mart

Features of SAP NetWeaver BW can be leveraged with the power of SAP HANA as a database. Data warehousing solution would also be flexible & scalable with SAP BW + HANA as its Database collaborated. This is the most cost effective way of introducing the power of SAP HANA into the Data warehousing Landscape in an organisation. Also it has highly integrated tools for data modeling, monitoring and reporting resulting into low development and maintenance efforts leveraging on existing BW team’s skills

Value Proposition addressing key Technical / Business drivers

Basic business scenarios such as dynamic layered hierarchy solution and other general time dependency requirements can be easily modeled using the standard features of SAP BW.


Organizations spend reduced by adapting to BW on HANA as it provides almost double the memory for the same price of HANA as data mart.

Another dimension to consider with customers having planning application requirements

With BW on HANA reporting performance is also accelerated along with Performance boost for ETL processes with fewer materialized layers. In case of planning requirements, Planning Logic has been pushed down to SAP HANA layer.

Traditional Planning runs the planning functions in the Application Server whereas In-memory Planning runs all planning functions in the SAP HANA platform.

Data warehouse with Agile modelling and Reporting

HANA optimized Info providers provides the ability to design simplified schema for optimizing data loads. For e.g. using HANA optimized DSOs, delta calculations is completely integrated in HANA.


In-memory optimized data structures provides faster access without any roundtrips to application server and also the advantage of redundant data storage, simplified administration and optimized infrastructure. There is considerable reduction in Layers on column based storage with high compression rates and significantly less data to be materialized. It gives the greater Modeling flexibility and reduction in materialized layers.

BW on HANA as Data Mart - Additional Capabilities enabled with In-memory!

Customers adapting to a unified Instance of DB and BW merging in one instance has simplified administration via one set of admin tools e.g. for Data Recovery and High Availability.


With the BW on HANA architecture, it would be simpler for integrating a Near Line Storage option as the volume increases over the years. And also potentially easier to consolidate the landscape in the near future to migrate existing BW investments across into BW on HANA solution.


Preserving existing Investment without disruption in the Future

SAP indicates that there would be 20% Reduction in Admin & Maintenance FTE as the existing skills can be leveraged to the project and re-deploying the skills to an integrated SAP BW instance. With the existing vision from SAP where the entire business suite will be running on SAP HANA, hence it gives better prospective for customers to deploy BW on HANA


In the previous two episodes of the Modeler Unplugged (1.1 & 1.2), we have shared how you can import BW models as native HANA Models and subsequently explore the details between mappings of the various BW Model elements and corresponding HANA Model elements. In this video, we share with you how you can see the Data Preview of those imported Models from within the HANA Sudio and explore them using SAP BO Explorer.

 


 

For licensing questions regarding the usage of this feature, please check with your Account Executives.

If a SAP HANA solution were a car, what kind of car would the customer want it to be? A highly tuned car with many fancy parts, or a standard production car with certain modifications for a specific purpose?

 

Well, at night in your preferred pub at the regulars’ table, you might like to brag about the performance and shiny extras. But the fun will be over the next day, when you recognize that the car needs 100 octane gasoline which is really hard to find—and lots of it. Not to imagine what will happen when something needs fixing, and only an extravagant and expensive specialist is willing and able to touch this so-different car.

 

That’s why Cisco and NetApp decided to build an SAP HANA solution with standard and well-tested components, which customers already use in all their standard SAP NetWeaver implementations.

 

The Cisco UCS™ platform, with its unified fabric, is an innovative, off-the-shelf, high-performance platform with simplified management and built-in scalability. The X86 platform is well-proven, and the HA configuration does not require any tuning—pardon, scripting. Low latency and high throughput are guaranteed.

 

The NetApp® platform uses simple and reliable components as well. Standard and well-proven NFS is leveraged, making it very easy to maintain and configure. After all, the Linux® operating system still runs best with NFS. It’s reliable, flexible, simple—and most important, it doesn’t create unwanted complexity or additional costs by requiring special gear. And with 10Gbit Ethernet, it’s fast, as validated by SAP as the certification authority. Not to mention, there’s a roadmap for NFS, and it will be available next year and the year after and the year after that.

 

What will happen when SAP improves the SAP HANA software, adding backup and restore as well as disaster recovery and virtualization? Do you need a new “vehicle” to be able to use all these improvements? Not at all. The Cisco UCS platform is prepared to support virtual machines, and NFS makes it child’s play to use VMs. The NetApp Data ONTAP® operating system is waiting for Snapshot™ based backup and restore and disaster recovery.

 

To come back to the opening question, the reliable, high-quality, standard production car is the better choice, even though you may not brag about it at the pub.


Well, its that time of year again.  SAP TechEd season has started and the SAP HANA team has been called on again to go above and beyond the call of duty to deliver a massive amount of SAP HANA knowledge at all four SAP TechEd locations.

 

Here’s a quick summary of what’s going on at SAP TechEd Las Vegas next week for those of you attending in person or online.

 

Vishal will kick things off on Tuesday morning with a massive torrent of SAP HANA awesomeness.  You can expect a “HANA state of the union” update on customer adoption and successes.  You’ll also get some tasty nuggets of the massive “speeds and feeds” achieved over the summer.  There will most certainly be a few killer demos of fresh HANA innovation and I’ve heard around the coffee corners that there will be some massive announcements related to SP05 and maybe even some teasers for SAP ERP on SAP HANA.

 

The real star of the show are the 341 SAP HANA lectures and hands-on sessions you can attend. Massive firehose of technical knowledge about every aspect of HANA you could want to learn. But, wait-- there's MORE!!

 

Right after the keynote, the showfloor opens up and I guarantee you’ll get smacked in the face with SAP HANA the instant you walk in the door (seriously, you really can’t miss it).  The SAP HANA area is the first thing you’ll pass on your way to the clubhouse, Q&A area, Meet the Experts theaters, Partner Pavilion and FOOD.

 

TestDrive.jpg

 

The SAP HANA area of the Technology Showcase will feature 8 tables covering 260 degrees of SAP HANA technical topics.   The whole area is staffed with SAP Übergeeks who can drag you neck deep into the code on every angle of HANA.  We’ll also have a ton of live HANA servers running the demos there, so you can warm up your coffee on their chassis while getting your geek on. Here’s the list of table topics and a look at the super-cool layout.

 

 

  1. SAP NetWeaver Business Warehouse on SAP HANA
  2. Administration & Operations for SAP HANA
  3. Data Integration and Big Data with SAP HANA
  4. SAP HANA Application: Accelerators, Real Time Apps, Data Driven Apps
  5. 3rd Party SAP HANA Developers: OEM, Start Ups, ISVs
  6. Analytics on SAP HANA
  7. Data Modeling & Provisioning with SAP HANA
  8. Application Development on SAP HANA

 

HANA_Setup.jpg

 

Inside the center of the area you’ll find the SAP HANA Race (AKA “SMACKDOWN” #RealTimeRace) where Bluefin and Optimal are going to take an actual copy of a customer BW system and migrate it to SAP HANA right before your eyes.  Come by, gawk at the geeks, haze them mercilessly and poke them with sharp sticks if you like

 

We’ve basically doubled the amount of educational content and face-to-face learning opportunities for SAP HANA at SAP TechEd this year, so there’s no excuse for anyone to leave the show with any unanswered questions about HANA. So carve out a couple of hours of time in your schedule to sit down and suck up all the SAP HANA awesomeness you can handle.

 

We'll be live tweeting all the SAP HANA activities at the show, so follow @SAPinmemory and @jeff_word and #SAPTechEd next week to keep up with everything.

 

 

 

 

 

 

 


Robert Klopp

HANA and Greenplum

Posted by Robert Klopp Oct 9, 2012

Just recently someone asked me (here) about  how HANA and Greenplum can coexist? This is a great question worthy of another blog.

 

If we forget about the business benefits brought by the extreme performance of HANA and forget about the throughput advantages of (nearly) every query completing in a second we can still draw some conclusions about HANA vs. a disk-based system like Greenplum based on hardware economics. If we include these business benefits then the arguments below become even more favorable for HANA.

 

The Five Minute Rule (see here) suggests that based on current DRAM prices any data touched within 50 minutes should reside in-memory. If we assume 10X compression by HANA then the number climbs to any data touched within 8 hours. In a year new memory technologies will extend this to data touched at least once every 2 days. So "hot" data should be in HANA for purely cost reasons.

 

If you have an application "mart" that looks at data that is fresh without even thinking you can justify HANA over Greenplum. But there is more

 

HANA should be 100X faster than Greenplum or more for a typical query. This is not hype just the performance boost from avoiding disk I/O. Because memory is more expensive than disk, the cost of a HANA system will be 2X-3X the cost of a disk-based system. But 100X faster for 2X the price is a pretty good deal after all, the correct measure of value should be price/performance not just price. Note that this value equation holds for Exadata 3 as well even when the data is held in Flash (due to the limited bandwidth from the Flash to the RAC processors see here).

 

This is a pretty airtight argument. It makes it clear that picking any disk-based DBMS is making the choice for a cheaper, but adequate, solution. I think that in every case IT organizations need to think carefully about the definition of "adequate" as the business often thinks that the benefits in people productivity that comes with price/performance has value.

 

If your customer has Greenplum already there are some interesting options. It is possible to define HANA tables as external tables to Greenplum and allow Greenplum access to HANA for queries where hot data and cold data need to be joined. In addition, putting the query load associated with hot data onto HANA will offload that work from Greenplum and make Greenplum more effective. Greenplum may suggest their own in-memory database... but it is not shared-nothing scalable nor column-store so there will be a significant difference in the cost (no column compression means more memory) and in the performance.

 

Note that this hybrid approach extending an existing data warehouse with a HANA-based hot data mart will work for other databases it is not a Greenplum-specific solution.

 

So, the differences between Greenplum and HANA are due in-part to the economics of in-memory databases and the option to have the two products co-exist can provide significant value to your customers and users.

Hi everybody,

As you may know, SAP continues to place strategic focus on SAP BW as our fundamental enterprise data warehousing application offering. To that end, we are pleased to announce the release of SAP BW 7.3, SP8, a service pack release that builds upon your investment in SAP BW 7.3 by adding even more capabilities and functionality to take you to the next level of performance, ease of use and flexibility in SAP BW.

SAP NetWeaver BW Support Package 8 includes many enhancements that came from direct  customer feedback during the early ramp-up period of  SAP BW 7.3 powered by SAP HANA.  In this context, additional integration scenarios have been added between SAP BW on HANA and HANA Data Mart use-cases for further flexibility as well as to enable the “Not-active” data concept for optimized RAM sizing and lower TCO. Further enhancement include support for converting existing SPOs to HANA-optimized DSOs and  InfoCubes and partitioning of Write-Optimized DSOs.

See the attached presentation for all the details.

Best regards,

Lothar


Just published yesterday is a compelling new video featuring SAP, IBM and Intel executives discussing a 100TB DRAM HANA scale-out implementation. 100TB DRAM today, with potential to grow to 10PD tomorrow. Talk about scalability.

 

Wes Mukai, VP, Systems Engineering/TIP at SAP, Mark Pallone, Enterprise Software Strategist at Intel and Rich Travis, SAP Infrastructure Architect at IBM provide an outstanding overview of how each of our companies come together to focus on high value solutions, and how SAP HANA represents a fundamental change in the way our joint customers can begin the process of managing their company’s big data. IBM’s expertise in high performance/ mission critical computing, coupled with the latest Intel technology built into that platform are what’s vital to HANA’s performance and at the core of what partner co-innovation and collaboration truly mean.

 

Learn about how SAP approached IBM with a great challenge and an aggressive schedule to implement the largest in-memory database ever built --  and the results of that challenge.

 

Check out the video and see yet another example of how we leverage the power of our partner Ecosystem with global technology leaders to deliver game-changing innovation to our customers – a new paradigm in business computing enabled by SAP HANA today and in the future.

In last week's post, we shared how to import BW Models as Native HANA Models (Episode 1.1). As a follow-up of that, this week's video will dig deeper into explaining the inner workings of such imported models. This video is meant for advanced users who have good understanding of the various BW modeling elements and it also explains how they are mapped to the native HANA Model elements.

 

This is an open call for nominations to be HANA Distinguished Engineers. If you caught Vishal's announcement at the HANA Anniversary Event, or saw our prior blog, wiki page, webcasts, video chats or other announcements on the topic, you know what the HDE program is about. If not, what are you waiting for? Go check them out!

 

So what are we looking for in HDE candidates? The first and most important criteria is someone that meets our core values:

 

Core Values:

    • Restless curiosity
    • Collaborative and sharing by nature
    • Continuously striving for improvement
    • Challenges convention and status quo
    • Mentors others

 

The commitment to these core values will be displayed in the actions of a HANA Distinguished Engineer. We would expect this commitment to be reflected in:

 

A minimum of three pieces of high quality, substantive, original technical content that reflects knowledge and expertise in some technical area pertaining to SAP HANA. This content is inclusive of, but not limited to:

    • Technical blogs on any publicly accessible blogging platform
    • Technical articles, podcasts, webcasts
    • Peer to Peer speaking engagements
    • Technical instructional or educational videos, seminars or lectures
    • Forum moderation
    • Code snippets/libraries
    • User group activities (SIG, Influences councils, etc.)

This commitment should continue through such efforts not only preceeding acceptance as a Distinguished Engineer, but on a continued annual basis as well.

 

And lastly, we are looking for candidates that have hands-on experience with SAP HANA. This should consist of a minimum of:

    • Six months of full-time verifiable hands-on experience with SAP HANA in a proof of concept, pilot, implementation project or production support capacity, in either a consulting or customer role, or
    • Six months of full-time verifiable experience providing remote support to SAP HANA customers as part of a paid SAP or partner support organization, or
    • Six months of full-time verifiable experience in a product development role within the SAP HANA development or product management organizations.

 

 

And this experience could be in any aspect of SAP HANA technologies, including:

    • Application Developers
    • BI / DM / DW Developers / Modelers
    • ETL / Replication Developers / Modelers
    • Application / Database Administrators
    • Security Administrators
    • ... and other technical capacities specific to SAP HANA

 

OK, that's a lot of bullets, so I'll stop there. If you, or someone you know, qualifies for, and is interested in, becoming a HANA Distinguished Engineer, please fill out the nomination form today at HDE Nomination Form.

Robert Klopp

Exalytics vs. Exadata

Posted by Robert Klopp Oct 3, 2012

Most, if not all SAP customers find that HANA's in-memory architecture provides the opportunity to eliminate aggregate tables and OLAP cube structures... HANA is so quick to aggregate data on-demand, that building a fragile and complex hierarchy of derived data and a series of derivation job-streams is no longer required.

 

Oracle recently announced Exalytics as an add-on to Exadata where derived data can reside in-memory to improve OLAP query performance. Oracle has also just upgraded their Exadata offering with a significant injection of Flash memory in-between the disks and the processors. Flash memory is sort-of the poor-man's in-memory database alternative. They say that "hot" data will then reside on Flash silicon, queries will avoid slow disk I/O, and be screaming fast.

 

HANA set the bar for "screaming fast" by defining "fast" as eliminating the requirement for pre-aggregated data. "Fast" is fast enough to aggregate on-demand.

 

So one question is: can Exadata eliminate the requirement for pre-aggregated data? If so, then Exalytics is useless. If Exadata is not quite good enough to eliminate the aggregates but it is as fast as Oracle claims... then the pre-aggregated cubes could reside in Exadata Flash rather than on an expensive add-on server... and again, Exalytics is useless. The only other possibility is that Exalytics was designed by Oracle to solve a real problem... that Exalytics is not useless... but that Exadata is not as good as the hype. There is a disconnect somewhere over there...

At the Oracle Open World keynote this week, Oracle repeated what Hasso showed years ago - "Everything is in memory…Disk drives are becoming passé."  We are, of course, glad they realized this. With SAP HANA, our customers have been benefiting from this reality for more than 18 months now. 


And yet Oracle made statements that are clear distortions and misrepresent HANA.  It has become something of a recurring theme, to mis-state and distort things. As industry leaders, we must do better. It behooves us to tell the truth to our customers, our partners and our employees.  We do not serve our stakeholders well by mis-statements and omissions of key things we know to be true.  They deserve better.  History deserves better. It is true that HANA represents a fundamentally new, rethought, database paradigm, and is receiving tremendous success in the market.  Perhaps it is its disruptive nature that threatens the status quo of database incumbents.  Perhaps it is some other reason.  Regardless, I find myself once again setting the record straight.


The statement Mr Ellison made about HANA, when talking about the release of a new Exadata machine, that has 4TB of DRAM and 22TB of SSD, is false.  He referred to HANA being "a small machine" with 0.5TB of memory. He said his machine has 26TB of memory, which is also wrong (SSD is not DRAM and does not count as memory, HANA servers also use SSDs for persistence). 


Here is the truth:

  1. HANA systems range from the very small, to extremely large scale systems. HANA’s architecture, with full exploitation of massive-parallelism of multi-core systems, and native use of memory via new, totally redesigned data structures, enables nearly unlimited scalability.
  2. We are presently shipping, for the last several months, certified 16-node HANA hardware made by 4 vendors: IBM, HP, Fujitsu and Cisco.  These systems are available for 16TB of DRAM, so they are already 4 times bigger than Oracle's machine, and they have been in the market since spring of this year. The machines can take up to 32TB of DRAM, within their current configurations.  In IBM's case, with the Max5 configuration, they can go up to 40TB.
  3. The largest HANA system deployed so far is the 100-node system built earlier this year with Intel & IBM (see picture below).  It has 100TB of DRAM and 4000 CPU cores.  Mr. Ellison is welcome to come to Santa Clara and see the system working himself, with workloads from several customers.  We shared this information in front of 10s of thousands of customers at our SAPPHIRE NOW event earlier this year.  Already today this system can go up to 250TB of DRAM (and with HANA's compression, can hold multiple Petabytes of data entirely in-memory).  Our partner, Steve Mills, Senior Vice President and Group Executive of IBM’s Software & Systems, whose team helped build this system, had this to say in support of this open innovation, “IBM and SAP have partnered to demonstrate an SAP HANA system at 100 TB, making it the largest in-memory database system in the world. That system, running on 100 IBM X5 nodes, can now scale to 250 TB.”

Scaleout_HANA.jpg

With the processor and memory roadmaps from our partners, these systems will be doubling in their capacity by this fall/early 2013 (so multiply these numbers above by 2, etc.).  And we don't have to release new versions of hardware to take advantage of this innovation.


HANA is built on a simple notion: advances in hardware, and deep research into the nature of modern enterprise software, enable us to rethink the database.  And we did. I treated this notion as a design principle for HANA’s construction. Others are trying to protect their database systems that were designed in the past.  The use of new SSD access technology, which accelerates access to flash and demonstrates performance improvements, simply reinforces this point.  HANA also benefits from this technology, for reading logs, for restart performance, etc. as do our ASE and IQ databases.  But HANA is built on a basic principle that Hasso had articulated many years ago: when we run everything in-memory, we can get predictable response times on even the most complex queries and algorithms, and everything can execute with massive parallelism. This power gives us the freedom to rethink enterprise software. To renew existing systems without disruption: from OLTP apps (such as our Business One product that we released on HANA last week), to Analytics, from structured data processing, to unstructured.  To rethink systems to run 1000s of times faster, and to eliminate batch jobs everywhere. It also gives us the ability to build completely new applications, unprecedented solutions.  To help software simplify the world, and connect it better, in real-time: from genome sequencing to energy exploration, from real-time customer intimacy, to inclusive banking. To liberate us from the confines and limitations of systems of the past, and enable us to be limited only by our imagination. 


As Alan Kay always told me, the future does not just have to be an increment of the past.  We choose to focus on the future, on building a highly desirable, feasible and viable future, with our hands, with our customers and partners.  Instead of focusing on incrementing limited systems of the past with temporary technologies.  And we think there is no room for lies in that world.  The truth of a HANA based landscape, and its unmistakable success, is open to all, and it is ours to take and build on.


Happy HANA. 


Best,

Vishal

In an organization using SAP HANA, the data resides in-memory for achieving massive performance. However, as the data grows, the amount of memory required to store the data also increases which in turn increases the cost as additional memory is required to cater for increasing data growth. Enterprises implementing SAP HANA should follow a data persistent strategy for building a smart storage infrastructure based on the business value of data thereby addressing the data storage requirements efficiently and at lower cost.

Making Storage Strategy Smarter

In SAP HANA, not all data is accessed frequently but it has to reside in-memory which increases the amount of main memory used. The historic or ‘cold data’ can be stored in separate data storage based on less expensive storage option. This data can still be accessed anytime providing necessary performance at lower cost. The end result will be a storage infrastructure that addresses the storage requirements of the business in a most efficient and cost effective solution.

Data can be classified into

  1. Hot or Active data – data that is used or updated frequently
  2. Cold or Historic or Passive data – data that is not updated frequently or used purely for analytical or statistical purposes

 

  When the historic or cold data is stored in separate data storage, the main memory storage is reduced and frees up the hardware resource and also makes the static data available. Access to this data requires faster reads but at less expensive cost. Maintaining all data including the infrequently accessed static data in a high-performance online environment can be very expensive or just impractical due to the limitations of the databases used in the data warehouse.


What Data needs to be persisted?


This is an important exercise that needs to be undertaken before we embark on any data warehouse project. With all the in memory solutions that are quite costly, it is better to do an exercise to understand organization’s data requirements. Some of the pointer what data needs to be persisted or does not need to be persisted is given below.

  1. How frequently the data is required?
  2. How many years of data business currently report on daily basis?
  3. What is the regulatory reporting data which is required to be made available?
  4. What kind of transaction data required by the business – Must have for the reporting needs?
  5. What data should be consolidated in Data Warehouse?
  6. What data can be consolidated in source systems?
  7. Whether some transformations can be pushed on to external ETL tool so that impact on Data Warehouse is reduced.
  8. Based on the reporting requirements, above options to be considered that suits the organization’s data needs.
  9. Redundant data to be explicitly identified and should not be brought into the Data warehouse.
  10. Processing logic that requires intermediate data to be stored should be analyzed so that it can be pushed to runtime at the query level or at the data base layer.
  11. Rolling data needs to be identified and moved to the NLS or using other backup mechanism.


Strong Information Lifecycle Management covering above points is required to meet the guidelines for an effective data persistence strategy for an organization.


Some of the key benefits of successful data persistent strategy are:

1)   Better resource usage – in terms of disk, CPU and memory

2)   System availability

3)   System performance

4)   Analysis with right set of data


Option - 1

Implementing a near-line component makes it possible to keep less frequently accessed data, such as aged information or detailed transactions more cost-effectively. In addition, if the relatively static data can be removed from the data warehouse, it facilitates to perform regular maintenance activities more quickly and provide business users with higher data availability.


Option – 2
Apache Hadoop and Data warehouse

As the enterprises start analyzing larger amounts of data, migrating it over the network for analysis becomes unrealistic. Analyzing terabytes of data daily in-memory can bring down the processing capacity of the system and also occupies more main memory space. With Hadoop, data is loaded directly to low cost commodity servers just once, and only transferred to other systems when required.


Hadoop a true “active archive” since it not only stores and protects the data, but also enables users to quickly, easily and perpetually derive value from it. Hadoop and the data warehouse can work together in a single information supply chain. The cold or the archived data can be stored in Hadoop and can act as online archives alternate to tapes. Used not only as storage mechanism, Hadoop also helps in real time data loading, parallel processing of complex data and discovering unknown relationships in the data.


hadoop.jpg


 

What is Hadoop good at? Hadoop...

1. ...is cost wise cheaper

2. ...is fast

3. ...scales to large amounts of big data storage

4. ...scales to large amounts of big data computation

5. ...is flexible with types of big data

6. ...is flexible with programming languages

 

Since NLS has been discussed extensively in various forums and blogs , we shall discuss how Hadoop can be integrated with SAP HANA for effective data persistent strategy in a subsequent discussion.

News Archive

SAP HANA News Archive

Filter Blog

By author:
By date:
By tag:

Contact Us

SAP Experts are here to help.

Ask SAP HANA

Our website has a rich support section designed to help you get the highest quality answers quickly and easily. Please take a look at the options below to get you answers from the SAP experts.

If you have technical questions:

If you're looking for training materials or have training questions:

If you have certification questions:

Χ