Home > Blog > Blog
1 2 3 51 Previous Next

Blog

763 Posts

HANA Promo blog.jpgThe best stories are told by customers themselves.  The following eBook, compiled of stories by Bloomberg and Forbes, provides insight into the heavy momentum SAP HANA is playing in the market illustrated via SAP customer stories.

 

Many organizations are running and reaping the benefits of SAP HANA including: Adobe, Alliander, ARI, Carefusion Commonwealth Bank, City of Boston, City of Cape Town, ConAgra, eBay, EMC, Florida Crystals, Globus, HP, HSE24, Johnsonville, Kaeser Compressors, Mercedes-AMG, Norwegian Cruise Line, Nomura Research Institute, National Football League (NFL), Maple Leaf Foods, Southern California Edison, and T-Mobile.

 

Each of the 23 case studies in this eBook provide a complete overview of the SAP customer; the customer’s top objectives; the solution resolution; and key business and technology benefits of each SAP customer engagement.

 

The customer case studies feature key innovations such as:

 

  • - SAP Business Suite Powered by SAP HANA
  • - SAP Business Warehouse powered by SAP HANA
  • - Big Data
  • - SAP HANA Applications

 

Click here to view the eBook today.

Try Simple.  Buy Simple.  Run Simple.

SAP has new offers to get you quickly realizing the benefits of SAP in-memory and data management solutions.   Today, at TechEd && d-code Las Vegas, we announced the availability of the SAP Simpler Choice DB Program.  This program is designed to make it easy for you to adopt SAP in-memory and data management solutions through a range of compelling tools and offers.    


Here's how:


Try Simple: We’ll get you started for free

Try Simple.jpg

 

The SAP data management solutions change the cost equation through simplification. It helps save costs on hardware and software, as well as reducing the labor required for administration and development needs. Now with the Try Simple program, SAP provides the resources to 1) help you assess –your current IT landscape complexity, 2) discover what it’s costing you, and 3) ascertain where you can save time and resources – enabling you to drive new innovations.

Offers:

  • SAP Industry Value Engineering Services will engage with you in a benchmarking survey to help estimate how SAP databases can significantly reduce the TCO associated with managing data and dramatically simplify IT landscapes
  • Landscape Assessment Services for SAP Cloud (HANA Enterprise Cloud) will help you evaluate and assess the benefits of cloud application deployments
  • SAP Database Trial offers or cloud and on-premise deployments:
  • SAP ERP powered by HANA Trial
  • SAP CRM powered by HANA Trial
  • SAP BW powered by HANA Trial
  • SAP HANA on AWS Test Drive
  • SAP ASE Developer EditionAWS
  • SAP ASE Developer Edition Download
  • SAP HANA Cloud Platform Trial

Buy Simple: We’ll protect your investment

Buy Simple.jpg

SAP has simplified licensing terms to allow you to mix and match SAP data management products for deployment in any SAP application scenario – providing greater protection for your SAP database investments as your needs evolve.

 

  • Migration services are provided and compelling offers delivered to lower the risk and cost of a database migration
  • Flexible deployment options are delivered, whether on premise or in the cloud
  • Simpler licensing terms and complete protection for SAP database investments are provided, which also evolves as your business requirements advance

 

Run Simple: We’ll help you migrate

Run Simple.jpg

SAP lowers the risk of migrating to SAP databases — on premise or in the cloud — with SAP services and other compelling offerings.

  • Lower the cost and risk of migration via services credit for database migrations
  • Reduced maintenance costs during the period of migration, so you can fully test the new environment

 

Ready to get started?  Want to learn more?  Please contact your AE or complete the form to have an SAP representative contact you.


Introducing SAP HANA SPS 09: The Platform for All Applications


SAP HANA SPS 09 provides numerous exciting functionalities developed by SAP, as well as additional capabilities provided by SAP co-innovation partners.  This new release renews our commitment to accelerating cloud adoption, providing the most powerful platform for both transactional and analytical applications, creating instant value from big data assets, and enabling co-innovation with our customers and partners.


When it comes to cloud enablement, SAP HANA SPS 09 lets you run multiple independent SAP HANA databases in one single SAP HANA system (SID) and mange as a single unit. We call this functionality Multi-tenant database containers and it is specifically designed to simplify database administration, while maintaining a strong separation and isolation of data, users, and system resources, among individual tenant databases. This innovation allows you to dramatically lower the TCO of your installations–whether that be on-premise or in a cloud environment. Additionally, co-innovations with partners such as IBM, Hitachi and Unisys offers a broad array of secure and professionally managed SAP HANA Enterprise Cloud (HEC) services to accelerate cloud deployments and reduce migration risk.


SAP HANA SPS 09 greatly improves the application development experience with a number of new enhancements to both tools and the core platform. On the tools’ front, SAP HANA Web-based Development Workbench has new SQLScript editor, SQL Script debugger and a Calculation View editor, while SAP HANA Studio has extended its code completion capabilities and offers end-to-end debugging. Additionally, the Visual Application Function Modeler (AFM), allows developers to build reusable application functions based on complex algorithms. The AFM provides prebuilt integration with predictive analysis libraries (PAL), business function libraries (BFL) and R, allowing developers to easily tailor the existing algorithms to specific analytical application needs and then run them in SQLScript.


On the platform front, SAP HANA SPS 09 turbocharges advanced analytics with many new innovations. It exposes to developers a native Graph Engine to efficiently persist and analyze property graphs without duplicating data. This simplifies the analysis of complex relationships and the uncovering of new insights. Additionally, SAP HANA SPS 09 includes Nokia/HERE maps and spatial content to facilitate the development of applications that leverage location-based services. Text mining capabilities have also been added to the platform to help identify relationships among documents. As an example, documents can be stored, indexed and ranked based on a reference document or key terms.


Big data and IoT
open new challenges and opportunities for businesses.  SAP HANA SPS 09 comes packed with new functionalities to help your company to operate in this new reality and create instant value from the variety of your big data assets. The new smart data streaming capability captures, filters, analyzes and takes action on millions of events (streaming data) per second, in real-time.  To optimize available resources, smart data streaming can be configured to pass high value data in SAP HANA for instant analysis and direct remaining data to Hadoop – for historical and trending analysis.  Of course, HANA can analyze data in both HANA and Hadoop at the same time through Smart Data Access, but HANA can also now call a Hadoop Map Reduce job directly from HANA and then bring the result set back to HANA from Hadoop/HDFS for additional analysis.  Another powerful feature is dynamic tiering, which can move warm data from memory to disk – still in columnar format, optimizing in-memory usage for extremely large data volumes.  This will enable customers to achieve great performance on hot data and great price-performance on hot and cold data.  Moreover, smart data integration and smart data quality in SAP HANA SPS09 enables the provisioning  filtering, transformation, cleansing and enrichment of data from multiple sources into SAP HANA, eliminating the need for a separate ETL or replication stages . Pre-built adapters are available for common data sources such as IBM DB2, Oracle, Microsoft SQL Server, OData, Hadoop and Twitter, and an open SDK is also available to build new adapters. In addition, special adapters are provided to consume SAP Business Suite data from databases such as IBM DB2, Oracle and Microsoft SQL Server. Smart data integration is architected for cloud deployment requiring no firewall exceptions to provision data to HANA in cloud.


Openness
is an exciting theme in SAP HANA SPS 09. You can now choose from 400+ SAP HANA configurations supplied by 15 different SAP partners. This is 4 times the number of server configuration options we had in August of 2013. Co-innovations with partners such ad IBM and Hitachi has allowed us to leverage system virtualization at new levels with the support of LPARs for optimal system resource utilization. Scale-up configurations have also expanded with new certified appliances from Cisco, HP and SGI. Additionally relaxed hardware requirements, support for lower cost Intel E5 processors, and new guidelines to utilize existing networking components - part of SAP HANA Tailored Datacenter Integration - provides a new choice to start small and grow your configurations as needed, reducing adoption barriers and improving TCO for SAP HANA.


To learn more about SAP HANA SPS 09, we are delivering resources and hosting a number of Live Expert Sessions to provide a technical view into the new SAP HANA SPS 09 innovations.

 

In the old days before the advent of Big Data, the Cloud and the Internet of Things, managing cash flow was one of the primary concerns of any  enterprise. Today, as we accelerate towards the networked economy, managing the flow of information is becoming just as critical to the success of business.

 

What’s different now?


The big difference between data now and before is that in the past, only humans created and collected data as they went about their lives and their work. Now, with the rise of mobile devices, social media and machines that create and collect data, not only are we experiencing an explosion of data, but humans are no longer the center of the data system, but are just another node in an increasingly autonomous data universe.


Consider the future for supply chains, for example, where vending machines communicate with each other and make decisions on replenishment and maintenance without human interaction.


Or consider that resourceful retailers use hijack alert apps to promote discounts. Using a messaging app with GPS tracking technology, the app recognizes when customers enter a competitor’s store and sends them a notice about their own promotion.

 

Or think about the benefits to hospital clinics when maintenance for medical devices is predicted in advance and any repairs scheduled to minimize inconvenience to patients.

 

These are typical examples of how data, devices, social media and the Internet of Things are rapidly transforming the way we live, work and do  business, raising a number of questions.

 

Are machines making the right decisions? Is the data on our devices secure? Are the new business models invading our privacy as consumers and citizens? How can we manage vast volumes of data coming from different sources? How do we make data useful for business purposes? How can we achieve insight quickly, in real-time?

 

Taking a holistic approach

 

Providing structure and governance is the first step to getting a grip on data.  Information governance is a discipline that includes people, processes, policies, and metrics for the oversight of enterprise information to improve business value. It has always been crucial for any organization but is now more important than ever.

 

The information landscape is becoming increasingly complex with a proliferation of systems and technology to try and make sense of it all. The danger lies in taking a fragmented approach to governing the many types of information from so many different sources. Managing data in silos is not only expensive; it is not sustainable in the long term because the business lacks the transparency to fully understand what is happening, making it more and more difficult to achieve strategic goals.

 

The solution is to simplify the IT landscape and to take a holistic approach to information governance. SAP has the platform and framework to help enterprises do both.

 

Helping enterprises Run Simple

 

At SAP, we are already engineering our solutions for the changing world around us, building everything on the principle of Run Simple.

 

The SAP Data Management portfolio, for example, which runs on the SAP HANA Platform, provides a complete, end to end approach, merging transactional and analytical workloads to be processed on the same data – structured and unstructured – for real time processing. This avoids bottlenecks and data duplication, saving costs,  ensuring compliance and providing transparency.

 

SAP can support your enterprise with high-performance, next-generation information governance services, smarter and seamless access to all types of data, plus democratised insight and information transparency via advanced analytics. And of course, all this can be deployed on premise, in the cloud, or both, backed up with increased security, identity management, and compliance with emerging privacy regulations.

 

Our advice to enterprise decision makers dealing with modern trends like big data, social media, the cloud and Internet of Things is to take action now because managing the flow of information is just as critical to the success of your business as managing cash flow!

 

In case you missed Steve Lucas's keynote at SAP TechEd && dCode in Las Vegas yesterday, I announced the release of a new SAP thought leadership paper on information governance.  This paper covers the substantial business advantages, the emerging challenges, SAP’s recommended framework and holistic platform perspective as well as real-world success stories including how we ourselves achieved €31 million in total benefits over 2 years.


SAP's latest paper on Information Governance can be downloaded here:

The SAP HANA journey towards openness continuous today with the announcement of new cost-optimized entry level servers for SAP HANA targeting price sensitive customers on the low-end of the enterprise market, and adding new large scale-up high-performance computing configurations for SAP HANA on the high-end of the enterprise market.

 

Every customer use case is different. For this reason, SAP introduced the Tailored Datacenter Integration (TDI) deployment model for SAP HANA, providing customers with additional flexibility and enabling significant cost savings for integrating SAP HANA into customers’ data center. SAP HANA TDI delivered on its promise:
Today,
there are more than 400 certified SAP HANA configurations available, providing customers with ultimate choice and flexibility for choosing the best option for their use case and budget needs.

 

Today’s announcement of TDI Phase 3:  SAP HANA on Intel Xeon E5 marks an important milestone on the SAP HANA continuous journey towards openness. The entry-level, 2 socket single node SAP HANA E5 configurations ranging in size from 128GB to 1.5 TB provide the power of SAP HANA real-time business on commodity hardware at the extremely attractive price of ~$10K per server.  SAP also made sure that the hardware procurement process for these boxes is quick and easy, by allowing customers to size and order systems from their preferred hardware vendor in three easy steps. The general availability is planned for November 10th 2014.

 

The rapidly growing partner ecosystem of SAP HANA is another important pillar of its openness strategy. SAP collaborates very closely with its partners to quickly incorporate the latest technology innovations into SAP HANA. The introduction of Intel Xeon Ivy Bridge processors to the market on Feb 18th, followed by Sapphire announcements this summer for the support of SAP HANA on Red Hat, SAP HANA on VMware, and the start of SAP HANA on IBM Power Test and Evaluation program, resulted in a flurry of activities and more than 300 newly certified configurations for SAP HANA. 

 

Today, most hardware partners have certified their SAP HANA system offerings for both SUSE Linux and Red Hat Linux, many offer SAP HANA on VMware configurations in appliance and TDI models, and they constantly look for new and innovative ways to improve their system offerings and lower TCO.  Cisco’s certification of an 8 socket HANA appliance, HP’s introduction of a large 16 sockets, 12TB HANA appliance, and the addition of several new storage vendors offering the latest technologies in hybrid and flash storage, are only a few examples of amazing new innovations being introduced almost daily from SAP HANA partners.

 

The latest addition of SGI to the list of SAP certified hardware appliance partners opens up new opportunities for further extending scalability for single node, large SAP HANA deployments requiring extreme performance and scalability.

 

Customers that prefer to run SAP HANA virtualized will also have more choices with SP09: Hitachi LPAR logical partitioning provides server virtualization at the firmware level to enable dividing of server hardware resources into multiple partitions, resulting in increased utilization and reduced licensing costs.

 

So have it your way!  Whether you are a small business looking to gain new business value from real-time analytics, or a large global enterprise looking to simplify your landscape and maximize IT performance, SAP HANA has the answer for you.  So fasten your seat belt and start your SAP HANA journey now: SAP HANA will take your business to a new high at the speed of thought!

Are you looking to understand SAP HANA innovation adoption process? New customer adoption journey maps will guide you through 5 simple steps to get to value quickly.

 

  1. EXPLORE business cases in your industry and how the solution can help you meet your business needs
  2. IDENTIFY the value of business cases with a tailored design thinking workshop and get a personalized roadmap
  3. TRY solutions with free trial offers and starter editions and make your own decision
  4. DEPLOY with a choice of deployment, on-premise, in the cloud or hybrid
  5. EXPERIENCE to drive the value that SAP HANA and UX innovations can bring to your business

 

11 new customer journey maps for SAP HANA and UX innovations are now available:

 

  1. SAP Business Suite powered by SAP HANA
  2. SAP Business Warehouse  powered by SAP HANA
  3. SAP Simple Finance powered by SAP HANA
  4. SAP Customer Engagement Intelligence powered by SAP HANA
  5. SAP Fraud Management powered by SAP HANA
  6. SAP Demand Signal Management powered by SAP HANA
  7. SAP Sales and Operations Planning powered by SAP HANA
  8. SAP HANA Enterprise Cloud
  9. SAP HANA Cloud Platform
  10. SAP Fiori UX
  11. SAP Screen Personas

 

SAP HANA now has over 4,000 customers globally, over 1,000 use cases listed on saphana.com and hundreds of live customers that are a testament of great progress and innovation that can be obtainable with SAP HANA. You can start on a path of innovation, acceleration, and radical simplification now – follow the 5 simple steps on the map and start driving value today.

In the first part I discussed the system landscape for the suite on HANA looking at large implementations. A crucial feature was the split of data into actual and historical data and the fact that unused tables can be purged from main memory after a certain period of time.

 

The smaller the data footprint of actual data becomes the faster the scan and filter operations in HANA will run. Now, this might not have any real impact on response times for the user anymore, because they are fast already, but it reduces the overall cpu consumption on the database server. We simply can run more transactions on the same system. The same is true for the move to sERP, where all redundant tables and the aggregate tables are being replaced by SQL views for transactional data entry. Since the data entry transactions become much slimmer, the cpu times in the database get significantly reduced. This is important for taking in sales orders coming from other systems, direct order entry, warehouse movements, automatic replenishment of products, manufacturing purchase orders, processing of incoming and out going payments or many more. All of those can be time critical processes in a company. Yes, the display of an account balance e.g. consumes  now a bit more cpu, but these transactions are pretty rare in comparison to the gains in data entry. And remember HANA doesn't need the large number of database indices any more and uses the attribute vectors of the tables instead but increases the scanning of attribute vectors (columns) instead.

 

Many reporting functions migrated away from the transactional system over the last decade. The will come back now because they can. The system load shifts clearly in the direction of more and more sequential and read only processing. The vast majority of the database activities will become read only and access only the actual data. The other main cost factor for the hardware is main memory. We want to keep the data mainly in memory but there is a lot of data unused or at least rarely used. As you may now know unused columns are not taking any space at all (in contrast to conventional row oriented databases). With the purging algorithm we can further reduce the data footprint in memory. Tables or parts of tables, which not being accessed for a certain period of time will be dropped from memory. As explained in the first part of this blog historical data can not change any more and as such don't have to be considered with regards to cache coherence.

 

When the data footprint plus workspace becomes much smaller than the typical blades used in the cloud (6, 3, 1 terabytes) multiple systems can run on the same blade using virtualization. The tradeoff between capacity and performance shifts towards capacity (main costs) and with even smaller systems eventually towards multi tenancy, since the performance is more than sufficient for the users. Test and development systems will always be virtualized and can vary in size by demand (system start).

 

I hope as soon as there is enough experience in the cloud with sERP (or sFIN) SAP will make these features also available to non cloud implementations of the suite on HANA. The non HANA based ERP system operate differently. The transactional performance is mainly achieved by using a large number of database indices and the database tries to find the data in it's caches. The split in actual and historical data would only help for really sequential processes reading all data of a given table. 

Learn, Share and Grow with SAP HANA : SAP HANA Operation Expert Summit in two locations.


  • November 20, 2014, in Newtown Square, PA
  • December 04, 2014 in Palo Alto, CA


As an IT expert working with the SAP HANA platform, we invite you to be part of our inner circle at an exclusive event: the SAP HANA Operation Expert Summit.

273307_l_srgb_s_gl.jpg

 

Don't expect your standard summit with speakers and coffee breaks. This is an interactive occasion that welcomes your full participation.


Panel discussions
and breakout sessions offer unique opportunities to share your experience and ideas with us.

We want to hear what you think of SAP HANA operation.


We want to know:

  • What are your pain points or challenges?
  • What advice, tips, or tricks do you have for other users?
  • What features would you like to see in the future?


In addition, speed-networking sessions with SAP experts from the SAP HANA development organization will offer you advice on how to best operate SAP HANA — from planning and building, all the way to running, giving you knowledge and insights you can start using immediately.
Space is limited!


Register here
for Newtown Square. Get the full AGENDA here

Register here for Palo Alto. Get the full AGENDA here.


Do you know another SAP HANA operation specialist in your company? Don't hesitate to forward this message. Let's keep making progress!


We look forward to welcoming you in Newtown Square, PA, on Nov
ember 20, 2014 and in Palo Alto, on December 04, 2014.

SAP HANA Marketplace (http://marketplace.saphana.com) is a key part of SAP’s cloud vision (The Cloud company Powered by HANA) and execution. It is the online commerce ecosystem for all things related to the SAP HANA platform. Here, customers can discover, try, and buy solutions based on SAP HANA across 25 industries and 12 lines of business – and partners gain a low-touch, low-cost channel to connect with SAP customers and commercialize their innovations. 

 

Also, the SAP HANA Cloud Platform (HCP), the in-memory Platform-as-a-Service offering from SAP can ONLY be purchased at the SAP HANA Marketplace. HCP enables customers and developers to build, extend, and run applications on SAP HANA in the cloud and is available in many flexible subscription models for apps, database, and infrastructure from 32GB to 1TB+  memory capacities. With few clicks and within minutes you get an instant access to the full power of SAP HANA

 

The details of our presence can be found in this blog: http://scn.sap.com/community/cloud-platform/blog/2014/09/23/come-experience-the-sap-hana-marketplace-at-techeddcode-2014

With recent announcements on SAP HANA reaching headlines, we thought, what better way to get your SAP HANA questions answered than during a tweetchat! We held a #SAPChat last week that had over 200 tweets within an hour. Our participants Steve Lucas, Andy Sitison and John Appleby were actively tweeting away and responding to questions that came through our stream.  If you were not able to participate or follow along the stream, no worries, I’ve highlighted a few key topics below:


1. SAP TechEd && d-code

SAP TechEd && d-code Las Vegas is just around the corner. Get all the information you need here, from searching for sessions to attend live or online: http://www.sapdcode.com/2014/usa/home.htm, to building your personalized agenda here: http://sessioncatalog.sapevents.com/go/agendabuilder.home/?l=84.


Q: @tpowlas: @D_Sieber @SAPMentors @nstevenlucas sure, any sneak previews to share for #SAPtd to share? #SAPHANA #SAPChat

  • @applebyj: @tpowlas @D_Sieber @SAPMentors @nstevenlucas I expect we will hear a lot about the upcoming #SAPHANA SP09 release #sapchat
  • @applebyj: @tpowlas @D_Sieber @SAPMentors @nstevenlucas I hear improvements in performance, scale-up support, core SQL, data tiering #sapchat


Q: @ASitison: So TechED is just around the corner, what should the masses keep an Eye on? #SAPChat

  • @tweetsinha: @ASitison > customer momentum in 1000s, openness, truth, platform, analytics on HANA, all social streams on HANA, IOT on HANA... #sapchat

 

2. SAP HANA Customers

Customers are important to us and we value their comments and concerns. Many of our customers have shared their stories and experiences with us and you can view them through our SAP HANA Customer Reference eBooks: http://www.saphana.com/community/learn/customer-reference-ebooks or hear them share their stories through these videos: http://www.saphana.com/community/learn/customer-stories/


Q: @Rafikul_Hussain: How many live customer for BW & ERP on HANA in production environment #SAPHANA #SAPChat

  • @tweetsinha : @Rafikul_Hussain > suite is 160 live, all are high availability


Q: @Rafikul_Hussain: How many customer using ERP on HANA in High availability #SAPHANA #SAPChat

  • @applebyj: @Rafikul_Hussain I'm working on right now with over 100TB of HANA. HA/DR, multiple datacenters. It's awesome so far. #sapchat
  • @applebyj: @Rafikul_Hussain But to be honest... I would think of the 350 live SoH customers. You don't run SoH without HA/DR :-) #sapchat
  • @applebyj: @Rafikul_Hussain Sorry correction for SoH 160 live. #sapchat
  • @nstevenlucas: RT @applebyj: @Rafikul_Hussain ALL HANA customers have HA/DR scenarios - you dont run suite of BW w/o that  #sapchat


Q: @satdesai: @D_Sieber @kensaul @nstevenlucas @ASitison @HenrikWagner @applebyj How many customers are in production on SAP HANA #saphana #sapchat

  • @applebyj: @satdesai @D_Sieber @kensaul @nstevenlucas @ASitison @HenrikWagner Over 1400 total prod customers. #sapchat


Q: @Beyanet_tr: #SAPChat #SAPHANA any customer success story about BW on HANA and HANA Live Rapid Deployment on the same appliance?

  • @applebyj: @Beyanet_tr There are some doing this. More common to separate it TBH for TCO reasons. #sapchat


Q: @tweetsinha: #sapchat @nstevenlucas > as a customer why should I not flip the switch to 12c vs. #HANA?

  • @nstevenlucas: @tweetsinha: #sapchat @nstevenlucas > as cust. why should I NOT flip the switch to 12c vs. #HANA? SL: that propagates WRONG architecture
  • @nstevenlucas: RT @tweetsinha: #sapchat why #HANA vs. Orcl 12c? HANA is about less data modeling @applebyj
  • @nstevenlucas @tweetsinha @nstevenlucas My answer would be - it only works for simple scenarios. Real-world scenarios break. #sapchat with better performance...Orcl 12c. is opposite.

 

3. The future of SAP HANA

There has been tremendous progress around SAP HANA throughout the years, and Steve Lucas explains why SAP HANA is the future: http://www.saphana.com/community/blogs/blog/2014/09/28/sap-hana-is-the-future 


Q: @naveenketha: Whats #SAP future plans to increase #HANA  adoption ? #SAPChat

  • @applebyj: @naveenketha I think a focus on helping customers build the business case, plus more and more use cases. @nstevenlucas


Q: @applebyj: What's the most important #saphana trend for 2015? @nstevenlucas #sapchat

  • @nstevenlucas: RT @applebyj: What's the most important #saphana trend for 2015? @nstevenlucas #sapchat SL:SoH is PROVEN. need cust. to move OFF oracle


Q: @tweetsinha: #sapchat @nstevenlucas > why do you think HCP is a big deal?

  • @nstevenlucas: RT @tweetsinha: #sapchat @nstevenlucas > why do you think HCP is a big deal? SL: reduces ext & cust of SAP on prem costs by upwards of 100x

 

I hope this tweet chat has shed light onto some of the burning questions you’ve had and I want to thank everyone who participated in asking as well as answering all the questions.

 

This tweet chat may be over, but it certainly won’t be our last! Are there more questions you have around SAP HANA? Feel free to leave a comment below with your questions and we will be sure to get you an answer or set up another tweet chat.

What’s in for You? – Get an Overview on SAP HANA Activities during SAP TechEd && d-code 2014

SAP TechEd && d-code 2014 Las Vegas is only couple days ahead of us. The Berlin Event follows mid of November. Be exited to meet the SAP HANA Experts of the SAP HANA Development and Product Management!

 

Get comprehensive insights into sessions around SAP HANA and do SAP TechEd && D-code your way!

 

Take part in expert-led exercises and classroom trainings; deep dive into SAP HANA platform topics during demo-rich lectures, review future product directions and chat with the SAP HANA experts in person!

 

Activities covering SAP HANA platform topics are spread during different tracks. Find the most interesting sessions based on your needs by reading the SAP HANA activities blog posts clustered via tracks below:

 

And don’t forget to checkout the 30-min Expert Networking Sessions to talk with SAP HANA conference speaker in a smaller audience.

 

Looking forward meeting you.

IMG_9788.jpgI recently participated in the Very Large Data Bases Conference (VLDB), the #1 ranked academic research conference in the database field, hosted in Hangzhou, China. During this conference, Professor Dr. Hasso Plattner delivered the keynote presentation: “The Impact of Columnar In-Memory Databases on Enterprise Systems”. Hasso Plattner shared visions of future enterprise applications to be aggregation and redundancy free; fundamentally changing business applications. “Aggregation and redundancy free” ,can be described in one single word: "Simplification".

 

The term “simplification” doesn't necessarily mean fewer or less.  The real meaning for "simplification" includes the following:

1. Simplification means less
redundancy:

There have been reasons to keep redundancy data, for example bad performance and reciprocal effects between transaction and analytical workloads when sharing the same disk IO. Thus, there are separated OLTP and OLAP systems. There are also hidden costs to copy/manage data or maintain data consistence.

Now, with new hardware technologies, we have the ability to run analytics directly on top of transactional data while reducing redundancy data; a concept Hasso Plattner shared during the conference.  Here is an example how new SAP Simple Finance
helps reduce redundancy.


Look at Hasso Plattner’s
paper on this topic. You will find the explanation how to drop the size from 7.1TB in a DB2 system to 0.2TB of hot data storage. The reduced data without redundancy can also accelerate all downstream processes such as system backup, replication, and recovery. In addition, the applications can focus on the business logic with less cost.

footprint.png


2. Simplification means more dynamic:

Modern business applications should be dynamic.  For example, web pages are usually dynamic, allowing more user interactions. Business users now want the applications to support their decisions for dynamically changing business needs. However, aggregates used to pre-define the semantics of data. Applications running on aggregates were viewed as dynamic and flexible enough.


dynamic.png


SAP HANA introduces break-through technologies to leverage in-memory computing and the power of modern hardware technology. Applications can aggregate data on the fly to separate the logic from the data.

3. Simplification means less data movement

Loading data from databases and processing them in an application server is the traditional way to process data. Bu is this a good way? The cost spent in data movement can be very high.  Is it possible to move the computing logic to where the data sits?


less move.png

Several years ago, I built the scientific modeler for a pricing optimization engine, which is a non-linear regression modeler using the Newton Raphson method to solve the equation. The data amount can easily go over 10GB for the history of 2 years worth of data.  And believe me it is a headache to accelerate the performance of transferring 10GB of data.  Another lesson learned for this type of solver is “do not put the SQL statement in the solver”.  Sometimes, one solver would take ~10K+ times to reach the converge point. A SQL statement within the solver will trigger multiple calls. But with SAP HANA, these challenges are solved. SAP HANA allows pushing down the computation to in-memory platform where the data sits, which reduces unnecessary time spent on data movement and the round trip of calls.


Beside the three points covered in Hasso Plattner’s talk, SAP HANA actually provides more: 
Simplification means Deeper Insights from Big Data

Is big data complex?  Yes, but the pattern identified from big data might be very simple.  Is social media data too much to be fully consumed?  Yes, but what we want to know is just a simple sentiment trend within the social media discussion. Our goal is not to own big data, but to gain useful insights. With SAP HANA, application can process different types of data in a single server through integrated engines, such as geospatial, text, or graph engine.  Now, one single server can have a huge size memory, for example, HP recently released "HANA Hawk", a 12TB memory single server.  Moreover, SAP HANA will compress data to process more “big data” within one single server, which is easier and cheaper to maintain.  Thus, the business applications now only need 1 instead of 10 servers. Software that runs on it can avoid using overhead coordinating multiple servers in a cluster.  SAP HANA also includes the most popular analytic algorithms to help you analyze data, such as the linear regression, kmeans, or C4.5 decision tree.

I like the “simplified” future for business applications. Simplification can help to add apps quickly and change them easily.  Your business application can be less redundant and have less data movement, while being more dynamic and provide deeper insights. The architecture will be more efficient and well organized, and the application can run faster and will be easier to be changed and maintained.

IT departments everywhere face the challenge of diminishing budgets and increasing demand. Blade servers can help to increase data center density and minimize power use, but compute and I/O resources may still be over allocated or underutilized. IT managers hence also look into virtualization technologies, to allow such unused resources to be put to more effective use, increasing return on investment or even delaying the need to buy new equipment.

LPARblog13.png

When start to thinking about virtualization, most people immediately call to mind the well-known software virtualization hypervisors. With the introduction of logical partitioning on the Intel x86 platform, like the Hitachi Compute Blade Logical Partitioning (LPAR) feature, customers can now also benefit from mainframe-class virtualization in blade computing.


Since end of September Hitachi LPAR 2.0, developed and optimized to perfectly operate SAP HANA on SAP HANA certified Hitachi servers, can be used for production and non-production scenarios in controlled availability. This means SAP will support selected customers, depending on their scenarios and system sizes to go live with SAP HANA on Hitachi LPAR 2.0. For further details on the process of controlled availability, see our updated slide deck on SAP HANA virtualization roadmap or feel free to contact us at: sap_hana_tailored_data_center_integration@sap.com


Hitachi LPAR virtualization enables physical server resources to be allocated among multiple securely isolated partitions to maximize the efficiency and utilization of blade server hardware. Each logical partition hosts its own independent guest operating system and application environment. Individual CPU cores can either be assigned to specific logical partitions for maximum security (dedicated mode), or shared between partitions for maximum utilization (shared mode). CPU and other resources assigned to shared mode partitions can be dynamically re-allocated to allow rapid response to changes in application workload.


As of today, SAP HANA on Hitachi LPAR support covers one single or up to 4 SAP HANA virtual machines running in parallel on a dedicated SAP HANA certified server. SAP HANA database scale-out configurations are not supported. Support is limited to 2 or 4-socket SAP HANA certified Intel E7 v2 Ivy Bridge EX processor based configurations in single-node, scale-up configurations.


For more information and details on supported scenarios for SAP HANA on Hitachi LPAR, check out SAP Note 1788665 and 2063057 on SAP HANA support for VMs on Hitachi LPAR in production (controlled availability) and the Hitachi Best Practices for running SAP HANA on Hitachi LPAR available from Hitachi / HDS - Best Practices Guide for using ... | SCN.

2014-09-06 09.45.01.jpg

Participants in the Critical Care Data Marathon hard at work


Critical Care Data Marathon

 

If you are a hospital patient, the intensive care unit is the most dangerous as well as the most expensive place to be.  This is why man medical professionals believe that improving the effectiveness of treatment for ICU patients will have a significant impact on the outcome and overall cost of hospital stays.

 

One thing that you may not realize is that because the treatment of ICU patients is tremendously complex, it has become highly automated which results in huge amounts of data documenting the most minute details of the ICU visit. They include everything from medications, vital signs at every 1/125th second interval, and laboratory tests; even detailed text notes entered by the medical staff.  Our team at SAP is collaborating with a team at MIT which has collected comprehensive patient records on 48,000 patients during their stay in an intensive care unit of a large Boston teaching hospital.  This data has been cleaned-up and deidentified for research purposes and is available as the MIMIC II database through the Laboratory for Computational Physiology website  http://www.physionet.org/

 

Several times per year an MIT-based group called Hacking Medicine organizes the Critical Care Data Marathon in Boston and Europe which focuses on improving the treatment outcomes for intensive care unit patients.  This year saw almost over 100 participants in Boston, 100 in London and 50 in Paris working from Friday evening through Sunday afternoon on their own medical analytics projects.  The teams were a mix of healthcare professionals and data scientists with the medical professionals formulating the hypotheses and guiding the team(s).  The MIMIC database was loaded on HANA One and made available to participants in the US and Europe.   A team from SAP was available to participants with HANA questions and the LCP helped them with the database details.


Let the Hacking Begin...

 

The event started off on Friday evening with a "pitch session" where participants tried to persuade the audience to join them in working on their idea.

Over 30 pitches were given and recorded on video which you can watch.

 

 

Afterwards there was a mixer in the main atrium where the initial teams were formed and plans made for the rest of the weekend.

 

Saturday was go time for the teams.  The our SAP support team spent the day helping folks with HANA--questions ranged from HANA Studio installation to complex SQL query syntax.  The MIT team was kept busy explaining how to interpret the data in the MIMIC database.

 

HANA is well suited as a tool for exploring a database like MIMIC.   MIMIC contains a large amount of data and is structurally complex with over 105 tables of information. Of more significance is the complexity of the queries needed to extract information.  Unlike business data, there are few (if any) aggregations--queries look at many discrete data elements. The queries are so complex that they usually have to be done in stages which is where HANA's templating facility is extremely helpful.

 

On mid-day Sunday, the judges began visiting with each team to check on progress.  Final presentations and judging were held in the renowned MIT Media Lab after lunch.  The results were quite impressive and for many teams this is just the start of a longer research project.  We videotaped the final session if you would like see for yourself.  The judges declared the follow teams to be prize winners:

  1. Team PEMI looking at Elderly Patient Mortality (Video @ 58:00)
  2. Team Inspire looking at at Sepsis (Video @ 42:00)
  3. Tie between:
    • Team Critical Mass looking at the obesity paradox (Video @24:30)
    • Team Septic Shockers looking at Sepsis (Video @9:45)

 


This is the second data marathon our SAP team supported and they just keeps getting more awesome.  One of the things that is always challenging is that participants at an open event like this show-up with all manner of computers and operating systems.  They often want to use their favorite tools in addition to HANA so we are supporting everything from .csv extracts to Excel and Python (via ODBC/JDBC). HANA One managed to accommodate all their working styles.

 

Finally I should mention that in addition to SAP, this event was supported by a our friends at Philips Healthcare.

danny.jpgAbout a year ago, my colleague's son, Danny, a student at the University of Amsterdam guided me around Amsterdam during my short visit to this amazing city.

 

Upon meeting Danny, my travel mates and I realized immediately that we limited ourselves by only considering schools in the United States.  Danny stepped outside of his comfort zone and experienced a new exciting culture every day – and it paid off.

 

Opportunity is endless at the University of Amsterdam (UvA).  Seventy thousand students call UvA home and with about 10,000 employees, UvA is the largest university in the Netherlands.  But even the distinguished and nationally recognized UvA has its challenges.  Managing data for 70,000 students is no easy task when running an outdated IT infrastructure. UvA strives to empower its students to be successful.


To achieve that goal, the school needed high-speed data computing. To begin change, UvA migrated its existing SAP Business Warehouse (SAP BW) application to the SAP HANA platform, speeding its reporting performance from 30 to 5 seconds. UvA now runs SAP Business Suite powered by SAP HANA and SAP BW powered by SAP HANA, allowing the school to spend less time on behind-the-scenes reporting and more time focusing on student success.

 

Together with SAP, UvA can enhance the learning experience at a deeper level because it has the software it needs to run large learning analytic data reports in real time.  And although I missed my chance at a Bachelor’s degree abroad, it’s never too late to go for my Master’s in Amsterdam…and I’m sure that Lindsey and Danny will be eager to visit me.

For the full story, click here.

 

To hear more about the University of Amsterdam and its engagement with SAP, watch this interview video with Bert Voorbraak, the director of ICT Services at UvA :

 

Follow my adventures on Twitter and LinkedIn.

News Archive

SAP HANA News Archive

Filter Blog

By author:
By date:
By tag:

Contact Us

SAP Experts are here to help.

Ask SAP HANA

Our website has a rich support section designed to help you get the highest quality answers quickly and easily. Please take a look at the options below to get you answers from the SAP experts.

If you have technical questions:

If you're looking for training materials or have training questions:

If you have certification questions:

Χ