Quantcast
Channel: SCN : Blog List - SAP Planning and Consolidation, version for SAP NetWeaver
Viewing all 152 articles
Browse latest View live

Meet the Expert - Transports on SAP Business Planning and Consolidation for 10 and 10.1

$
0
0

Don't miss out the opportunity to acquire even more knowledge regarding an important topic on NetWeaver systems, focusing specifically on Business Planning and Consolidation.

This session will cover the following topics:

  • SAP BPC-NW 10/10.1 transport architecture
  • Basic configuration
  • Transport and activation processes
  • Essential Tables and parameters
  • Comments on ChaRM
  • Reading transport logs (checking the activation)

 

When: Tuesday, June 14, 2016 from 2:00 PM to 3:00 PM (BRT)


Link for registration:

https://service.education.sap.com/sap/bc/webdynpro/sap/zlso_esa_home?suser=I839719&hash=0E23F7062BD8FF2F0BD96DA278459632…

 

Please leave a comment below in case you have any question about this session.


Tenth time's a charm, SAP's EPM solution once more ranked as a leader by Gartner

$
0
0

Predictably, SAP's EPM offering resides in the leaders' corner as Gartner releases the results of its latest Magic Quadrant study. This makes it the 10 year in a row. This time, Gartner divided the "contest" into two separate series, namely Financial and Strategic CPM. SAP was awarded the highest title in both categories, a feat only one other contestant managed to pull off.

SAP BPC Write back parameters - RECLEVEL_NR

$
0
0

Here you can find the result of an investigation I have done on SAP BPC write back parameters that can be maintained as model parameter for planning and consolidation in SPRO transaction (a general description can be found on BPC admin guide).

 

RECLEVEL_NR specifies the maximum number of records to be written for which a record-based locking is always applied.


This is quite important because a wrong setting of this parameter can affect performance and overload the en-queue server by generating a huge number of locking selection range.

 

In case the number of submitted records in a BPC input form or in the result of a Data Manager Package exceeds the value of RECLEVEL_NR, we will have 2 different behaviors:

  1. NON-SPARSE records
  2. SPARSE records

The choice of these 2 scenarios in theory should depend from parameter SPARSITY_COEF, but in implementation I checked the sparse behaviour was never possible, no matter what the value of SPARSITY_COEF is.

cl_ujo_writeback check_concurrency_lock.JPG

This is screenshot is taken from BPC release 801 SP level 12.As a result of these 2 scenarios we can have:

  • record-based locking (the same we have if RECLEVEL_NR is greater than the number of submitted records)
  • a between range locking: instead of adding a record for every member to lock for a specified dimension, we will have a single record defining an interval from lowest and highest member. This means we avoid to overload the server with a huge locking selection, but on the other side we risk to lock too many members. Especially if we are in a sparse situation.


NON-SPARSE records

The locking selection is created using BPC parameter INTERVAL_NR. From BPC admin guide we know that:


"In the situation where record level locking is not being implemented and the data set being saved is NOT sparse, any dimensions with           less than this number of distinct member values in the dataset will be locked using their single values. If the dimension has more than this           number of records, the range between the low to high values will be locked."


So if the members of a dimension are less or equal to INTERVAL_NR, record based locking for that dimension will be implemented; otherwise the locking will be done using an interval that considers only the lowest and highest member.

E.g.

If I am going to write 3 members for Time (2016.01,2016.02 and 2016.12) and INTERVAL_NR is 2, the interval will lock every time member from 2016.01 to 2016.12.As a consequence:

  • I have generated just 1 record instead of 3 in the locking table (positive thing)
  • I have locked 12 months instead of the 3 affected by the changed (negative thing)


SPARSE records (actually in my current version is never implemented).

The locking selection is created using BPC parameter MULTIPLY_COEF.


"In the situation where record-level locking is not being implemented and a sparse data set is being saved, this value specifies the maximum           number of members for which you can implement record level locking (that is, when to swap to using a BETWEEN range in the lock table)."


In this case it will apply the range selection by considering the number of members in every dimension.


E.g. 

See below example where the model has 4 dimension and MULTIPLY_COEFF is equal to 15. Please note that the table is a sorted table by number of members in each dimension: so category (1 member) is first and account (7 member) is the last.

img2.JPG

Conclusion

I would advice to keep parameter RECLEVEL_NR to default value 10 and to not increase it.

Decision of type of locking to be implemented in this way will depend from parameter INTERVAL_NR (I assume that we are always in a NON SPARSE scenario).

 

Impact on parallelisation

When a parallel BPC process(RUNLOGIC_PH or BPC parallel framework) is implemented, it is highly recommended to have a low value of RECLEVEL_NR.

 

RUNLOGIC_PH with a RECLEVEL_NR equal to PACKAGE_SIZE usually has some process that fails because of an overloading of the en-queue server.

Below you can see the result of some test I have done on our BPC system:

 

RECLEVEL_NR = 40000 and INTERVAL_NR = 10 -> some process failed. It took ~14 minutes

RECLEVEL_NR = 10 and INTERVAL_NR = 10 -> Succeeded. It took ~8 min 30s

RECLEVEL_NR = 10 and INTERVAL_NR = 1000 -> Succeeded. It took ~8 min 40s.

Unable to open EPM add-in from BPC web client

$
0
0

Unable to open EPM add-in from BPC web client is a very common problem in BPC 10.X, here we discuss the common errors and how to deal with them.

 

First, you need to know the client software supported by BPC 10.X

 

SAP NW 10.0:

https://support.sap.com/content/dam/library/ssp/infopages/pam-essentials/BPC_10_NW.pdf

 

SAP NW 10.1:

https://support.sap.com/content/dam/library/ssp/infopages/pam-essentials/BPC_101_NW.pdf

 

Note:

- Microsoft has stopped providing support for Windows XP, so we don't suggest using it any more.

- Google Chrome is supported with Analysis Office, not EPM add-in. (Note 2258301)

- IE11 is supported with compatibility view or enterprise mode with CPMBPC 801 SP09.

- IE11 is supported with standard or native mode with CPMBPC 801 SP11.

- IE11 is the last version of Internet Explorer. After January 12, 2016, Microsoft will no longer provide security updates or technical support for older versions of Internet Explorer.

 

Case 1 - Symptom

Unable to open Excel link through BPF. (Error launching office client Automation server can't create object)

Picture1.png

 

Install EPM Add-in for Microsoft Office is displayed when clicking on EPM Office Add-in in BPC 10.1 web client.

Picture2.png

 

Install EPM Add-in for Microsoft Office is displayed in BPC 10.0 web client> Home > Start Page> Launch.

Picture3.png

 

Case 1 - Checklist

  • Use IE 64 bit with Office 64 bit (EPM add-in .NET 4.0)
  • Use IE 32 bit with Office 32 bit (EPM add-in .NET 3.5 or 4.0)
  • Uncheck option Enable Protected Mode in Internet Options > Security.
  • Add the BPC web URL to Local Sites/Trusted Sites in Internet Options > Security.

 

Make sure EPM add-in is loaded when opening Excel directly, or you need to fix this issue first.

Picture17.png

Make sure the following Add-on is enabled in IE.

Picture4.png

Rest Internet Explorer Settings

Picture5.png

The following warning should appear after resetting Internet Explorer Settings and you need to allow it to run.

Picture6.png
The same warning in IE11.

Picture7.png

Try opening Internet Explorer or Firefox throughRun as administrator.

Picture8.png

 

Case 2 – Symptom

Exception from HRESULT:0x80029C4A (TYPE_E_CANTLOADLIBRARY)

Picture10.png

Case 2 – Solution

http://social.technet.microsoft.com/wiki/contents/articles/18919.c-ppt-to-pdf-unable-to-cast-com-object-of-type-microsoft-office-interop-powerpoint-applicationclass-to-interface-type-microsoft-office-interop-powerpoint-application.aspx


Case 3 – Symptom

Library not registered(registry ID:{000208D5-0000-0000-C000-000000000046}), Exception from HRESULT:0x8002801D(TYPE_E_LIBNOTREGISTERED)

Picture11.png

Case 3 – Solutions

  • Apply the solution of note 1978201.
  • Try opening Internet Explorer or Firefox throughRun as administrator.


Case 4 – Symptom

Library not registered(no registry ID), Exception from HRESULT:0x8002801D(TYPE_E_LIBNOTREGISTERED)

Picture12.jpg

Case 4 – Solutions

  • Try opening Internet Explorer or Firefox throughRun as administrator.
  • Do not delete WPS office if you’ve installed it or Install WPS office again if you’ve deleted it. (WPS office is an office suite which is compatible with Microsoft Office)


Case 5 – Symptom

Error launching office client: a.PushActionsByValue is not a function

Picture13.png


Case 5 –
Solution

Make sure the following Firefox plug-in is activated or you need to install NP.EPM.Plugin.xpi from the installation folder of EPM add-in.

Picture14.png


Case 6 – Symptom

You get warning This add-in could not be installed because it has not been verified whentrying to install EPM plug-in for Firefox.
Picture15.png


Case 6 –
Solution

Set xpinstall.signatures.required = false in about:config

Picture16.png

BPC 10.1 embedded: How DAPs determine Authorizations for HCPRs

$
0
0

Abstract

The authorization concept of SAP BusinessObjects Planning and Consolidation 10.1, version for SAP Netweaver (short: BPC embedded) is based on BW Analysis Authorizations. There are three layers of authorizations in place: BW backend authorizations, environment authorizations, and Data Access Profiles (DAPs).

The difference between DAPs and the other two layers of authorizations is that DAPs are defined per BPC-model and not per provider as the other two layers are.
The DAP is defined on the basic providers of the model and for CompositeProviders (HCPRs) or MultiProviders, a “lifting” technique is used to determine the authorization on the higher-level provider from the basic providers.

In the first part of this text, we describe the lifting algorithm and discuss its implications. In the second part, we give some examples of authorization configurations on the different layers.

In the usual case where the characteristics of the basic providers are mapped to the same characteristics in the HCPR, this lifting is straight forward. The lifting only really does anything special if there is a non-trivial mapping between the characteristics of the basic providers and the HCPR/MultiProvider.

Preface

Notes 2314554 and 2332057 need to be implemented for this feature to work as described here.

For the sake of readability, this text only talks about HCPRs. However, the same algorithms are valid for MultiProviders. This means that a MultiProvider behaves just like the an HCPR of the same structure.

Analysis Authorizations and DAPs

If a query is run in the context of a BPC environment and model, the applicable authorizations are calculated in the following way:

  1. The Analysis Authorizations B maintained in /nRSECADMIN and assigned to the user directly or via /nPFCG role are extended by the environment authorizations E for the current environment as maintained in /nRSECENVI. The extension is done by simply taking the union of the two lists of authorization objects. The result of this is a set of Analysis Authorizations which potentially grants access to more objects than the initial set B.
  2. This extended set of authorizations is then restricted again by the DAP of the BPC-model in question. This restriction is an intersection of the authorization objects.

In effect, the user can only see or write data which he/she is authorized to by the DAP and the union of environment and BW authorizations. In short, this can be summarized as

A = (B∪E) ∩ D

(1)

Where A is the resulting authorization, B is the ordinary BW Analysis Authorization, E is the environment authorization for the current environment, and D is the DAP-authorization for the current model.

Note 1: If no DAP is maintained, this is interpreted as D=. Hence, no access to the data will be granted.

Note 2: If no Environment/Model context is used, only the Analysis Authorizations B are considered (see the last chapter).

Calculating the DAP-Authorization Part

Relevant Providers

The basic difference between ordinary BW analysis authorizations (as well as environment authorizations) and BPC’s Data Access Profile (DAP) concept is that DAPs are defined per model for all authorization-relevant characteristics (“dimensions” in BPC terminology). It is not possible to set an authorization for an individual provider of the model within the DAP.

The list of relevant characteristic is determined by considering all authorization-relevant characteristics of all basic providers of the model. This includes all InfoCubes, DSOs (ordinary or advanced) which are added to the model explicitly. It also includes all part-providers of all HCPRs of the model.
In short: the DAP is defined on the characteristics of thebasic providers of the model.

The DAP-authorization on HCPRs (no matter whether they are members of the model themselves or just theirs part providers) are calculated using a lifting algorithm.

The Lifting Algorithm

When the authorization for an HCPR is requested, the authorizations B and E in equation (1) are maintained on the HCPR level, but D is defined on the model level. In order to evaluate (1), the DAP authorizations are lifted using the following rules

  1. For a target characteristic in the HCPR, the pre-image in the basic providers is looked up and the authorization defined in the DAP is copied
  2. If a characteristic which is not part of the DAP (not part of the model) supplies a characteristic in the HCPR, the *-authorization (all-authorization) is used in the target
  3. If a characteristic of the HCPR is supplied from different source characteristics, the intersection of the source authorizations is taken

This leads to the lifted DAP authorizations D’ which are now defined on the characteristics exposed by the HCPR. So in this case, the resulting authorization is calculated as

A = (B ∪ E) ∩ D’

(2)

Examples

1. Unique Mapping

Suppose we have a InfoCube contained in an HCPR (as the sole part provider). In the HCPR, characteristics might be mapped to other characteristics provided they all refer to a common base characteristic.

So let’s assume that the characteristic ZCOUNTRY references 0COUNTRY and we have the following mapping between the InfoCube and the HCPR:

InfoObject (InfoCube)

Mapping target in HCPR

ZCOUNTRY

0COUNTRY

0CALYEAR

0CALYEAR

 

The DAP of the model is defined on the characteristics of the basic providers of the model. If the HCPR or the InfoCube is part of the model, a possible DAP could look like this:

InfoObject

DAP D: authorized values

ZCOUNTRY

DE, FR, GB

0CALYEAR

2015, 2016

Note that 0COUNTRY is not part of the DAP. This is true no matter whether the HCPR or the InfoCube or both providers are explicitly added to the model.

If a query is run on the HCPR, the resulting DAP-authorizations are:

Characteristic (of HCPR)

Source characteristic for mapping

Resulting authorization D’

0COUNTRY

ZCOUNTRY

DE, FR, GB

0CALYEAR

0CALYEAR

2015, 2016

The authorization for 0CALYEAR is simply read from the DAP, while for 0COUNTRY, the pre-image ZCOUNTRY is looked up and the authorization of ZCOUNTRY is applied to 0COUNTRY.

2. Complex Mappings

In HCPRs with multiple part providers, a characteristic exposed by the HCPR might be supplied by different characteristics of the part providers (all of them need to reference a common characteristic).

In the following example, we have one InfoCube which supplies the HCPR’s-characteristic 0COUNTRY from the InfoCube’s characteristic ZCOUNTRY. And we could have a second InfoCube which HCPR’s-characteristic 0COUNTRY from the characteristic ZCNTRY2.
There’s also some cross-mapping happening here: InfoCube 1 supplies ZCROSS2 from ZCROSS1 and InfoCube 2 supplies ZCROSS1 from ZCROSS2. This essentially leads to an exchange of names of the two characteristic between the InfoCube layer and the HCPR layer.
Let’s not spend too much time on the question whether this kind of modeling is helpful for understanding the data.

InfoObject (InfoCube 1)

Mapping target in HCPR

ZCOUNTRY

0COUNTRY

0CALYEAR

0CALYEAR

ZCROSS1

ZCROSS2

ZCROSS2

<not mapped>

 

InfoObject (InfoCube 2)

Mapping target in HCPR

ZCNTRY2

0COUNTRY

0CALYEAR

0CALYEAR

ZCROSS1

<not mapped>

ZCROSS2

ZCROSS1

The DAP is defined on the union of all characteristics of all part providers. Let’s assume it looks like this:

InfoObject

DAP D: authorized values

ZCOUNTRY

DE, FR, GB

ZCNTRY2

DE, ES, IT

0CALYEAR

2015, 2016

ZCROSS1

1, 2, 3

ZCROSS2

3, 4

Now the DAP authorizations are lifted to the HCPR like this:

Characteristic (of HCPR)

Source characteristics
for mapping

Authorization
from source

Resulting authorization D’

0COUNTRY

ZCOUNTRY
ZCNTRY2

DE, FR, GB
DE, ES, IT

DE

0CALYEAR

0CALYEAR

2015, 2016

2015, 2016

ZCROSS1

ZCROSS2

3, 4

3, 4

ZCROSS2

ZCROSS1

1, 2, 3

1, 2, 3

Note that the cross-mapping of ZCROSS1 and ZCROSS2 defined in the HCPR leads to a cross-mapping of the authorization.

3. HCPR with External Part Provider

Up to now, we looked at cases where all part providers were considered providers of the model. Be it because the part providers were explicitly added to the model’s provider list or be it because the HCPR was defined as a provider of the model.

With the HCPR definition as in example 2, we now assume that InfoCube 1 is added to the provider list of the model, but not InfoCube 2 or the HCPR. Let’s further assume that there is some other provider InfoCube 3 in the model and that the characteristics ZCROSS2 is part of InfoCube 3 (but ZCOUNTRY is not).

The DAP definition could look like this:

InfoObject

DAP D: authorized values

ZCOUNTRY

DE, FR, GB

ZCALYEAR

2015, 2016

ZCROSS1

1, 2, 3

ZCROSS2

3, 4

Note that 0COUNTRY is not in the DAP definition, because it is not a characteristic of any of the basic providers of the model -- supplied by InfoCube 2 only, but InfoCube 2 is not part of the model.
But as we assumed that the model contains InfoCube 3 and InfoCube 3 contains ZCROSS2, this characteristic is a part of the DAP.

In this case, the lifting algorithm calculates the following result:

Characteristic (of HCPR)

Source characteristics
for mapping

Authorization
from source

Resulting authorization D’

0COUNTRY

ZCOUNTRY
ZCNTRY2

DE, FR, GB
<all>

DE, FR, GB

0CALYEAR

0CALYEAR

2015, 2016

2015, 2016

ZCROSS1

ZCROSS2

3, 4

3, 4

ZCROSS2

ZCROSS1

1, 2, 3

1, 2, 3

The resulting authorization for 0COUNTRY is created by intersecting the authorization for ZCOUNTRY from the DAP (that is: DE, FR, GB) with the authorization of ZCNTRY2 which is lifted as * (the all-authorization), because ZCNTRY2 is not part of the model.

Intersecting the DAP with Backend/Environment Authorizations

In the above section, we looked at the D in the equation . As we have seen above, the DAP authorizations D are lifted to the HCPR to create the authorizations D’.

The authorizations for the HCPR are hence calculated as . In particular, the lifting is done first and then the intersection is taken (rather than intersecting on base level first and then lifting the result).

Examples

1. Unmapped Characteristics

Consider a BPC model containing an HCPR. The DAP for this model is defined on all authorization-relevant characteristics of all part-providers of the HCPR.

Suppose the HCPR contains one InfoCube which consists of characteristics ZCOUNTRY and 0CALYEAR. Assume further that only ZCOUNTRY is mapped in the HCPR (using the trivial mapping).
Both ZCOUNTRY and 0CALYEAR are DAP-relevant but it is possible to access the HCPR even if no values are maintained for the non-mapped characteristic in the DAP:

InfoObject

Backend Authorization BE

DAP authorizations D

Resulting authorization for HCPR in Env/Mod context

 

InfoCube

HCPR

 

 

ZCOUNTRY

DE, GB

FR, IT

DE, FR, GB

FR

0CALYEAR

2015, 2016

<not mapped>

<none>

<aggregation>

Note that the resulting authorization on the mapped characteristic is calculated by intersecting the DAP with the backend authorizations of the HCPR, not the InfoCube.

2. Different Backend Authorizations on HCPR and Part Providers

Looking at the above example again, we can see the effect of different authorizations on the HCPR and the part provider. In BW Analysis Authorizations, it is possible to grant different access rights to the same data depending on the provider that data is consumed with.

In our example, if the user consumes the data of the InfoCube through a query built on top of the InfoCube rather than the HCPR, he gets the following access rights:

InfoObject

Backend Authorization BE

DAP authorizations D

Resulting authorization for InfoCube in Env/Mod context

InfoCube

HCPR

ZCOUNTRY

DE, GB

FR, IT

DE, FR, GB

DE, GB

0CALYEAR

2015, 2016

<not mapped>

<none>

<none>

So because of the unmaintained DAP on 0CALYEAR, the user is not able to see any data when looking at the InfoCube directly. Also the authorization for ZCOUNTRY has changed completely because now the intersection of D is taken with the authorization of the InfoCube.

Here is another example with an authorization set for 0CALYEAR in the DAP:

InfoObject

Backend Authorization

DAP authorizations D

Resulting authorization

InfoCube

HCPR

InfoCube

HCPR

ZCOUNTRY

DE, GB

FR, IT

DE, FR, GB

DE, GB

FR

0CALYEAR

2015, 2016

<not mapped>

2015

2015

<aggregation>


Usecase: Enforcing the Use of Environment/Model

DAP authorizations as well as the work status feature only take effect if an Environment/Model context is set.

So by running a query in a frontend without Environment/Model context, it might be possible to plan on regions which would be locked under a work status if Environment/Model was set. The restrictions coming from DAPs and the environment authorizations are not applied in this scenario.

To enforce the use of BPC by enforcing the setting of Environment/Model in all frontends, Analysis Authorizations can be used. By giving no basic Analysis Authorizations on the relevant providers at all (i.e.  in the previous notation) and putting all backend authorizations into environment authorizations (E in our previous notation), these providers will not be accessible if Environment/Model is not set. This follows from the fact that only B is evaluated if Environment/Model is not set.

How to Create a Dynamic Refencing Local Member

$
0
0

Local Members are very useful and easy to use when it comes to creating simple formulas. They work very well without breaking. But sometimes even simple calculations tend to break if we are using formulas involving specific members. For example, in a report where Rows/Columns are eliminated with zero or empty values, a Local Member with sum of few members will break if one of the Member is not present in the report after refresh. We can use a Position Local Member, but the formulas can go wrong referencing wrong rows/columns after eliminating. Or we end up with a big report if Rows/Columns are not eliminated.

 

In the below report there are 4 Nodes with lot of Company codes are with Zero values for a specific period. And company codes in NODE4 do not have any data. If a sum of 4 nodes has to be created, we can either use a simple calculation in excel can used. This will result in Formula in LocalMember as below.

 

=SUM(EPMMEMBER([COMPANYCODE].[PARENTH1].[NODE1]),EPMMEMBER([COMPANYCODE].[PARENTH1].[NODE2]),EPMMEMBER([COMPANYCODE].[PARENTH1].[NODE3]),EPMMEMBER([COMPANYCODE].[PARENTH1].[NODE4]))

 

and if you are using Positional Local Member the formula will look like this

 

=SUM(EPMPOSITION(8),EPMPOSITION(26),EPMPOSITION(51),EPMPOSITION(69))

 

The problem with the above 2 formulas is , if rows are eliminated with Zero values, and if one of the Node is removed, the Local Members will not work.

 

There are 2 ways in which we can resolve the issue.

One way is you can just hide the rows/columns instead of removing the rows/or columns. The Local Members work, even if one of the nodes is hidden.

 

Below Positional Local member, still works even if NODE4 is hidden.

 

img1.jpg

 

But let’s say if one of Member (1002) is moved from NODE1 to NODE2 as below, the Positional Local Member will still work but with wrong formula. Here, the Local Member formula is still reflecting D42, which is wrong.

 

img2.jpg

 

When there are multiple reports involved, Hiding Zero or Empty rows may not be available. In which case use below formula to locate specific Node ID and extract the value.

 

Below is a complex looking but simple formula to calculate Local Member for specific Member ID’s.

 

=SUM(IFERROR(VLOOKUP("NODE1",$C$35:INDIRECT(ADDRESS(ROW()-1,COLUMN())),COLUMN()-2,FALSE),0),

IFERROR(VLOOKUP("NODE2",$C$35:INDIRECT(ADDRESS(ROW()-1,COLUMN())),COLUMN()-2,FALSE),0),

IFERROR(VLOOKUP("NODE3",$C$35:INDIRECT(ADDRESS(ROW()-1,COLUMN())),COLUMN()-2,FALSE),0),

IFERROR(VLOOKUP("NODE4",$C$35:INDIRECT(ADDRESS(ROW()-1,COLUMN())),COLUMN()-2,FALSE),0))

 

img3.jpg

 

What the above formula does is to find specific Nodes in the report area using VLOOKUP and dynamically referencing the Rows and Columns based on the Cell used for calculating the Local Member. The IFERROR will give a “0” if the VLOOKUP returns an Error or #N/A.

 

There are few things you need to remember while using the above. Be sure to use only ID’s in report. There is not restriction is using descriptions but if Descriptions are used, and in case of change in descriptions, the formula might break. The report might get bigger and pose a maintenance problem if more member id’s are used in the calculation.

Otherwise, you can use this as one of the option in creating Local Members that has dynamic Rows/Columns and then create or use in other local members based on this.

 

Any comments and/or suggestions are welcome.

Business Planning with SAP. What? Which? When?

$
0
0

In past few years, SAP has introduced many changes to the existing Planning products or introduced new products that focus on Planning in response to changing focus on Cloud and In-memory computing. SAP HANA being the major driver for future SAP products, lot of these products has underwent significant changes in order to address customer need for robust and integrated planning products.

 

In this blog I will try to provide the list of all such planning products and what benefits are being offered by each against other.

 

SAP Business Planning and Consolidation


In Short SAP BPC, is the main product of focus for most of the Planning with SAP, with many customers has chose the product because of its ease of use, integration with MS Excel and integration with SAP BW. SAP BPC also provides the Consolidation(the only other product that offer Consolidation is SAP Financial Consolidation) along with Planning, most of the customers chose BPC for its multiple benefits.

 

SAP BPC has been made into 2 separate Products.

  • SAP BPC Standard : Existing BPC tool that is used for both Planning and consolidation. Can be used with or without HANA.
  • SAP BPC Embedded : Is extended implementation using SAP BW cubes and SAP BW IP/PAK planning functionalities and login on top of HANA. Not possible without HANA. Also supports Consolidation(this uses same consolidation engine as BPC Standard).

 

BPC Standard and Embedded offers more or less same features, including Consolidation. And more and more features are being implemented from Standard to Embedded. BPC Embedded has an advantage when it comes to integration with SAP BW and reusing existing BW implementations. SAP BPC Standard will be interested to those who want flexibility and ease of use, and business ownership.


SAP BW Integrated Planning

 

SAP BW Integrated Planning(BW-IP) is another Planning product that leverages SAP BW. All the existing BW implementation is being reused for creating planning scenarios for BW-IP and single user interface for reporting and planning by using BeX. Even though there are advantages of re-usability, lack of complete functionality and no so easy to use and change the planning models, BW-IP is not a very popular product when it comes to Planning.

 

With SAP HANA, there is a new production as Planning Application Kit (PAK) that can be used along with BW-IP. The PAK has introduced many functionality and features that are not possible earlier, and also provided In-memory execution of complex planning processes. Both BW-IP and PAK can be used simultaneously to provide complex planning scenarios that can be executed in SAP HANA.

 

At present most of the BW-IP/PAK features are being classified into a new product call BPC Embedded, again which is sub classification of existing BPC.

 

 

SAP Integrated Business Planning

 

The original Supply chain planning(SCM) product that also includes SAP APO. The SCM has offered planning functionality for the supply chain business like Demand Planning, Supply Planning and many more. You can read more from here.

SCM has offered a robust and stable functionality that integrates well with existing ERP and other BW implementations. With APO is also included, most the scenarios like Demand Planning, Plan to Inventory can only be delivered using IBP.

 

Again with SAP HANA, SCM has otimized most of functionality in in-memory with SAP HANA. More and more complex processes and algorithms that increase the forecast accuracy and supply chain planning are introduced with IBP. Here is a nice blog which talks about how IBP and BPC can used together.

 

And with S4HANA, IBP has been introduced with separate planning functionality for different departments like Finance and SCM. All the new planning processes are integrated with SAP S4HANA and works together with ERP implementation on S4HANA. This also replaces the Planning based on SAP Gui.

 

IBP for Finance(SAP Simple Finance Add-On for SAP Business Suite powered by SAP HANA (IBP)):

IBP for provides the planning features like Cost center planning,  Project planning,  Internal order planning , Market segment planning , Profit center planning , Functional area planning , P&L planning  within ERP implementation. All the planning directly run on ERP instead of SAP BW. And works with existing FI/CO implementations without creating new models for Planning.SAP Help page for more details on planning. IBP for Finance also uses the BPC 10.1 Embedeed technology for its planning processes. You can read more on Planning with IBP Finance here.


All the content is delivered with SAP Accounting powered by SAP HANA as Local BI Content(Very similar to BI Content in SAP BW). This is only applicable to customers  who have deployed SAP ERP on SAP HANA with the SAP Simple Finance Add-on for Suite on HANA.


IBP for SCM(on SAP HANA, for Cloud deployment):

It is an Integrated supply chain planning – built on SAP HANA, for Cloud deployment. Various Planning Scenarios like sales and operations, demand(IBP for Demand), inventory(IBP for Inventory), supply(IBP for Supply), response planning(IBP for Response), supply chain control tower are supported(Seperately Licensed). You can find more details here on the planning scenarios supported.

 

IBP for SCM supports Planning across various departments and sharing plans in real time. It also supports Real Time dashboards for analytics. Provides predictive modelling capabilities and simulation.  It uses pre delivered supply chain algorithms to integrate into existing business process and is also incremental to existing APO.

 

 

SAP BusinessObjects Cloud for planning


Another Cloud based planning tool that can work independently or using SAP BPC for its reporting and planning capability. You can find more info here.


Planning Product Feature Flow

Solid line represents logical successor to the product. A dashed line represents features are being shared with other products.

Presentation1.jpg

I will try to provide details for different ways in which each product can be used as a single solution or in combination with other products.

Any comments and suggestions are welcome.

BPC 10.1 Embedded Model - Important Information for Office frontend

$
0
0

You are implementing BPC 10.1 Embedded Model and would like to know the official SAP recommendation for the Office front-ends / client.

 

In SAP Note 2312162, we have clearly recommended Analysis for Office (AO) plug-in. This recommendation is based primarily on some architectural advantages of the AO plug-in on SAP BW meta model - compared to the EPM plug-in.

 

 

However, you can still use EPM plug-in/add-in on BPC Embedded Model if you are aware of these architectural (functional and performance) limitations listed in that note.

 

 

Please note that with the next support packages of EPM Add-in (EPM 10 SP27, RTC planned in September) and Analysis Office (2.3 SP 2, RTC planned in September), the access to an BPC 10.1 Embedded Model is deactivated by default.

 

If you are already using or if you plan to use the EPM Add-in/plug-in on top a BPC 10.1 Embedded model, please follow the process described in SAP Note 2327742.


Dummies for BPC NW Embedded system (1)

$
0
0

I would like to start to post a couple of blog article about BPC NW embedded system.


As you know, BPC was started as Microsoft Platform since 1999 and evolved to BPC NW standard model.

Now it is on top of S4HANA.

 

I believe It is a kind of dramatic evolution and I guess all BPC consultants and customers who were using BPC MS and NW standard version should know the differences for the future.

 

SAP already published many slides including videos but I would like to make it really simple for dummies(?) like me because I also had hard time to move from BPC MS world to BPC NW world. (I was a developer/ software architect for BPC MS since 2001.)

 

Here are the topics what I would like to post.

 

1. System and Concept difference between current BPC and BPC embedded,

2. What is difference between BPC Embedded and BPC on S4HANA

3. EPM Excel Add-in vs. Analysis Office as client tool

4. BPC script logic vs. Fox Formula

 

 

If you have any questions or not clear something about BPC NW embedded system including BPC on S4HANA, please leave a comment.

 

Thank you.

Regards,

James Lim

Exit Based Characteristic Relationships with BW-PAK

$
0
0

In BW Integrated Planning, SAP provides the functionality of defining characteristic relationships on a planning cube. These so-called relationships can be used in 4 different ways – Attribute, Exit, Hierarchy and DSO. More information on this can be found here. Exit based characteristic relationships are used when the derivations are customer specific and cannot be derived directly via the other 3 types.

 

In order to check if a planning sequence can be executed in-memory, SAP recommends running the program RSPLS_PLANNING_ON_HDB_ANALYSIS. Based on the report results, we can analyze further, as to what is blocking the planning sequence from being executed in-memory. In this case, we figured out that if a characteristic relationship of type ABAP exit class is used, the planning sequence is NOT executed in-memory. In this blog post, I would like to describe how to analyze the results of this program and implement a quick fallback method that SAP recommends.

 

In transaction SE38, execute program RSPLS_PLANNING_ON_HDB_ANALYSIS. In the selection screen, any one of the options can be selected – Infoproviders, Planning functions or Planning sequence. We have chosen a planning function as an example.

1.png

After execution, we see that the status is red.

2.png


When double-clicking on the red status, the program provides additional information as to what is blocking the planning function to execute in-memory.3.png

4.png

5.png

 

AMDP in Exit Based Characteristic Relationships?

Based on the analysis from running the program RSPLS_PLANNING_ON_HDB_ANALYSIS, we checked to see if we could use an AMDP (ABAP managed database procedure) in the exit-class, as it is SQL Script based and will allow the planning cube to be PAK compliant.

 

When trying to use the AMDP interface, IF_AMDP_MARKER_HDB & implementing an AMDP method using IF_RSPLS_CR_METHODS~CREATE_HDB, the system throws a syntax error, because the importing parameter ‘I_TSX_SELDR’ in the method must be structured and all components of the row-type must be elementary. According to SAP's AMDP documentation, the parameter interface of an AMDP method has to fulfill specific prerequisites (such as the one we faced). Thus, it can be concluded that the methods available for Characteristic Relationships are not AMDP compatible.

 

Alternative Approach – ABAP as a Fallback

As an alternative approach, SAP recommends ABAP as a fallback. In this case, the exit-class needs an interface IF_RSPLS_CR_EXIT_HDB to be implemented. The two methods of the interface, IF_RSPLS_CR_EXIT_HDB~GET_SQLSCRIPT_INFO and IF_RSPLS_CR_EXIT_HDB~GET_SQLSCRIPT_PARAMETERS have to be implemented but they can be empty. The characteristic relationship will call the interface methods at run-time and because the returned values for names of SQL-script implementations are empty, the corresponding ABAP-implementation is called as a fallback. The interfaces mentioned above only serve as markers, indicating that PAK deep HANA integration is requested.

 

6.png

7.png

 

8.png

After making the above changes in the exit-class, re-run the program RSPLS_PLANNING_ON_HDB_ANALYSIS. It is very evident by the green status, that the planning function is now ready to execute in-memory!

9.png10.png

 

Conclusion and Next Steps

It is evident that AMDP is not ready to be used yet in Exit based characteristic relationships. The ABAP as a fallback approach is a quick and easy win, to make the planning cube PAK compliant. However, there are some limitations on how and when this method can be used and the developer needs to exercise caution on this. Please refer to the SAP note 1956085, for more details.

 

Based on the above findings we have considered the following next steps to be dealt with:

  • Test the behavior of all planning functions built on the cube, which has has a Characteristic Relationship with the ABAP fallback method implemented. It is possible that the planning function may give undesirable results and the planning function code may need to be revisited.
  • Raise an idea with SAP to have the Characteristic relationship interface be AMDP compliant
  • Check possibilities of using a normal SQL Script procedure in the exit class, instead of the fallback method

BPC 10: Backup and Restore with Transport Express

$
0
0

I was recently working with a customer which uses the Transport Express (TE) tool for transporting changes through their BPC landscape. Their requirement was to update Development and Quality systems using Production in order to align the technical name of all the BPC dimensions and attributes.

 

Following are the steps I had to take to ensure a successfully restored system using the Production backup file.

 

1) Disable the Transport Express settings.

This step will require a Basis Resource to perform the disabling.

This is a mandatory requirement as whenever a change is made in the system (in this case delete) it will trigger a TE request in the background but the deletion log was not capturing this and the successful delete message was in fact misleading.

 

2) Delete the BPC Environment that needs to be restored by using the program UJS_ACTIVATE_CONTENT and selecting “Clean the Environment”

 

 

3) Make sure that the /CPMB/”Environment ID” node is deleted

   

 

4) Check that all the BPC items are deleted by checking that there are no items with a technical name starting with /CPMB/* under the “Unassigned Nodes”. If the Transport Express tool was active when executing step 2, you will find items here.

 

Following is an example of 2 items that were not deleted but moved to “Unassigned Nodes”

 

5) If there are scrap items that were not cleared, use the program UJAA_CLEAN_DELETED_APPSET to delete the items.

 

APPSET: is the Environment ID

APPS_PFX: this parameter is the environment's technical name prefix. The prefix are the first 2 characters after the /CPMB/ namespace. In the following example the values are “FU”.

 

Example of technical name prefix

 

 

6) If the log shows items were not deleted then manually delete the remaining items (this might require you to delete some internal tables as well, the reference of the table will be indicated in the delete log).

 

Example of log

 

7) When all items for the environment that need to be restored are deleted, run transaction UJBR

 

It is important to select “Use Tech Names from Backup Files” in order to maintain the same technical names.

In our case, we had to load the backup file on the server /usr/sap/trans/ as the file was 1.4 GB and the program would error when it was trying to read from the local PC.

 

The restore will start with building the BPC structures and then load the master data. In this example I restored both Metadata and Master Data at once but this uses up a lot of resources and I would overall suggest to Restore Metadata Tables first and then Master Data.

 

8) The ultimate check is to connect to the BPC admin and see if all the models, dimensions, business rules etc are visible. If all the structures are in, go into each dimension and check whether the master data is in.

 

9) Reactivate the TE tool.

How to implement Matrix Security using Hierarchies in SAP BPC NW

$
0
0

In this blog I will show you a possible approach to implement a matrix security based on 2 different hierarchies in the same dimension.

In particular I will describe how you can use Business Add-Ins (BADI) for BPC to enhance the standard security model:

  1. BADI_UJE_DYNAMIC_DAP will be used to dynamically generate a “Deny” rule for every member that is not visible in one of the hierarchies.
  2. BADI_UJ_SQE_POST_PROCESS will recalculate total members without the amount posted on denied members.

 

1. Business scenario


Many organizations use a group level matrix with two reporting lines by geography and by functions. Therefore, in our BPC’s entity dimension there will be a geographic hierarchical view and a functional hierarchical view.

E.g.: we can consider a computer retailer company which operates in Europe and America selling and providing assistance on Personal Computers.

In our Profit Centre dimension we can describe this by having 2 different hierarchies:


a. Geographic Hierarchy where company’s profit centres report to their region

1_A.png

b. Functional Hierarchy where company’s profit centres report to their function

1_B.png

 

If we need a role for the Europe’s Sales Manager, we won’t have any total node available for him, because he will need access only nodes highlighted in light blue.

1_2.png1_2b.png

2. The options available in the standard model


The standard model is able to cover the most of possible scenarios and we can choose between different kinds of approach.

  1. Assign leaves directly to the users and do not provide any access to both the hierarchies to this type of user. Only bas members will be available and any aggregation between them has to rely on a property instead of hierarchy.

2_A.png

b. Update the dimension in order to merge the two hierarchies: in our example it will mean to create new nodes for Europe Sales and America Sales.

2_B.png


c. Put the second hierarchy in a different dimension: first hierarchy (usually the geographic one) remains in the Entity dimension, but the second one (the functional) is moved to a user defined dimension.

So in our example we will have to duplicate profit centres bas members in 2 dimensions and then build our 2 hierarchies.



2_C1.png2_C2.png

Members shown in grey will be restricted to Europe Sales Manager with a “Deny” Rule in Data Access Profiles.


3. In which situations could we need a non-standard solution?


We will need a custom solution, if we have a big entity dimension, with many different hierarchical nodes in the 2 hierarchies.

  1. If we give access only to leaves, usability of the system will be affected:
    1. In data access profile maintenance, a specific Read/Write Access rule has to be defined for every leave;
    2. In reporting, we will have to use multiple selections of members in Page Key range: more are the leaves that we have to select and more difficult will be to build and to maintain the report.
  2. If we merge the hierarchies, instead:
    1. we will need to create a node for every intersection between functions and regions we need to consider in securities and reporting
    2. we will generate inconsistencies between BPC dimension and original dimension imported from ECC/BW
  3. If we move the second hierarchy to a different dimension:
    1. If the enhancement is requested in an optimization phases, it will imply a massive review of every logic and report;
    2. It will provide additional value if we have to give the users the ability to build reports that are broken down along the geographic hierarchy as well as the functional hierarchy at the same time, but it will be only an additional complexity in case we only need to maintain securities (e.g. we will have an additional secured dimension in the model and this will also affect performances);
    3. It will make more complicated to read and save data: we will have to query the model from total PC sales and to save the data to the same profit centre selected in the entity dimension. This means that we may need to implement the Shared Query Post Process Badi and the Write Back Pre Process Badi.


4. The custom solution


In this section, I will assume you are already confident in creating BADI and writing ABAP for BPC.

As explained before, I will also assume that we have an entity dimension with thousands of members and many different hierarchical levels between the 2 hierarchies. 


4.1 How to generate dynamically a deny rule using the BADI_UJE_DYNAMIC_DAP?


As explained in the introduction, I want to dynamically generate a deny rule for every member that is not visible in the intersection of the two members.

E.g.: Access will be given to Europe and PC Sales and the badi will automatically add the deny rule for Europe PC Services and America Sales for Business.


4_1a.png4_1b.png

In this way, user will be able to see only:

4_2b.png4_2a.png

In the Dynamic DAP badi implementation I will maintain a filter to activate the enhancement only for the profile ids that require the hierarchy matrix security.

E.g.

ENVIRONMENT = SALES_PC

PROFILE_ID = GEO_FUNC_USER (the dynamic Data Access Profile ID)

 

In the implementing class instead, I will call the method to generate the Deny rules.

code_2.png

Following this approach only the profile IDs that start with A_ENT_D_FUNC will apply the new rules; all the others will work applying the standard behaviour. 

The “ct_access” internal table will already contain the member access rules defined in the Data Access Profile, which in our example will consists of:

  1. Read/write access to Europe
  2. Read/write access to Sales

In “exclude_method”, we will have to add the Deny rules, in order to give access only to the intersection of bas members visible in both hierarchies.

code_3.png

For each access rule, I will perform following steps:

  • Get the bas members of the parent specified in the rule
  • Identify the hierarchy to which he belongs
  • If the bas member is found to be in both hierarchies, mark it as accessible
  • Generate a deny rule for every remaining bas members that are not owned in both hierarchies
  • Generate a deny rule for every parents that have only denied bas members below

code_4.png

4.2 How to recalculate totals with the Denied member value using UJ_SQE_POST_PROCESS Badi?


The totals, shown in BPC reports, will still include the value on Denied Members: therefore we need to use the Shared Query Badi to recalculate them.

In the filter of the badi we can only specify the Environment and the Model; therefore in the ABAP we will have to force the badi to work only for users who have a data access profile starting with A_ENT_D_FUNC assigned.

 

Therefore in method IF_UJQ_SQE_POST_PROCESS~POST_PROCESS you will have find if the user has the Hierarchy Security Matrix Data Access Profile and execute a different logic for this kind of users.

code_5.png

The post process method will execute an additional read to get the value of denied members and then it will remove this value from the total. Below the steps in details:

  • Replace selection of parent Entity member with his denied bas members. Please note that the selection of the entity can be both in:
    • it_axis: if entities are exploded in one of the axis
    • it_slicer: if entity is selected in context or in page or it is a single member in axis
  • Read the Denied members value, bypassing security (boolean parameter i_pass_by_security of standard method if_ujo_query~run_axis_query_symm is set to true).

     code_6.png

  • Remove the denied members values from the original query:
    • If the entity is in slicer, just remove the total value on the denied member
    • If the entity is in axis, look up for the parent of the denied member in the axis and decrease it

code_7.png

 

5. Conclusion


I wanted to give an example of the flexibility offered from SAP BPC as a planning tool, which provides you the possibility to implement complicated scenarios as the one described above.

As a tool it provides so many ways to reach a business requirement and it is up to the solution architect to choose the one to achieve the best result in term of usability, performances and maintenance cost.

Viewing all 152 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>