Quantcast
Viewing all 152 articles
Browse latest View live

BPC Implementations - Why it is not only about the technology?

What companies don't often get about BPC implementations is that it is not only about solving a business problem using technology. Don't get me wrong, solving a business problem and using the right technology to do so is very important. But every implementation focuses on getting the right technology, so what is different with BPC implementations and why are so many BPC implementations still not deriving full benefits of the technology?

Why after a BPC implementation, companies are still struggling with these questions. Had we considered the latest release or a different software? OR How our legacy system was so flexible and could do much more? OR Should we have waited for the next release? OR Are our employees resistant to change? OR Is there too much internal politics in my company?

At the ASUG annual conference in May'15:
David McDonough, Director of Enterprise Data Management at Dolby and Suvir Shahani, Senior Managing Consultant, EPM Practice at TekLink International, Inc will be presenting 'Gaining Performance, User Adoption and Self-service while rapidly upgrading across and existing landscape'

David and Suvir will discuss on how getting end-users engaged early on and investing in change management initiatives in parallel while upgrading from BPC 7.5 to BPC 10.0 helped Dolby derive most benefits of BPC and exceed their target ROI

Please stop by at ASUG to introduce yourself and if you are on a journey to implement BPC, we hope to answer some of your questions

I found these below links very informative specially for those of you who are looking to upgrade or implement BPC 10.1

BPC 10.1 A Unified Model - ASUG Recording and Slides | https://www.asug.com/discussions/docs/DOC-37609
BI on Excel Roadmap - as ASUG webcast  http://scn.sap.com/community/businessobjects-analysis-ms-office/blog/2014/11/04/asug-bi-on-excel-roadmap-an-update


Incorporating WBS into the CAPEX model of the BPC 10.1 NW Extended Financial Planning RDS

The as-delivered capital expenditure planning process in the Extended Financial Planning RDS does not utilize the Work Breakdown Structure (WBS).  WBS data is neither imported into the system from source systems nor generated during the project planning process.  As developers of the RDS, this article will discuss our thoughts on potential methods for modifying the RDS to incorporate WBS.

 

[Note: If you want to learn more about the Extended Financial Planning RDS, please visit SAP Service Marketplace - SAP Extended Financial Planning or register for the EPM RDS Academy 2015 at https://www.surveymonkey.com/s/878J92K]

 

The current CAPEX planning process in the Extended Financial Planning RDS has two main elements:

  1. Planning new capital projects.  This is a two-step process.  First, the project spending is proposed and approved at a highlevel. Second, one or more CARs (capital asset requests) attached to the project are proposed and approved.  Each CAR contains budget data for up to 10 asset types. This is the lowest level of granularity used in the planning process.
  2. Updating existing capital projects.  Based on the current integration process, limited data about current projects (original budget, spend-to-date and commitments) is imported into the system.  All imported data from the source system is at the project level (i.e. no asset level details). Using the delivered input form, the RDS user then can assign the remaining project budget to the 10 asset types over the forecast horizon.

 

If the customer desires to incorporate WBS and the source system integration is expanded to comprehend WBS, first a design decision must be made:


Option 1– Incorporate WBS only into the existing project aspect of the process (i.e. utilize WBS data for reporting
on actuals and in the update process for existing projects).  However, new projects would continue to be planned at the Project and CAR level.

 

Option 2– Incorporate WBS in both aspects of the CAPEX process (new and existing projects).

 

 

The first option can be accomplished with modification of the structure of the PROJECT dimension with limited impact on the delivered input forms and reports. The second option will require the addition of a new dimension (WBS) which will require material modification to the delivered input forms and reports.  A clear understanding of the customer needs should be obtained before proceeding with option 2.

 

Implementation of Option 1

In this option, WBS data is being imported from source system(s) for existing projects (note: the expansion of the current delivered integration capabilities is beyond the scope of this article).  This data will be used for any desired reporting based on actuals and within the reforecasting process for these existing projects.

 

The delivered design of the PROJECT dimension is as follows:

Image may be NSFW.
Clik here to view.
1.png


The blue-shaded elements indicate the generic sequentially numbered projects used for planning new projects.  The green-shaded elements indicate the project IDs and descriptions of existing projects imported into the CAPEX model. (Note: The Test Script EG1_BPC101_BPD_EN_XX.doc provides further details the design of the PROJECT dimension). The blue-shaded elements are used exclusively in input forms and reports related to planning of new projects, whereas the green-shaded elements are used exclusively in input forms and reports related to reforecasting of existing projects.


The updated design would not require a change to the blue-shaded elements but for existing projects, a concatenation of the project ID and WBS element would be used. For example:

Image may be NSFW.
Clik here to view.
2.png

By concatenating Project and WBS, we avoid the addition of an another dimension which would require material modification to the delivered input forms and reports.  Those input forms and reports related to actuals/existing projects will use the concatenated
Project/WBS as the “Project” which may be somewhat cumbersome.  However, we propose the addition of two property columns which can be used by the implementer to improve the usability of the input forms / reports by displaying the Project and WBS in a familiar format to the customer.

Image may be NSFW.
Clik here to view.
3.png

As noted before, the enhancement of the integration capabilities to import the source data in a suitable manner is beyond the scope of this article.

 

Implementation of Option 2

In this option, WBS data is being imported from source system(s) for existing projects and WBS data will be generated for new
projects.  In other words, new projects will be planned not at the PROJECT and CAR level, but the PROJECT, WBS and CAR
level.

 

This requires the addition of a new dimension to the CAPEX model called WBS.  This dimension will have a structure similar to the PROJECT dimension:

Image may be NSFW.
Clik here to view.
4.png


The blue-shaded elements represent generic WBS elements to be used with the generic project numbers for planning new projects and WBS’s. The green-shaded elements represent the IDs and descriptions resulting from the importing of WBS data from the source system(s).  These elements are used for reporting of actuals and for updating existing projects and WBS’s. Note that unlike in Option 1, no changes are required to the PROJECT or other dimensions.


The main impact of this change is that the delivered input forms and reports need to be modified to accommodate the addition of the WBS dimension to the CAPEX model.   For example, the input form 20 CAPITAL (ASSIGNED) REQUEST.XLTX has a user selection area that is arranged as follows:

            Image may be NSFW.
Clik here to view.
5.png

 

The design would need to be modified similar to the example below:

Image may be NSFW.
Clik here to view.
6.png

Similar changes are required for the other input forms and reports and the customer may desire additional reports focused on WBS analysis.

 

It must be mentioned that the addition of the WBS to the selection area further increases the complexity of the user’s task. The
additional dimension may have a negative effect on system performance (send data and refresh data times).  The need for WBS when planning new projects should carefully be considered with the customer before following the option 2 path.

 

One final thought – the current design incudes approval cycles first at the Project level, then at the CAR level. Would addition of the WBS dimension made a third approval level necessary or desirable?  Sounds like a topic for a future article.

 

We welcome all feedback and suggestions on this article as well as the Extended Financial Planning RDS.

Topics in BPC Sizing. Part 0: Preliminary Information

In this and subsequent blog posts we will explore several topics related to the sizing of the SAP BPC 10.0 NW application. In this first post (Part 0: Preliminary Information) we present a brief recap of the standard sizing guide and methodology, while subsequent posts will dive into an array of extended subjects. Topics anticipated for inclusion: expert sizing of CPU / SAPS requirements, a probabilistic approach to buffering these SAPS calculations, special considerations for a variety of customer scenarios, and empirical tests of simple hardware configurations.

 

Part 0: Preliminary Information


The primary source for sizing of the SAP BPC 10.0 NW application is the sizing guide available on the SAP service marketplace (see below).


http://service.sap.com/sizing> Sizing Guidelines > Analytics > EPM: SAP Business Planning and Consolidation NW & HANA 10.0

 

 

It is important to note: the sizing requirements contained within the BPC sizing guide are for the BPC application only, not for the entire BPC system. SAP BPC sizing must be done in two steps: you need to size the BW foundation first and then add the BPC requirements on top of this foundation. (Please refer to http://scn.sap.com/docs/DOC-60565 for details.)


For example, you will find in the sizing guide that the sizing for the simplest implementation class (in terms of complexity / number of concurrent users) of BPC 10.0 NW on HANA is: 900 CPU SAPS and 8 GB memory for the application server, and 700 CPU SAPS, 42 GB memory and 73 GB disk for the HANA database. But if you use these numbers as sizing for the entire BPC system, the system will most likely be undersized. The application server sizing amounts to just 1 CPU core, which is not enough to run a BPC 10.0 NW system. And even without populating the database with any business data, a newly-installed BW system will already use more than 42GB of memory on HANA.

 

 

Within the SAP BPC 10.0 NW sizing guide you will find two methods for sizing the planning functionality.

  1. The simplest method is to assess the customer's environment by level of complexity and user load, matching it to one of nine predetermined classes for which sizing recommendations already exist. Specifically you will find recommendations for the application server SAPS and memory requirements, as well as database server SAPS, memory, and disk requirements.
  2. The more complicated method is to calculate the application server and database server requirements from a customized user profile distribution, where the customer specifies how many users are expected to be simultaneously running each of 10 predetermined user profile tasks

 

In method 1 above, the customer's environment is first classified as belonging to one of three complexity classes – Categories 1, 2 or 3 – depending on whether the implementation is medium-sized, large, or a large enterprise deployment. The guide itself contains further details, where each such category is defined in terms of statistics such as the number of dimensions, the maximum number of dimension members, the number of transactional records in the environment, the script logic complexity, etc.


Next, the expected number of concurrent users is specified: 50, 150 or 300 users. Based on both the complexity category and the concurrent user count, the included tables provide the estimated application server and database requirements for each of the nine possible configurations. See below for several sample tables.

 

Image may be NSFW.
Clik here to view.
p0im1.jpg


The above tables assume that the customer is using a combination of both the EPM Add-In frontend and the web frontend. A separate set of tables exists for customers using only the EPM Add-In frontend.


So for example for a Category 3 implementation of BPC 10.0 NW on HANA with 150 concurrent users using both the EPM Add-In and web frontends, the above tables recommend: 34,000 CPU SAPS and 30 GB of memory for the application server, and 31,000 CPU SAPS, 52 GB of memory and 108 GB of disks for the HANA database. Again, this is in addition to the sizing requirements of the BW foundation on which the BPC application sits.


Sizing via method 2 above is a refinement of method 1 in that the customer is given further control over the number of concurrent users and the user profile distribution itself, i.e. how many of the expected concurrent users belong to each of the different user profiles.


There are a total of 10 BPC user profiles presented in the sizing guide, and provided for each is a definition of the actions that such a user performs and the "think time" (or wait time) between each repeated cycle of such actions. See below for a sample.

Image may be NSFW.
Clik here to view.
p0im2.jpg


The BPC sizing guide then provides the system requirements of a single user executing the tasks of each such user profile, where again the requirements are separated into one of three categories depending on the complexity of the implementation. See below for one such table describing the single-user application server requirements for each of the user profiles.

 

Image may be NSFW.
Clik here to view.
p0im3.jpg


The total sizing requirements for the application server are then calculated by multiplying the single-user requirements above by the number of concurrent users belonging to the corresponding user profile, and then summing across all such relevant user profiles. There are similar tables for sizing both traditional and HANA database requirements as well. See the sizing guide for the full details and for example calculations.


Note that sizing by method 1 is simply a special case of sizing by method 2, where the user profile distribution across 50, 150 or 300 users is already pre-determined (and hence the sizing calculations pre-performed).


Please see the sizing guide for further information about each of these sizing techniques, as well for additional background information and for some similar details on the sizing of consolidation functionality.


While the preceding sizing methodology is convenient, it is not without its limitations. In the next post we will explore some of these limitations, and we will use this as a jumping-off point for further techniques and topics related to the sizing of the BPC 10.0 NW application.

How to consolidate mannual adjustment entries in BPC?

According to some customer’s business requirement, manual adjustments to automatic consolidation result is required. So they configure a ‘M’ type data source (audit ID) particularly for this and post journal entries by this data source to adjust the consolidation result.


In this scenario, there is a common requirement that manual entries posted at lower consolidation group need to reflected (consolidated) in the higher level group.


Following is an example:


Group1 is the parent of Group2 and Group3.The manual adjustment in 2015,FEB is :

 

TIME      GROUP     RPTCURRENCY    ENTITY  INONE CATEGORY  ACCOUNT AUDITID   SIGNDATA

2015.02  Group1        USD                       E100      I_NONE Acutal          A1000       Journal_M   1000

2015.02  Group2        USD                       E100      I_NONE Acutal          A2000       Journal_M   100

2015.02  Group3        USD                       E100     I_NONE Acutal           A3000       Journal_M    100

 

The customer expects the data of group2 and group3 to be consolidated to the group1 as below:

 

TIME         GROUP     RPTCURRENCY    ENTITY  INONE CATEGORY   ACCOUNT    AUDITID     SIGNDATA


2015.02  Group1        USD                       E100      I_NONE Acutal          A1000             Journal_M   1000+100+100


The question here is : When executing BPC consolidation business rule, it only takes care of the data source(Audit ID) whose type is set to be ‘A’. How can this be achieved?

It is not well known that we can use the property ‘DATASRC_STAGE’ of the AUDITID(datasource) dimension to achieve this?

Regarding how to make the configuration for this is not documented on SAP help.But this is introduced on SAP note 2049932. This functionality is available
since BPC 10 NW SP15.

Please note:for his functionality, there are some subsequent fixes in later SPs,refer to note 2073249,2098172 and KB 2163803.

Enjoy.

Topics in BPC Sizing. Part 1: Limitations of the Standard Sizing Guide

In this series of blog posts we will explore several topics related to the sizing of the SAP BPC 10.0 NW application.

 

 

To see the first post (Part 0: Preliminary Information), in which we present a brief recap of the standard sizing guide and methodology, please click here: http://scn.sap.com/community/epm/planning-and-consolidation-for-netweaver/blog/2015/05/18/topics-in-bpc-sizing-part-0-preliminary-information

 

 

In this post (Part 1: Limitations of the Standard Sizing Guide) we will explore some of the sizing challenges raised by the flexibility of the BPC application itself.

 

Part 1: Limitations of the Standard Sizing Guide


The BPC application itself is extremely flexible. In addition to giving customers control over the basic implementation structure (e.g. the dimensions, the structure of their hierarchies, all data access profiles and rules, etc.), BPC provides a wealth of powerful customization options (by way of member formulas, script logic, SQE and WriteBack BAdI's, etc.). Application users as well have a tremendous amount of flexibility in their reporting and query design. This flexibility is an important part of BPC's overall power and utility.


This flexibility, however, raises challenges for sizing. These sources of flexibility have impact on the overarching complexity of the application, and thus necessitate special consideration in terms of sizing. Some such considerations include:

 

  • The level of complexity of the calculations taking place on the application server and the database.

    There is a significant amount of variability in calculation complexity as governed by the use of custom member formulas, script logic, BAdIs, etc. And even within each of these customization methods there is a large variance in complexity level, e.g. a custom member formula can be anything from a simple sum to a complex nested formula spanning multiple dimensions.

    The sizing guide attempts to divide the complexity of a BPC implementation into one of three categories, but there are simply too many dimensions along which BPC can be customized for such a categorization to be exhaustive.

  • The nature and complexity of user profiles.

    The sizing guide addresses sizing in terms of user profiles: profiles that capture a particular user activity in the system. The per-user demand of each profile, combined with the distribution of concurrent users across these profiles, determines the overall sizing requirements of the BPC application.

    But these profiles are not exhaustive of the tasks that can be performed in a BPC system. And they do not take into account the varying complexity of the tasks included; again the complexity is divided only into one of only three categories. For example: the system demands of the user profile for "Run Report (EPM Add-in Client)" depend heavily on the complexity of the report being run, and this may not be adequately captured by a single user profile.

    And as we will explore in a future section, the "think time" – i.e. the time between the repetition of a user profile's tasks – has a very important impact on the CPU requirements associated to a given user profile. The think time is baked into the user profiles described in the BPC sizing guide, and the guide provides no way to customize these values in computing the sizing.

  • Use of the BW vs. HANA MDX engines.

    For BPC 10.0 NW on HANA systems, the choice of an MDX engine has an impact on the sizing requirements of the application server and of the database. Opting for the BW MDX engine will result in a heavier CPU requirement in the application server, while opting for the HANA MDX engine will shift some of that computational burden to the database. Currently the sizing guide does not distinguish between these two customization scenarios.

  • The complexity of Excel-side calculations, e.g. through formatting or the use of complex VBA logic.

    The performance of a report is governed not only by its complexity from a query standpoint, but by its client-side complexity as well. Reports of significant Excel complexity (formatting, VBA usage, etc.) may see poor performance regardless of the application server and database sizing. While this is not a traditional concern of system sizing, it is a customer pain point worthy of awareness.

  • The data volume level in reporting and planning.

    The performance of reporting and planning can also be impacted by the volume of data being sent over the network. Again this is not a problem that can be alleviated by adjusting the sizing of the client or the server side systems, but it is a topic worth understanding when assessing the overall performance of the system.


Many of the considerations above illustrate limitations of the standard BPC sizing guide.

 


The guide itself is further hindered by the fact that the relevant environment and testing details themselves are not publically available. So while the guide may describe the complexity level of each category, and while it presents some general guidelines around the complexity of the reporting objects that were tested, it is difficult to assess exactly how well the template environment / objects match up to those being sized. In other words, actual customer reports could be very different from those used in the standard sizing guide. And with no visibility into the raw computations used to produce the sizing guide's tables, it is impossible to tweak individual parameters (like the think time) to further customize the sizing results.

 


The BPC sizing guide is a good place to start in sizing the BPC 10.0 NW application, but should any of the above or further considerations arise it may be necessary to engage in some expert sizing techniques, i.e. sizing tailored directly to the customer's scenario.

 


In the next section we will explore the basic logic behind using a sandbox environment to estimate CPU sizing, both in terms of the number of cores and in terms of a rough SAPS measurement.

Logic by HANA stored procedure - SAP BPC on HANA

Not sure this information is already announced in other blog or discussion. If you already know how to use HANA SP for BPC, do not need to go thru.

 

Let me explain how we can overcome performance issue of BPC with HANA stored procedure.

 

The project was for 1. 4-month rolling forecasting 2. Yearly planning 3. Long range planning 4. Operating Income Simulation by LE condition. Industy was petrochemical and the company is located in Korea.

 

During design phase, the customer strongly asked to do performance test within isolated environment with PA data, we got 1 million data as sample and do test by BPC Script, ABAP, and HANA Stored Procedure. The result was below.

 

Let say 10 sec as a base line, which is BPC Script logic. ABAP got 7 and HANA SP got 1. Not that complex just assumed simple, we believe the gap would get larger in propotion of complexity of logic.

 

Here is how we set the link between HANA SP and SAP BPC.

 

 

1. Once you finished HANA SP, you need to DB proxy.

- Assume you already know how to make SP, you need to make out-put parameter same as the module you created in BPC.

- In HANA studio, you need to install ABAP software, it make you enable to create DB proxy.

 

Image may be NSFW.
Clik here to view.
ABAP_02.jpg

 

- Do not forget Do Synchronize and then Activation

2. Once you created DB proxy, you need to make BADI.

- Assume you already know how to manipulate BPC BADI

- Create Fiter

 

Image may be NSFW.
Clik here to view.
BADI.jpg

- Create Implementation

Image may be NSFW.
Clik here to view.
BADI_Implementation.jpg

3. Activate the BADI implementation.

4. Create LGF file in BPC Admin

Image may be NSFW.
Clik here to view.
Logic Scripot_01.jpg

 

 

You need to follow the step above to avoid validation error.

 


Except the performance benefit, you can get the advantage as follow.

 

1. Debug stored procedure - Whether you implement Classic or not, which means model supports single figure or not(if you implement Embedded model), you can build internal table as multi-key figure(just like PA table CE1XXXX). And you can verify the logic you built in debug mode. It is easier!!

2. All ECC tables you need, you can access from SP after getting data to DSO. -  All LE events, you can design the integration between LE and FI/CO data for BPC planning. For example, you can design MRP explosion for the planning with ECC tables.

3. And I believe ANSI-SQL is more widely known and easy to maintenance.

 

If anybody knows another way to run HANA SP with SAP BPC, let me know. ^^;;

SAP BPC Introduction with BW

Dear SAP Users,

I have tried to prepare a document to introduce and illustrate the SAP Business Planning and Consolidation (BPC) aspects and also with respect to BW.

I hope this document would help the beginner's and the BW consultants who are looking to explore the BPC on BW career mode.

Do give your Inputs and suggestions to improve and collaborate more on this space.

 

SAP Business Planning and Consolidation ( BPC )

SAP Business Objects Planning and Consolidation (formerly Outlook Soft) is a corporate performance management tool that can cater for all types of planning and Consolidation , forecasting and Budgeting from simple small processes to complex multi-layer processes, while also providing consolidation and easy to use reporting.

SAP BPC offers a robust, multiuser platform, which is fully integrated with Microsoft Excel.

The multi-dimensional nature allows users to “slice and dice” the data according their needs.

As SAP BPC utilizes OLAP technology, this means that data is available as soon as it has been entered – no need to aggregate data.

Enables an organization to keep track of its progress through a budget/forecasting process.

 

Planning and Consolidation is part of Enterprise Performance Management

Image may be NSFW.
Clik here to view.
Planning and Consolidation is part of Enterprise Performance Management.png

SAP BPC Versions and Requirements

SAP BPC 7.5, version for Netweaver, requires SAP BW 7.0

SAP BPC 10.0, version for Netweaver, requires SAP BW 7.3

SAP BPC 10.1, version for Netweaver, requires SAP BW 7.4 SP05

 

 

 

 

 

 

SAP BPC Architecture

Image may be NSFW.
Clik here to view.
SAP BPC Architecture.png

The Excel Interface of the EPM add-in can be used to access data from multiple

  1. Sources.

The data retrieval options are the same, regardless of the source. However,

when used for the planning and consolidation application, additional features are

available including data input and Data Manager, for example.

 

Data Manager is used to import data and run planning functions such as copy,

delete, and move.

 

Since it is easy to use, IT does not always need to be involved in the configuration of SAP Business Objects Planning and Consolidation.

Image may be NSFW.
Clik here to view.
easy to use.png

 

 

BPC Objects with BW objects Comparison

 

SAP BPC

SAP BW

Environment

InfoArea

Model

Infocube

Dimensions

Infoobjects

Member

Characteristic Value

Property

Attribute

Hierarchy

Hierarchy

SignedData

Key Figure

Measures

Calculated Key Figures

 

 

 

 

High Level BPC – BW Solution

Image may be NSFW.
Clik here to view.
High Level BPC – BW Solution.png

 

 

 

 

It is Necessary to understand the Key Figure and Account Based Model. On a high level the comparison


Key Figure to Account Based Modelling

Image may be NSFW.
Clik here to view.
Key Figure to Account Based Modelling.png

Key Figure Cube to BPC Cube

Image may be NSFW.
Clik here to view.
Key Figure Cube to BPC Cube.png

 

 

 

Modelling Infocubes

Image may be NSFW.
Clik here to view.
Modelling Infocubes.png

 

 

 

 

 

Data Load High Level

 

Flat File Scenario

Image may be NSFW.
Clik here to view.
Flat File Scenario.png

 

 

Master Data Import

Image may be NSFW.
Clik here to view.
Master Data Import.png

 

 

BW to BPC Scenario

Image may be NSFW.
Clik here to view.
BW to BPC Scenario.png

 

 

 

 

My main focus would be to align on the BPC on BW integration and Technical Architecture and details.

Therefore my next step would be illustrate and explain on the BPC on BW integration followed by step by step explanation on each topic like Modelling, Loading and so on.

 

Good Luck and hope this helps.

Tables BW for BPC 10.1(Embedded)

Hi,

 

On many occasions we have found with inconsistencies within the SAP BPC 10.1 Embedded system such as:

 

 

Image may be NSFW.
Clik here to view.
No allow delete BPF´s

Image may be NSFW.
Clik here to view.
No allow delete Instances BPF's

Image may be NSFW.
Clik here to view.
Some action buttons do not run if the administrator does not have BPC Web in English language

 

 

Therefore the only way I found to delete the BPF and instances of BPF, is delete the component by the tables of BW.

 

 

Seeing that the transactions and tables in BW for the classical model are not the same for the embedded model.

 

 

After searching and not finding documentation on tables for embedded model. But as he says the saying "he who seeks finds" With the help of my love found some the tables applying the embedded model.

 

 

which today want to share.Image may be NSFW.
Clik here to view.

 

 

I hope to be of use.

 

 

TableDescription
RSBPC_EN_BCSPOA SBC E&N BCS Send Document Fiel
RSBPC_EN_DEFPREE&N Master User Preferences for De
RSBPC_EN_EMAILPR              E&N Email Preferences
RSBPC_EN_MYMSGPR  POA SBC Event & NotificationMy Mes

RSBPC_EN_PREF_T

E&N User Preferences Item Descript

RSBPC_EN_USRPRE           

E&N User Preferences

RSBPC_EN_V_DEFPR           

E&N: User Channel Preferences

RSBPC_WEB_CONN             

BPC Web Connections

RSBPC_WEB_DISP             

BPC Web resource handler dispatch

RSBPC_WEB_UP               

BPC user preference

RSBPC_WEB_V_DISP           

BPC: Programa de control Web

RSBPC0_AUTO_UPDT           

BPC: Autoupdate

RSBPC0_PARAM               

BPC BW Extension: Global Parameter

RSBPC0_PARAM_APP           

BPC Extension: Environment/Model P
RSBPC0_PARAM_DEF               BPC BW Extension: Default Paramete

RSBPC0_TR_CONT             

BPC: request for transport deletio

RSBPC0_TR_DELETE           

BPC: request for transport deletio

RSBPCA_APPL                

BPC model information

RSBPCA_APPL_IPRV           

BPC information about infoprovider

RSBPCA_APPLT               

BPC Model description text

RSBPCA_APPSET              

BPC Environment information

RSBPCA_APPSETT             

BPC Environment text table

RSBPCA_ENVM                

BPC Environment information

RSBPCA_ENVMT               

BPC Environment text table

RSBPCB_ACTION              

BPC: Activity action

RSBPCB_ACTION_CV           

BPC: BPF action context

RSBPCB_ACTION_PA           

BPC: BPF action parameter

RSBPCB_ACTIONT             

BPC: Activity action text

RSBPCB_DRV_DIM             

BPC: BPF Driver Dimension

RSBPCB_EMAIL               

BPC: BPF Email Template

RSBPCB_EMAILT              

BPC: BPF Email Template Text

RSBPCB_INST                

BPC: BPF Instance
RSBPCB_INST_OWN                BPC: BPF Instance Owner

RSBPCB_MEM_ATTR             

BPC: BPF Driver Dimension Attr

RSBPCB_MEM_HIE              

BPC: BPF Driver Dimension Hier

RSBPCB_REPORT               

BPC: Vista BPF para informe de

RSBPCB_RGN_ACC              

BPC: BPF Step Region Access

RSBPCB_SOBJ                 

BPC: BPF Semantic Object

RSBPCB_SOBJ_LK              

BPC: Semantic object link

RSBPCB_SOBJ_LKP             

BPC: Semantic object link para

RSBPCB_SOBJ_LKPT            

BPC: Semantic object link para

RSBPCB_STEP                 

BPC: BPF Step

RSBPCB_STEP_MAP             

BPC: BPF Step Map

RSBPCB_STEP_RGN             

BPC: BPF Step Region

RSBPCB_STEP_ROLE            

BPC: BPF Step Role

RSBPCB_STEPT                

BPC: BPF Step Text

RSBPCB_TEMPLATE             

BPC: BPF Template header

RSBPCB_TMPL                 

BPC: BPF Template

RSBPCB_TMPL_ACC             

BPC: BPF Template Access

RSBPCB_TMPLT                

BPC: BPF Template Text

RSBPCC_F_DAO_V              

Configuración de DAO feed BPC

RSBPCC_FEED                 

BPC Comment feed table

RSBPCC_FEED_COM             

BPC Comment Feed comment table

RSBPCC_FEED_DAO             

BPC Comment Feed DAO table

RSBPCE_DAP_DET              

BPC: Detalles del perfil de acceso a datos

RSBPCE_DAP_DIM              

BPC: DAP Dimension & Hierarchy

RSBPCE_MEMACCESS            

BPC: Dmension member access table

RSBPCE_PROFILE              

Profile/Role Mapping

RSBPCE_PROFILET             

BPC: Profile text

RSBPCE_TEAM                 

BPC: Team

RSBPCE_TEAMT                

BPC: Team text

RSBPCE_USER_TEAM            

BPC: User team

RSBPCF_APSRV_DIR            

BPC File Service Application Server Direct

RSBPCF_CLUSTER              

BPC File Service Document Cluster Table

RSBPCF_DOC                  

BPC File Service Document Master Table

RSBPCF_DOC_ACC              

BPC: File Service Security Check Table, on

RSBPCF_DOCMAP               

BPC File Service Document Directory Path M
RSBPCF_DOCTREE                 BPC File Service Document Tree Table

RSBPCK_AO                   

BPC IP Extension : Primary Object Table

RSBPCK_AO_R                 

BPC IP Extension: Workbook relationship

RSBPCK_AOT                  

BPC IP Extension:  Workbook description

RSBPCPS_ALIAS               

BPC: Persistence reource alias

RSBPCPS_TEAM_RES            

BPC: Team Folder resource

RSBPCR_ACT_PROV             

BPC: Activity workspace action provider

RSBPCR_APPS_PROV          

BPC: Resource releated appset handler

RSBPCR_CNTXT_HDR            

BPC: Resource context header

RSBPCR_CNTXT_ITM            

BPC: Resource context item

RSBPCR_FILE_UPLD            

BPC: Upload files

RSBPCR_MODEL_DAO            

BPC: Model DAO configuration

RSBPCR_OWNER_PRV            

BPC: Owner authority provider

RSBPCR_PROPERTY             

BPC: Resource Property

RSBPCR_RES_PROVD            

BPC: Resource provider

RSBPCR_RESOURCE             

BPC: Persistence Resource

RSBPCR_V_ACT_PRV            

BPC: Proveedor de acciones p.área de traba

RSBPCR_V_APS_PRV            

BPC: Programa control conjunto actividades

RSBPCR_V_OWN_PRV            

BPC: Verificación de autorización para pro

RSBPCR_V_RES_DAO            

BPC: DAO de recursos

RSBPCR_V_RES_PRV            

BPC: Proveedor para tipo de recurso

RSBPCR_V_VIW_PRV            

BPC: Proveedor de vista

RSBPCR_VIEW                 

BPC: Persistence Resource View

RSBPCR_VIEW_PROV            

BPC: View provider

RSBPCR_VIEW_VALS            

BPC: Resource View Values

RSBPCR_VIEWT                

BPC: Persistence Resource View Text

RSBPCU_DA_CTL

BPC-PAK: Data Audit Control Table

RSBPCW_APP_STATU

Application lock status cache

RSBPCW_EMAIL

BPC Work Status - Email Setting Table

RSBPCW_EMAIL_CFG

Work Status Email Confiuration

RSBPCW_EMAILT

BPC Work Status - Email Setting Text Table

RSBPCW_LCK_DIM

BPC Work Status - Lock Dimension Table

RSBPCW_MODTAB

Model Table: Work Status

RSBPCW_STATCODE

BPC Work Status - Status Code Table

RSBPCW_STATCODET

BPC Work Status - Status Code Text Table

RSBPCW_STATCOMP

BPC Work Status - Component Enablement Sta

RSBPCW_UPD_LOG              

BPC Work Status - Locks Update Log Table
RSBPCW_WK_STATUSBPC Work Status: Meta Info for Work Status

 

Best Regards,

Juan Pablo Chaparro.Image may be NSFW.
Clik here to view.


Important T-CODES Related to BPC in BW Side

I'm posting some important T-CODES in BW side that certainly help while working in BPC Image may be NSFW.
Clik here to view.
..

 

Transaction Code

 

Description

UJ_VALIDATIONValidation Maintenance (Old version - currently implemented using CONTROLS in BPC
UJ0_IMG_01BPC Parameters
UJ0_IMG_02BPC AppSet Parameters
UJ0_IMG_03Model Parameters
UJ00BPC Configuration
UJBPCTRTransports - Create Request
UJBRP&C: Backup & Restore Tool
UJETCUJE Transaction Data Check
UJFSFile Service
UJKTScript Logic Tester
UJLDauto update msp file upload
UJQ0Query Runtime Parameters
UJR0Writeback Runtime Parameters
UJSTATPerformance Statistics Report
UJUT_DATAUnit Test Data Maintain

 

Thanks and Regards,

Anurag Yadav

How to perform volume test for BPC reports with statistics information

As we know, BPC NW version could collect statistics in transaction UJSTAT, which contains very detailed runtime information about a report or input form. Besides to help analyze the query performance bottleneck, it can also be used to perform volume test. In following sections, I’ll give more detailed steps about it.


1. Preparation

Firstly, we need to enable the BPC_STATISTICS and run the report/input forms we want to include in the volume test, so that UJSTAT could collect statistic for them.


For how to enable this parameter, refer to note 1708178 - How to turn on statistics monitoring in BPC 10 NW?


After statistic info. is collected, what we need to do is to trigger the report/input forms according to the volume test plan. For that, we define an user interface based on UJSTAT.


2. User Interface

You can implement your own user interface for this volume test, here we just select to customize UJSTAT program to leverage user knowledge of this tool:


Firstly copy the UJSTAT program (UJ0_STATISTICS_RPT)  to custom package, and add new function based on it. As shown in following figure, we add a new button “Run all queries in batch” to trigger the test.

Note: Don’t change the code of UJSTAT directly, since it will be overridden when system is upgraded.


Image may be NSFW.
Clik here to view.


After select out the queries we want to run, click the button “RUN all queries in batch”, a dialog will be shown up to let user input parameters:


Image may be NSFW.
Clik here to view.

Parameters:

Think time:

The time that a user waits between performing successive actions is known as the think time.


Iteration:

iteration is a single scenario in the testing.


Concurrent user:

The user number involved in the test.


Pace between users:

The time between different users join the test. For example if pace is 180 seconds, and user 1 starts test at 9:00, user 2 will start test at 9:03.

 

Image may be NSFW.
Clik here to view.

(R*T: report time, TT: Think time)

 

After specify parameters, click the button “RUN” to start the test. You could monitor the test running in SM50:

Image may be NSFW.
Clik here to view.


3. Key points

In order to simulate different users, we use asynchronous RFC (aRFC) to trigger different tasks for each user.

Image may be NSFW.
Clik here to view.

During the think time, dialog work process will be release for other use.

4. Next enhancement

  • We consider to add DM package support in this volume test tool.

Handling Locking in an Offline Application

Introduction

EPM offline mode is a key feature when it comes to procuring input from users not having EPM Addin installed. However, the offline mode comes with its own set of challenges.

 

Scope

Offline mode provided by EPM Excel Addin required when converting a file to a offline mode - manually or via Distribution functionality.

 

Software Components

- SAP BW7.31

- SAP BPC 10.0 Version for SAP NW

- EPM Excel Addin - Version 10 SP21 .NET4

- MS Excel 2013 (32-bit)

 

Challenge

The offline mode overrides the protection provided in the application and disables the permitted Excel features.

 

 

Additional Details

SAP Note 1738690.

 

Case

Let us say we have an input form with a few thousand rows with Auto-filter enabled and EPM protection provided with the permission to use Auto-filter. It is to be sent across to people on field to capture forecast closer to the customer for better accuracy and visibility. These representatives do not have EPM Addin. The input form is sent across to them after turning on the offline mode.

 

Challenge

Even though representatives are able to input numbers, they are not able to perform auto-filter. The auto-filter is disabled by the protection forced by Offline mode.

 

Cause

When a report or input form is switched to offline mode, it forces an additional protection on the worksheet which overrides the existing one. The default password for offline mode is "PASSWORD" - refer SAP Note 1728690. Even if the same password is supplied under EPM protection or your VBA code, the template/form/ report stays locked except the data entry cells.

 

Approach to Resolve

A small VBA macro can be executed when the file is opened to remove the Offline Mode password and re-enable it again with required permissions. In this case, we need Auto-filter to be enabled.

 

Sub Offline_Tackling()
If InStr(CStr(Cells(7, 3).Formula), "_epmOfflineCondition_") > 0 Then     ActiveSheet.Unprotect Password:="PASSWORD"     ActiveSheet.Protect Password:="PASSWORD", AllowFiltering:=True     ActiveSheet.EnableAutoFilter = True
End If
End Sub

What this macro does is to look into a cell (any fixed cell that can be reference for offline condition formula) of the form to check if it is offline or not. It can be executed at the time of opening the workbook or user discretion. It will help to enable the Auto filter with other protection remaining intact allowing the template to be a little more flexible to use.

 

When template/form is brought online, the original password will be restored and the data can be saved successfully.

How to Display Formatted Save Results

In this blog, I would like to share one simple solution I have implemented with couple of HTML tags in order to display formatted custom messages after save operation. May be some of you are already aware about it, but I thought it's worth to share.

 

In my recent implementation, I was expected to display some customized success or failure messages to users after they perform save. Here all data save related calculations and validations are performed in write back BADI and it is expected to display details about them in formatted text. Like user should see No Records are stored! message in case of invalidations.

 

For this, et_message export table parameter of write back BADI Pre-Process method can be used by populating a message rows through BADI logic. These messages are displayed in Save Results Log in Error Message block.

 

Additionally, you can use simple html scripting to highlight specific message (when tested we observed that this screen supports html tags).

 

e.g., If you are rejecting records in your BADI logic and want to show specific text in such case to users in red color in bold then you can use script like -

 

<b><font color="red">No Records are stored!</font></b>

 

which results into -

Image may be NSFW.
Clik here to view.
1.png

Sample Code:

DATA: ls_message TYPE uj0_s_message.

.

.

.

ls_message-message =  '<b><font color="red">No Records are stored!</font></b>'.

ls_message-msgid      = 'FAILED'.

ls_message-msgno     = '01'.

ls_message-msgty      = 'E'.

APPEND ls_message TO et_message.

 

Further such messages can be maintained in message class(TCode: SE91) to make them reusable.

 

Save Results Error Message Log with some html formatting makes save outputs more descriptive and appealing.

 

References:

Customize ET_MESSAGE and ET_ERROR_RECORDS

Want to show my own Error message after sending data through Input Schedule

 

I am still looking for replacing text Error Message with Message to avoid wrong header for customized messages for successful operations.

 

---Tested on SAP EPM SP20, SAP BPC NW 10---

 

If you have any suggestions, comments or inputs, please do share.

 

Thanks and Regards,

Prashant Vankudre

How To Download EPM Add-In SAP for SAP BPC

Hi

 

To download, follow these steps.

 


  • Click in Software Downloads

Image may be NSFW.
Clik here to view.
Imagen1.png

  • Click in Support Packages & Patches

Image may be NSFW.
Clik here to view.
Imagen2.png

  • Click in Alphabetical List of My Products

Image may be NSFW.
Clik here to view.
Imagen3.png

  • Select the letter B

Image may be NSFW.
Clik here to view.
Imagen4.png

  • Search and Select Sap Bpc For SAP Netweaver

Image may be NSFW.
Clik here to view.
Imagen5.png

  • Click in Your Version

Image may be NSFW.
Clik here to view.
Imagen6.png

  • Click in Entry By Component and select the component to download

Image may be NSFW.
Clik here to view.
Imagen7.png

Image may be NSFW.
Clik here to view.
Imagen8.png

Image may be NSFW.
Clik here to view.
Imagen9.png

 

I hope you like it


Best Regards!!!

Juan Pablo Chaparro

Do you use SAP BusinessObjects Analysis, edition for Microsoft Office and EPM Add-in?

Both products allow you to connect to SAP BW InfoProviders and other datasources, each one with its own functionalities and characteristics. This has led many users to use both tools together.

 

Now the new version of Analysis Office, the version 2.1, brings with it the EPM Add-in integrated with it.

 

Analysis Office and EPM Add-in are still separated in two different tabs (DataManager functionalities also still come in a third separate tab), but all three tabs are now more compatible than ever.

 

There is one important point to call attention to, this version of EPM Add-in (EPM plugin) which comes together with Analysis Office 2.1 does not support the EVDRE function.

 

Please review this great blog for further information Analysis Office - BI and Planning Clients Convergence ASUG Webcast

 

The help files are located here: SAP BusinessObjects Analysis, edition for Microsoft Office 2.1

Executing Data Manager Package Silently (No Prompts)

Introduction

This blog post does not contain a new approach but an attempt to explain the process of running Data Manager Packages without any prompts. It delves a little deeper to understand the concept of local response file and how to leverage it for various purposes.

 

Scope

The scope of this blog is to provide an ability to run a Data Manager Package without any prompts. This is applicable especially in the cases where a user refreshes on a predefined context and then wish to trigger any post processing - say via a Script Logic.

 

Software Components

- SAP BW7.31

- SAP BPC 10.0 Version for SAP NW

- EPM Excel Addin - Version 10 SP21 .NET4

- MS Excel 2013 (32-bit) on Windows 7 machine

 

Additional Details

Referenced link:

How to deal with BPC Data Manager packages programmatically

 

Case

Let us say we have an input form where a user enters value for a month of the year and then triggers a script logic to copy the same value across all other months of that year. We will be referring a Data Manager Package named - Copy to remaining Periods which contains the script logic to perform the copy. The input for this DMP is value for a month(base level here) for the TIME dimension.

 

Explanation

Every time you run a DMP in a dialog mode, selections are to be entered as requested before it is to be run. When you execute a DMP manually for the first time, an XML file will be created named - DMUserSelection.xml and the default location of the file is:

<USERPROFILE>\Documents\PC_NW\<DOMAIN>\<user name>\AppInfo\<Environment>\<Model>\DataManager\<user name>\

 

This xml contains details of all the DMPs and their selections for a given model in the Environment. Now let us take a look at the content of this xml file:

Image may be NSFW.
Clik here to view.
DMUserSelection.jpg

Every package's response is captured between "<Package" and "</Package" as highlighted above. This xml will contain the latest selections for that DMP on that machine. That is the reason that next time when you execute the DMP, you see your selections automatically filled based on the selections made last time. In this case, the package has been executed for TIME = 2017.01

 

Now if we could somehow pass this xml response to DMP, we can avoid the user prompts and can execute it silently. VBA is capable of generating text and xml files and we will be leveraging it for our purpose. Perform the following steps:

- Copy the xml code between "<Answer>" and "</Answer>" of the highlighted xml code in a notepad or wordpad or Excel

- Replace all "&lt;" with "<" and all "&gt;" with ">"

 

Here is how the code will look like:

Copy to remaining Periods{param_separator}<?xml version="1.0" encoding="utf-16"?><ArrayOfAnswerPromptPersistingFormat xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">  <AnswerPromptPersistingFormat>    <_ap>      <Name>%SELECTION%</Name>      <Values />    </_ap>    <_apc>      <StringListPair>        <str>TIME</str>        <lst>          <string>2017.01</string>        </lst>      </StringListPair>    </_apc>  </AnswerPromptPersistingFormat></ArrayOfAnswerPromptPersistingFormat>

Now notice that the top two lines contain quote marks and VBA will not be able to read quotes within quotes, so we will replace it with a variable called vQuote. We will be creating a function to generate a local response xml based on the DMP ID, Dimension and selection. This is how the VBA function will look like:

Private Sub CreateLocalResponseXMLFile(iFileName As String, iPackageId As String, iDimName As String, iTimeId As String)
'Local Declerations
Dim vQuote As String
vQuote = Chr$(34)
'Close any open files
Close
'Open XML File
Open iFileName For Output As #1
'Populate XML File line by line
Print #1, iPackageId & "{param_separator}<?xml version=" & vQuote & "1.0" & vQuote & " encoding=" & vQuote & "utf-16" & vQuote & "?>"
Print #1, "<ArrayOfAnswerPromptPersistingFormat xmlns:xsi=" & vQuote & "http://www.w3.org/2001/XMLSchema-instance" & vQuote & " xmlns:xsd=" & vQuote & "http://www.w3.org/2001/XMLSchema" & vQuote & ">"
Print #1, "  <AnswerPromptPersistingFormat>"
Print #1, "    <_ap>"
Print #1, "      <Name>%SELECTION%</Name>"
Print #1, "      <Values />"
Print #1, "    </_ap>"
Print #1, "    <_apc>"
Print #1, "      <StringListPair>"
Print #1, "        <str>TIME</str>"
Print #1, "        <lst>"
Print #1, "          <string>" & iTimeId & " </string>"
Print #1, "        </lst>"
Print #1, "      </StringListPair>"
Print #1, "    </_apc>"
Print #1, "  </AnswerPromptPersistingFormat>"
Print #1, "</ArrayOfAnswerPromptPersistingFormat>"
'Close XML File
Close #1
End Sub

Notice that "2017.01" is replaced by iTimeId so that desired variable can be passed. Let us say that we want to pass selection as the first month of the next year every time we execute it. Now we will have a DMP to trigger a script logic to copy value from one period to all other periods. We will generate and save the local response file under "USERPROFILE\Documents\". The macro will check for old response file and deletes it before creating the new one.

 

Here is how the macro would look like:

Sub Copy_to_Periods()
Dim cPackageId As String
Dim cDimName As String
Dim cTimeId As String
Dim myFolderName, myFileName As String
Dim vFileName As String
'General type declerations
Dim oAutomation As Object
Set oAutomation = CreateObject("FPMXLClient.EPMAddInAutomation")
Dim oAutomationDM As Object
Set oAutomationDM = CreateObject("FPMXLClient.EPMAddInDMAutomation")
Dim oPackage As Object
Set oPackage = CreateObject("FPMXLClient.ADMPackage")
cTimeId = Year(Now()) +1 & ".01"
cDimName = "TIME"
cPackageGroupId = "Your_Package_Group_ID"
cPackageId = "Copy to remaining Periods"
'Declare package
With oPackage
.Filename = "/CPMB/DEFAULT_FORMULAS" '<----Enter the process chain ID
.GroupID = cPackageGroupId
.PackageDesc = ""
.PackageId = cPackageId
.PackageType = "Process Chain"
.TeamID = "Your_team_ID"
.UserGroup = "Your_user_Group"
End With
'Path of Local Response XML File
vFileName = Environ$("USERPROFILE") & "\Documents\LocalResponse.xml"
'Locate and delete old local response file
'Enter file path to delete files from
myFolderName = Environ$("USERPROFILE") & "\Documents\"
myFileName = Dir(myFolderName & "*.xml")
'Delete all files without an Excel extension
Do While myFileName <> "LocalResponse.xlsx"  If myFileName = "" Then Exit Do  Kill myFolderName & myFileName  myFileName = Dir
Loop
'Create Local Response XML File
Call CreateLocalResponseXMLFile(vFileName, cPackageId, cDimName, cTimeId)
'Run Package
Call oAutomationDM.RunPackage(oPackage, vFileName)
End Sub

Once this macro is executed, you can look for the generated xml file for local response under predefined folder. On opening that file, you will see the desired value that we wanted to pass.

Image may be NSFW.
Clik here to view.
Generated XML.jpg

This is a very simple case but hope it helps to understand the basics of Local Response xml and how to use it.


How-To Guide: Dynamically Enable FPMXLClient reference


Purpose of the document

 

If your going to create BPC 10 or BPC 10.1 reports and input schedules using the EPM API for several functions like refreshing sheets or saving data you may realize that the EPM throws a VBA error if you try to use methods from the EPM API. The reason for this is the missing FPMXLCLient reference. The same behavior can be experienced if users of the reports or input schedules execute them for the first time unless they activate the reference in the VBA editor. To avoid this behaviour the following guide will provide you a possible solution.

 

Step by Step Procedure

 

Activating FPMXLClient


If you are going to create an input schedule or report in BPC using the EPM Add-In and you plan the usage of the VBA API provided you have to enable the FPMXLClient reference. You can achive this by open the VBA editor (press ALT - F11 or simply activate the developer tools in the MSExcel options) and choose from menu Extras --> Reference ... . Activate the FPMXLClient reference by setting a flag shown in the following screen and press OK:

Image may be NSFW.
Clik here to view.
Ref.png

 

After doing this you are able to use the EPM classes and methods within VBA.


Caution:


Any user who is going to use an input schedule or report which includes elements of the EPM API will have to activate the FPMXLClient reference. Otherwise VBA errors will occur during the usage of the linked functionalities.

 

 

Creating VBA


After activating the reference you will have to create a VBA coding which checks and enables the reference dynamically anytime the input schedule or the report is executed - indepentent from user or machine.


1. Create Module


The first you will have to do is creating a module within your VBA project. Simply right click on your VBAProject and choose Insert --> Module.

 

2. Create VBA Coding


Copy the following VBA code in your newly created module:


----------------------------------------------------------------------------------------------------------------------------------------------------------------

Private Const cVerweisname As String = "FPMXLClient"
Private Const cEPMDateiname As String = "C:\Program Files (x86)\SAP BusinessObjects\EPM Add-In\FPMXLClient.tlb"
Public var As Boolean


' Function - check if FPMXLClient has loaded.
' Return Value:
' true: if loaded
' false: if not loaded
Private Function EPMCheckReference() As Boolean
    Dim lReturn As Boolean
    Dim lVerweisname As String
   
    lReturn = False
   
    Set VBE = Application.VBE.ActiveVBProject
    With VBE
        For x = 1 To .References.Count
            If UCase(.References(x).Name) = UCase(cVerweisname) Then
                lReturn = True
            End If
        Next x
    End With
    EPMCheckReference = lReturn
End Function


' Function to load FPMXLClient
' Return Value:
' true: Loading OK
' false: Loading not OK
Private Function EPMLoadReference() As Boolean
    Dim lReturn As Boolean
    lReturn = False
   
    On Error Resume Next
    Application.VBE.ActiveVBProject.References.AddFromFile cEPMDateiname
    lReturn = True
    EPMLoadReference = lReturn
End Function

 

' Function - check if user has logged on
' Return Value:
' true: logged on
' false: not logged on
Private Function CheckEPMConnection() As Boolean
    Dim lString As String
    lString = ""
   
    Dim epm As New FPMXLClient.EPMAddInAutomation
    On Error Resume Next
    lString = epm.GetActiveConnection(ActiveSheet)
   
    If lString <> "" Then
        lBoolean = True
    Else
        lBoolean = False
    End If
    CheckEPMConnection = lBoolean
End Function


' Function - check if FPMXLClient has loaded.
' If not - load.
' Check if active connection exists.
' Return Value:
' - true: FPMXLClient loaded and active connection
' - false: FPMXLClient loaded, but no connection (Comment: Info- Dialogue will pop up)
Public Function EPMActivate()
    Dim lBoolean As Boolean
    lBoolean = EPMCheckReference()

    If lBoolean = False Then
        lBoolean = EPMLoadReference()
    End If
    If lBoolean And CheckEPMConnection Then
        lBoolean = True
    Else
        MsgBox "No active EPM Connection." & Chr(13) & "Please log in again.", vbInformation, "Caution"
        lBoolean = False
    End If
    EPMActivate = lBoolean
End Function

----------------------------------------------------------------------------------------------------------------------------------------------------------------


Now use this functionality within your functions and macros. Please see the example for refreshing the sheet below:


----------------------------------------------------------------------------------------------------------------------------------------------------------------

' Function - create refresh

Public Function runEPMRefreshWorkSheet()

    Dim epm As New FPMXLClient.EPMAddInAutomation

    epm.RefreshActiveSheet

End Function

 

' Macro for calling the refresh function

' To be used wihtin the sheet for buttons and more

' Make sure the EPMActivate reference has included

Sub EPM_RefreshWorkSheet()

    If EPMActivate Then
        runEPMRefreshWorkSheet
    End If
End Sub

----------------------------------------------------------------------------------------------------------------------------------------------------------------

 

After including the coding referencing EPMActivate in each macro anytime a user executes a macro the FPMXLClient reference will be checked and reloaded if required. No manual activating the reference will be needed anymore.

 

Best regards,

Karsten

SAP EPM Design Council Workshop (Americas) - BPC (Las Vegas, Nov. 12, 2015)

Dear SAP Customer,

Following the SAP Conference for Financial Planning, Consolidation, and Controls, SAP EPM Product Management, Development, and Go-to-Market teams cordially invite you to participate in a post-conference Design Council workshop focused on SAP BPC. We will share features, capture feedback, and co-innovate with customers for continuous improvement.  During the session, please be prepared to discuss business planning related use cases within for your organization.

Conference registration is NOT required to attend this workshop. A signed Feedback Agreement is required. The workshop will be limited in attendance to ensure a high quality discussion. Please RSVP by emailtojanet.tran@sap.com (with Full Name, Company, Role, Email, Phone) to take advantage of this influence opportunity.

Date:  Thursday, November 12, 2015
Time: 
9:00 AM – 4:30 PM PDT
Location:
Las Vegas, NV, US
Agenda
• Workshop Expectations and Goals
• BPC Roadmap & Innovation Updates
• SAP Cloud for Planning Overview & Updates
• Customer Presentation & Discussion
• Development Topic Discussion & Feedback
• Wrap-Up

Click here for more detailed information on SAP EPM Design Council.

BW-BPC10 Upgrade XPRA_AIMMRG Downtime Issue

Hi ,

 

During the downtime , phase XPRA_AIMMRG fails with UJT_TLOGO_AFTER_IMPORT .

 

 

Cheers,

Tushar





 

 

Applies to:
SAP NW 7.3x to 7.4 Upgrade with BPC 10 to BPC 10.1

 

Summary:
This white paper explains ,the probable solutions for the Downtime issue faced during the upgrade. As per SAP Note 2147932 - Method UJT_TLOGO_AFTER_IMPORT failed during XPRA phase.

 

Author: Tushar Pardeshi

Created on: 28th Sept 2015

 

 

 

System Information:-

OS  Details

Windows 2012 R2

DB  Details

MSSQL 2012 R2

SAP Version

SAP NW 731 to SAP NW740 Upgrade

 

Reproducing Issue:
Can be reproduced during Downtime run for Upgrade.

 

 

 

Phase: MAIN_NEWBAS/XPRAS_AIMMRG, Logs (XPRASUPG) & Reference Note.

Image may be NSFW.
Clik here to view.
00.jpg

Image may be NSFW.
Clik here to view.
01.jpg

Image may be NSFW.
Clik here to view.
02.jpg

 

 

 

Probable Causes:-

  1. BPC 10 Specific post processing steps not followed as per guide.
    This specially points to the MDX_PARSER & RSTPRFC related RFC’s & configuration. Also Global parameter is required to be maintained properly. The user ID/Password should be properly maintained for these RFC (normally logical client ABAP RFC).
  2. During upgrade, if you have locked any relevant ID’s as a procedure for downtime preparation, relevant ID’s should be unlocked (BWREMOTE etc).
  3. STMS, TP , fails. Could be recognized via STMS -> System Overview -> Checks

 

 

 

 

Solutions:-

We can process this step by step. First Unlock the system using CMD, and process as follow

  1. BPC 10 post processing can be cross checked either during the issue/prior the downtime commences. BPC 10 Installation guide should be followed for this. Below are the few screen shot from the BPC 10 Installation guide. Also note that MDX_PARSER is TCP/IP RFC and also Non Unicode by default. Should be kept the same way. Also maintain Global Parameter for RFC. (for detail process do follow the BPC 10 Installation guide)
    Note:- If there is system copy happened in past, this needs to be updated accordingly.

    Image may be NSFW.
    Clik here to view.
    03.jpg

    Image may be NSFW.
    Clik here to view.
    04.jpg

    Image may be NSFW.
    Clik here to view.
    05.jpg




  2. Check whether required users (BWREMOTE etc.) are unlocked and passwords are properly maintained. MDX_PARSER requires LIBRFC32.DLL file in kernel, which normally will not be available, as kernel will be updated during downtime. This also, is the cause for RFC MDX_PARSER to fail. Copy LIBRFC32.DLL file into kernel.
    (//hostname/sapmnt/SID/SYS/exe/uc/NTAMD64).

  3. Go to STMS -> System overview -> SID -> Check -> Transport Tool.
    If this fails, then take the relevant measures. In our case it fails at transport profile with, “cannot read tp profile”. And few others relative errors like, TP profile has invalid format. While tp call also fails with “Link to database failed”, ”ERROR: see tp’s stdout”. These all errors report to one single thing, TP_DOMAIN_SID.PFL not being available or readable at path SUM/abap/bin location. Once the proper file is copied / permissions if already exists are reset to full. This will show green. Post these changes, you may have to rename/delete buffer to refresh in path SUM/abap/buffer – File name SID. Always take backup before renaming/deleting.

Image may be NSFW.
Clik here to view.
06.jpg

Image may be NSFW.
Clik here to view.
07.jpg

 

 

 

Conclusion:-

That’s it Folks. This has been tested on couple of systems, as we faced this issue during our non prod systems upgrade, and successfully completed.

 

Appendix:-

SAP BPC 10 Installation Guide

SAP BPC 10.1 Upgrade Guide

SAP SUM Guide.
SAP NW Upgrade guide.

SAP Note 2147932 - Method UJT_TLOGO_AFTER_IMPORT failed during XPRA phase

Drill through SAP FI/CO from SAP BPC NW 10.1 Classic

Hi all,

 

 

I need to create a drill
through to "FS10N" with preloaded values from EPM. I do have the
How To for BEx access but this is not helping. Did somebody used this already?

 

 

 

 

 

Kind regards Dario

BPC 10.1 Training Courses

I'm often asked what BPC Courses are available.  SAP Education currently offers 5 courses on BPC 10.1. Full course outlines as well as the schedule for each course are available by selecting the hyperlinks contained within this blog.

Millennials

 

Administration

There are 2 courses covering Administration.  BPC410covers Administration on the Microsoft Platform.  BPC420covers Administration on the NetWeaver Platform. It is recommended to take the BW310 class before the BPC420. BPC410 and BPC420 cover the following topics:

  • Creating Environments, Models, Dimensions
  • Maintaining Dimension Members
  • Using Script Logic
  • Creating and Using Data Manager Packages
  • Business Process Flows for Planning
  • Work Status
  • Security

 

Consolidation

There is 1 course covering Consolidation, BPC440.  The BPC440 covers:

  • Setting up models for consolidation
  • Creating dimensions and maintaining properties for consolidation purposes
  • Script logic for consolidation
  • Business rules
  • Controls
  • Consolidation Central
  • Consolidation Business Process flows

 

Reporting and Planning

There is 1 course covering Reporting and Planning, BPC430.  The 10.1 version of the course covers the new EPM Add-in. BPC430 covers:

  • Creating reports and input forms
  • Cell based EPM functions
  • Distribution and Collection, Books, Comments, Planning Functions
  • Business Process Flows for Planning
  • Web Integration
  • Drill Through
  • Dashboards

 

 

BPC 10.1 Embedded

With the introduction of SAP Business Planning and Consolidation 10.1, SAP has combined its two planning solutions into one version.  This “unified”
version, now called Embedded,  integrates the functionality of SAP Business Planning and Consolidation with
SAP Integrated Planning.

 

The BPC450 class will bring you up to date on version 10.1 Embedded:

  • The Embedded (previously called Unified) features for SAP Business Planning and Consolidation
  • Modeling BPC Embedded
  • Use Integrated Planning with BPC
  • Administering BPC Embedded
  • Reporting and Planning in the EPM Add-in
  • Using Analysis for Office and Design Studio
  • Maintaining and Using Embedded Business Process Flows (BPF's)
  • Web Reporting and Planning
  • How to create Real-time InfoCubes, Aggregation Levels, Planning Functions, and Planning queries
  • Embedded work status, data audit, Embedded transports, and migration paths
  • How to create ABAP Managed Database Procedures to execute custom planning functions in memory.

 

Hope to see you in class!

Viewing all 152 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>