Sunday 15 June 2014

REPORTING TOOLS IN SAP BI/BW

The SAP NetWeaver Business Intelligence Suite, the Business Explorer (BEx), provides flexible reporting and analysis tools for strategic analyses and decision-making support within a business. These tools include query, reporting, and analysis functions. As an employee with access authorization, you can evaluate historical or current data at various levels of detail and from different perspectives; both on the Web, in the portal and in Microsoft Excel.
You can also use the Business Explorer tools to create planning applications, and for planning and data entry in BI Integrated Planning. More information: BI Integrated Planning.
You can use BEx Information Broadcasting to distribute business intelligence content by e-mail, either as precalculated documents with historical data, or as links with live data. You can also publish this content to the portal (in Knowledge Management folders or collaboration rooms).

SAP NetWeaver 7.0 provides two versions of the following tools:
  BEx Query Designer
  BEx Web Application Designer
  BEx Broadcaster
  BEx Analyzer
This documentation describes the BEx tools in SAP NetWeaver 7.0. For more information about the SAP BW 3.5 BEx tools, see help.sap.com/nw04  SAP NetWeaver  Information Integration SAP Business Information Warehouse  BI Suite: Business Explorer.

Features

The following overview shows the functional areas of Business Explorer:
This graphic is explained in the accompanying text
Query, Reporting and Analysis
The data in SAP NetWeaver BI is divided into self-contained business data areas (InfoProviders). You analyze the data of the BI system by defining queries for InfoProviders in BEx Query Designer. By selecting and combining InfoObjects (characteristics and key figures) or reusable structures in a query, you define how you will evaluate the data in the selected InfoProvider.
Analyzing data based on multidimensional data sources (OLAP reporting) makes it possible to analyze multiple dimensions simultaneously (like time, location, and product). You can implement any number of variance analyses (such as plan-actual comparison, fiscal year comparison). The data, which is displayed in a table, acts as the starting point for a detailed analysis, which can answer a variety of questions. A range of interaction options, such as sorting, filtering, swapping characteristics, and local calculations allow flexible navigation through the data at runtime. You can also display data graphically (such as bar charts or pie charts). You can also evaluate geographical data (for example, characteristics such as Customer, Sales Region, and Country) on a map. You can also use exception reporting to determine deviating and critical objects, to broadcast messages on deviating values by e-mail, or to distribute the messages to the universal worklist in the portal.
You can perform a detailed analysis of BI information on the Web and in Microsoft Excel.
BEx Web
Web Application Design
Web Application Design allows you to use the generic OLAP navigation in Web applications as well as Business Intelligence Cockpits for simple or highly individual scenarios. You can use standard markup languages and Web Design API to implement highly individual scenarios with user-defined interface elements. Web application design comprises a broad spectrum of interactive Web-based business intelligence scenarios that you can adjust to meet your requirements by using standard Web technologies.
BEx Web Application Designer
You can use BEx Web Application Designer, the desktop application used to create Web applications, to generate HTML pages that contain BI-specific content such as tables, charts, or maps. Web applications are based on Web templates that you create and edit in the Web Application Designer. You can save the Web templates and access them from the Web browser or the portal. Once they are executed on the Web, Web templates are referred to as Web applications.
BEx Web Analyzer
BEx Web Analyzer is a standalone, convenient Web application for data analysis that you can call using a URL or as an iView in the portal. In Web Analyzer, you can open a data provider (query, query view, InfoProvider, or external data source) and use ad hoc analysis to create views of BI data (query views) that you can then use as data providers for other BI applications. You can also distribute and save the results of your ad hoc analysis as required.
Report Designer
Report Designer is an easy-to-use design tool that you can use to create formatted reports that are optimized for presentation and printing. Report Designer provides extensive formatting and layout functions that you can use to create corporate balance sheets or HR master data sheets to suit your needs, for example.
PDF Generation
The integrated PDF generation function allows you to print Web applications and reports in various formats. More information: Creating Print Versions of BI Applications.
BI Patterns
BI patterns are Web applications that are tailored to the requirements of particular user groups. BI patterns are used to provide a uniform display of content from BI. You can configure BI patterns to a certain extent using the Pattern Wizard from the Web Application Designer.
BEx Analyzer
BEx Analyzer is an analysis, reporting, and design tool in Business Explorer, which is integrated into Microsoft Excel. In BEx Analyzer, you can analyze selected InfoProvider data and use it for planning by navigating in queries created in BEx Query Designer. You can use the context menu or drag and drop functions to do this.
You can design the interface for your queries by inserting design items such as dropdown boxes, radio button groups, and pushbuttons into your Excel workbook. A workbook thus becomes a complete query application.
See Analysis & Reporting: BEx Analyzer.
BEx Information Broadcasting
BEx Information Broadcasting allows you to make objects with content from business intelligence available to a wide spectrum of users, according to your requirements.
Using BEx Broadcaster, you can precalculate Web templates, queries, query views, reports, and workbooks and publish them to the portal, distribute them by e-mail, or print them. In addition to the precalculated documents that contain historical data, you can also generate online links to queries and Web applications.
The Business Explorer portal role illustrates the various options that are available when you are working with BI content in the portal.
More information: Information Broadcasting.
Integration in the Portal
You can integrate business content from BI seamlessly in the portal. Integration is carried out using BEx Broadcaster, KM content, SAP Role Upload, or the Portal Content Studio. Depending on the type of integration, you create objects with various display types in the portal. More information: Overview: Integration and Display Types for Content from BI.
The portal allows you to access applications from other systems and sources, such as the Internet or intranet. Using one single entry point, you can access both structured and unstructured information. In addition to content from Knowledge Management, business data from data analysis is available from the Internet and intranet.
More information: Integrating Content from BI into the Portal.
Integration with SAP BusinessObjects
Integration with SAP BusinessObjects broadens the scope of the reporting tools in Business Explorer. This integration comprises the following:
  Integration with Xcelsius Enterprise to visualize BI data in dashboards
  Integration with Crystal Reports to create form-based reports based on BI data
  More information: Integration with SAP BusinessObjectssapurl_link_0001_0015_0015.

Data Flow in the Data Warehouse in sap bi/bw

The data flow in the Data Warehouse describes which objects are needed at design time and which objects are needed at runtime to transfer data from a source to BI and cleanse, consolidate and integrate the data so that it can be used for analysis, reporting and possibly for planning. The individual requirements of your company processes are supported by numerous ways to design the data flow. You can use any data sources that transfer the data to BI or access the source data directly, apply simple or complex cleansing and consolidating methods, and define data repositories that correspond to the requirements of your layer architecture. 
With SAP NetWeaver 7.0, the concepts and technologies for certain elements in the data flow were changed. The most important components of the new data flow are explained below, whereby mention is also made of the changes in comparison to the past data flow. To distinguish them from the new objects, the objects previously used are appended with 3.x.

Data Flow in SAP NetWeaver 7.0

The following graphic shows the data flow in the Data Warehouse:
This graphic is explained in the accompanying text
In BI, the metadata description of the source data is modeled with DataSources. A DataSource is a set of fields that are used to extract data of a business unit from a source system and transfer it to the entry layer of the BI system or provide it for direct access.
There is a new object concept available for DataSources in BI. In BI, the DataSource is edited or created independently of 3.x objects on a unified user interface. When the DataSource is activated, the system creates aPSA table in the Persistent Staging Area (PSA), the entry layer of BI. In this way the DataSource represents a persistent object within the data flow.
Before data can be processed in BI, it has to be loaded into the PSA using an InfoPackage. In the InfoPackage, you specify the selection parameters for transferring data into the PSA. In the new data flow, InfoPackages are only used to load data into the PSA.
Using the transformation, data is copied from a source format to a target format in BI. The transformation process thus allows you to consolidate, cleanse, and integrate data. In the data flow, the transformation replaces the update and transfer rules, including transfer structure maintenance. In the transformation, the fields of a DataSource are also assigned to the InfoObjects of the BI system.
InfoObjects are the smallest units of BI. You map the information in a structured form that is required for constructing InfoProviders.
InfoProviders are persistent data repositories that are used in the layer architecture of the Data Warehouse or in views on data. They can provide the data for analysis, reporting and planning.
Using an InfoSource, which is optional in the new data flow, you can connect multiple sequential transformations. You therefore only require an InfoSource for complex transformations (multistep procedures).
You use the data transfer process (DTP) to transfer the data within BI from one persistent object to another object, in accordance with certain transformations and filters. Possible sources for the data transfer include DataSources and InfoProviders; possible targets include InfoProviders and open hub destinations. To distribute data within BI and in downstream systems, the DTP replaces the InfoPackage, the Data Mart Interface (export DataSources) and the InfoSpoke.
You can also distribute data to other systems using an open hub destination.
In BI, process chains are used to schedule the processes associated with the data flow, including InfoPackages and data transfer processes.

Uses and Advantages of the Data Flow with SAP NetWeaver 7.0

Use of the new DataSource permits real-time data acquisition as well as direct access to source systems of type File and DB Connect.
The data transfer process (DTP) makes the transfer processes in the data warehousing layers more transparent. The performance of the transfer processes increases when you optimize parallelization. With the DTP, delta processes can be separated for different targets and filtering options can be used for the persistent objects on different levels. Error handling can also be defined for DataStore objects with the DTP. The ability to sort out incorrect records in an error stack and to write the data to a buffer after the processing steps of the DTP simplifies error handling. When you use a DTP, you can also directly access each DataSource in the SAP source system that supports the corresponding mode in the metadata (also master data and text DataSources).
The use of transformations simplifies the maintenance of rules for cleansing and consolidating data. Instead of two rules (transfer rules and update rules), as in the past, only the transformation rules are still needed. You edit the transformation rule on an intuitive graphic user interface. InfoSources are no longer mandatory; they are optional and are only required for certain functions. Transformations also provide additional functions such as quantity conversion and the option to create an end routine or expert routine.

Extraction Using the SAP Query

Extraction Using the SAP Query 

SAP Query is a comprehensive tool for defining reports. It uses many different forms of reporting. It allows users to define and execute their own evaluations of data in the SAP system without requiring ABAP programming know-how.
To define the structure of evaluations, you enter texts in SAP Query and select fields and options. InfoSets and functional groups allow you to easily select the relevant fields.
An InfoSet is a special view of a set of data (logical database, table join, table, sequential file). It serves as the data source for SAP Query. An InfoSet determines which tables or fields of these tables are referenced in an evaluation.InfoSets are usually based on logical databases.
The maintenance of InfoSets is one component of SAP Query. When an InfoSet is created, a data source is selected in an application system. Since the number of fields for a data source can be high, fields can be combined into logical units, the functional groups. Functional groups are groups of several fields to form a logical unit within an InfoSet. Any fields that you want to use in an extraction structure have to be assigned to a functional group. In generic data extraction using an InfoSet, all the fields of all functional groups for this InfoSet are available.
The relevance of SAP Query to SAP NetWeaver BW lies in the definition of the extraction structure by selecting fields of a logical database, a table join or other datasets in an InfoSet. You can use generic data extraction for master or transaction data from any InfoSet. A query is generated for an InfoSet that gets data and transfers this data to the generic extractor.
InfoSets represent an additional, easily manageable data source for generic data extraction. They allow you to use logical databases of all SAP applications, table joins, and further datasets as data sources for SAP NetWeaver BW. For more information about SAP Query, in particular InfoSets, see the SAP Query documentation -> System Administration.
In the following section, the terms SAP Query and InfoSet are used independently of the source system release.SAP Query is the same as an ABAP Query or ABAP/4 query, dependent on the source system release. The InfoSet is also called a functional area in some source system releases

How to create Financial Reports by States for Tax Purposes

We are using logic in the tax jurisdiction code to USTX77705XXXXX. We are going to use Left(Right(Tax Jurisdiction,2),4) to get TX.

Step 1:
Enhance extractor with functional module 0FI_GL_40 enhancement
1. Go to RSA6.
2. Select your data Source 0FI_GL_40.
3. Select the "Enhance Extraction structure"
4. Give the Append Structure name. In our example we used ZZFIGL40APPEND. Note: you will need new append structure for each extractor.
5. Append the fields to extract structure (ZZ*). In our example we use ZZTXJCD. Note: use se11 to check the component type of the fields you are trying to append.
6. Goto CMOD
7. Give the project name-->Display. In our example we used ZBWEX


 8. Click Create Ã  Component Ã  enter RSAP0001
 9. Double Click on EXIT_SAPLARSAP_001 since we are enhancing transaction data extractor
EXIT_SAPLRSAP_001 -> Transaction Data
EXIT_SAPLRSAP_002 -> masterdata attributes
EXIT_SAPLRSAP_003 -> Master data Texts
EXIT_SAPLRSAP_004 -> Masterdata Hirarchies.
10. Double click on ZXRSAU01
11. You can write the code under the exist.

***Enhance 0FI_GL_40 Extract with Functional Location

DATA:  l_tabix like sy-tabix,

      L_S_FAGL_S_SREP_LINE_ITEMS like FAGL_S_SREP_LINE_ITEMS.

CASE I_DATASOURCE.



WHEN '0FI_GL_40'.

    LOOP AT C_T_DATA INTO L_S_FAGL_S_SREP_LINE_ITEMS.
   l_tabix sy-tabix.
   SELECT SINGLE TXJCD From Bseg into L_S_FAGL_S_SREP_LINE_ITEMS-ZZTXJCD
     WHERE BELNR L_S_FAGL_S_SREP_LINE_ITEMS-BELNR
           AND
           BUZEI L_S_FAGL_S_SREP_LINE_ITEMS-BUZEI.
     MODIFY c_t_data From L_S_FAGL_S_SREP_LINE_ITEMS INDEX l_tabix.
     Endloop.
ENDCASE.


12. Save and activate the Code
13. go to Rsa3 select 0FI_GL_40--> unselect the ZZTXJCD under Hide column (by defalult new added fields goes under HIDE)
14. Save the datasource Ã  generate Data Source
15. Check the data in RSA3.




Step 2: 
Enhance BW 0FIGL_D40

1. Replicate the datasource into BI side.
2. Go to BW, Create Xregion  and add to 0FIGL_D40. In the transformation edit the rule. Create a substring formula. Note you can not use Left(right) logic because the value is text. In this case we have to use substring forumla. 



3. Now you should be able to write  a bex report on GL based on States.

Tuesday 20 May 2014

Transporting BI Objects (DEV->Quality->PROD)

Transporting BI Objects 

Use

Development projects for Business Intelligence (BI) are generally not performed in a productive system. They are normally carried out in one or more development systems depending on their scope.
If you carry out your development projects in a development system, you have to transport the development to a target system (a test or production system).
To do this, the standard transport system has been enhanced in BI to provide the BI transport connection. Using the transport connection in your BI development system, you can collect a consistent amount of new or changed BI objects and then transport them using the Change and Transport Organizer (CTO).

Procedure

  1.  In Data Warehousing Workbench, choose the Transport Connection functional area (transaction RSOR) by choosing the pushbutton in the left navigation window or by choosing the menu option Goto  Transport Connection.
Note
You can display or hide the left hand navigation window in Data Warehousing Workbench by choosing This graphic is explained in the accompanying text Navigation Window On/Off.
  2.  Collect objects that you want to transport and check the settings.
More information: Collecting Objects

Objects Being Transported for the First Time (Package $TMP)

  3.  To transport the selected objects, choose This graphic is explained in the accompanying text Transport Objects. The Change Object Directory Entry dialog box appears.
  4.  Under Attributes, in the Package field, enter a transportable customer package (beginning with Y or Z).
 Note
This prompt only appears once for all collected objects. If you want to transport objects in different packages, you must collect the objects more than once.
You can change the package assignment subsequently.
More information: Changing Package Assignments
  5.  Save your entries. A dialog box appears where you are prompted for a transport request.
  6.  In the Request field, enter a transportable request or create a new request first.
  7.  Choose This graphic is explained in the accompanying text Continue.
  8.  Choose This graphic is explained in the accompanying text CTO Transport Organizer: Overview. The Workbench Organizer: Requests screen appears.
  9.  Release the individual tasks first and then the complete request.
Caution 
Ensure that you release any requests with locked objects beforehand.
Note
In the Collected Objects screen area, the system displays the corresponding information for individual objects in the following columns:
  Last Changed
  Last Changed By
  Transport Request
  Transport Request Holder
  Package
  10.  In the log display in the lower screen area of Collected Objects, check the status of the transport.

Objects That Have Already Been Transported

From here on, BI objects are subject to automatic change recording since they were assigned to a transportable package when transported for the first time. This means that all changed objects are written automatically to a request, and can be transported.
Caution
Note the following for automatic change recording: If you create new InfoObjects for an InfoCube that is subject to automatic change recording, these are not automatically written to the InfoCube request. Instead, they are assigned the $TMP package. You must use the BI transport connection to collect and transport these objects or the InfoCube.
Note
Only refrain from collecting objects using the BI transport connection if the changes do not produce any new objects.

Result

Depending on their object type, the transported objects are imported to the target system in either A, M or T version and are automatically activated using the after-import method RS_AFTER_IMPORT. The dependent DDIC and program objects are generated in the target system after the import. If objects deletions have been transported, the corresponding objects are deleted in the target system.
With the implementation of Business Add-In (BAdI) RSO_BADI_TRANSPORT, you can automatically trigger follow-on activities such as executing or rescheduling process chains once the objects have been imported. After the standard BI after-import methods have run, method AFTER_IMPORT from interface IF_EX_RSO_TRANSPORT is executed. For more information, see Implementing BAdIs in Enhancement Builder and the method documentation for IF_EX_RSO_TRANSPORT AFTER_IMPORT.