SAP Explore Community Network
 
Home  |   About SAPExplore  |   Register   |   Login
 
To view the comments please Login and go to Blogs.
 
1.  Next SAP workshop on ALE IDOC and Partner Profiles
As a part of Project Training it is important for  SAP consultants to know about Interfaces and IDOCs. This session is applicable to all module consultants. All project learners please show your interest for this free SAP session by Faisal Majid.
view comments...
 
2.  SAP MOBILITY
I am pleased to announce that I can now add SAP MOBILITY as part of my professional portfolio
My thanks to CAREERDRAGONS for all your support.
view comments...
 
3.  PRICING SCHEMA
Many get confused with how to interpret pricing Schema, the following is an example you might see in an exam.  I have seen many students come up with £1054.50 as same mistake is made over and over again.
  1. A vendor offers you a material at gross price (PB00) of EUR 1200. In addition, the vendor gives you a 15% discount (RB01) and a 5% cash discount(SKTO). The vendor charges 90 for freight costs (FRB1). What is the effective price if you use the  calculation schema shown in the attached graphic?

Level

Counter

Condition type

Description

From

1

1

PB00

Gross Price

 

10

1

RB01

Discount %

1

15

1

ZC01

Surcharge %

1

20

0

 

Net value

 

30

1

FRB1

Absolute freight amount

20

35

1

SKTO

Cash discount

20

40

0

 

Effective price

 

 

  1. EUR 1,059
  2. EUR 1,042
  3. EUR 1,032
  4. EUR 1,050
Who would like to offer me an answer and why ?
view comments...
 
4.  The Advanced SAP Consultants Handbook - By Glynn Williams
All, 
 
I am very encouraged and excited to showcase Glynn's publication "The Advanced SAP Consultants Handbook" to our community and thankful to Glynn for not only providing the below Synopsis and Background, but also provide 5 complimentary copies of his very successful, best selling book to SAP explore users. If you would like to receive a a copy of his book, please register your interest in the link below, following the Synopsis.

Background 
Leading SAP authority Glynn C Williams, author of “Implementing SAP R3 Sales and Distribution” and “Implementing SAP ERP Sales and Distribution” has released his personal library of more than 230 SAP Tips and Tricks, compiled after implementing SAP in more than 39 countries and consulting in more than 5 functional domains. 
These Tips and Tricks are cross functional and easy to reference, empowering the reader with valuable time saving tips. 
Even at $1/tip the book should retail at $230. 
"The Advanced SAP Consultants Handbook" will enhance any readers SAP Skill set.
 
Synopsis 
• The "Advanced SAP Consultants Handbook", is a summary of tips and tricks gained on more than 18 SAP projects over a period of more than a decade. 
• It contains more than 230 SAP Tips and Tricks, with more than 200 screenshots. 
• It is designed for the reader to obtain the fundamentals of the SAP tip, trick within minutes, in most cases using a single page per tip in the 364 page book. 
• The reader learns valuable, advanced, time saving SAP Tips and Tricks, not taught in training centres. 
• This is the same reference manual that is used by countless professionals on countless SAP projects worldwide. 

If you are interested in a complimentary copy of the book please register your interest here - http://sapexplore.com/StaticPages/frmShowInterest.aspx , The winners of complimentary copies will be announced and informed by mid of December. 
 
Please note that in addition to the complimentary copies which will be selected based on a draw, Glynn has also offered a generous discount to every SAP Explore user who registers an interest in the book , the code for which will directly sent by Glynn.
 
 
view comments...
 
5.  Inbound Delivery

Inbound Delivery:

 

Definition:

                        “Inbound Delivery (ID) is a record which is holding all the information / data required to start and monitor the inbound delivery process.  ID process begins with goods receipt in the yard and closes on moving of the goods at final putaway.”

 

Actually it is notification from vendor against PO of the delivery of goods at specific dates.

 

Creation of inbound Deliveries:

 

If your vendor gives you Advance Shipping Notification before actual goods received, so in this situation inbound delivery is created. So place can be reserved in storage location for the goods.

 

E.g.: If a PO is issued to vendor where you specify the delivery date of required material, So vendor after view the PO and inform you that due to short of resources unable to delivered the required material the dates mentioned in PO. So need one week more time to delivered the products. So Inbound delivery is created against the PO issued to that vendor with mutually agreed dates of required goods and these dates mention inbound delivery.

 

The inbound delivery can be created using Transaction Code VL31N.

The initial screen appears to create inbound delivery. Enter the right vendor and the delivery date, the system automatically propose that current date as a delivery date, if you want to create a inbound delivery for specific PO than enter PO number, all system will find all the POs due for inbound delivery automatically.

 

 

 

So after that overview screen appeared and the Purchase Order data is copied into inbound delivery as shown in below screen shot. And some addition data can be input in the Header and item level screen, (E.g  transportation planning, route etc). after doing some necessary changes save the inbound delivery and when system saved the inbound delivery it generate inbound delivery’s number.

 

 

 

 

 

 

 

 

All the stages of an external procurement process are involved in the inbound process that happens when goods are received. Basically Inbound Delivery in a Follow On Activity to the purchase order. The inbound delivery can be created using Transaction Code VL31N.  The process starts when the goods are presented to vendor shipping point and it ends when the posting of the goods receipt is done by receiver end. This process has some steps after creation of PO and which are the following.

 

  • Notification
  • Inbound Delivery
  • Subsequent Putaway of goods
  • Posting of goods receipt

 

 

Process Flow:

 

  1. ME21N – In Purchase Order item level choose confirmation tab and select confirmation control key “Inbound Delivery”.
  2. Create inbound delivery (VL31N) against Purchase Order.
  3. Through MIGO Goods Receipt against Inbound Delivery. 

 

Accounting Difference:-

 

If you view the Accounting document of material document where the goods are received without Inbound Delivery. So please is accounting details.

 

300000

Inventory - Raw Mate

191100

Goods Rcvd/Invoice R

281000

Income - price varia

 

 

Accounting document of material document where the goods are received with Inbound Delivery.

 

300000

Inventory - Raw Mate

191100

Goods Rcvd/Invoice R

231000

Loss - price varianc

 

 

If you view the both accounting document, there is only one difference that is price variance account. One shows income and other shows loss.

view comments...
 
6.  Predicting BW Database Volume

Revisiting the Technical Content in BW Administration Cockpit with SAP Predictive Analysis

 

The following blog post demonstrates how to use the technical content of SAP BW as a forecast data basis for a prognosis model in SAP Predictive Analysis. The aim is to show a smooth and straight-forward process avoiding additional modelling outside of BW as much as possible. In the described use case the Database Volume Statistics[1] have been chosen as an example.

 


 

The official SAP Help summarizes the Technical Content in BW Administration Cockpit as follows: “The technical BI Content contains objects for evaluating the runtime data and status data of BW objects and BW activities. This content is the basis for the BW Administration Cockpit, which supports BW administrators in monitoring statuses and optimizing performance.[2]

 

The Technical Content with its pre-delivered Web Reporting might look a bit old-fashioned nevertheless the variety, quality, and quantity of data which is “generated” at any time in the system is very useful and important for further analysis. The type of data has a strong focus on performance-related data (e.g. query runtimes, loading times) but also other system-related data like volume statistics are available.

 


 

BW on Hana and SAP Predictive Analysis[3] together are extending the possibilities how to see the data and what to do (potentially more) with it.[4]

Technically there are simply the following 3 steps to follow[5]:

  1. Expose cube information model to Hana (SAP BW)
  2. Adjust data types to PA-specific format (Hana Studio)
  3. Create forecast model (SAP PA Studio)

 

The Database Volume statistics in the technical content are designed with a simple data model consisting of just one cube with some characteristics (day, week, month, DB object, object type, DB table etc.) and key figures (DB size in MB, number of records etc.). Following the above steps with this set of data, choosing a certain type of algorithm, results in a bar chart shown below integrated with forecast figures for the past and some months into the future.

 

The blue bars represent the actual database size by month. The green line represents the calculated figures of the forecast model (in this case a Double Exponential Smooth regression) for the past 20 months and 10 months into the future.

1.png

 


Below are some technical details for each of the mentioned steps:

 

(1) Expose information model of Infocube 0TCT_C25 to Hana Studio[6]

  • Edit the Infocube in BW and set the flag for “External SAP HANA view”:

2.png

 

Immediately the information model is generated as an Analytic View and can be viewed in Hana Studio:

  • Content -> system-local -> bw -> bw2hana -> 0 -> Analytic Views -> TCT_C25

3.png

 


(2) Adjust data types to PA-specific format (Hana Studio)

  • The generated Analytic View of Infocube 0TCT_C25 looks like below:

4.png

SAP Predictive Analysis needs (currently) a specific time-ID column and the key figures must be of data type DOUBLE. The new Calculation View CV_TCT_C25_1 is created based on the generated Analytic View TCT_C25:

  • Column [Month] (PA_TIME_ID_MONTH) = <unique sequential number for each month>[7]
  • Column [Database Size] (PA_TCTDBSIZE) = DOUBLE(0TCTDBSIZE)

5.png

 


(3) Create forecast model (SAP PA Studio)

 

Creating a forecast model in SPA Predictive Analysis follows the standard tasks as for any other data source.

 

  • Select data source i.e. select prepared calculation view including (time) key id column and relevant key figures
  • Select and configure components for the model:
    • Use [Filter] component (if necessary restrict columns and rows like filtering the relevant database object types, time range etc.)
    • Choose adequate [Algorithm] component, in the following case a Double-smoothing algorithm (PAL) has been chosen for forecasting several months into the future

6.png

 

And finally the resulting trend diagram is shown (see above).

 

 

 


[1] Infocube 0TCT_C25

[2] SAP Help Portal -> Technology -> SAP NetWeaver Platform

[3] This post deals with SAP BW on Hana 7.40/SP6 and SAP Predictive Analysis 1.19

[4] The blog post is focusing on the technical aspects to get a forecast model successfully executed. The chosen algorithm might not be statistically appropriate.

[5] Assuming the technical content has been activated in SAP BW

[6] Unfortunately it’s not yet possible to expose the information model of a Multiprovider

[7] Data used is from April 2013 to November 2014. To get a unique ID the following calculation is used (in order to get a sequence starting from 1):

    (int("0CALYEAR") - 2013)*12 + int(rightstr("0CALMONTH",2)) - 3

 

Reference : http://scn.sap.com/community/predictive-analysis/blog/2014/11/18/predicting-database-volume-in-bw

view comments...
 
7.  How-to Load a File into BW-Integrated Planning (Version 3)
Hello Friends of Integrated Planning,


thank you very much for all the feedback I received on the File Upload/Download how-to over the past years. I have great news: Basically every development request has been implemented! Yes, this means that there is a big load of new features available with version 3. Upload and download of CSV files and a new user interface that allows the preview of the file and plan data before you save it are just two of the highlights. The new version is also compatible with SAP NetWeaver BW 7.4.

 

Prerequisites

 

Minimum release to use the new version is SAP NetWeaver BW 7.30.

 

Download


You can download the complete how-to guide as well as the solution from SAP Note 2053696.

 

Enhancements


The following list shows the changes and enhancements compared to the previously published version 2.4:

  • Enabled conversion exit in variable screen
  • Removed context info from message output (can be enabled again with show_messages parameter)
  • Added search help for all selection fields including special characteristics like fiscal period
  • Added check that 0INFOPROV must be filled for uploads on MultiProviders
  • Support for CSV format for upload and download (new parameters for data separator and escape character)
  • Improved auto-detection of file format
  • Added info messages to display version and detected file format
  • New parameter for checking for duplicate records
  • New parameter to define display of +/- sign for download
  • New and improved alternative user interface
  • Function to generate the required master data for ZIP_* InfoObjects
  • Added BADI for performing custom transformations during upload and download
  • Integrated File Upload/Download with Report-Report-Interface
  • New parameter setting for download to select field description instead of technical name in header line
  • Automatic recognition of UTF byte-order-mark during upload
  • Added ready-for-input variables for all parameters
  • Added support for XLS format for upload with SAPGUI
  • Added load from application server (which enables upload from and download to Analysis Office)
  • File Upload is now supported for SAP NetWeaver BW 7.4
  • Updated screen shots in how-to guide to show GUI-based planning modeler and web-based file upload/download application

 

 

Preview


Here are a few screen shots of the version 3 user interface (Note: The old UI is still available in the version 3 transport). For more details, please refer to the how-to guide (see "Download" section above).


File Upload Selection Screen:

BW-IP_File_Upload_Example_02.png

File Upload Preview Screen:

BW-IP_File_Upload_Example_03.png

File Download Selection Screen:

BW-IP_File_Download_Example_01.png

File Download Preview Screen:

BW-IP_File_Download_Example_02.png
Your Feedback


As always, I appreciate your feedback. It's as simple as adding a comment to this blog.


Enjoy the new File Upload and Download for BW-Integrated Planning!

 

Reference : http://scn.sap.com/community/data-warehousing/business-planning/blog/2014/08/13/how-to-load-a-file-into-bw-integrated-planning-version-3

view comments...
 
8.  Optimize the performance of your SAP BusinessObjects Design Studio solutions – tips and tricks

Optimize the performance of your

SAP BusinessObjects Design Studio solutions – tips and tricks

 

 

     SAP BusinessObjects Design Studio allows you to build solutions on SAP Netweaver, SAP HANA and with the new version 1.2 on BusinessObjects Universes. With the technical possibility to create very advanced dashboards, one of the main criteria driving the End User adoption is performance: whatever you may bring into these dashboards in terms of content and nice design, you will get the buy-in you are looking for only with a very high interactivitythere is no time to wait !

 

     After a first blog on Design Tips and Tricks, here is a second article that specifically deals with performance. Based on our experience with real implementations (SAP Runs SAP), we have collected and documented a list of things to do to make sure our Design Studio Apps are optimized to offer the highest responsiveness!

 

      Note: for each item in the list below, we have made an assessment on the potential performance improvement. This potential is an indicator, it does not mean it systematically brings the same percentage of speed growth, as there are many parameters entering into the overall results.

 

     Disclaimer: all information below are the result of investigation made by myself and some other colleagues from the team. There is no commitment from SAP regarding the points below. We just want to share our point of view, our investigations about how we have improved our performance.

 

 

          1. PageBooks and Tabs – management of the initial view 

               –>  Performance improvement potential : very high

 

     This is the most important of all of the aspects! With a careful handling of such design configurations, you can drastically influence the perception at each navigation step.

 pagebook.png

    tab.png

 

     When your app includes Tabs or Pagebooks, this means that there are several “screens” where only one is active and visible. Therefore, only the data sources on this active screen (page or tab) should be processed.

- When leaving this active view, “Resetting” the corresponding data sources should not be done (the screen it not visible anyway). 

- All the data sources of the other inactive views should not be the object of any action as long as these views are not visible.

 

The code in this event handler must be structured in a way that only the data sources related to the displayed view are processed:

 

if ((PAGEBOOK_1.getSelectedPageIndex() == 0)) {

    // Touch data sources relevant for your first page

  // ...

else if ((PAGEBOOK_1.getSelectedPageIndex() == 1)) {

   // Touch data sources relevant for your second page

   // ...

else if ((PAGEBOOK_1.getSelectedPageIndex() == 2)) {

   // Touch data sources relevant for your third page

   // ...

}

 

 

         2. Carefully select the data sources that are loaded at startup

               –>  Performance improvement potential : very high

 

     By default, all data sources are loaded in Design Studio at start up. For simple dashboards, this does not alter the perception of the end user and the data show up in a few seconds – with the advantage that they are now included.

 

     For multi-page applications, the number of data sources and of components to be loaded will have a more significant impact and the end user may need to wait for an approximate 30s to 1 minute if nothing is settled to change this. 

 

     Solution: since only the data sources for the first screen are needed at startup, all other data sources should set the “Load In Script” to “true”.

 

load_in_script.pngThese other data sources must be loaded afterward within the script making the related components visible:      

     DS_1.loadDataSource();

 

 

     Note: when a startup script is used (for example in the case of multiple views, to handle the fact only one view is visible), not only “Load in Script” must be properly set as described above, but also you need to pay attention for the “loadDataSource()” in the script to avoid to load Datasources unnecessarily with a performance impact. To make it simpler:

 

 

Load in script loadDatasource Consequence on processing Consequence on Performance
False Not called KPI is visible – the datasource is loaded

Positive impact - Expected setting for visible KPIs on startup 
performance depends on the datasource, and will impact the opening time

False Called DataSource loaded twice Negative impact
True Called KPI is not visible on startup – however you load the Datasource

Positive impact - Expected setting for KPIs not visible on startup – performance depends on the datasource, and will impact the time to switch to the KPI

True Not called Error: you try to display a KPI while the datasource is not loaded Negative impact

 

 

          3. Initialize data sources in “background processing” (available since 1.2)

               –>  Performance improvement potential : very high

 

     This feature can help if: 

- Multiple data sources per screen are really needed

- The components relying on these data sources are visible at the same time.

 

     With “background processing”, the “most important” data sources can be initialized first (and these components show the data right away) and the other components are initialized shortly after in the “On Background Processing” script of the application.

 

 

     Example:

Supposing when you open your dashboard, you have several datasources/charts to show in the first page, and other datasources/charts in other pages. 

What the background processing will allow you to do is:

  • Show data, chart after chart for the first page (You have to define which chart you want to load/show first, then the second one, third one, …)
  • As soon as all charts from page 1 are loaded, load charts for the other pages with the background processing mode. This will improve your opening time, and the load of datasources from page 2, 3, … will be transparent for end-users.

 

 

     In Design Studio:

  • For datasources/charts that you need first, keep “Load in script = false”
  • For all other datasources, please select “Load In Script = true”. Then in the “On Startup” script this line must be added:

                APPLICATION.doBackgroundProcessing();

 

     This will trigger the “On Background Processing” script after the rendering has finished. Inside the “On Background Processing” script, the other data sources can be initialized with line calls like:

               DS_XX.loadDataSource();

 

 

Note that this concept is not limited to two levels (“main” data sources vs. other data sources), but it is also easily possible to create a sequence-chain of data sources to be initialized.

 

Note: the “On Background Processing” concept is not limited to the application startup. On newly displayed pages of a pagebook or tabstrip the most important data can also be shown first and the other data sources can be initialized subsequently.

 

 

Taking an example:

     To use a tile effect, in which datasources are loaded one-by-one, the application designer can use the recursive functionality.

 

     If 6 different DSs are used within the application, and they show their data one after the other, a script like this one could be used:

 

tiles.png

 

On the “On Startup” script, please add 

      APPLICATION.doBackgroundProcessing();

 

Create a variable Variable1, initialize it to 0

 

Then, on the background processing function, add:

if(Variable1 == 0){

DS_1.loadDataSource();

//the chart linked to DS_1 will appear

}

if(Variable1 == 1){

DS_2.loadDataSource();

//the chart linked to DS_2 will appear

}

if(Variable1 == 2){

DS_3.loadDataSource();

//the chart linked to DS_3 will appear

}

if(Variable1 == 3){

DS_4.loadDataSource();

//the chart linked to DS_4 will appear

}

Variable1 = Variable1 +1;

if(Variable1 < 7){

APPLICATION.doBackgroundProcessing();

}

 

     Following this code, tiles will appear one after the other one. During the time you will load DS_2, DS_3, etc… end-users will be able to read/play with DS_1, and the loading time of DS_2, DS_3, etc… will be transparent for end-users. So, the perception of the loading time will be completely different.

 

 

          4. Upgrade to version 1.2 SP1

               –> Performance improvement potential : very high

 

     In this release, a number of changes have been introduced in the Designer and in the BIP Add-on. In particular, SAP BusinessObjects Design Studio uses a SAP library (BICS - Business Intelligence Consumer Services) to access the data – enhanced with additional enhancement of the memory consumption leading to significant improvement.

 

 

          5. SetVariableValue/setFilter/setDataSelection

               –> Performance improvement potential : very high

 

     Depending on the definition of the source BW Queries, you may have to create some filters in Design Studio especially in case end-users have required drop down lists, list boxes, etc…

 

     To achieve this, there are various possibilities with different costs on the performance. In particular, it is preferable to use a “setFilter” instead of  setVariableValue” wherever possible. “setVariableValue” impacts the VariableContainer and therefore refreshes all the datasources embedded in this container. This may significantly impact the performance, executing more data refresh than needed.

 

     To summarize, here is a general impact of the different types of filters on the performance:

          Highest: “setVariable”: BW roundtrips for multiple data sources

          Medium: “setFilter”: BW roundtrip for one data source

          Lowest: “setDataSelection”: no BW roundtrip

 

 

          6. Use one datasource for multiple charts (available since 1.2)

               –> Performance improvement potential : medium / high

 

     If several charts show the result set of the same query in different ways in your application, this requires any version until 1.1 to create distinct data sources for each chart. Each of the data sources used a different “Initial View” setting to display the different data.

 

     Design Studio 1.2 offers the possibility to use a single data source in multiple charts. The data source has an Initial View that contains the superset of all needed data. The chart itself has a new property “Data Selection” which allows filtering only the relevant data for the specific chart.

 

data_selection.png

 

     The “Data Selection” can be exposed via a “picking tool” in the property sheet or by editing the “Data Selection” manually. In most cases the picking tool should be enough. The syntax for manual data edition is not yet fully documented. 

 

     As data sources are the most significant performance driver in Design Studio applications, minimizing the number of data sources is always a significant performance gain.

 

 

          7. Care of issues on the “Error log”

               –> Performance improvement potential : small / medium

 

     When the application is running, you should check carefully “Message View” or in the Designer’s “Error Log” view. Sometimes these entries indicate programming errors/issues. Some programming errors could impact performance.

 

     For example, if the data source is asked for data that does not exist (warning message: There is a problem with the measure or dimension that you wanted to select using getData or getDataAsString), the tool will check for a non-existence, with high cost (generally via a backend-roundtrip).

 

 

          8. Use “Global Script Variables” instead of “hidden text fields”

               –> Performance improvement potential : small / medium

 

     In the past, Global Script Variables were not available in Design Studio; therefore, you had to use “Hidden text fields” or Dropdown lists to simulate variables.

 

     This has changed since version 1.1. Global variables are available and use far less resources than a hidden text field as it overall reduces the number of components. It also cleans up the “Outline View” of the application.

 

 

          9. Use CSS classes in one text field instead of two text fields

               –> Performance improvement potential : small / medium

 

     On your app, if you need:

  • to increase size when an item is selected
  • to put a text when an item is selected
  • show values in red or green depending of the value (for e.g. up or down to 0)

 

     You should leverage the dynamic javascript code (setCSSClass), instead of:

  • overlapping two texts (one in bold and one without bold) and play with dynamic visibility
  • overlapping two texts (one in red and one in green) and play with dynamic visibility

 

     Doing this, the number of text fields is reduced and performance improved. The reduced number of components will save resources and make the complex tree in the “Outline” view more usable.

 

 

          10. Summary : parameters influencing the performance

 

ParameterPossible cases
Load In Script

When multiple tabs/pagebooks:

- true: make KPI not visible at startup

- false: only for the KPIs that are visible at startup

“DS”.loadDatasource() When called: datasource is loaded – used in scripts when the datasource is linked to a visible component
APPLICATION.doBackgroundProcessing() When called, for complex dashboards: allowing to delay the processing of non-visible KPIs in the background while first KPIs have been processed

 

      Then, the choice between SetVariableValue/setFilter/setDataSelection could have a huge impact of performance.

 

 

Conclusion

 

     Following all of these recommendations with a careful management of your javascript, your queries, and your components, you can expect to have a dashboard running very fast.

 

     Design Studio offers a smart way of data management, processing the right minimum of sources (only information that your business can view). In parallel, the background processing helps to load the data for other KPIs, other views, in a transparent way for the end user. With this, data are already there when the user switches to another view and information comes up instantaneously.

 

 

Have a good time on Design Studio!!

view comments...
 
9.  STOCHASTIC BLOCK
Like most people when addressed with this most wonderful word from the English language I indeed had to look it up in a  dictionary!
 
The definition of  Stochastic is simply  RANDOM.
 
I was asked  to explain a recent SAP MM exam sample question...
 
The following values are defined  for the stochastic block:

    Threshold value: 1000£

    Percentage: 40

 

For an invoice value of 500£ what is the probability that the invoice will be blocked Stochastically ?

For an invoice value of 1000£ what is the probabilty that the invoice will be blocked Stochastically ?

For an invoice value of 5000£ what is the probablility that the invoice will be blocked Stochastically ?

 

Answers with explanation will be posted  on Friday.

view comments...
 
10.  Big data & SAP

Big data has been a very revolutionary area particularly due to the load or bulk of data processing and it's underlying usage on the social platform. A technology base like SAP primarily due to it's huge adoption user base could benefit with any integration on Big data storage and processing fronts. There are interesting articles present on the web which depict the current trends in this field. Putting down links to some interesting reads on the this topic -

Big data and SAP - http://www.sap.com/solution/big-data/software/platform.html

HANA Vs Hadoop - http://scn.sap.com/community/hana-in-memory/blog/2014/02/01/hana-vs-hadoop--showdown

HANA Hadoop Integration - http://saphanatutorial.com/sap-hana-and-hadoop/

Sources: scn, sap & hana.

view comments...
 
11.  Bg data & SAP

Big data has been a very revolutionary area particularly due to the load or bulk of data processing and it's underlying usage on the social platform. A technology base like SAP primarily due to it's huge adoption user base could benefit with any integration on Big data storage and processing fronts. There are interesting articles present on the web which depict the current trends in this field. Putting down links to some interesting reads on the this topic -

Big data and SAP - http://www.sap.com/solution/big-data/software/platform.html

HANA Vs Hadoop - http://scn.sap.com/community/hana-in-memory/blog/2014/02/01/hana-vs-hadoop--showdown

HANA Hadoop Integration - http://saphanatutorial.com/sap-hana-and-hadoop/

Sources: scn, sap & hana.

view comments...
 
12.  RESEARCH AND VALIDATION TECHNIQUES
The internet provides a wealth of information to SAP users who post sometimes the most challenging questions  and we must thank all the wonderful people who have offered tireless support by posting such well though out responses. I for sure in the early days found the internet my #1 research tool !
 
When using the internet to research your subject matter, its important to validate the information you find. 
 
As a recent example, I was asked by a user new to MM if I could confirm the answer to the following typical SAP MM exam question,  this user found was that answers 1 and 3 were correct...or were they ?
 
You post a GR in to stock for a purchase order item for which the indicator Free item is set. The material has a material master record and a material type for which the quantity and values are updated. The price control parameter has the value Standard price for the material. To which general ledger accounts are the posting made? 
  1. Stock account
  2. Consumption account
  3. GR/IR clearing account
  4. Price difference account

To thoroughly understand any question posed do your research but support your research by testing the theory . In this instance  create a PO and check the 'FREE ITEM' field, create an identical PO WITHOUT the FREE ITEM field checked then go and compare the FI documents you created via PO history.
 
A gold star to anyone who responds with the correct answer.
view comments...
 
13.  BI reporting
 
Hi!
 
I'm looking for SAP BI reporting course, any of you know about this course that any opportunity in future.
 
Regards,
Dinesh
view comments...
 
14.  SAP Fiori - User using SAP on a web browser - Simplfied !!

SAP Fiori would perhaps be a very simplified version provided to a user using SAP. SAP Fioiri sits on NetWeaver gateway and offeres out of box business rich process capabilities by leveraging your existing platform and mobilising through the use of browser not through mobile platform.  

You can create Sales Order, Approve Purchase Order and 25 such profiles all on a web browser connected with SAP backend.

Here is a demonstration example: 

On your internet browser on phone or laptop open the link (after proper installations). Then a link on the browser will prompt for a user to enter user id and password, the same authorisation you use in the SAP backend system. Depending on your profile you will be prompted for selecting profiles from 1 to 25.

You can then be provided with the ability to create Sales Order from scratch on the web browser, enter shipping instructions etc and confirm

The best part is you can execute the Sales order on a web browser using mobile/laptop/desktop as it is built on HTML5 and a mobile responsive site.

view comments...
 
15.  SAP HANA Architecture

SAP HANA database consists of Index Server, Name Server, Statistics Server, Preprocessor Server and XS Engine.

 
  1. Index Server contains the actual data and the engines for processing the data. It also coordinates and uses all the other servers.
  2. Name Server holds information about the SAP HANA databse topology. This is used in a distributed system with instances of HANA database on different hosts. The name server knows where the components are running and which data is located on which server.
  3. Statistics Server collects information about Status, Performance and Resource Consumption from all the other server components. From the SAP HANA Studio we can access the Statistics Server to get status of various alert monitors.
  4. Preprocessor Server is used for Analysing Text Data and extracting the information on which the text search capabilities are based .
  5. XS Engine is an optional component. Using XS Engine clients can connect to SAP HANA database to fetch data via HTTP.

 

The SAP HANA Index Server performs 7 key functions to accelerate and optimize analytics. Together, these functions provide robust security and data protection and enhanced data access.

  • Connection and Session Management – This component initializes and manages sessions and connections for the SAP HANA Database using pre-established Session Parameters. SAP has long been known for excellence in session management through its integration of SAPRouter into the SAPGUI product used as a front end for accessing the ABAP stack.  SAP HANA retains the ability to configure Connection and Session management parameters to accommodate complex security and data transfer policies instituted.
  • Authentication – User and role-based privileges are authenticated by the SAP HANA Database. (The Users, Authorizations and Roles within the SAP ERP system are not applicable or transportable to the SAP HANA instance.) The SAP HANA authentication model allows granting of privileges to users or roles, and a privilege grants the right to perform a specified SQL operation on a specific Object. SAP HANA also utilizes a set of Analytic Privileges that represent filters or hierarchy drilldown limitations for analytic queries to protect sensitive data from unauthorized users. This model enforces “Segregation of Duty” for clients that have regulatory requirements for the security of data.
  • SQL Processor – The SQL Processor segments data queries and directs them to specialty query processing engines for optimized performance. It also ensures that SQL statements are accurately authored and provides some error handling to make queries more efficient. The SQL processor contains several engines and processors that optimize query execution:
    • The Multidimensional Expressions (MDX) Engine is queries and manipulates the multidimensional data stored in OLAP (OnLine Analytical Processing) data cubes.
    • The Planning Engine enables the basic planning operations within the SAP HANA Database for financial planning operations.
    • The Stored Procedure Processor executes procedure calls for optimized processing without reinterpretation.  (e.g. converting a standard InfoCube into an SAP HANA Optimized Infocube)
    • The Calculation Engine converts data into Calculation Models and creates a Logical Execution Plans to support parallel processing.
  • Relational Stores – SAP has further segmented the storage of In-Memory data into compartments within memory for speedier access.  Data not needed immediately is stored on a Physical Disk as opposed to RAM.  This allows quick access to the most relevant data. The SAP HANA Database houses four relational stores that optimize query performance:

 

    • The Row Store stores data in a row-type fashion and is optimized for high performance of write operation, and is derived from the P-Time “In Memory System” which was acquired by SAP in 2005.  The Row Store is held fully in RAM.
    • The Column Store stores data in a column-type fashion and is optimized for high performance of write operation, and is derived from TREX (Text Retrieval and Extraction)  which was unveiled by SAP in the SAP NetWeaver Search and Classification product.  This technology was further developed into a full relational column based store.  The Column Store is held fully in RAM.
    • The Object Store is an integration of SAP Live Cache Technology into the SAP HANA Database.
    • The Disk Based Store is used for data that does not need to be held in memory and is best used for “tracing data” or old data that is no longer used.  Disk Based Store is located on a hard disk is pulled into RAM as needed.
  • Transaction Manager – The SAP HANA Database processes individual SQL statements as transactions.  The Transaction Manager controls and coordinates transactions and sends relevant data to appropriate engines and to the Persistence Layer. This segmentation simplifies administration and troubleshooting.
  • Persistence Layer – The Persistence Layer provides built-in disaster recovery for the SAP HANA Database. The algorithms and technology is based on concepts pioneered by MAX DB and ensures that the database is restored to the most recent committed state after a planned or unplanned restart.  Backups are stored as Save Points in the Data Volumes via a Save Point Coordinator which is typically set to backup every five to ten minutes.  Any change points that occur after a save point are designated as un-committed transactions and are stored in the Transaction Log Volume.  Typically, these volumes are saved to media and shipped offsite for a cold-backup disaster recovery remedy.
  • Repository - The Repository manages the versioning of Metadata Objects such as Attribute, Analytic Views and Stored Procedure.  It also enables the import and export of Repository content

SAP HANA CONNECTIVITY OVERVIEW

 

Now let us check the architecture components of SAP HANA Index Server.

SAP HANA Index Server Architecture:

Block Diagram of Engine Components

  1. Connection and Session Management component is responsible for creating and managing sessions and connections for the database clients. Once a session is established, clients can communicate with the SAP HANA database using SQL statements. For each session a set of parameters are maintained like, auto-commit, current transaction isolation level etc. Users are Authenticated either by the SAP HANA database itself (login with user and password) or authentication can be delegated to an external authentication providers such as an LDAP directory.
  2. The client requests are analyzed and executed by the set of components summarized as Request Processing And Execution Control. The Request Parser analyses the client request and dispatches it to the responsible component. The Execution Layer acts as the controller that invokes the different engines and routes intermediate results to the next execution step.For example, Transaction Control statements are forwarded to the Transaction Manager. Data Definition statements are dispatched to the Metadata Manager and Object invocations are forwarded to Object Store. Data Manipulation statements are forwarded to the Optimizer which creates an Optimized Execution Plan that is subsequently forwarded to the execution layer.
    • The SQL Parser checks the syntax and semantics of the client SQL statements and generates the Logical Execution Plan. Standard SQL statements are processed directly by DB engine.
    • The SAP HANA database has its own scripting language named SQLScript that is designed to enable optimizations and parallelization. SQLScript is a collection of extensions to SQL. SQLScript is based on side effect free functions that operate on tables using SQL queries for set processing. The motivation for SQLScript is to offload data-intensive application logic into the database.
    • Multidimensional Expressions (MDX) is a language for querying and manipulating the multidimensional data stored in OLAP cubes.
    • The SAP HANA database also contains a component called the Planning Engine that allows financial planning applications to execute basic planning operations in the database layer. One such basic operation is to create a new version of a dataset as a copy of an existing one while applying filters and transformations. For example: Planning data for a new year is created as a copy of the data from the previous year. This requires filtering by year and updating the time dimension. Another example for a planning operation is the disaggregation operation that distributes target values from higher to lower aggregation levels based on a distribution function.
    • The SAP HANA database also has built-in support for domain-specific models (such as for financial planning) and it offers scripting capabilities that allow application-specific calculations to run inside the database.

    The SAP HANA database features such as SQLScript and Planning operations are implemented using a common infrastructure called the Calc engine. The SQLScript, MDX, Planning Model and Domain-Specific models are converted into Calculation Models. The Calc Engine creates Logical Execution Plan for Calculation Models. The Calculation Engine will break up a model, for example some SQL Script, into operations that can be processed in parallel. The engine also executes the user defined functions.

  3. In HANA database, each SQL statement is processed in the context of a transaction. New sessions are implicitly assigned to a new transaction. The Transaction Manager coordinates database transactions, controls transactional isolation and keeps track of running and closed transactions. When a transaction is committed or rolled back, the transaction manager informs the involved engines about this event so they can execute necessary actions. The transaction manager also cooperates with the persistence layer to achieve atomic and durable transactions.
  4. Metadata can be accessed via the Metadata Manager. The SAP HANA database metadata comprises of a variety of objects, such as definitions of relational tables, columns, views, and indexes, definitions of SQLScript functions and object store metadata. Metadata of all these types is stored in one common catalog for all SAP HANA database stores (in-memory row store, in-memory column store, object store, disk-based). Metadata is stored in tables in row store. The SAP HANA database features such as transaction support, multi-version concurrency control, are also used for metadata management. In distributed database systems central metadata is shared across servers. How metadata is actually stored and shared is hidden from the components that use the metadata manager.
  5. The Authorization Manager is invoked by other SAP HANA database components to check whether the user has the required privileges to execute the requested operations. SAP HANA allows granting of privileges to users or roles. A privilege grants the right to perform a specified operation (such as create, update, select, execute, and so on) on a specified object (for example a table, view, SQLScript function, and so on).The SAP HANA database supports Analytic Privileges that represent filters or hierarchy drilldown limitations for analytic queries. Analytic privileges grant access to values with a certain combination of dimension attributes. This is used to restrict access to a cube with some values of the dimensional attributes.
  6. Database Optimizer gets the Logical Execution Plan from the SQL Parser or the Calc Engine as input and generates the optimised Physical Execution Plan based on the database Statistics. The database optimizer which will determine the best plan for accessing row or column stores.
  7. Database Executor basically executes the Physical Execution Plan to access the row and column stores and also process all the intermediate results.
  8. The Row Store is the SAP HANA database row-based in-memory relational data engine. Optimized for high performance of write operation, Interfaced from calculation / execution layer. Optimised Write and Read operation is possible due to Storage separation i.e. Transactional Version Memory & Persisted Segment.Row Store Block Diagram
    • Transactional Version Memory contains temporary versions i.e. Recent versions of changed records. This is required for Multi-Version Concurrency Control (MVCC). Write Operations mainly go into Transactional Version Memory. INSERT statement also writes to the Persisted Segment.
    • Persisted Segment contains data that may be seen by any ongoing active transactions. Data that has been committed before any active transaction was started.
    • Version Memory Consoliation moves the recent version of changed records from Transaction Version Memory to Persisted Segment based on Commit ID. It also clears outdated record versions from Transactional Version Memory. It can be considered as garbage collector for MVCC.
    • Segments contain the actual data (content of row-store tables) in pages. Row store tables are linked list of memory pages. Pages are grouped in segments. Typical Page size is 16 KB.
    • Page Manager is responsible for Memory allocation. It also keeps track of free/used pages.
  9. The Column Store is the SAP HANA database column-based in-memory relational data engine. Parts of it originate from TREX (Text Retrieval and Extraction) i.e SAP NetWeaver Search and Classification. For the SAP HANA database this proven technology was further developed into a full relational column-based data store. Efficient data compression and optimized for high performance of read operation, Interfaced from calculation / execution layer. Optimised Read and Write operation is possible due to Storage separation i.e. Main & Delta.Column Store Block Diagram
    • Main Storage contains the compressed data in memory for fast read.
    • Delta Storage is meant for fast write operation. The update is performed by inserting a new entry into the delta storage.
    • Delta Merge is an asynchronous process to move changes in delta storage into the compressed and read optimized main storage. Even during the merge operation the columnar table will be still available for read and write operations. To fulfil this requirement, a second delta and main storage are used internally.
    • During Read Operation data is always read from both main & delta storages and result set is merged. Engine uses multi version concurrency control (MVCC) to ensure consistent read operations.
    • As row tables and columnar tables can be combined in one SQL statement, the corresponding engines must be able to consume intermediate results created by each other. A main difference between the two engines is the way they process data: Row store operators process data in a row-at-a-time fashion using iterators. Column store operations require that the entire column is available in contiguous memory locations. To exchange intermediate results, row store can provide results to column store materialized as complete rows in memory while column store can expose results using the iterator interface needed by row store.
  10. The Persistence Layer is responsible for durability and atomicity of transactions. It ensures that the database is restored to the most recent committed state after a restart and that transactions are either completely executed or completely undone. To achieve this goal in an efficient way the per-sistence layer uses a combination of write-ahead logs, shadow paging and savepoints. The persistence layer offers interfaces for writing and reading data. It also contains SAP HANA ‘s logger that manages the transaction log. Log entries can be written implicitly by the persistence layer when data is written via the persistence interface or explicitly by using a log interface.

Distributed System and High Availability

The SAP HANA Appliance software supports High Availability. SAP HANA scales systems beyond one server and can remove the possibility of single point of failure. So a typical Distributed Scale out Cluster Landscape will have many server instances in a cluster. Therefore Large tables can also be distributed across multiple servers. Again Queries can also be executed across servers. SAP HANA Distributed System also ensures transaction safety.

Features

  • Active Servers or Worker hosts in the cluster.
  • Standby Server(s) in the cluster.
  • Shared file system for all Servers. Serveral instances of SAP HANA share the same metadata.
  • Each Server hosts an Index Server & Name Server.
  • Only one Active Server hosts the Statistics Server.
  • During startup one server gets elected as Active Master.
  • The Active Master assigns a volume to each starting Index Server or no volume in case of cold Standby Servers.
  • Upto 3 Master Name Servers can be defined or configured.
  • Maximum of 16 nodes is supported in High Availability configurations.

 

Name Server Configured Role Name Server Actual Role Index Server Configured Role Index Server Actual Role
Master 1 Master Worker Master
Master 2 Slave Worker Slave
Master 3 Slave Worker Slave
Slave Slave Standby Standby

 

Failover

  • High Availability enables the failover of a node within one distributed SAP HANA appliance. Failover uses a cold Standby node and gets triggered automatically. So when a Active Server X fails, Standby Server N+1 reads indexes from the shared storage and connects to logical connection of failed server X.
  • If the SAP HANA system detects a failover situation, the work of the services on the failed server is reassigned to the services running on the standby host. The failed volume and all the included tables are reassigned and loaded into memory in accordance with the failover strategy defined for the system. This reassignment can be performed without moving any data, because all the persistency of the servers is stored on a shared disk. Data and logs are stored on shared storage, where every server has access to the same disks.
  • The Master Name Server detects an Index Server failure and executes the failover. During the failover the Master Name Server assigns the volume of the failed Index Server to the cold Standby Server. In case of a Master Name Server failure, another of the remaining Name Servers will become Active Master.
  • Before a failover is performed, the system waits for a few seconds to determine whether the service can be restarted. Standby node can take over the role of a failing master or failing slave node.
view comments...
 
16.  Best Practice to publish SAP Syclo Code

Following steps explains a scenario on the best practice to pusblish an agentry code when developers are located in different countries

1. Developer (Developer A, Developer B and Developer C located in 3 different countries) 

        1.1 unit test their code using their local environment (Eclipse, Agentry server)
        1.2 commit their code to central repository (Agentry share, svn) once unit tested.
         => This will ensure there is no impact on testers or other developers

 For future release for example Sprint 2 ( In Syclo Terminology)


2. Demo sprint 2 builder ( Developer A and B)
        2.1 Publish Agentry application to central Agentry server
        2.2 export Java code to central Agentry server
        2.3 restart central Agentry server
        => Build a coherent application, possibility to revert to a previous working level


3. Functionality Testers (X, Y and Z.)
        3.1 Run functional tests of the build using an Agentry client on their laptop connected to the central Agentry server.

view comments...
 
17.  BC T Codes & Tables & Function
SAP BC TRANSACTIONS / T CODES MODULE
LSMW Legacy System Migration Workbench. An addon available from SAP that can make data converstion a lot easier. Thanks to Serge Desland for this one.
DI02 ABAP/4 Repository Information System: Tables.
OSS1 SAP Online Service System
OY19 Compare Tables
SM13 Update monitor. Will show update tasks status. Very useful to determine why an update failed.
S001 ABAP Development Workbench
S001 ABAP/4 Development Weorkbench
S002 System Administration
SA38 Execute a program
SCAT Computer Aided Test Tool
SCU0 Compare Tables
SE01 Old Transport & Corrections screen
SE03 Groups together most of the tools that you need for doing transports. In total, more than 20 tools can be reached from this one transaction.
SE09 Workbench Organizer
SE10 New Transport & Correction screen
SE11 ABAP/4 Dictionary Maintenance SE12 ABAP/4 Dictionary Display SE13 Maintain Technical Settings (Tables)
SE12 Dictionary: Initial Screen - enter object name
SE13 Access tables in ABAP/4 Dictionary
SE14 Utilities for Dictionary Tables
SE15 ABAP/4 Repository Information System
SE16 Data Browser: Initial Screen
SE16N Table Browser (the N stands for New, it replaces SE16). Provided by Smijo Mathew.
SE17 General Table Display
SE24 Class Builder
SE30 ABAP/4 Runtime Analysis
SE32 ABAP/4 Text Element Maintenance
SE35 ABAP/4 Dialog Modules
SE36 ABAP/4: Logical Databases
SE37 ABAP/4 Function Modules
SE38 ABAP Editor
SE39 Splitscreen Editor: Program Compare
SE41 Menu Painter
SE43 Maintain Area Menu
SE48 Show program call hierarchy. Very useful to see the overall structure of a program. Thanks to Isabelle Arickx for this tcode.
SE49 Table manipulation. Show what tables are behind a transaction code. Thanks to Isabelle Arickx for this tcode.
SE51 Screen Painter: Initial Screen
SE54 Generate View Maintenance Module
SE61 R/3 Documentation
SE62 Industry utilities
SE63 Translation
SE64 Terminology
SE65 R/3 document. short text statistics SE66 R/3 Documentation Statistics (Test!)
SE68 Translation Administration
SE71 SAPscript layout set
SE71 SAPScript Layouts Create/Change
SE72 SAPscript styles
SE73 SAPscript font maintenance (revised)
SE74 SAPscript format conversion
SE75 SAPscript Settings
SE76 SAPscript Translation Layout Sets
SE77 SAPscript Translation Styles
SE80 ABAP/4 Development Workbench
SE81 SAP Application Hierarchy
SE82 Customer Application Hierarchy
SE83 Reuse Library. Provided by Smiho Mathew.
SE84 ABAP/4 Repository Information System
SE85 ABAP/4 Dictionary Information System
SE86 ABAP/4 Repository Information System
SE87 Data Modeler Information System
SE88 Development Coordination Info System
SE91 Maintain Messages
SE92 Maintain system log messages
SE93 Maintain Transaction
SEARCH_SAP_MENU From the SAP Easy Access screen, type it in the command field and you will be able to search the standard SAP menu for transaction codes / keywords. It will return the nodes to follow for you.
SEU Object Browser
SHD0 Transaction variant maintenance
SM04 Overview of Users (cancel/delete sessions)
SM12 Lock table entries (unlock locked tables)
SM21 View the system log, very useful when you get a short dump. Provides much more info than short dump
SM30 Maintain Table Views
SM31 Table Maintenance
SM32 Table maintenance
SM35 View Batch Input Sessions
SM37 View background jobs
SM50 Process Overview
SM51 Delete jobs from system (BDC)
SM62 Display/Maintain events in SAP, also use function BP_EVENT_RAISE
SMEN Display the menu path to get to a transaction
SMOD/CMOD Transactions for processing/editing/activating new customer enhancements.
SNRO Object browser for number range maintenance
SPRO Start SAP IMG (Implementation Guide)
SQ00 ABAP/4 Query: Start Queries
SQ01 ABAP/4 Query: Maintain Queries
SQ02 ABAP/4 Query: Maintain Funct. Areas
SQ03 ABAP/4 Query: Maintain User Groups
SQ07 ABAP/4 Query: Language Comparison
ST05 Trace SQL Database Requests
ST22 ABAP Dump analysis
SU53 Display Authorization Values for User
WEDI EDI Menu. IDOC and EDI base.
WE02 Display an IDOC
WE07 IDOC Statistics
   
SAP BC TABLES
DD02L Tables in SAP
DD02T Tables description
DD03L Field names in SAP
DD03T Field description in SAP
Workbench
TADIR Directory of R/3 Repository Objects
TRDIR System table TRDIR
TFDIR Function Module
TLIBG Person responsible for function class
TLIBT Function Group Short Texts
TFTIT Function Module Short Text
TSTC Transaction codes in SAP
TSTCT Transaction codes texts
T100 Message text (vb e000)
VARID Variant data
D020T Screen texts
TDEVC Development class
TDEVCT Texts for development classes
User administration
USR01 User master
USR02 Logon data
USR03 User address data
USR04 User master authorizations
USR11 User Master Texts for Profiles (USR10)
UST12 User master: Authorizations
USR12 User master authorization values
USR13 Short Texts for Authorizations
USR40 Prohibited passwords
TOBJ Objects
TOBC Authorization Object Classes
TPRPROF Profile Name for Activity Group
DEVACCESS Table for development user
Batch input queue
APQD DATA DEFINITION Queue
APQI Queue info definition
Job processing
TBTCO Job status overview table
TBTCP Batch job step overview
Spool
TSP02 Spool: Print requests
Runtime errors
SNAP Runtime errors
Message control
TNAPR Processing programs for output
NAST Message status
NACH Printer determination
EDI
EDIDC Control record
EDIDD Data record
EDID2 Data record 3.0 Version
EDIDS EDI status record
EDPAR Convert External < > Internal Partner Number
EDPVW EDI partner types
EDPI1 EDI partner profile inbound
EDPO1/2/3 EDI partner profile outbound
Change documents
CDHDR Change document header
CDPOS Change document positioned
JCDS Change Documents for System/User Statuses (Table JEST)
Reporting tree table
SERPTREE Reporting: tree structure
LIS structure/control tables
TMC4 Global Control Elements: LIS Info Structure
   
SAP BC FUNCTIONS
ABAP_DOCU_DOWNLOAD Download ABAP documentation in HTML format.
APPL_LOG_DISPLAY With this function module you can analyze logs in the database.
APPL_LOG_INIT This function module checks whether the specified object or sub-object exists and deletes all existing associated data in local memory.
APPL_LOG_READ_INTERN With this function module you read all log data whose log class has at least the specified value, from local memory, for the specified object or sub-object.
APPL_LOG_WRITE_DB With this function module you write all data for the specified object or sub-object in local memory to the database. If the log for the object or sub-object in question is new, the log number is returned to the calling program.
APPL_LOG_WRITE_LOG_PARAMETERS With this function module, you write the name of the log parameters and the associated values for the specified object or sub-object in local memory. If this function module is called repeatedly for the same object or sub-object, the existing parameters are updated accordingly. If you do not specify an object or sub-object with the call, the most recently used is assumed.
APPL_LOG_WRITE_MESSAGES With this function module you write one or more messages, without parameters, in local memory.
ARFC_GET_TID will return the IP address of the terminal in hex.
BAL_CNTL_FREE Release Control
BAL_DB_DELETE Delete logs from the database
BAL_DB_ENQUEUE Lock log
BAL_DB_LOAD Load log(s)
BAL_DB_SAVE Save log(s)
BAL_DB_SEARCH Find logs in the database
BAL_DSP_LOG_DISPLAY Display Log
BAL_DSP_LOG_TECHNICAL_DATA Output all log header data
BAL_DSP_MSG_PARAMETERS Either output extended long text or call a callback routine (based on the data in BAL_S_MSG-PARAMS)
BAL_DSP_OUTPUT_FREE End output
BAL_DSP_OUTPUT_SET_DATA Set dataset to be displayed
BAL_DSP_PROFILE_NO_TREE_GET Display without tree (fullscreen)
BAL_DSP_PROFILE_SINGLE_LOG_GET Standard profile (SLG1) for one log
BAL_GLB_AUTHORIZATION_GET Assign authorization
BAL_GLB_CONFIG_GET Read configuration
BAL_GLB_MEMORY_EXPORT Put function group memory in ABAP-MEMORY
BAL_GLB_MEMORY_REFRESH (Partially) reset global memory
BAL_GLB_MSG_CURRENT_HANDLE_GET Get current message handle
BAL_GLB_SEARCH_LOG Find logs in memory
BAL_LOG_CREATE Create log with header data
BAL_LOG_DELETE Delete log (from database also at Save)
BAL_LOG_HDR_CHANGE Change log header
BAL_LOG_HDR_READ Read log header and other data
BAL_LOG_MSG_ADD Put message in log
BAL_LOG_MSG_CHANGE Change message
BAL_LOG_MSG_CUMULATE Add message cumulated
BAL_LOG_MSG_DELETE Delete message
BAL_LOG_MSG_READ Read message and other data
BAL_LOG_REFRESH Delete log from memory
BAL_MSG_DISPLAY_ABAP Output message as ABAP-MESSAGE
BAL_OBJECT_SUBOBJECT Check whether object and subobject exist and the combination is allowed
BP_EVENT_RAISE Trigger an event from ABAP/4 program
CHANGEDOCUMENT_READ_HEADERS Get the change document header for a sales document, and put the results in an internal table.
CHANGEDOCUMENT_READ_POSITIONS Get the details of a change document, and store them in an internal table. This will tell you whether a field was changed, deleted, or updated.
CLAF_CLASSIFICATION_OF_OBJECTS Return all of the characteristics for a material
CLPB_EXPORT Export a text table to the clipboard (on presentation server)
COMMIT_TEXT To load long text into SAP
CONVERSION_EXIT_ALPHA_OUTPUT converts any number with zeroes right into a simple integer
CONVERT_OTF Convert SAP documents (SAPScript) to other types.
CONVERT_TO_FOREIGN_CURRENCY Convert local currency to foreign currency.
CONVERT_TO_LOCAL_CURRENCY Convert from foreign currency to local currency
DATE_CHECK_PLAUSIBILITY Check to see if a date is in a valid format for SAP. Works well when validating dates being passed in from other systems.
DATE_GET_WEEK will return the week that a date is in.
DATE_IN_FUTURE Calculate a date N days in the future.
DAY_ATTRIBUTES_GET Return useful information about a day. Will tell you the day of the week as a word (Tuesday), the day of the week (2 would be Tuedsay), whether the day is a holiday, and more.
ENQUE_SLEEP Wait a specified period of time before continuing processing.
EPS_GET_DIRECTORY_LISTING return a list of filenames from a local or network drive
F4_DATE displays a calendar in a popup window and allows user to choose a date, or it can be displayed read only.
F4IF_INT_TABLE_VALUE_REQUEST F4 help that returns the values selected in an internal table. Very handy when programming your very own F4 help for a field.
F4IF_SHLP_EXIT_EXAMPLE documents the different reasons to use a search help exit, and shows how it is done.
F4IP_INT_TABLE_VALUE_REQUEST This function does not exist in 4.6 and above. Use F4IF_INT_TABLE_VALUE_REQUEST instead.
FM_SELECTION_CRITERIA_PRINT Print out selection criteria. Nicley formatted.
FTP_COMMAND Execute a command on the FTP server
FTP_DISCONNECT Close the connection (and log off) the FTP server
Function Group GRAP is now obsolete. SAP recommends using functions in function group SFES instead. Below is an overview of the changes.
GET_CURRENT_YEAR Get the current fiscal year.
GET_GLOBAL_SYMBOLS Returns a list of all tables, select options, texts, etc for a program. Even includes the text definitions for the selection screen
GET_JOB_RUNTIME_INFO Get the current job number from a program. Also returns other useful info about the current job.
GUI_CREATE_DIRECTORY Create a directory on the presentation server
GUI_DOWNLOAD Replaces WS_DOWNLOAD. Download table from the app server to presentation server
GUI_GET_DESKTOP_INFO Replaces WS_QUERY. Delivers Information About the Desktop (client)
GUI_REMOVE_DIRECTORY Delete a directory on the presentation server
GUI_UPLOAD Replaces WS_UPLOAD. Upoad file from presentation server to the app server
HELP_VALUES_GET_WITH_TABLE Show a list of possible values for F4 popup help on selection screens. This function module pops up a screen that is just like all the other F4 helps, so it looks like the rest of the SAP system. Very useful for providing dropdowns on fields that do not have them predefined.
HOLIDAY_CHECK_AND_GET_INFO Useful for determining whether or not a date is a holiday. Give the function a date, and a holiday calendar, and you can determine if the date is a holiday by checking the parameter HOLIDAY_FOUND.
HR_DISPLAY_BASIC_LIST is an HR function, but can be used for any data. You pass it data, and column headers, and it provides a table control with the ability to manipulate the data, and send it to Word or Excel. Also see the additional documentation here.
HR_GET_LEAVE_DATA Get all leave information (includes leave entitlement, used holidays/paid out holidays)
HR_IE_NUM_PRSI_WEEKS Return the number of weeks between two dates.
CLOI_PUT_SIGN_IN_FRONT Move the negative sign from the left hand side of a number, to the right hand side of the number. Note that The result will be left justified (like all character fields), not right justifed as numbers normally are.
CLPB_IMPORT Import a Text Table from the Clipboard (on presentation server)
CONVERSION_EXIT_ALPHA_INPUT converts any number into a string fill with zeroes, with the number at the extreme right
CONVERT_ABAPSPOOLJOB_2_PDF convert abap spool output to PDF
CONVERT_OTFSPOOLJOB_2_PDF converts a OTF spool to PDF (i.e. Sapscript document)
DATE_COMPUTE_DAY Returns a number indicating what day of the week the date falls on. Monday is returned as a 1, Tuesday as 2, etc.
DATE_TO_DAY Converts a date in internal format to a text description of a day. For example 20030529 returns Thursday
DOWNLOAD download a file to the presentation server (PC)
DYNP_VALUES_UPDATE Similar to DYNP_VALUES_READ, this function will allow the updating of fields on a dynpro. Very useful when you want to change a field based on the value entered for another field.
ENQUEUE_ESFUNCTION Lock an abap program so that it cannot be executed.
EPS_GET_FILE_ATTRIBUTES Pass in a filename and a path, and will return attributes for the file
F4_IF_FIELD_VALUE_REQUEST Use values from a DDIC table to provide a list of possible values. TABNAME and FIELDNAME are required fields, and when MULTIPLE_CHOICE is selected, more than one value can be returned.
FILENAME_GET popup to get a filename from a user, returns blank filename if user selects cancel
FORMAT_MESSAGE Takes a message id and number, and puts it into a variable. Works better than WRITE_MESSAGE, since some messages use $ as a place holder, and WRITE_MESSAGE does not accommadate that, it only replaces the ampersands (&) in the message.
FTP_CONNECT Open a connection (and log in) to an FTP server
FU CSAP_MAT_BOM_READ You can use this function module to display simple material BOMs. You cannot display BOM groups (for example, all variants of a variant BOM). as in transaction CS03. Current restrictions: You cannot display long texts. You cannot display sub-items. You cannot display classification data of BOM items for batches. You can only display one alternative or variant. You cannot enter an alternative for module CSAP_MAT_BOM_READ, so you always see alternative 01. The following example came from a posting on the SAP-R3-L mailing list.
G_SET_GET_ALL_VALUES Fetch values from a set.
GET_INCLUDETAB Returns a list of all INCLUDES in a program
GET_PAYSLIP Returns a fully formatted payslip, ready for displaying
GUI_DELETE_FILE Replaces WS_FILE_DELETE. Delete a file on the presentation server
GUI_EXEC Replaces WS_EXECUTE. Start a File or Program Asynchronously with WinExec
GUI_RUN Start a File or Program Asynchronously with ShellExecute
HELP_START Display help for a field. Useful for doing AT SELECTION SCREEN ON VALUE REQUEST for those fields that do not provide F4 help at the DDIC level.
MS_EXCEL_OLE_STANDARD_OLE will build a file, and automatically start Excel
OTF_CONVERT wraps several other function modules. Will convert OTF to ASCII or PDF
POPUP_TO_CONFIRM_LOSS_OF_DATA Create a dialog box in which you make a question whether the user wishes to perform a processing step with loss of data.
POPUP_TO_CONFIRM_STEP Create a dialog box in which you make a question whether the user wishes to perform the step.
POPUP_TO_CONFIRM_WITH_MESSAGE Create a dialog box in which you inform the user about a specific decision point during an action.
POPUP_TO_CONFIRM_WITH_VALUE Create a dialog box in which you make a question whether the user wishes to perform a processing step with a particular object.
POPUP_TO_DECIDE Provide user with several choices as radio buttons
POPUP_TO_DECIDE_WITH_MESSAGE Create a dialog box in which you inform the user about a specific decision point via a diagnosis text.
POPUP_TO_DISPLAY_TEXT Create a dialog box in which you display a two line message
POPUP_TO_SELECT_MONTH Popup to choose a month
POPUP_WITH_TABLE_DISPLAY Provide a display of a table for user to select one, with the value of the table line returned when selected.
PRICING Return pricing conditions in an internal table. Use structure TCOMK for parameter COMM_HEAD_1, and structure TCOMP for parameter COMM_ITEM_1, and set CALCULATION_TYPE to B. The pricing conditions will be returned in XOMV. You must fill TCOMP, and TCOMK with the appropriate values before callling the function in order for it to work.
PROFILE_GET Read an Entry in an INI File on the frontend
PROFILE_SET Write an Entry in an INI File on the frontend
READ_TEXT To load long text into SAP
REGISTRY_GET Read an Entry from the Registry
REGISTRY_SET Set an entry in the Registry
RH_DATA_COLLECTOR_ORGCHART get org info and put in tables suitable for displaying an org structure
RH_GET_ACTIVE_WF_PLVAR Return the active HR Plan
RH_GET_DATE_DAYNAME return the day based on the date provied
RH_READ_INFTY generic PD infotype read with authorization checks
RH_STRUC_GET Returns all related org info
RHP0_POPUP_F4_SEARK is a matchcode for any type of HR Planning object, including the possibility to fill the field that you want
RKD_WORD_WRAP Convert a long string or phrase into several lines.
RP_CALC_DATE_IN_INTERVAL Add/subtract years/months/days from a date
RP_LAST_DAY_OF_MONTHS Determine last day of month
RP_PNP_ORGSTRUCTURE Show a dialog box with the org structure displayed. User is then allowed to choose org units.
RPY_DYNPRO_READ Read dynpro, including screen flow
RPY_TRANSACTION_READ Given a transaction, return the program and screen or given a program and screen, return the transactions that use the program and screen.
RS_COVERPAGE_SELECTIONS Returns an internal table that contains a formatted list of all the selection parameters entered for a report. Table is ready to print out.
RS_REFRESH_FROM_SELECTOPTIONS Get the current contents of selection screen
RS_SEND_MAIL_FOR_SPOOLLIST Send message from ABAP/4 program to SAPoffice.
RS_VARIANT_CONTENTS Returns the contents of the specified variant in a table.
RSPO_DOWNLOAD_SPOOLJOB Download the spool from a program to a file. Requires spool number.
RSPO_RETURN_ABAP_SPOOLJOB Fetch printer spool according to the spool number informed.
RZL_READ_DIR If the server name is left blank, it reads a directory from local presentation server, otherwise it reads the directory of the remote server
RZL_READ_DIR_LOCAL Read a directory on the Application Server
RZL_READ_FILE Read a file from the presentation server if no server name is given, or read file from remote server. Very useful to avoid authority checks that occur doing an OPEN DATASET. This function using a SAP C program to read the data.
RZL_SLEEP Hang the current application from 1 to 5 seconds.
RZL_SUBMIT Submit a remote report.
RZL_WRITE_FILE_LOCAL Saves table to the presentation server (not PC). Does not use OPEN DATASET, so it does not suffer from authority checks.
SAP_CONVERT_TO_XLS_FORMAT Convert data to Microsoft Excel format.
SAPGUI_PROGRESS_INDICATOR Display a progress bar on the SAP GUI, and give the user some idea of what is happening
SAVE_TEXT To load long text into SAP
SCROLLING_IN_TABLE If you are coding a module pool and using a table control, you can use this function SCROLLING_IN_TABLE to handle any scrolling.
SD_DATETIME_DIFFERENCE Give the difference in Days and Time for 2 dates
SO_NEW_DOCUMENT_ATT_SEND_API1 Send a document as part of an email. The documentation is better than normal for this function, so please read it.
SO_SPLIT_FILE_AND_PATH Split a fully pathed filename into a filename and a path.
SO_SPOOL_READ Fetch printer spool according to the spool number informed. See also RSPO_RETURN_ABAP_SPOOLJOB
SO_WIND_SPOOL_LIST Browse printer spool numbers according to user informed.
SPELL_AMOUNT Convert a number to the corresponding words
SWD_HELP_F4_ORG_OBJECTS HR Matchcode tailored for organizational units. Includes a button so that you can browse the hierarchy too.
SX_OBJECT_CONVERT_OTF_PDF Conversion From OTF to PDF (SAPScript conversion)
SX_OBJECT_CONVERT_OTF_PRT Conversion From OTF to Printer Format (SAPScript conversion)
SX_OBJECT_CONVERT_OTF_RAW Conversion From OTF to ASCII (SAPScript conversion)
SXPG_COMMAND_CHECK Check whether the user is authorized to execute the specified command on the target host system with the specified arguments.
SXPG_COMMAND_LIST_GET Select a list of external OS command definitions.
TERM_CONTROL_EDIT Edit a table of text with a very nice text editor. Just call TERM_CONTROL_EDIT and supply with the function with a table of text. Table entries are modified in the editor after clicking "ok".
TERMINAL_ID_GET Return the terminal id
TH_DELETE_USER Logoff a user. Similar results to using SM04.
TH_ENVIRONMENT Get the UNIX environment
TH_POPUP Display a popup system message on a specific users screen.
TH_REMOTE_TRANSACTION Run a transaction on a remote server. Optionally provide BDC data to be used in the transaction
TH_USER_INFO Give information about the current user (sessions, workstation logged in from, etc)
TH_USER_LIST Show which users are logged into an app server
TMP_GUI_DIRECTORY_LIST_FILES Retrieve all of the files and subdirectories on the Presentation Server (PC) for a given directory.
UNIT_CONVERSION_SIMPLE convert weights from one UOM to another.
UPLOAD upload a file to the presentation server (PC)
UPLOAD_FILES Will load one or more files from app or presentation server
WEEK_GET_FIRST_DAY For a given week (YYYYMM format), this function returns the date of the Monday of that week.
WRITE_LIST Useful for writing out the list contents that result from the function LIST_FROM_MEMORY.
WS_DOWNLOAD Save Internal Table as File on the Presentation Server
WS_EXCEL Start EXCEL on the PC
WS_EXECUTE execute a program on a windows PC
WS_FILE_DELETE Delete File at the Frontend
WS_FILENAME_GET Call File Selector
WS_MSG Create a dialog box in which you display an one line message
WS_UPLOAD Load Files from the Presentation Server to Internal ABAP Tables
WS_VOLUME_GET Get the label from a frontend device.
BP_JOBLOG_READ Fetch job log executions filling the structure TBTC5
RFC_SYSTEM_INFO Fetch information from the current instance filling the structure FRCSI
SD_PRINT_TERMS_OF_PAYMENT Format terms of payment according to base line date and payment terms
SO_USER_LIST_READ List of all users filling the structure SOUD3
TH_SAPREL Gather information from the current system including upgrade activities. It completes fields from the structure KKS03
TH_SERVER_LIST Gather information of all instances filling the structure MSXXLIST
TH_WPINFO List of work processes filling the structure WPINFO
WWW_LIST_TO_HTML After running a report, call this function to convert the list output to HTML.
view comments...
 
18.  SAP MM & SD Tables
SAP MM TABLES
 
EINA Purchasing Info Record- General Data
EINE Purchasing Info Record- Purchasing Organization Data
MAKT Material Descriptions
MARA General Material Data
MARC Plant Data for Material
MARD Storage Location Data for Material
MAST Material to BOM Link
MBEW Material Valuation
MKPF Header- Material Document
MSEG Document Segment- Material
MVER Material Consumption
MVKE Sales Data for materials
RKPF Document Header- Reservation
T023 Mat. groups
T024 Purchasing Groups
T156 Movement Type
T157H Help Texts for Movement Types
MOFF Lists what views have not been created
A501 Plant/Material
EBAN Purchase Requisition
EBKN Purchase Requisition Account Assignment
EKAB Release Documentation
EKBE History per Purchasing Document
EKET Scheduling Agreement Schedule Lines
EKKN Account Assignment in Purchasing Document
EKKO Purchasing Document Header
EKPO Purchasing Document Item
IKPF Header- Physical Inventory Document
ISEG Physical Inventory Document Items
LFA1 Vendor Master (General section)
LFB1 Vendor Master (Company Code)
NRIV Number range intervals
RESB Reservation/dependent requirements
T161T Texts for Purchasing Document Types
 
 
 
SAP SD TABLES
 
 
KONV Conditions for Transaction Data
KONP Conditions for Items
LIKP Delivery Header Data
LIPS Delivery: Item data
VBAK Sales Document: Header Data
VBAP Sales Document: Item Data
VBBE Sales Requirements: Individual Records
VBEH Schedule line history
VBEP Sales Document: Schedule Line Data
VBFA Sales Document Flow
VBLB Sales document: Release order data
VBLK SD Document: Delivery Note Header
VBPA Sales Document: Partner
VBRK Billing: Header Data
VBRP Billing: Item Data
VBUK Sales Document: Header Status and Administrative Data
VBUP Sales Document: Item Status
VEKP Handling Unit - Header Table
VEPO Packing: Handling Unit Item (Contents)
VEPVG Delivery Due Index
view comments...
 
19.  SAP MM & SD TCODES
MM MODULES
 
 
Tcodes    Description
ME01 Maintain Source List
ME03 Display Source List
ME04 Changes to Source List
ME05 Generate Source List
ME06 Analyze Source List
ME07 Reorganize Source List
ME08 Send Source List
ME0M Source List per Material
ME11 Create Purchasing Info Record
ME12 Change Purchasing Info Record
ME13 Display Purchasing Info Record
ME14 Changes to Purchasing Info Record
ME15 Flag Purch. Info Rec. for Deletion
ME16 Purchasing Info Recs. for Deletion
ME17 Archive Info Records
ME18 Send Purchasing Info Record
ME1A Archived Purchasing Info Records
ME1B Redetermine Info Record Price
ME1E Quotation Price History
ME1L Info Records Per Vendor
ME1M Info Records per Material
ME1P Purchase Order Price History
ME1W Info Records Per Material Group
ME21 Create Purchase Order
ME21N Create Purchase Order
ME22 Change Purchase Order
ME22N Change Purchase Order
ME23 Display Purchase Order
ME23N Display Purchase Order
ME24 Maintain Purchase Order Supplement
ME25 Create PO with Source Determination
ME26 Display PO Supplement (IR)
ME27 Create Stock Transport Order
ME28 Release Purchase Order
ME29N Release purchase order
ME2A Monitor Confirmations
ME2B POs by Requirement Tracking Number
ME2C Purchase Orders by Material Group
ME2J Purchase Orders for Project
ME2K Purch. Orders by Account Assignment
ME2L Purchase Orders by Vendor
ME2M Purchase Orders by Material
ME2N Purchase Orders by PO Number
ME2O SC Stock Monitoring (Vendor)
ME2S Services per Purchase Order
ME2V Goods Receipt Forecast
ME2W Purchase Orders for Supplying Plant
ME308 Send Contracts with Conditions
ME31 Create Outline Agreement
ME31K Create Contract
ME31L Create Scheduling Agreement
ME32 Change Outline Agreement
ME32K Change Contract
ME32L Change Scheduling Agreement
ME33 Display Outline Agreement
ME33K Display Contract
ME33L Display Scheduling Agreement
ME34 Maintain Outl. Agreement Supplement
ME34K Maintain Contract Supplement
ME34L Maintain Sched. Agreement Supplement
ME35 Release Outline Agreement
ME35K Release Contract
ME35L Release Scheduling Agreement
ME36 Display Agreement Supplement (IR)
ME37 Create Transport Scheduling Agmt.
ME38 Maintain Sched. Agreement Schedule
ME39 Display Sched. Agmt. Schedule (TEST)
ME3A Transm. Release Documentation Record
ME3B Outl. Agreements per Requirement No.
ME3C Outline Agreements by Material Group
ME3J Outline Agreements per Project
ME3K Outl. Agreements by Acct. Assignment
ME3L Outline Agreements per Vendor
ME3M Outline Agreements by Material
ME3N Outline Agreements by Agreement No.
ME3P Recalculate Contract Price
ME3R Recalculate Sched. Agreement Price
ME3S Service List for Contract
ME41 Create Request For Quotation
ME42 Change Request For Quotation
ME43 Display Request For Quotation
ME44 Maintain RFQ Supplement
ME45 Release RFQ
ME47 Create Quotation
ME48 Display Quotation
ME49 Price Comparison List
ME4B RFQs by Requirement Tracking Number
ME4C RFQs by Material Group
ME4L RFQs by Vendor
ME4M RFQs by Material
ME4N RFQs by RFQ Number
ME4S RFQs by Collective Number
ME51 Create Purchase Requisition
ME51N Create Purchase Requisition
ME52 Change Purchase Requisition
ME52N Change Purchase Requisition
ME52NB Buyer Approval: Purchase Requisition
ME53 Display Purchase Requisition
ME53N Display Purchase Requisition
ME54 Release Purchase Requisition
ME54N Release Purchase Requisition
ME55 Collective Release of Purchase Reqs.
ME56 Assign Source to Purch. Requisition
ME57 Assign and Process Requisitions
ME58 Ordering: Assigned Requisitions
ME59 Automatic Generation of POs
ME59N Automatic generation of POs
ME5A Purchase Requisitions: List Display
ME5F Release Reminder: Purch. Requisition
ME5J Purchase Requisitions for Project
ME5K Requisitions by Account Assignment
ME5R Archived Purchase Requisitions
ME5W Resubmission of Purch. Requisitions
ME61 Maintain Vendor Evaluation
ME62 Display Vendor Evaluation
ME63 Evaluation of Automatic Subcriteria
ME64 Evaluation Comparison
ME65 Evaluation Lists
ME6A Changes to Vendor Evaluation
ME6B Display Vendor Evaln. for Material
ME6C Vendors Without Evaluation
ME6D Vendors Not Evaluated Since...
ME6E Evaluation Records Without Weighting
ME6F Print
ME6G Vendor Evaluation in the Background
ME6H Standard Analysis: Vendor Evaluation
ME6Z Transport Vendor Evaluation Tables
ME80 Purchasing Reporting
ME80A Purchasing Reporting: RFQs
ME80AN General Analyses (A)
ME80F Purchasing Reporting: POs
ME80FN General Analyses (F)
ME80R Purchasing Reporting: Outline Agmts.
ME80RN General Analyses (L,K)
ME81 Analysis of Order Values
ME81N Analysis of Order Values
ME82 Archived Purchasing Documents
ME84 Generation of Sched. Agmt. Releases
ME84A Individual Display of SA Release
ME85 Renumber Schedule Lines
ME86 Aggregate Schedule Lines
ME87 Aggregate PO History
ME88 Set Agr. Cum. Qty./Reconcil. Date
ME91 Purchasing Docs.: Urging/Reminding
ME91A Urge Submission of Quotations
ME91E Sch. Agmt. Schedules: Urging/Remind.
ME91F Purchase Orders: Urging/Reminders
ME92 Monitor Order Acknowledgment
ME92F Monitor Order Acknowledgment
ME92K Monitor Order Acknowledgment
ME92L Monitor Order Acknowledgment
ME97 Archive Purchase Requisitions
ME98 Archive Purchasing Documents
ME99 Messages from Purchase Orders
ME9A Message Output: RFQs
ME9E Message Output: Sch. Agmt. Schedules
ME9F Message Output: Purchase Orders
ME9K Message Output: Contracts
ME9L Message Output: Sched. Agreements
MEAN Delivery Addresses
MEB0 Reversal of Settlement Runs
MEB1 Create Reb. Arrangs. (Subseq. Sett.)
MEB2 Change Reb. Arrangs. (Subseq. Sett.)
MEB3 Displ. Reb. Arrangs. (Subseq. Sett.)
MEB4 Settlement re Vendor Rebate Arrs.
MEB5 List of Vendor Rebate Arrangements
MEB6 Busn. Vol. Data, Vendor Rebate Arrs.
MEB7 Extend Vendor Rebate Arrangements
MEB8 Det. Statement, Vendor Rebate Arrs.
MEB9 Stat. Statement, Vendor Rebate Arrs.
MEBA Comp. Suppl. BV, Vendor Rebate Arr.
MEBB Check Open Docs., Vendor Reb. Arrs.
MEBC Check Customizing: Subsequent Sett.
MEBE Workflow Sett. re Vendor Reb. Arrs.
MEBF Updating of External Busn. Volumes
MEBG Chg. Curr. (Euro), Vend. Reb. Arrs.
MEBH Generate Work Items (Man. Extension)
MEBI Message, Subs.Settlem. - Settlem.Run
MEBJ Recompile Income, Vendor Reb. Arrs.
MEBK Message., Subs. Settlem.- Arrangment
MEBM List of settlement runs for arrngmts
MEBR Archive Rebate Arrangements
MEBS Stmnt. Sett. Docs., Vend. Reb. Arrs.
MEBT Test Data: External Business Volumes
MEBV Extend Rebate Arrangements (Dialog)
MECCP_ME2K For Requisition Account Assignment
MEDL Price Change: Contract
MEI1 Automatic Purchasing Document Change
MEI2 Automatic Document Change
MEI3 Recompilation of Document Index
MEI4 Compile Worklist for Document Index
MEI5 Delete Worklist for Document Index
MEI6 Delete purchasing document index
MEI7 Change sales prices in purch. orders
MEI8 Recomp. doc. index settlement req.
MEI9 Recomp. doc. index vendor bill. doc.
MEIA New Structure Doc.Ind. Cust. Sett.
MEIS Data Selection: Arrivals
MEK1 Create Conditions (Purchasing)
MEK2 Change Conditions (Purchasing)
MEK3 Display Conditions (Purchasing)
MEK31 Condition Maintenance: Change
MEK32 Condition Maintenance: Change
MEK33 Condition Maintenance: Change
MEK4 Create Conditions (Purchasing)
MEKA Conditions: General Overview
MEKB Conditions by Contract
MEKC Conditions by Info Record
MEKD Conditions for Material Group
MEKE Conditions for Vendor
MEKF Conditions for Material Type
MEKG Conditions for Condition Group
MEKH Market Price
MEKI Conditions for Incoterms
MEKJ Conditions for Invoicing Party
MEKK Conditions for Vendor Sub-Range
MEKL Price Change: Scheduling Agreements
MEKLE Currency Change: Sched. Agreements
MEKP Price Change: Info Records
MEKPE Currency Change: Info Records
MEKR Price Change: Contracts
MEKRE Currency Change: Contracts
MEKX Transport Condition Types Purchasing
MEKY Trnsp. Calc. Schema: Mkt. Pr. (Pur.)
MEKZ Trnsp. Calculation Schemas (Purch.)
MELB Purch. Transactions by Tracking No.
MEMASSIN Mass-Changing of Purch. Info Records
MEMASSPO Mass Change of Purchase Orders
MEMASSRQ Mass-Changing of Purch. Requisitions
MENU_MIGRATION Menu Migration into New Hierarchy
MEPA Order Price Simulation/Price Info
MEPB Price Info/Vendor Negotiations
MEPO Purchase Order
MEQ1 Maintain Quota Arrangement
MEQ3 Display Quota Arrangement
MEQ4 Changes to Quota Arrangement
MEQ6 Analyze Quota Arrangement
MEQ7 Reorganize Quota Arrangement
MEQ8 Monitor Quota Arrangements
MEQB Revise Quota Arrangement
MEQM Quota Arrangement for Material
MER4 Settlement re Customer Rebate Arrs.
MER5 List of Customer Rebate Arrangements
MER6 Busn. Vols., Cust. Reb. Arrangements
MER7 Extension of Cust. Reb. Arrangements
MER8 Det. Statement: Cust. Rebate Arrs.
MER9 Statement: Customer Reb. Arr. Stats.
MERA Comp. Suppl. BV, Cust. Rebate Arrs.
MERB Check re Open Docs. Cust. Reb. Arr.
MERE Workflow: Sett. Cust. Rebate Arrs.
MEREP_EX_REPLIC SAP Mobile: Execute Replicator
MEREP_GROUP SAP Mobile: Mobile Group
MEREP_LOG SAP Mobile: Activity Log
MEREP_MIG SAP Mobile: Migration
MEREP_MON SAP Mobile: Mobile Monitor
MEREP_PD SAP Mobile: Profile Dialog
MEREP_PURGE SAP Mobile: Purge Tool
MEREP_SBUILDER SAP Mobile: SyncBO Builder
MEREP_SCENGEN SAP Mobile: SyncBO Generator
MERF Updating of External Busn. Volumes
MERG Change Curr. (Euro) Cust. Reb. Arrs.
MERH Generate Work Items (Man. Extension)
MERJ Recomp. of Income, Cust. Reb. Arrs.
MERS Stmnt. Sett. Docs. Cust. Reb. Arrs.
MEU0 Assign User to User Group
MEU2 Perform Busn. Volume Comp.: Rebate
MEU3 Display Busn. Volume Comp.: Rebate
MEU4 Display Busn. Volume Comp.: Rebate
MEU5 Display Busn. Volume Comp.: Rebate
MEW0 Procurement Transaction
MEW1 Create Requirement Request
MEW10 Service Entry in Web
MEW2 Status Display: Requirement Requests
MEW3 Collective Release of Purchase Reqs.
MEW5 Collective Release of Purchase Order
MEW6 Assign Purchase Orders WEB
MEW7 Release of Service Entry Sheets
MEW8 Release of Service Entry Sheet
MEW9 mew9
MEWP Web based PO
MEWS Service Entry (Component)
ME_SWP_ALERT Display MRP Alerts (Web)
ME_SWP_CO Display Purchasing Pricing (Web)
ME_SWP_IV Display Settlement Status (Web)
ME_SWP_PDI Display Purchase Document Info (Web)
ME_SWP_PH Display Purchasing History (Web)
ME_SWP_SRI Display Schedule Releases (Web)
ME_WIZARD ME: Registration and Generation
J1IS Outgoing excise Invoice Others
J1IH Exice JV
J1IEX Capture Display Post in coming excise invoice
 
 
 
SD MODULES
 
 
VS00 Master data
VC00 Sales Support
VA00 Sales
VL00 Shipping
VT00 Transportation
VF00 Billing
At configuration
VOV8 Define Sales documents type (header)
OVAZ Assigning Sales area to sales documents type
OVAU Order reasons
VOV4 Assign Item categoreies(Item cat determination)
VOV6 Scedule line categories
OVAL To assign blocks to relevant sales documents type
OVLK Define delivery types
V/06 Pricing
V/08 Maintain pricing procedure
OVKP Pricing proc determination
V/07 Access sequence
Enduser
VD01 / XD01 Customer Master Creation
VD02 Change Customer
VD03 Display Customer
VD04 Customer Account Changes
VD06 Flag for Deletion Customer
XD01 Create Customer
XD02 Modify Customer
XD03 Display Customer
MM00 Create Other material
VB11 To create material determination condition record
CO09 Material availability Overview
VL01 Create outbound delivery with ref sales order
VL04 Collective processing of delivery
VA11 Create Inquiry
VA12 Change Inquiry
VA13 Display Inquiry
Sales & Distribution
   
Sales order / Quote / Sch. Agr. / Contract
VA01 Create Order
VA02 Change Order
VA03 Display Order
VA02 Sales order change
VA05 List of sales orders
VA32 Scheduling agreement change
VA42 Contract change
VA21 Create Quotation
VA22 Change Quotation
VA23 Display Quotation
Billing
VF02 Change billing document
VF11 Cancel Billing document
VF04 Billing due list
FBL5N Display Customer invoices by line
FBL1N Display Vendor invoices by line
Delivery
VL02N Change delivery document
VL04 Delivery due list
VKM5 List of deliveries
VL06G List of outbound deliveries for goods issue
VL06P List of outbound deliveries for picking
VL09 Cancel goods issue
VT02N Change shipment
VT70 Output for shipments
General
VKM3 / VKM4 List of sales documents
VKM1 List of blocked SD documents
VD52 Material Determination
 
 
 
 
view comments...
 
20.  Physical Inventory

What is physical inventory?

 

Physical inventory is a process in which all the transactions related to the movement of goods are stopped and the company physically counts inventory. It is required in financial accounting rules or for placing an accurate value on the inventory for tax purposes.

 

view comments...
 
21.  Payment Terms - Customer pays with in 21 days he gets 10 % disc and if he pays with in 30

Payment Terms

 
Question:
We raise invoice to a customer. Now if customer pays with in 21 days he gets 10 % disc and if he pays with in 30 days he gets 5% disc, How do we configure thisscenario?

Answer
In the above scenario, we can choose payment terms as one of the fields in the condition table: Payment terms can be defined as follows:

NT21 Within 21 days Due net (For NT21 the customer would get 10% discount)
NT30 Within 30 days Due net (For NT25 the customer would get 5% discount)

Upon selecting the relevant payment terms system would determine the percentage discount in the document.
------------------------------------------------------------------------------------------------

Now If the scenario is 'unless the customer pays the amount, payment date is not known hence we don't know which payment term to use and which discount to apply' as u mentioned. e.g.

e.g. payment terms => 21 days 10% cash discount; 30 days 5% cash discount
 
Then we have to use the condition type SKTO , it is a special condition type used strictly for this scenario i.e based on which payment term discount should be applicable. This condition type is not passed to accounting and generally not to COPA either (as you can see no Act keys for this condition type is not maintained in the Pricing Procedure)
The condition category E cash (in V/06) discount tells the system to go get the payment terms and calculate the potential/actual value i.e. 10% within 21 days and 5% within 3
0 days.

Based on the differing payment terms while payment, Invoice value will not change and would be the same, but SKTO will correct the value and discount is calculated in A/R instead.
The discount is applied on posting of the invoice and an error will be raised if the payment amount does not equal the net - the calculated cash discount. In the above case, if the payment is within 30 days, then system will throw an error if the customer takes 10 % instead of 5%.

 
 
Hope I am able to explain: Post your doubt/s via comment/s
view comments...
 
22.  Delta Management for SAP Datasource

Delta Process in SAP system

 

MAKING DATASORUCE DELTA CAPABLE

Activity In the ERP system:

Make the datasource delta capable

 Since we create the datasource for SAP in the sap system we have to make the changes in the source system and replicate the changes in the BW system

Making datasource delta capable

This activity is carried out in the SAP ERP system

Video : 

REPLCIATION OF DATASOURCE:

Activity in BW system:

 

After you make it delta capable, we replicated the changes to the BW system

 

Once the datasource is replicated we will initialise the process to create the queue

 

The Queue is created in the ERP system however its initiated form the BW system using the Info Package with the option “Initialize delta process”

Video : 

TO CHECK THE DELTA QUEUEN IN R/3 SYSTEM:

Activity in ERP system:

 

To check the queue in the ERP system from the customising screen

Use menu (SBIW) à Customizing extractors

 

General settings à Check delta queue

T-Code – RSA7

Video : 

 

Once the queue is created the queue name is similar to the datasource

 

CREATING DELTA INFO PACKAGE:

This activity is in the BW system:

Now you the delta update will be visible in the info package which will be the option we select to take the delta data from the delta queue

Video : 

 

view comments...
 
23.  BI Performance Tuning with Infocubes and Query

When the query is running slow, how should we improve the query performance? Query Performance

Ø   When we are extracting data from source system to BI, Loading might be going slow? Loading Performance

Ø   Query Performance: Query Execution Process: Whenever we execute the query it triggers the OLAP processer, it first check the available of data in OLAP cache if cache is not available, it identifies the info provider on which BEx report should be executed on & it triggers the info provider & selects the records & aggregates the records based on characteristic values in the OLAP processer & transfer to the front end (BEx) and the records are formatted in the front end.

Ø   Frontend Time: Time spent at the Bex to Execute the query is called as Frontend time

Ø   OLAP Time: Time spend at the processer to perform the process called as OLAP time

Ø   DB Time: The time spent at the database to retrieve the data to the processer is called as DB Time

Ø   Total Time taken to execute query = Frontend time + OLAP time +DB time

Ø   Aggregation ratio: Number of records selected from the database to the OLAP processor / number of records transferred to the BEX

Ø   1. How to collect the statistics:RSA1 à Tools à Setting for BI statistics (Tcode: RSDDSTAT)à(RSDDSTAT_DM & RSDDSTAT_OLAP tables will collect the statics)à If the tables already having data delete the contents of table à You can find a button delete statistical data à It will ask the period à Delete àObserve the job in SM37 à Now select the info provider & Query on which you want maintain the statics àMake the necessary settings

Ø   Save à Now if any one execute the query the statics will be maintained in statistical tables

Ø   How to analyse the statistics collected:1) By looking at the contents of the tables RSDDSTAT_DM, RSDDSTAT_OLAP

Ø   Another Way: By using the Transaction code ST03N

Ø   Another Way: By Implementing BI statistics

Ø   Go to statistical tables à Contents à Settings à List Format à Choose Fields à Deselect all à Select INFO CUBE & QUERY ID (Name of the query) & QDBSEL (Number of records selected from data base) & QDBTRANS (Number of records transferred to BEX) & QTIMEOLAP (Time spent at OLAP) à  QTIMEDB (DB TIME) àQTIMECLIENT (FRONTEND TIME)à TRANSFER à Observe the statics

Ø   Another way: ST03 à Expert Mode à Double click on BI  Work Load à Select drop down for aggregation àSelect Query à Filter your query à Go to all data tab à Observe the statistical information

Ø   Another Flexibility by implementing BI statistics (RSTCC_INST_BIAC) àInstead of looking the data in the table what SAP has done is they have given some readymade queries, info cubes, transformations, readymade multi providers, install them & load the data to these cube à There are some readymade BEx queries which will give the analysis of the reports

Ø   0TCT_C01 (Front-End and OLAP Statistics (Aggregated))

Ø   0TCT_C02(BI Front-End and OLAP Statistics (Details))

Ø   0TCT_C03(Data Manager Statistics (Details))

Ø   0BWTC_C04(BW Statistics - Aggregates)

Ø   0BWTC_C05(BW Statistics - WHM)

Ø   0BWTC_C09(Condensing Info Cubes),

Ø   0BWTC_C11(Data deletion from info cube),

Ø   0TCT_MC02 (MULTIP PROVIDER - Front-End and OLAP Statistics (Details))

Ø   0TCT_MC01 (Multi Provider - Front-End and OLAP Statistics (Aggregated))

Ø   0BWTC_C10 (Multi Provider - BW Statistics)

Ø   Most of the system maintenance reports come from these contents, Like How many number of users used some reports & Administration Reports

Ø   STEPS: Install the Business content data source à RSA5 à Expand the application component Business Information Ware house àExpand application component TCT àInstall the data sources (Total 6)àReplicate the data source using My Self Connectionà RSA13 à Select My self-Connection  à Data Source overview àExpand BW data sources à Expand Business information warehouse à Technical content à Context menu àReplicate à

Ø   Install all other contents like info cubes, reports, multi providers, info packages, transformations, DTP’s

Ø   RSOR àExpand the Multi Provider à Double click on select objects àFind à0BWTC_C10 à Select Inflow Before & After àInstall in the background à Once the installation is done

Ø   Load the data to the all cubes by scheduling the info package & DTP’s

Ø   2 reports which are mainly used for report analysis àutilizing OLAP per query (0BWTC_C10_Q012)& utilizing OLAP per Info cube (0BWTC_C10_Q013)

Ø   Open the query Q012 in analyzer à Execute à Specify the cube name & query name à Execute à Observe the statistics

Ø   Different aspects what we can do to improve query performance:

Ø   If DB TIME IS MORE: 1.Modelling Aspects 2. Query design 3. Compression

  1. 4. Aggregates 5. Partitioning 6. Read mode of the query 7. Pre calculated web template 8. Line Item dimension 9. Indexes
view comments...
 
24.  Step by step procedure to transport SAP BI/BW Objects

Contents

Introduction
Including a BW object into a transport request
Release a transport request
Import queue
Importing Requests
a) Starting the Import for Individual Requests (Single Transport Strategy)    
b) Starting the Import of All Requests in an Import Queue
c) Starting the Import of All Requests in One or More Projects
Import History and Return Codes
Re-import a transport request
Delete a transport request from the Import Queue
Updating the Import Overview and the Import Queues

Introduction
A transport request is a package that is used to collect developed objects and move them from one SAP system to another. It is not encouraged to implement newly created objects directly in the production environment to prevent risk factors like loss of data, data flow changes etc., and hence transport request is used. The required developed objects are included in the transport request and transported from development systems to many testing systems (like Quality Assurance, Regression, Pre-Production), tested and finally moved to Production. So initially the required object(s) is included in the transport request and released from the source system then it is imported in the target system.    

Including a BW object into a transport request                                    
There are many ways to include a BW Object in a request of which one is shown here. Call T-Code RSA1 in the Source system (here DEV) > Transport Connection functional area > Search for the object that needs to be transported and drag/drop it on the right side of the screen as shown in Figure 1. Consistent requests that take object dependencies into consideration are especially important in BI because the metadata objects are activated in the import post processing step. If dependent objects are missing in the transport request, this results in errors during activation. These dependencies are mapped to the grouping modes when the objects are collected.


                                                                             Figure 1

Grouping Mode

Only Necessary Objects: 
Only those objects that are really required for the action (copying or transporting the selected objects) are taken into account (minimal selection).

In Data Flow Before: 
The objects that pass data to a collected object are collected. For an InfoCube, for example, all the objects those are in the data flow before the InfoCube, and are therefore necessary for providing data to the InfoCube, are collected. This includes transformation rules and InfoSources, for example.

In Data Flow Afterwards: 
The objects that get their data from a collected object are collected. For an InfoCube, for example, all the objects that are in the data flow after the InfoCube, and are therefore reporting objects that display the data stored in the InfoCube, are collected. This includes queries and Web templates, for example. 

In Data Flow Before and Afterwards: 
All objects that provide or pass on data are collected. 
For example, if you are using an InfoCube, the objects required to activate the InfoCube are collected together with other objects that are required to activate those objects as well. This includes objects positioned both before and after the InfoCube in the data flow.

Save for System Copy:
This setting is used when a source system needs to be copied and renamed. Hence having to re-create objects for both SAP systems and BI systems and be avoided.

Collection Mode

Collect Automatically (default setting): The data is collected as soon as the objects are selected.
Start Manual Collection: The data is not collected until you choose  Collect Dependent Objects.

Once Grouping and Collection modes are selected, click on   symbol to create a transport request (see Figure 2), select the Package name (Specific for a project, get it from the Basis Team) and save. Request ID is generated as SIDKXXXXXX, (SID-System ID). By default all the objects are included in the package $TMP. 

                                                                         Figure 2

Release a transport request

Use: Initially the request will be in Modifiable state, it should be released from the development system to move it into further systems.

Procedure
Call T-Code SE01/SE09 in the source system (here DEV) > Enter the transport request, release the task (here DEVK123452) and then the main transport (here DEVK123456) as shown in Figure 3.

3.jpg
                     Figure 3

To release a request Click on  Release directly. As soon as the transport request is released, it should be available in the Import Queue of the target system (here testing system). Make sure that the connection exists between these two systems. 

Import queue

Use: The import queue displays all transport requests flagged for import for a particular SAP System. 

Procedure
To check the Import queue Call T-Code STMS in the target system (here TST). It will take you to the TMS screen shown in Figure 4. Now click on  symbol, it will take you to Import Overview (Figure 5) where all the systems defined can be seen. Double click on target system (here testing system) to check the import queue.

18.jpg

Import queue is shown in Figure 6, where

 

Number- Serial number
Request- Transport request
Clt- Client ID
RC-Return Code (Explained in further section) 
Owner- Developer name
Project- Project ID
Short Text- Description of the transport request

For performance reasons, the data required in the queue is read from the transport directory the first time the TMS is called. After that, information buffered in the database is always shown. To refresh the buffered information, choose Edit Refresh (F5). Sometimes even after refreshing the queue  appears next to the transport request as shown in Figure 7. Click on  (Adjust Import Queue) and choose ‘Yes’ (Figure 8). Here TMS transfers the data files and co-files belonging to this project and confirms the transfer in the import queue. Now the transport request is ready to be imported into the target system.

19.jpg

Importing requests

Before you import the requests from an import queue into an SAP System, ensure that no users are importing other objects in this SAP System because only one transport request can be imported at a particular instant of time. If multiple transports are imported simultaneously then the transports are imported only one after the other i.e. in parallel. There are three ways to import the request.

a) Starting the Import for Individual Requests (Single Transport Strategy)
The TMS allows importing individual requests from the import queue. The requests you choose are imported in the order in which they are placed in the import queue. Select the Transport request and click on Transport Request as shown in figure 5. The screen displayed (Figure 9) helps you in choosing the options to import the transport request which is explained below.

Starting an Import: Date Tab

All the options for starting an import in TMS are listed here.

9.jpg
                                               Figure 9

The options you have depend on which import type you have chosen (project or individual import, import all requests, transport workflow).

  1. Immediate: If you want the import to start immediately in a dialog, choose Immediate.
  2. At start timeIf you want the import to start at a later time, choose this option. The import is scheduled as a background job in the target system. If you enter a date and time in the field No start after, the import is started in the time frame between Planned start and No start after. If there is no background process available in this window, the import will not happen. If you want the import to be performed regularly, you must choose a period in the field Period. The Period option does not exist for single transports and the transport workflow.
  3. After event: If you want the import to start only after an event is triggered, choose this option. If you choose the option Execute import periodically, the import is started each time the specified event is triggered. Otherwise, the import is started only when the event is triggered the first time. The Execute import periodically option does not exist for single transports and the transport workflow.


Starting an Import: Execution Tab

On the tab page Execution, you can specify how you want the transport control program tp to start:

20.jpg

  1. Synchronously: If you want the dialog or background process to wait until the import has been completely performed by tp, choose this option (figure 10). It is useful, for example, if subsequent actions are to be performed in the system after the import. If you schedule the import to run synchronously in the background, the background job, which performs the subsequent actions, can wait until the end of the import. A dialog process or background process is blocked until the import has ended.
  2. Asynchronously: If you want to release the dialog or background process after the transport control program has been started, choose this option (figure 11). It is useful if there are a lot of requests waiting for import, which would make the import take a long time. After tp has been started by the dialog or background process on the operating system level, the SAP process ends and tp imports the requests.

    The option Asynchronously is the default setting for importing projects or importing all the requests in an import queue. However, the option Synchronous is the default setting for importing single requests. For other import types, it is always asynchronous.


Starting an Import: Options Tab

All the options for starting an import in TMS are listed here (Figure 12). The options you choose depend on which import type you have chosen (project or individual import, import all requests, transport workflow).

12.jpg

                                                                    Figure 12

  1. Leave transport request in queue for later import: This causes these requests to be imported again in the correct order with the next import of all the requests. This option is useful if you have to make preliminary imports for individual requests. This prevents older objects from being imported at the next regular import of all the requests.
  2. Import transport requests again: The transport control program also imports the transport request if it already has been completely imported.
  3. Overwrite originals: The transport control program also imports objects if the objects are the originals in the target system. The object directory entry determines the SAP System where the original version of an object is located.
  4. Overwrite objects in unconfirmed repairs: The transport control program also imports objects if they were repaired in the target system and the repair is not yet confirmed.
  5. Ignore unpermitted transport type: The transport control program imports the transport request if this transport type was excluded by particular settings in the transport profile.
  6. Ignore predecessor relations: You can choose this option if you want to import all the requests for one or several projects, but additional requests from other projects exist for which there are dependencies. This option is switched off by default, which means the predecessor's relationships are checked before the import. The import only occurs if the predecessor's relationships will not be damaged.


Here in our case, we select only the option Ignore predecessor relations and proceed.

b)   Starting the Import of All Requests in an Import Queue

When you import all the requests from an import queue, they are imported in the order in which they are placed in the queue. Each import step is performed for all requests. First, all the dictionary objects in the requests are imported, then all the Dictionary objects are activated, and then the main import is performed for all requests

      c) Starting the Import of All Requests in One or More Projects

If you have assigned your transport requests to project, you can import all requests that belong to a single project together. The requests are imported in the order in which they are placed in the import queue. This also applies if you want to import all the requests from multiple projects together. All the requests in one project are not imported first, followed by all the requests in the next project. Instead they are imported in the order in which they are placed in the import queue.


Import History and Return Codes

Import History

Use: In the import history, all the requests imported into a particular system for a specific time interval, and their maximum return codes are displayed.

Procedure: 
To check the import history, Go to Import Queue > Click on (Import History).

Return Codes

Use: To check whether a transport request has been successful imported, the return codes (Figure 13) are generated by the programs used for the transport.

Procedure:
Click on  (Import History) in the Import Queue Screen. Return code for a particular request can be seen next to it.13.jpg
                                                                        Figure 13

Re-import a transport request

Use Sometimes there exists a situation where the transport request needs re-import. So the transport request should be moved from history to the import queue. 

Procedure

Go to History > Extras > Other Requests > Add. 
Enter the transport request needs re-import, Target Client and check the Import again as shown in Figure 14 and Figure 15.

21.JPG

Now the transport request will be ready in the import queue for importing again. Select the request and click on Transport Request > Now select required options in Date and Execution Tab. In Options tab you need to select the below options (Figure 16) because the request needs re-import. The use of each option is same as explained in previous section.

16.jpg

                                              Figure 16

Delete a transport request from the Import Queue

Use In exceptional cases, you may have to delete a transport request from the import queue so that the request is not imported into the target system. If you delete change requests from the import queue, inconsistencies may occur during the next import due to shared objects. Suppose you delete request 1 containing a data element which is not imported. In request 2, you transport a table that references this data element. Since the referenced data element does not exist in the target system, there is an Activation error when request 2 is imported.

Procedure
To Delete a Request from the queue > Select the transport request (F9) > Request (Menu) >Delete

Updating the Import Overview and the Import Queues

 

Use You can refresh the display of the import overview and individual import queues. However, it is more convenient to update the import queues periodically in the background.

Procedure
To update import queues in the background > Enter transaction STMS > Choose The Import Overview appears > Choose Extras Update all import queues. The dialog box Update All Import Queues in Background appears. > Choose the correct option (from ImmediateAt start timePeriod) and enter the required data. Use input help to select a period > Choose Continue.

Schedule a periodic update of the import queues, at least in the SAP System where you use TMS the most (default period Daily is recommended).

 

Related Content

http://help.sap.com/saphelp_nw2004s/helpdata/en/3d/ad5a8a4ebc11d182bf0000e829fbfe/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/0b/5ee7377a98c17fe10000009b38f842/content.htm
http://help.sap.com/printdocu/core/print46c/en/data/pdf/bcctstms/bcctstms.pdf

Posted by :Sai Prasad

Src : http://scn.sap.com/docs/DOC-25970

view comments...
 
25.  Transporting Objects in BW/BI

IMPORTANT THINGS WHILE TRANSPORTING OBJECTS BW 

Normally as a beginner we do have some issues in transporting the objects in the landscape..
 
So I thought this would through some light on to understanding the collection of objects and transporting across the landscape in SAP BW.
 
Firstly we need to know about the T-code's SE09/SE10/SE01 which are normally used for transportation purpose..across the landscape.
 
 
 
I wanted to explain a scenario where in I have following dataflow.
 
Eg 1:7.0 data flow.
 
Multiprovider <--Infocube<--Transformation<--DSO<--Transformation<--Datasource
 
Following are the objects need to be checked in a transport before transporting.
 
1. Multiprovider
2. Info cube
3. Transformation
4. DSO
5. Transformation
6. Data source.
Collect all above objects in a transport..
 
The other example is on the collection of Process chain Variants.
 
Eg 2:For Process Chain.
 
Collect all the variants in process chain before transporting from EXTRAS Menu and doing a Object directory entry.
 
Apart from this we do have provision of transport connection
 
SRC : http://bwdude-sapbi.blogspot.co.uk/2013/08/transporting-objects-in-bwbi.html
view comments...
 
26.  What is Master data

What is Master Data

 

 

A master data is a table, which contains details about an entity. For e.g. entity here can be referred as a customer, Material, Employee’s etc. .A master data for customer will have details of all the customer in an sales organisation. Like the name, D.O.B, Address, Phone and other details. This list of details of the entire customer is called a customer master data and likewise a list of all the materials is called a material master data.

 

 

The below figures illustrates a table with a list of 2 customers and their respective details an example of customer master data.

 

 What is Master Data

 

The table has the following fields

  1. Customer ID
  2. Name
  3. Customer Address
  4. Customer Phone
  5. Language
  6. Description

 

From the Above fields Name, Customer Address and Phone number are the details of the respective customer ID and the language and description fields records in which language the record was entered and description in the respective language

In BW / BI terminology the details fields are called Attribute Fields and the Language and Description field are called the text fields.

 

As illustrated in the figure a master data can either have attributes or only text or can have both of them attributed and text based on the requirements.

 

In the below example we have 2 tables one Material master data having details of the respective material and material group master data having details of the various material groups

 

 What is Master Data reference

 

Here in the figure the fields name, price, price unit and material group Id are details (Attributes) of the material id field however the field material group id further has its own details (Attributes) which is “Group name” and text “Description”

view comments...
 
27.  TVs – The Newest Mobile Device

“Mobile OS” may no longer be the best term for the software powering our phones and tablets. Google’s Chromecast and Apple TV are tangible examples of the cool stuff mobile platforms can do when they kick back in our living rooms. With hugely popular Kickstarter campaigns putting similar products on store shelves and persistent rumors of an Amazon-branded set-top box, it’s fair to assume the media server is making itself at home.

New products, same story

Chromecast is to TVs as Android is to smartphones. Like all Google devices, Chromecast has far fewer barriers to entry than its competitors. According to a recent PCMag.com article, the Google Cast SDK lets brands and media properties stream their content with a few modifications instead of having to build an app from the ground up.

The other leading cable box alternative, Apple TV, takes a characteristically insular approach to software. Most of the movies, music and shows streamed through Apple TV come from the iTunes store, which offers a diverse á la carte menu of media content. That’s not to say they’re completely resistant to outside content, as seen with their built-in Hulu Plus and Netflix capabilities. Third parties can feasibly bring their apps to Apple TV, but is it worth the extra effort now that Chromecast is on the market?

Apple TV’s backdoor entry

Though there’s no real App Store for Apple TV – permitted apps come pre-installed – it’s possible to sneak onto viewers’ televisions through Apple’s proprietary streaming technology, AirPlay. When paired with a mobile iOS device, AirPlay turns the user’s TV screens into an independent source of visual information and their iPad, iPhone or iPod Touch into a sophisticated remote control.

Vidora is one of the first apps to seize the streaming opportunity. Used on its own, this iPad app lets viewers pull content from popular online destinations like Hulu and Amazon Instant Video. Connecting it to an Apple TV via AirPlay brings the video content from the small screen to any display you can imagine (as long as it’s Apple TV-compatible).

Supersizing your video content isn’t Vidora’s only benefit. The app also provides pertinent recommendations – much like Netflix – and shows you exactly where to find the video you’re searching for. Once you’ve navigated the peaceful waters of Vidora with your iPad, you’ll never want to use your Apple TV remote again.

Follow the demand

According to Russ Crupnick, senior vice president of industry analysis at the NPD Group, over half of streaming TV viewers are between 18 and 34 years old. In the same way millennials have largely eschewed landlines for cell phones, today’s youth will have no problem cutting cable television for a cheap, convenient alternative like mobile-based media servers.

It makes perfect sense from their financial perspective. While it may sting to lose the few bits of content that require a cable agreement – like live sports – paying $50 or more each month for a bunch of TV shows you don’t even watch hurts worse, especially for young adults on a shoestring budget.

The abundance of unwanted channels is a large part of the reason people clamor for services like HBO GO to offer a standalone option, even at a higher cost. Paying more for a service that better suits your lifestyle doesn’t seem illogical when it eliminates a larger monthly bill. The only thing that doesn’t make sense is why so many people are still attached to their cable box.

Source : http://blogs.sap.com/innovation/mobile-applications/tvs-the-newest-mobile-device-0431646

view comments...
 
28.  SMS Still Rulz

SMS is a killer app. It’s so simple. There’s no app to download. There’s no unexpected crashing. You don’t have to have a smartphone. In a world with a lot of complicated technology, I love to see companies still finding creative, useful ways to use SMS. (Yes, I’ve beaten this drum before.)

1

Like this one: Birds Eye has partnered with a couple of health-related nonprofits to help end childhood obesity in the U.S. The program is called Dinner Made Easy. Text a shortcode to subscribe, and you get a couple of text messages each with recipies, nutritional information and tips about making healthy food choices.

At London’s Charing Cross underground station, the police have a new campaign that solicits information about non-emergency incidents—by text. You simply send a text to a shortcode. The police gather more information this way, and can probably save time on typing up reports with the old copy-and-paste.

Another at Charing Cross: Brook Street recruitment firm is using SMS as a simple and immediate2call-to-action to managers looking to attract and retain new staff. SMS makes for an ideal CTA, as you can quickly fire off a text. Very unlikely you would stop and send an email on your way through the station, let alone download an app.

Bank of Queensland customers find the nearest ATMs and branches by texting their location (as city and state or postcode) to a shortcode. The bank will return an SMS with the location of up to four ATMs or branches.

3

Earlier this year, Citi introduced a super-advanced ATM that extends how much banking you can do outside of a branch. It has biometric identity authentication, an online banking connection, video conferencing and—SMS! SMS still provides the universal mobile service to Citi account holders, which the bank uses for sending information, alerts, dispute resolution notices and one-time-pin for online banking.

And one more train station example: Colgate has been running a promotion for its new electric toothbrushes at London’s Waterloo Station. In addition to the electronic billboard ads, radio spots and newspaper inserts, Colgate used SMS so you could set a reminder to go to the station at the right time. Although by the huge queues I saw that day, perhaps it was a little too successful!

 
 Source : http://blogs.sap.com/innovation/mobile-applications/sms-still-rulz-0353964
view comments...
 
29.  What Can Mobile App Development Teams Learn from a Spinning Top?

For many, it is hard to imagine a world when simpler, non-electronic, toys were the primary options for fun. How quickly we seem to forget!

IT and C-level executives might be surprised to discover there are three business lessons that can still be learned from a simple toy like a spinning top.

Here are three for consideration:

Simplicity can increase durability

Sometimes, the simplest concept can stand the test of time. Archeologists have found spinning tops that date back over five thousand years. And, here’s the most amazing stat: they still function today exactly as they did then. What about your mobile app? How will it stand the test of time? Is anyone taking bets that a cell phone, or any of its apps, will still be working five thousand years from now? How about one year from now?

Tipping points

Without going into the physics of how a spinning top works, suffice it say that once a top is correctly spinning on a smooth surface, it will continue to do for as long as its’ spinning inertia can maintain a balance. Once inertia begins to slow, balance will falter, and the spinning top will revert back into being just an inert object. Eerily this description fits mobility software programs, too. Finding the right balance in software features and functions, without making it overloaded, may make the critical difference in the lifespan of the product. There are tipping points when all software programs stop being useful. And, an unused software app is another definition of an inert object. Have you identified your tipping points?

User interfaces

A complex concept implies complex user interfaces. Plus, a complex concept has more points of failure than a simpler concept. A spinning top is an intuitive product. The very design of a top invites the user to give it a spin with a flick of the wrist. When users look at your mobile app, what is appealing and inviting about it? Is it intuitive or intimidating? Are users ready to give it a flick or a swipe to get started?

Final thoughts

What are your best case hopes and aspirations for the life of your mobile app? A year? More than two years? More than five years?? Perhaps emulating the lessons learned from a spinning top will help produce positive influences on your mobile application projects. Aim high!

Source from: http://blogs.sap.com/innovation/mobile-applications/what-can-mobile-app-development-teams-learn-from-a-spinning-top-0426731

 

view comments...
 
30.  SAP ERP Simulation Game by BATON Simulations

The SAP ERP application has the potential to transform your business.   But getting the best possible return on your software investment ultimately comes down to several things including the strength of your implementation team, business user ability, and how well managers and executives understand what the software can do.  The likelihood of a successful implementation increases dramatically when your team is committed and excited to learn how to use the software.  But instilling this excitement can be a challenge.

 

Research shows that traditional training methods focusing on transactions and keystrokes aren't as effective as experiential, hands-on learning practices.  To take full advantage of the power and potential of SAP software, your business needs to be engaged and invested in the learning process.  The SAP ERP Simulation game by Baton Simulations offers your organization a proven way to get new users to accept SAP software.  It also helps existing users increase their understanding of the software so they can use it more effectively and collaboratively in your organization.

 

As a hands-on learning game, SAP ERP Simulation is played in a classroom setting, in teams, on a live instance of the SAP ERP application.   The interactivity of the game  encourages your learners to work together to achieve true collaboration during the execution of enterprise business processes.

 

Here is how the game is played:  Throughout the game, participants interact in the software to demonstrate how their individual contributions impact other parts of the business.  For example, one team member prepares a forecast and orders raw materials.   To do so, they access screens and transactions relating to independent requirements and material resource planning.  At the same time, another team member adjusts pricing and makes marketing decisions based on sales data, and market intelligence, that is being monitored by a third team member.  Your learners will see that the best results require not only great individual execution, but great teamwork. 

 

During the course of the game, team members will:

 

  • interact with suppliers and customers by sending and receiving orders,
  • manage inventory levels,
  • document the production and delivery of products,
  • manage cash flow, and
  • make decisions about marketing, plant, and distribution-system improvements.

 

The game accelerates participants along the learning curve.  It also generates tremendous motivation among business users, executives, and project teams.

 

After playing the game, participants report:

 

  • significantly increased skill,
  • more positive attitudes,
  • greater confidence in their ability to master software transactions and reports, and
  • stronger belief in the potential of SAP software to add value to the enterprise.

 

In short, they are ready to go -- with enthusiasm, understanding, and positive expectations.  Positive attitudes and adequate preparation can reduce your organization's training and support costs while shortening the ramp-up time for new users.  The game provides deep learning embedded in engaged doing -- with results that help your organization achieve the best possible return on your software investment.  Ultimately, SAP ERP Simulation adds value to your enterprise by helping business users leverage the power and potential of SAP software.  

 

PRODUCT DETAILS

 

SAP ERP Simulation is a classroom based competitive business game, played in a live SAP environment. It provides a compelling way for the learning 2.0 generation members of your workforce to harness the power of SAP solutions in their day to day activities. SAP ERP Simulation can be purchased two ways:

 

  • As a six (6) month subscription, allowing multiple members of your organization to access and utilize the system at their convenience.
  • As a single-day game delivered at the customer location.

 

LINK TO THE US SAP ERP SIMULATION BY BATON SIMULATIONS WEBPAGE:   http://www.sap.com/usa/services/education/softwareproducts/erp-simulation/index.epx

 

LINK TO OVERVIEW DEMONSTRATION OF SAP ERP SIMULATION BY BATON SIMULATIONS:  https://sap.na.pgiconnect.com/p43457272/

 

LINK TO QUICK OVERVIEW OF SAP ERP SIMULATION BY BATON SIMULATIONS:  http://www.sap.com/demos/richmedia/rm_3p_mediaPlayer.epx?movieSource=/demos/RichMedia/videos/sap-erp-simulation-game-by-baton-simulations-inspiring-employees-to-master-sap-software-through-collaborative-business-games-10-ov-us.epx&mWidth=720&mHeight=405

Srchttp://scn.sap.com/people/kenneth.schieffer/blog/2010/07/16/sap-erp-simulation-game-by-baton-simulations-inspiring-employees-to-master-sap-software-through-collaborative-business-games

 

 

view comments...
 
31.  SAP BW 7.30 : Performance Improvements in Master-Data related scenarios and DTP Processing
With Data Warehouses around the world growing rapidly every day, the ability of a Data Warehousing solution to handle mass-data, thus allowing for the ever-shrinking time-windows for data loads is fundamental to most systems.
BW 7.3 recognizes the “need of the hour” with several performance related features and in this blog, I will discuss the performance features related to data loads in SAP BW 7.3, focusing mainly on Master Data Loads and  DTP Processing.
Here is the list of features discussed addressed in this blog -

Master Data
  1. Mass Lookups during Master Data Loads
  2. The “Insert-Only” flag for Master Data Loads.
  3. The new Master Data Deletion
  4. SID Handling
  5. Use of Navigational Attributes as source fields in Transformations. 
DTP Processing
  1. Repackaging small packages into optimal sizes.

MASTER DATA

1.   Mass Lookups during Master Data Load

Data loads into a Master Data bearing Characteristic require database look-ups to find out if records exist on the database with the same key as the ones being loaded. In releases prior to SAP BW 7.3, this operation was performed record-wise, i.e. for every record in the data-package, a SELECT was executed on the database table(s). Obviously, this resulted in a lot of communication overhead between the SAP Application Server and the Database Server, thereby slowing the Master Data loads down. The effect is pronounced on data loads involving large data volumes.
The issue of overhead between the SAP Application Server and the Database Server has now been addressed by performing a mass-lookup on the database so that all records in the data-package are looked-up in one attempt. Depending on the DB platform it can bring up-to 50% gain in load runtimes. 

2.   The ‘Insert-Only Flag’ for Master Data Loads

  • Starting NW 7.30 SP03, this flag will be renamed to – “New Records Only”. The renaming has been done to align with a similar feature supported by activation of DSO data. (See blog Performance Improvements for DataStore Objects )
As mentioned above, the Master Data Load performs a look-up on the database for every data-package to ascertain which key values already exist on the database. Based on this information, the Master Data load executes UPDATEs (for records with the same key already existing in the table) or INSERTs (for records that don’t exist) on the database.
With the ‘Insert-Only’ feature for Master Data loads using DTPs, users have the opportunity to completely skip the look-up step, if it is already known that the data is being loaded for the first time. Obviously, this feature is most relevant when performing initial Master Data loads. Nevertheless, this flag can also be useful for some delta loads where it is known that the data being loaded is completely new.
Lab tests for initial Master Data loads indicate around 20% reduction in runtime with this feature. 
The ‘Insert-Only’ setting for DTPs loading Master Data can be found in the DTP Maintenance screen under the ‘UPDATE’ tab as shown below.
 
Insert-Only Flag
Note :
  • If the ‘Insert-Only’ flag is set, and data is found to exist on the database, the DTP request will abort. To recover from this error, the user simply needs to uncheck the flag and re- execute the DTP.

3.   The New Master Data Deletion

Deleting MasterData in BW has always been a performance intensive operation. The reason being that before any MasterData can be physically deleted, the entire system (Transaction Data, Master Data, and Hierarchies etc) is scanned for usages. Therefore, if a lot of MasterData is to be deleted, it takes some time to establish the data that is delete-able (i.e., has no usages) and data that is not (has usages). In addition, with the classical MasterData Deletion involving large data volumes, users sometimes ran into memory overflow dumps.
To address these issues, the Master Data Deletion was completely re-engineered. The result is the New Master Data Deletion. In addition to being much faster than the classical version, the new Master Data deletion offers interesting new features like Search-modes for the usage check, Simulation-mode etc. The screen shot below shows the user interface for the new Masterdata Deletion when accessed via the context menu of InfoObjects in the DataWarehousing Workbench.
 
New Master Data Deletion
 
Although the new Master Data Deletion has be available for some time now (since BW 7.00 SP 23), it was never the default version in the system. This implied that the BW System Administrators needed to switch it ON explicitly. With BW 7.30 however, the New Master Data Deletion is the default version and no further customizing is necessary to use it.
All further information about this functionality is documented in the SAP note:1370848 underhttps://websmp130.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/sapnotes/index2.htm?numm=1370848
It can also be found in the standard SAP BW documentation underhttp://help.sap.com/saphelp_nw73/helpdata/en/4a/373cc45e291c67e10000000a42189c/frameset.htm        

4.   SID Handling

This feature relates to the handling of SIDs in the SAP BW system and while it is certainly relevant for Master Data loads, it is not restricted to it. The performance improvements in SID handling are relevant for all areas of SAP BW where SIDs are determined, for example – Activation of DSO Requests, InfoCube Loads, Hierarchy Loads and in some cases, even Query processing.
In BW 7.30, SIDs are determined en-masse’ meaning that database SELECTs and INSERTs that were done record-wise previously have been changed to the mass SELECTs (using the ABAP SELECT FOR ALL ENTRIES construct) and mass INSERTs. The system switches to this mass-data processing mode automatically when the number of SIDs to be determined is greater than a threshold value. The default value of this threshold is 500.           
The threshold value is customizable of course and that can be done in the SAP IMG for customizing under the transaction SPRO by following the path: SAP Netweaver -> Business Warehouse -> Performance Settings -> Optimize SID-Determination for MPP-Databases.
          Note: As the threshold value corresponds to the minimum number of SIDs 
                  to be determined in one step, setting the threshold to a very high value
                  (For example: 100000) causes the system the system to switch back to the
                  classical behavior
.

5.   Use of Navigational Attributes as source fields in Transformations

Quite often there are scenarios in SAP BW where data being loaded from a source to a target needs to be augmented with information that is looked up from Masterdata of Infoobjects. For instance - loading sales data from a source that contains data on Material level to a DataTarget where queries require the sales data to be aggregated by Material Group. In such cases, the Master Data Lookup rule-type in Transformations is used to determine the Material Group for any given Material (given that MaterialGroup is an attribute of Material).
Although the performance of the Masterdata Lookup rule-type has been optimized in earlier versions of BW (starting BW 7.0), there is an alternative to this rule-type in BW 7.30. Now, navigational attributes of Infoobjects are available as source fields in Transformations. The benefits of this feature are two-pronged. 
  • The fact that the data from the navigational attributes is available as part of the source structure allows the data to be used in custom logic in Transformations (example : Start Routines).
  • Secondly, the data from the navigational attributes is read by performing database joins with the corresponding Masterdata tables during extraction. This helps in improving the performance of scenarios where a lot of look-ups are needed and/or a lot of data is to be looked-up.  
To use this feature in Transformations, the navigational attributes need to be switched ON in the source InfoProvider in the InfoProvider maintenance screen as below -
 
Navigation Attributes in InfoProviders
 
Once this is done, the selected navigational attributes are available as part of the source structure of Transformations as shown below – 
 

Navigation Attributes in Transformations

DATA TRANSFER PROCESS (DTP) 

1.     Repackaging small packages into optimal sizes

This feature of the DTP is used to combine several data packages in a source object into one data package for the DataTarget. This feature helps speed up request processing when the source object contains a large number of very small data packages.
This is usually the case when memory limitations in the source systems (for example: an SAP ERP system) results in very small data-packages in the PSA tables in BW.  This DTP setting can be used to propagate the data to subsequent layers in BW in larger chunks.
Also, InfoProviders in BW used for operational reporting using Real-time Data Acquisition contain very small data packages. Typically, this data is propagated within the DataWarehouse into other InfoProviders for strategic reporting.  Such scenarios are also a use-case for this feature where data can be propagated in larger packets.
As a prerequisite, the processing mode for the DTP needs to be set to ‘Parallel Extraction and Parallel Processing’. Also note that only source packages belonging to the same request are grouped into one target package.
Below is a screenshot of the feature in the DTP Maintenance.
view comments...
 
32.  Help with updated 2LIS_03_BF
 I have noticed that there are additional fields in 2LIS_03_BF in our current Business Content.  
 
1. Is it possible to install and utilize this updated datasource without having to deactivate the queue in LBWE and request downtime on R3 Side?
 
2. Will the new fields be automatically populated once I just install this datasource?
 
 
Here's a list of the additional fields I'm seeing the Business Content Version.
 
ATTYP Material Category
INVKZ Indicator: Movement type is physical inventory
ITEM_CAT EA Retail BW Extr. Enhancement: Item Type
KORR Indicator: Movement type is inventory correction
MATST Structured material
REC_TYPE EA Retail BW Extr. Enhancement: Data Record Type
RETKZ Indicator: Return from call
SAUTO Item automatically created
UMLKZ Indicator: Movement type is stock transfer
UMMATKZ Transfer Posting Material to Material
VKSTA Value at sales prices excluding value-added tax
VKSTT Value at Sales Prices Including Value-Added Tax
VLFKZ Plant category
WAERSST Currency Key



Sol: Append the these are all fields at the extractor structure level. and to fetch the data into this fields you have to write either user exit or badi.
 
Use the following link for the enhancing the structure and full the data for this fields.

http://bi.sdn-sap.com/2013/06/help-with-updated-2lis03bf.html 


Sol: 
You need to replicate the data source in BI system in order to get the changes.
 
It is not a best practice to enhance the data source with multiple fields as you get performance issues.
 
If you want to have these fields, try to create generic extraction on table/views and then extract the data.
view comments...
 
33.  SAP BI Can we do remodelling on DSO
I need to populate data in the DSO for a new charactersitics that is added in the DSO , can the  SAP BI remodelling tool used for this DSO to populate the data for this DSO .
 The DSO has historical data that has been already loaded .. Can somebody help me with a soution.
view comments...
 
34.  Create Index is taking more time to complete
In my scenario we are loading full load from one data source and delta load from DSO (The data DSO is coming from differenct DS) to the same target 0FIAP_C30.
and we are not deleting the data from cube because of the delta records so the data records are increasing day to day in cube.
due to this create index variant is taking long time around 2:30 hr. 
 
Process chian flow:
 
Data is loading from 0FI_AP_30 (DS) to cube directly with full update,
and the data of  0FI_AP_3 & 0FI_AP_4 both are loading to DSO(standard) with full updated thru DSO we are loading delta to cube.
 
Could you please someone help me how to reduce the completion time of the create index variant.

Sol: 


Hope you are dropping indices before you load to Cube. This will improve loading performance. Creating secondary indexes on Cubes is not mandatory unless your reporting performance is decreased. If there is no such issue, you can remove dropping/building indexes from your process chain. Because, eventually data will increase in the Cube. It will be very difficult to maintain in terms of roll up of aggregates , compression etc..

These indexes are just secondary indexes on the cube. Your actual performance of the cube depends on the design of your dimensions/chars. You can improve its performance by Aggregates, Compression, partitioning etc..
Creating indexes is always time consuming as your cube is full load based. The data will be increasing like hell..


It is not mandatory to drop the index before loading the data to the cube. If your daily delta load has more than 10 % of the data(not a hard and fast rule) in the cube then it makes sense to drop the index and recreate it after load. Also if the cube is not compressed then create index time will be more as each request forms a segment and index creation happens on each of these segment. So do the following for your issue.
 1) Try loading without dropping index
2) if you get DBIF_RSQL_SQL_ERROR during load which happens due to index not being dropped then go for drop and recreate index
3) compress the cube if you dropping and recreating index.

If data size is high the creation of indexes is going to take time.
Generally it is advisable to delete and rebuild the indexes during the time of data load, benefits it will faster the data load, as loading data on already index table is slower than that of without indexes.
Keep the indexes as it is and load data here you are compromising on data load time on the cost of saving index creation time.
One more thing to consider in above case, dead lock issue may arise if your DTP is using more than 1 batch process, so make it 1 to avoid oracle dead lock issue during loading (this way you are further increasing the data load time).
You have to make decision based on the scenario/time.
view comments...
 
35.  SAP BI Questions and Answers I

Some free Basisc Questions and Answers in SAP BW / SAP BI 

Hope this helps all users , these are some basic questions in SAP BW , soon i will be posting some more Enjoy !! 

 

What are the advantages of an Extended star schema of?

BW vs. The star schema?

 

  1. Uses generated numeric keys and aggregates in its own tables for faster  access.
  2. Uses an external hierarehy.
  3. Supports multiple languages.
  4. Contains master data common to all cubes.
  5. Supports slowly changing dimensions.

 

What is the "myself data mart"?

 

A BW system feeding data to itself be caged the myself data mart.  It is created automatically and uses ALE For data transfer.

 

 

How many dimensions are there in a cube?

 

   There are a total of 16 dimensions in a cube.  Of these16, SAP and these are time, unit and request predefine 3. This leaves the customer with 13 dimensions.

 

What is an aggregate?

 

Aggregates are mini cubes.   They are used to improve performance when executing queries. You can equate them to indexes on a table.  Aggregates are transparent to the user.

 What is the transaction for the Administrator workbench?

Transaction RSA1

 

 

What is a calculated key figure?

 

A calculated key figure is used to do complicated calculations on key figures such as mathematical functions, percentage functions and total functions. For example, you can have a calculated key figure to calculate sales tax based on your sale price.

 

What is the enhancement user exit for BEx reporting?

 

RSROOOOl

 

What is a characteristics variable?

 

You can have dynamic input for characteristics using a characteristic variable.   For example, if you are developing a sales report for a given product, you will define a variable for OMATERIAL.

 

What is a condition?

 

If you want to filter on key figures or do a ranked analysis then you use a condition. For example, you can use a condition to report on the top 10 customers, or customers with more than a million dollars in annual sales.

 

What are the differences between OLAP and OLTP?

Applications?

 

OLAP Summarized data Read only

Not Optimized

Lot of historical data

 

What is a star schema?

 

OLTP Detailed Read writes

Optimized for data applications

Less historical data

A fact table at the centre and surrounded (linked) by dimension tables

 

What is a slowly changing dimension?

 

A dimension containing characteristics whose value changes over a time period.  For example, take an employee’s job title; this changes over a period of time as the employee moves through an organization. This is called a slowly changing dimension.

 

What are the advantages of an Extended star schema of?

BW vs. The star schema?   

 

  1. Use of generated keys (numeric) for faster access
  2. External hierarchy
  3. Support for multiple languages
  4. Basic Concepts
    1. Master data is common to all cubes
    2. Supports slowly changing dimensions
    3.  Aggregates in its own tables which allows for faster access

 

 What is the namespace for BW?

 

All SAP objects start with 0. The customer namespace is A - Z. All tables begin with /BIO for SAP and /BIC for customers; All generated objects start with 1-8 (like export data source).  The prefix 9A is used in APO.

 

What is an lnfoObject?  

 

InfoObjects are business objects e.g. Customer, product.  They are divided into characteristics and key figures.  Characteristics are evaluation objects such as customer and key figures are measurable objects such as sales quantity.   Characteristics also include special objects like unit and time.

 

What are time dependent texts I attribute of characteristics?  

 

If text (for example a name of a product or person) or if an attribute changes over time then these must be marked as time dependent.

 

Can you create your own time characteristics?    

 

No

 

 

What is meant by Alpha conversion?

 

Alpha conversion is used to store data consistently. It does this by storing numeric values prefixed with "O" eg. If you have defined   a material as length 6 (of type N Nums) then material number 1 is stored as 000001 but displayed as 1; this removes inconsistencies between 0 1 vs. 001.

 

What is the alpha check execution program? 

 

This is used to check consistency f () r BW 2.x before upgrading the system to 3.x. It is RSMDCNVEXIT

 

What are the attributes only flag? 

 

If this flag is set, no master data is stored.   This is only used as an attribute for other characteristics, for example comments on an Accounts Receivable document.

 

What are the data types allowed for key 'figures? 

 

.   Amount,

.   Quantity

.   Number

.   Integer

.   Date

.  Time

 

What are the aggregation options for key figures?

 

If you are defining prices then you may want to set "no aggregation" or you can define max, min, sum.   You can also define exception aggregation like first, last etc. This is helpful in getting a headcount eg. If you define a monthly inventory count key figure you want the

Count as of the last day of the previous month.

 

What is the maximum nurnber of key 'figures you can have in an lnfoCube?  

 

233

 

What is the maximum number of characteristics you can have per dimension?   

 

248

 

What is a SID table and what are its advantages? 

 

The SID table (Surrogate ID table) is the interface between master data and the dimension tables.

 

Advantages include:

Using 8 byte integer values as indexes for faster access

Master data is independent of InfoCubes

Supports multiple languages

Supports slowly changing dimensions

 

 

 

What is the transfer routine of the lnfoobject? 

 

It is like a start routine; this is independent of the data source and valid for all transfer routines; you can use this to define global data and global checks.

 

 What is the DIM ID?  

 

These are Dimensional IDS. Dim ID’s link dimensions

To the fact table.  It is an 8-byte integer like SID.

 

What is a table partition?

 

By partitioning we split the table into smaller tables, which is transparent to the application. This improves performance (when reading as well as deleting data). SAP uses fact table partitioning to improve performance. Note that you can only partition on OCALMONTH or OFISCPER.

 

Remember that the partition is created only in the E fact table; the F fact table is partitioned by Request Number as a default.

 

Advantages of a partition:

" Makes use of parallel process

" Allows a smaller set of data to be read

" Allows fast deletion

 

How many extra partitions   are created and why?  Can you partition a cube with data?

 

Usually 2 extra partitions are created to accommodate data before the beginning period and one after the end of partitioning period.

 

No, you cannot partition a cube with data.   A cube must be empty to partition it.  One work around is to make a copy of the cube A to cube B and then to export data from A to B using export data source. Then empty cube A, create partition on A, re-import data from B and delete cube B. Note that this is going to change in Netweaver 2004S (Or BW 7)

 

What is a source system?  

 

Any system that is sending data to BW like R/3, flat file, oracle database or a non-SAP systems.

 

 

 

What is a data source and what is an lnfoSource?  

 

Data source: The source that is sending data to a particular InfoSource on BW   for example, we have an OCUSTOMER_ATTR data source to supply attributes to OCUSTOMER from R/3.

 

Info Source: Group of logically related objects.   For example, the OCUSTOMER Info Source will contain data related to customer and attributes like customer number, address, phone no, etc.

 

What are tile 4 types of lnfoSources?  

 

.  Transactional

., Attributes

.  Text

,  Hierarchy

 

What is a communication structure?   

 

Is an independent structure created from an InfoSource?  It is independent of the source system I data source.

 

What at are transfer rules and what is global transferring rule? 

Transfer rules: The transformation rules for data from the source system to the InfoSource I communication structure. These are used to clean up the data from source system. 

 

For example when you load customer data from flat file, you can convert the name to upper case using a transfer rule.

 

Global Transfer Rule: This is a transfer routine (ABAP) defined at the InfoObject level.  This is common for all source systems.

 

What is the process of replication and what menu path would you use to perform it?  

This copies data source structures from R/3 to BW For example, assume that you added a new data source in R/3. This will not be visible in the BW system until you replicate it. 

You replicate using the transaction RSA 1 -+Source System  -+Right click on the system  -+Replicate. You can also replicate at an info area level.

 

What is the update   rule?  

The update rule defines the transformation of data from the communication structure to the data targets. This is independent of the source systems I data sources. For example, you can use update rule to globally change data independent of the source

System.

What are the options in update rules? 

 

,  One to one moves for InfoObject value

,   Constant

,  Lookup for master data attribute value

,   Formula

,  Routine (ABAP)

,  Initial value

 

What are the special conversions for time in update rules?  

Time dimensions are automatically converted.   For example, if the cube contains calendar month and your transfer structure contains date, the date to calendar month is converted automaticall

 

What is the start routine?  

The first step in the update process is to call start routine.   Use this to fill global variables to be used in update routines.   For example, you can define global values to be used by the update routines.   It is also the first step in the Transformation process before the Transfer rules.

 

What is the conversion routine for units and currencies in the update rule?  (?)

Using this option you can write ABAP code for unit I

Currency conversion.   If you enable this flag then unit

Of measure of the key figure appears in the ABAP code as an additional  parameter.  For example, you can use this to convert quantity in pounds to quantity in kilograms.

  

How do you create the  "myself data mart"?   

The BW system feeding data to itself is called the myself data mart.   It is created automatically and uses ALE for data transfer

 

  1. Right click and create the export data source for the ODS/cube or PSA.
  2. In the target system replicate the data source
  3. Create transfer rules and update rules
  4. Create info package to load

 

 

view comments...
 
36.  Safety Upper Limit and lower limit SAP BW Generic Datasource
Hello 
 
Can somebody help me in understading what is safety upper limit and lower limit while creating a generic datasource in the generic delta options
 
Thanks
view comments...
 
37.  SAP BW REPLACEMENT PATH IN VARIABLES
WAHT IS REPLACEMENT PATH IN SAP BW
 
Replacement path is used in variables so instead of prompting the user the value is taken from another object

In a sample report if we want an option for the user to select the airline, we can create a variable in the selection under the characteristics airline id

However the description of the field is something that we need to autopopulate based on the selection of airline so in this case we can create a variable under description 

 

Replacement Path Variable SAP BW

When we create a new variable, give the description and technical name and the processing by instead of manual which will prompt the user, in this case we will use replacement path 

Replacement Path in Variables SAP BW

Once we select the replacement path we define which object we should take the value from in this case we will use Airline ID

 Replacement Path in Variables SAP BI

From the next tab replacement path we can select should we take the value from the key object or the attribute object and select the attribute

 Now when we run the query, the description will be picked from the attribute value of the Airline ID selected by user.

view comments...
 
38.  COPA Extraction Steps (SAP BW / BI )

COPA Extraction Steps

The below are the command steps and explanation. COPA Extraction -steps

 

R/3 System

1. KEB0

2. Select Datasource 1_CO_PA_CCA

3. Select Field Name for Partitioning (Eg, Ccode)

4. Initialise

5. Select characteristics & Value Fields & Key Figures

6. Select Development Class/Local Object

7. Workbench Request

8. Edit your Data Source to Select/Hide Fields

9. Extract Checker at RSA3 & Extract

 

BW

1. Replicate Data Source

2. Assign Info Source

3. Transfer all Data Source elements to Info Source

4. Activate Info Source

5. Create Cube on Infoprovider (Copy str from Infosource)

6. Go to Dimensions and create dimensions, Define & Assign

7. Check & Activate

8. Create Update Rules

9. Insert/Modify KF and write routines (const, formula, abap)

10. Activate

11. Create InfoPackage for Initialization

12. Maintain Infopackage

13. Under Update Tab Select Initialize delta on Infopackage

14. Schedule/Monitor

15. Create Another InfoPackage for Delta

16. Check on DELTA OptionPls r

17. Ready for Delta Load

 

LIS, CO/PA, and FI/SL are Customer Generated Generic Extractors, and LO is BW Content Extractors.

 

LIS is a cross application component LIS of SAP R/3 , which includes, Sales Information System, Purchasing Information System, Inventory Controlling....

 

Similarly CO/PA and FI/SL are used for specific Application Component of SAP R/3.

 

CO/PA collects all the OLTP data for calculating contribution margins (sales, cost of sales, overhead costs). FI/SL collects all the OLTP data for financial accounting, special ledger

 

1) Add the fields to the operating concern. So that the required field is visible in CE1XXXX table and other concerned tables CE2XXXX, CE3XXXX etc.

 

2) After you have enhanced the operating concern then you are ready to add it to the CO-PA data source. Since CO-PA is a regenerating application you can't add the field directly to the CO-PA data source. You need to delete the data source and then need to re-create using KEB2 transaction.

 

3) While re-creating the data source use the same old name so that there won't be any changes in the BW side when you need to assign the data source to info-source. Just replicate the new data source in BW side and map the new field in info-source. If you re-create using a different name then you will be needing extra build efforts to take the data into BW through IS all the way top to IC. I would personally suggest keep the same old data source name as before.

 

If you are adding the fields from the same "Operating concern" then goto KE24 and edit the dataaource and add your fields. However if you are adding fields outside the "Operating concern" then you need to append the extract structure and populate the fields in user exit using ABAP code. Reference OSS note: 852443

 

1. Check RSA7 on your R3 to see if there is any delta queue for COPA. (just to see, sometimes there is nothing here for the datasource, sometimes there is)

2. On BW go to SE16 and open the table RSSDLINIT

3. Find the line(s) corresponding to the problem datasource.

4. You can check the load status in RSRQ using the RNR from the table

5. Delete the line(s) in question from RSSDLINIT table

6. Now you will be able to open the infopackage. So now you can ReInit. But before you try to ReInit ....

7. In the infopackage go to the 'Scheduler' menu > 'Initialization options for the source system' and delete the existing INIT (if one is listed)

 

 

You may also refer these docs:

 

COPA Extraction steps :.

http://bi.sdn-sap.com/2012/12/co-pa-extraction.html

http://www.slideshare.net/mdsadiqdvg/co-pa-extraction 

Source: http://scn.sap.com/thread/1243577

view comments...
 
39.  SAP BI SAP BW - WHAT IS FORMULA COLLISION

SAP BI - WHAT IS FORMULA COLLISION

 

A formula collision occurs when the query uses two structures and there is formula in both structures. The point at which the formula intersects is called a formula collision.

Due to a formula collision we are not sure the result is due to formula in the rows or formula in the columns

To resolve this we to need to eliminate the formula collision and there are twos ways to eliminate the formula collision

  1. Extended options in the properties
  2. Cell Editor

Extended options in the properties

Select the formula in rows in properties choose extended tab and select results of competing formula

Further , In the columns chose the competing formula and under the same options choose use results of this formula

In this way we have resolved the formula collision using the first option

Cell Editor

Cell Editor will only be enabled if there are two structures one in row and one in column where the structure of the report won’t change with new data

 

  1. Cell editor allows direct definition of specific cells in a query. ?The Cell Editor also allows you to mark a specific cell as a 'reference cell'
  2. So that it can be referenced in further calculations
  3. Cells are the intersection of two structures in a query definition therefore the Cell Editor can only be used in query definitions where there are two structures.

In the cell editor select the cell and from the right click context menu select a new cell reference and from the properties tab)(Display) select “always hide”

 

 

 

 

 

 

 

view comments...
 
40.  Free SAP BW certification Questions
SAP Free BW certification Questions , Free SAP BI certification questions.
 
http://sapexplore.com/SAPTutorial/frm_FreeLesson.aspx?sck=2
view comments...
 
41.  EXPERIENCE SAP FIORI
Experience SAP Fiori 
view comments...
 
42.  Standard SAP SYCLO deficiencies

While working and implementing SAP Syclo in one of the large Oil and Gas company an assessment was conducted to evaluate the efficiencies of using the Mobility solution for Materials Management determined that the current solution, as built to represent a common interface between SAP and mobility, did not provide the business the efficiencies expected from a mobile material management solution. The amount of data analysis and UI interaction deflected from the execution of the task at hand.

To overcome this we need to create a Usability Enhancement layer to be placed on top of the original mobile application without disrupting the business process already developed. This independent layer uses the same fetches (data load), rules (business validation), and transactions (data sync) while simplifying the user experience to allow the field user to focus on the execution of the task being performed providing the efficiencies expected from a mobile solution.

The efficiencies gained by using a mobile device to execute the materials management functions are tied directly to the user’s ability to scan barcode labels in order to locate materials loaded on the device and navigate to the data input screens. Scan-enabled list screens are used to provide these efficiencies. These screens have a limited viewing time span, if any, and are conduits to the ultimate goal of the task, data input to sync back to SAP. 

view comments...
 
43.  GOOGLE PROJECT GLASS AND SAP

Google's Project Glass 

Google Glass (styled "GLΛSS") is a wearable compute with an optical head-moutned display  (OHMD) developed by Google  with the mission of producing a mass-market computer.  Google Glass displays information in a smartphone like hands-free format, that can interact with the Internet.

Now SAP and Vuzix have teamed up to create augmented reality glasses,presenting it for manufacturers, logistics companies, and service technicians as well.

The smart glasses can connect with a smartphone to access data which is displayed on a screen in front of the person wearning the Google Glass. The person with its glass on can control the device through voice commands. For example, smart glasses can guide warehouse workers to the products on their pick lists. At the shelf they can scan the barcode to make sure they have the right item and confirm in the system that it has been picked. Forklift drivers can use the glasses to request help or instructions on how to resolve a technical problem, for instance.

SAP products that can be used with smart glasses are: SAP Inventory Manager, SAP Work Manager, SAP CRM Service Manager, SAP Rounds Manager, and SAP Machine-to-Machine Platform. According to Vuzix, the smart glasses can run on iOS and Android.

Sources

http://en.wikipedia.org/wiki/Google_Glass

http://scn.sap.com/community/business-trends/blog/2013/05/29/data-glasses

view comments...
 
44.  SAP MM - STEPS TO CREATE A COST CENTER

 

To create a cost center Use the Transactions KS01, KS02 and KS03 for Create, Change and Display Cost Centre Master data. A Cost Centre in SAP is created to collect costs fo a particular area

A Cost Centre master record must contain the following information: basic data, a long text description, and classic system data entries made in the Address and Communications dialog boxes.

Menu Path:


Accounting >> Controlling >> Cost Center Accounting >> Master Data >> Cost Center >> Individual Processing > Create (KS01)

Following initial screen will appear for creating a Cost Centre

Enter the relevant entries in the following fields: (Definitions taken from Standard SAP) 

  • Cost Centre - Enter a name for the cost centre* 
  • Valid From - Enter the desired start date of when you want the Cost Centre to be used*
  • Valid To - Enter the date to which the Cost Centre will prevent further postings to it.
  • Reference: This section means you can enter another cost centre from any selected controlling area and the new cost centre being created will automatically have all the same attributes as the one you choose to copy. 
  • Cost Centre: Enter the Cost Centre number which you would like similar attributes for your new Cost Centre. You can always change the copied attributes once the KS01 transaction is executed but it saves time to have most of the fields automatically filled.
  • Controlling Area: Select the controlling area to which the cost centre you wish to copy from is assigned to.
    Now once all data has been entered then Click the Master Data button ( ) or press Enter
view comments...
 
45.  SAP Fiori - Collection of SAP Mobile Apps
SAP Fiori is a collection of apps with a simple and easy to use experience for broadly and frequently used SAP software functions that work seamlessly across devices – desktop, tablet, or smartphone. The first release of SAP Fiori includes 25 apps for the most common business functions, such as workflow approvals, information lookups, and self-service tasks. SAP customers can get started with SAP Fiori immediately and bring instant value to all their employees.
view comments...
 
46.  ABAP PROGRAMMING

This report will take delivery document number and delivery date from user and fetches details from delivery table and fetches

corresponding sales order details and billing details  and displays sales order details with ALV list.

Report sales_order_report .
*"Table declarations..............................................................

TABLES: likp. " SD Document: Delivery Header  Data

*"Selection Screen Elements.....................................................

SELECT-OPTIONS:

s_deldoc FOR likp-vbeln, " Delivery

s_dldate FOR likp-lfdat. " Delivery Date

*"--------------------------------------------------------------------------------------*

* Type declaration of the structure to hold specified delivery header data *

*"---------------------------------------------------------------------------------------*

TYPES :

BEGIN OF type_s_likp,

vbeln TYPE likp-vbeln, " Delivery

lfdat TYPE likp-lfdat, " Delivery Date

kunnr TYPE likp-kunnr, " Ship-to party

END OF type_s_likp.

*"---------------------------------------------------------------------------------------*

* Type declaration of the structure to hold specified delivery item  data      *

*"---------------------------------------------------------------------------------------*

TYPES :

BEGIN OF type_s_lips,

vbeln TYPE lips-vbeln, " Delivery

posnr TYPE lips-posnr, " Delivery Item

vgbel TYPE lips-vgbel, " Document number of the  reference document

vgpos TYPE lips-vgpos, " Item number of the reference  item

lfimg TYPE lips-lfimg, " Actual quantity delivered (insales units)

vrkme TYPE lips-vrkme, " Sales unit

END OF type_s_lips.

*"--------------------------------------------------------------------------------------*

* Type declaration of the structure to hold specified data in Customer Master *

*"---------------------------------------------------------------------------------------*

TYPES :

BEGIN OF type_s_kna1,

kunnr TYPE kna1-kunnr, " Customer Number 1

name1 TYPE kna1-name1, " Name 1

END OF type_s_kna1.

*" Type declarations..........................................................................

*"------------------------------------------------------------------------------------------------*

* Type declaration of the structure to hold specified sales document  header data *

*"------------------------------------------------------------------------------------------------*

TYPES :

BEGIN OF type_s_vbak,

vbeln TYPE vbak-vbeln, " Sales Document

erdat TYPE vbak-erdat, " Date on Which Record Was  Created

aufnr TYPE vbak-aufnr, " Order Number

END OF type_s_vbak.

*"----------------------------------------------------------------------------------------------*

* Type declaration of the structure to hold specified sales document   item data *

*"---------------------------------------------------------------------------------------------*

TYPES :

BEGIN OF type_s_vbap,

vbeln TYPE vbak-vbeln, " Sales Document

posnr TYPE vbap-posnr, " Sales Document Item

matnr TYPE vbap-matnr, " Material Number

arktx TYPE vbap-arktx, " Short text for sales orderitem

kwmeng TYPE vbap-kwmeng, " Cumulative Order Quantity in sales Units

vrkme TYPE vbap-vrkme, " Sales unit

END OF type_s_vbap.

*"--------------------------------------------------------------------------------------------*

* Type declaration of the structure to hold specified billing item  data                *

*"--------------------------------------------------------------------------------------------*

TYPES :

BEGIN OF type_s_vbrp,

vbeln TYPE vbrp-vbeln, " Billing Document

posnr TYPE vbrp-posnr, " Billing item

vgbel TYPE vbrp-vgbel, " Document number of the reference document

vgpos TYPE vbrp-vgpos, " Item number of the reference item

fklmg TYPE vbrp-fklmg, " Billing quantity in stockkeeping unit

vrkme TYPE vbrp-vrkme, " Sales unit

END OF type_s_vbrp.

*"-------------------------------------------------------------------------------------------*

* Type declaration of the structure to hold specified sales socument               *        

* header,sales document item data,delivery item data,billing item data            *

*"-------------------------------------------------------------------------------------------*

TYPES:

BEGIN OF type_s_order,

vbeln TYPE vbap-vbeln, " Sales Document

posnr TYPE vbap-posnr, " Sales Document Item

erdat TYPE vbak-erdat, " Date on Which Record Was Created

kunnr TYPE likp-kunnr, " Sold-to party

name1 TYPE kna1-name1, " Name 1

aufnr TYPE vbak-aufnr, " Order Number

matnr TYPE vbap-matnr, " Material Number

arktx TYPE vbap-arktx, " Short text for sales orderitem

kwmeng TYPE vbap-kwmeng, " Cumulative Order Quantity in sales Units

vrkme TYPE vbap-vrkme, " Sales unit

vbeln1 TYPE lips-vbeln, " Delivery

posnr1 TYPE lips-posnr, " Delivery Item

lfimg TYPE lips-lfimg, " Actual quantity delivered in sales units

vrkme1 TYPE lips-vrkme, " Sales unit

vbeln2 TYPE vbrp-vbeln, " Billing Document

posnr2 TYPE vbrp-posnr, " Billing item

fklmg TYPE vbrp-fklmg, " Billing quantity in stockkeeping unit

vrkme2 TYPE vbrp-vrkme, " Sales unit

END OF type_s_order.

*" Data declarations......................................................................

*"-------------------------------------------------------------------------------------------*

* Work variables                                                                                                  *

*"-------------------------------------------------------------------------------------------*

DATA :

w_container TYPE REF TO cl_gui_custom_container," Reference variable for

                                                                                   "container

w_grid TYPE REF TO cl_gui_alv_grid. " Reference variable for grid

*" Field String declarations..............................................................

*"------------------------------------------------------------------------------------------*

* Field string variable to hold sales socument header,sales document              *

* item data,delivery item data,billing item data,fieldcatalog,layout  record       *

*"-------------------------------------------------------------------- --------------------*

DATA :

fs_kna1 TYPE type_s_kna1, " Holds Customer master record

fs_vbak TYPE type_s_vbak, " Holds sales header record

fs_vbap TYPE type_s_vbap, " Holds sales item record

fs_likp TYPE type_s_likp, " Holds delivery header record

fs_lips TYPE type_s_lips, " Holds delivery item record

fs_vbrp TYPE type_s_vbrp, " Holds billing item record

fs_order TYPE type_s_order, " Holds sales order record

fs_cat TYPE lvc_s_fcat, " Holds fieldcatalog record

fs_lay TYPE lvc_s_layo. " Holds layout record

*" Internal Table declarations...........................................................

**"-----------------------------------------------------------------------------------------*

** Internal Table to hold hold sales socument header,sales document              *

* item data,delivery item data,billing item data,fieldcatalog,layout  records       *

**"-------------------------------------------------------------------------------------------*

DATA :

t_kna1 LIKE

STANDARD TABLE

OF fs_kna1, " Internal table to hold customer master records

t_vbak LIKE

STANDARD TABLE

OF fs_vbak, " Internal table to hold sales header records

t_vbap LIKE

STANDARD TABLE

OF fs_vbap, " Internal table to hold sales  item records

t_likp LIKE

STANDARD TABLE

OF fs_likp, " Internal table to hold delivery header records

t_lips LIKE

STANDARD TABLE

OF fs_lips, " Internal table to hold delivery item records

t_vbrp LIKE

STANDARD TABLE

OF fs_vbrp, " Internal table to hold billing  item records

t_order LIKE

STANDARD TABLE

OF fs_order, " Internal table to hold sales order records

t_cat TYPE lvc_t_fcat. " Internal table to hold field

" catalog records

*"------------------------------------------------------------------------------------------------*

* AT SELECTION-SCREEN EVENT *

*"------------------------------------------------------------------------------------------------*

AT SELECTION-SCREEN .

PERFORM check_for_initial.

*"--------------------------------------------------------------------------------------------------*

* AT SELECTION-SCREEN ON S_DELDOC EVENT *

*"--------------------------------------------------------------------------------------------------*

AT SELECTION-SCREEN ON s_deldoc.

PERFORM check_delivery_document.

*"-------------------------------------------------------------------------------------------------*

* AT SELECTION-SCREEN ON S_DLDATE EVENT *

*"------------------------------------------------------------------------------------------------*

AT SELECTION-SCREEN ON s_dldate.

PERFORM check_delivery_date.

*"------------------------------------------------------------------------------------------------*

* START-OF-SELECTION EVENT *

*"-------------------------------------------------------------------------------------------------*

START-OF-SELECTION.

PERFORM data_selection.

*&--------------------------------------------------------------------------------------------------*

*& Form DATA_SELECTION

*&---------------------------------------------------------------------*

* This subroutine is used to select required fields from sales header,

* sales item,delivery header,delivery item ,customer master tables

*----------------------------------------------------------------------*

* There are no interface parameters to be passed to this subroutine

*----------------------------------------------------------------------*

FORM data_selection .

* Get delivery document number,delivery date,customer number from

* delivery header table

SELECT vbeln " Delivery

lfdat " Delivery Date

kunnr " Customer Number 1

FROM likp

INTO TABLE t_likp

WHERE vbeln IN s_deldoc

AND lfdat IN s_dldate.

IF sy-subrc EQ 0.

* Get Customer name for customer numbers from Customer master table

SELECT kunnr " Customer Number 1

name1 " Name 1

FROM kna1

INTO TABLE t_kna1

FOR ALL ENTRIES IN t_likp

WHERE kunnr EQ t_likp-kunnr.

IF sy-subrc EQ 0.

* Get delivery item number,sales document number,sales item number,

* delivery quantity from delivery item table

SELECT vbeln " Delivery

posnr " Delivery Item

vgbel " Document number of

" reference document

vgpos " Item number of reference item

lfimg " Actual quantity delivered

vrkme " Sales unit

FROM lips

INTO TABLE t_lips

FOR ALL ENTRIES IN t_likp

WHERE vbeln EQ t_likp-vbeln.

IF sy-subrc EQ 0.

* Get sales document number,item number,material,material description,

* ordered quantity from sales item table

SELECT vbeln " Sales Document

posnr " Sales Document Item

matnr " Material Number

arktx " Short text for sales order

" item

kwmeng " Cumulative Order Quantity

vrkme " Sales unit

FROM vbap

INTO TABLE t_vbap

FOR ALL ENTRIES IN t_lips

WHERE vbeln EQ t_lips-vgbel

AND posnr EQ t_lips-vgpos.

IF sy-subrc EQ 0.

* Get sales document number ,created date,purchase order number from

* sales header tableSELECT vbeln " Sales Document

erdat " Date on Which Record Was

" Created

aufnr " Order Number

FROM vbak

INTO TABLE t_vbak

FOR ALL ENTRIES IN t_lips

WHERE vbeln EQ t_lips-vgbel.

IF sy-subrc EQ 0.* Get billing document number,billing item,reference delivery document

* number,delivery item number,billing item from billing item table

SELECT vbeln " Billing Document

posnr " Billing item

vgbel " Document number of the

" reference document

vgpos " Item number of the

" reference

" item

fklmg " Billing quantity in

" stockkeeping unit

vrkme " Sales unit

FROM vbrp

INTO TABLE t_vbrp

FOR ALL ENTRIES IN t_lips

WHERE vgbel EQ t_lips-vbeln

AND vgpos EQ t_lips-posnr.

ENDIF. " IF SY-SUBRC EQ 0

ENDIF. " IF SY-SUBRC EQ 0

ENDIF. " IF SY-SUBRC EQ 0

ENDIF. " IF SY-SUBRC EQ 0

ELSE.

* Display message if records are not found for entered values

MESSAGE S000.

EXIT.

ENDIF. " IF SY-SUBRC EQ 0

* Looping Delivery item internal table to assign values to order

* internal table

LOOP AT t_lips INTO fs_lips.

* Get delivery date and customer number for delivery document number

* from delivery header internal table

READ TABLE t_likp WITH KEY vbeln = fs_lips-vbeln

INTO fs_likp.

* Get customer name for customer number from customer master internal

* table

IF sy-subrc EQ 0.

READ TABLE t_kna1 WITH KEY kunnr = fs_likp-kunnr

INTO fs_kna1.

* Get sales document number,item number,ordered quantity for delivery

* document number,item number from sales item internal table

IF sy-subrc EQ 0.

READ TABLE t_vbap WITH KEY vbeln = fs_lips-vgbel

posnr = fs_lips-vgpos INTO fs_vbap.

* Get goods issue date and purchase order number for sales document

* number from sales header internal table

IF sy-subrc EQ 0.

READ TABLE t_vbak WITH KEY vbeln = fs_vbap-vbeln INTO fs_vbak.

IF sy-subrc EQ 0.

* Get billing document number,billing item,billing quantity for delivery

* document number,delivery item number from billing item internal table

READ TABLE t_vbrp WITH KEY vgbel = fs_lips-vbeln

vgpos = fs_lips-posnr INTO

fs_vbrp.

* Assign sales,delivery,billing fields into respective fields of sales

* order internal table

IF sy-subrc EQ 0.

fs_order-vbeln = fs_vbap-vbeln.

fs_order-posnr = fs_vbap-posnr.

fs_order-erdat = fs_vbak-erdat.

fs_order-kunnr = fs_likp-kunnr.

fs_order-name1 = fs_kna1-name1.

fs_order-aufnr = fs_vbak-aufnr.

fs_order-matnr = fs_vbap-matnr.

fs_order-arktx = fs_vbap-arktx.

fs_order-kwmeng = fs_vbap-kwmeng.

fs_order-vrkme = fs_vbap-vrkme.

fs_order-vbeln1 = fs_lips-vbeln.

fs_order-posnr1 = fs_lips-posnr.

fs_order-lfimg = fs_lips-lfimg.

fs_order-vrkme1 = fs_lips-vrkme.

fs_order-vbeln2 = fs_vbrp-vbeln.

fs_order-posnr2 = fs_vbrp-posnr.

fs_order-fklmg = fs_vbrp-fklmg.

fs_order-vrkme2 = fs_vbrp-vrkme.

APPEND fs_order TO t_order.

CLEAR fs_order.

ENDIF. " IF SY-SUBRC EQ 0

ENDIF. " IF SY-SUBRC EQ 0

ENDIF. " IF SY-SUBRC EQ 0

ENDIF. " IF SY-SUBRC EQ 0

ENDIF. " IF SY-SUBRC EQ 0

ENDLOOP. " LOOP AT T_LIPS INTO FS_LIPS

* Check if the final table is initia

lIF t_order IS INITIAL.

MESSAGE S000.

EXIT.

ELSE.

* Calling screen to display the sales order records

CALL SCREEN 1500.

ENDIF. " IF T_ORDER IS INITIAL

ENDFORM. " DATA_SELECTION

*&---------------------------------------------------------------------*

*& Form CHECK_FOR_INITIAL

*&---------------------------------------------------------------------*

* This subroutine is used to validate selection screen elements

*----------------------------------------------------------------------*

* There are no interface parameters to be passed to this subroutine

*----------------------------------------------------------------------*

FORM check_for_initial .

* Check if either delivery document number and delivery date is not

* entered

IF s_deldoc IS INITIAL AND s_dldate IS INITIAL.

MESSAGE E001 display like 'S'.

ENDIF. " IF S_DELDOC IS INITIAL AND....

ENDFORM. " CHECK_FOR_INITIAL

*&---------------------------------------------------------------------*

*& Form CHECK_DELIVERY_DOCUMENT

*&---------------------------------------------------------------------*

* This subroutine is used to perform validation on delivery document

* number

*----------------------------------------------------------------------*

* There are no interface parameters to be passed to this subroutine

*----------------------------------------------------------------------*

FORM check_delivery_document .

* Check if high value is entered without lowvalue

IF s_deldoc-low IS INITIAL AND s_deldoc-high IS NOT INITIAL.

MESSAGE E004 display like 'S'.

ELSE.

* Check if delivery document is in delivery header table

SELECT vbeln " Sales Document

FROM likp

UP TO 1 ROWS

INTO fs_likp-vbeln

WHERE vbeln IN s_deldoc.

ENDSELECT.

* Displays message if there is no record for entered delivery document

* number

IF sy-subrc NE 0.

MESSAGE E002 DISPLAY LIKE 'S'.

ENDIF. " IF SY_SUBRC NE 0

CLEAR fs_likp.

ENDIF. " IF S_DELDOC-LOW IS INITIAL...

ENDFORM. " CHECK_DELIVERY_DOCUMENT

*&---------------------------------------------------------------------*

*& Form CHECK_DELIVERY_DATE

*&---------------------------------------------------------------------*

* This subroutine is used to perform validation on goods issue date

*----------------------------------------------------------------------*

* There are no interface parameters to be passed to this subroutine

*----------------------------------------------------------------------*

FORM check_delivery_date .

* Check if high value is entered without lowvalue

IF s_dldate-low IS INITIAL AND s_dldate-high IS NOT INITIAL.

MESSAGE E004 DISPLAY LIKE 'S'.

ELSE.

* Check if delivery date is in delivery header table

SELECT lfdat " Date on which record is created

FROM likp

UP TO 1 ROWS

INTO fs_likp-lfdat

WHERE lfdat IN s_dldate.

ENDSELECT.

CLEAR fs_likp.

* Displays message if there is no record for entered delivery date

IF sy-subrc NE 0.

MESSAGE E003 DISPLAY LIKE 'S'.

ENDIF. " IF SY_SUBRC NE 0

ENDIF. " IF S_DLDATE-LOW IS INITIAL...

ENDFORM. " CHECK_DELIVERY_DATE

*&---------------------------------------------------------------------*

*& Module STATUS_1500 OUTPUT

*&---------------------------------------------------------------------*

* This module is used to hold pfstatus for the ALV list display

*----------------------------------------------------------------------*

MODULE status_1500 OUTPUT.

* Holds pf-status for the screen 1500

SET PF-STATUS 'SALES_ORDER'.

* Assigns title to the list output

SET TITLEBAR 'SALES'.

ENDMODULE. " STATUS_1500 OUTPUT

*&---------------------------------------------------------------------*

*& Module USER_COMMAND_1500 INPUT

*&---------------------------------------------------------------------*

* This module is used to hold user command for screen navigation

*----------------------------------------------------------------------*

MODULE user_command_1500 INPUT.

* Navigates screen on back,exit,return user commands

CASE sy-ucomm.

WHEN 'BACK' OR '%EX' OR 'RW'.

SET SCREEN '0'.

ENDCASE. " CASE SY-UCOMM

ENDMODULE. " USER_COMMAND_1500 INPUT

*&---------------------------------------------------------------------*

*& Module SALES_DISPLAY OUTPUT

*&---------------------------------------------------------------------*

* This module is used to populate fieldcatalogue and layout and *

* create container and grid and call method of grid to display sales *

* order *

*----------------------------------------------------------------------*

MODULE sales_display OUTPUT.

* Calling subroutine to populate values into fieldcatalogue

PERFORM pop_fcat USING 'VBELN'(006) 'Sales Doc.'(007) '1' .

PERFORM pop_fcat USING 'POSNR'(008) 'Item'(009) '2' .

PERFORM pop_fcat USING 'ERDAT'(010) 'Goods Issue'(011) '3' .

PERFORM pop_fcat USING 'KUNNR'(012) 'Sold-to Party'(013) '4' .

PERFORM pop_fcat USING 'NAME1'(014) 'Sold-to Desciption'(015) '5' .

PERFORM pop_fcat USING 'AUFNR'(016) 'Purchase Order No.'(017) '6' .

PERFORM pop_fcat USING 'MATNR'(018) 'Material'(019) '7' .

PERFORM pop_fcat USING 'ARTKX'(020) 'Material Description'(021) '8'.

PERFORM pop_fcat USING 'KWMENG'(022) 'Ordered Quantity'(023) '9' .

PERFORM pop_fcat USING 'VRKME'(024) 'Sales Unit'(025) '10' .

PERFORM pop_fcat USING 'VBELN1'(026) 'Delivery Doc. No.'(027) '11' .

PERFORM pop_fcat USING 'POSNR1'(028) 'Delivery Item '(029) '12' .

PERFORM pop_fcat USING 'LFIMG'(030) 'Delivery Quantity'(031) '13' .

PERFORM pop_fcat USING 'VRKME1'(032) 'Sales Unit'(025) '14' .

PERFORM pop_fcat USING 'VBELN2'(033) 'Billing doc. No.'(034) '15' .

PERFORM pop_fcat USING 'POSNR2'(035) 'Billing Item'(036) '16' .

PERFORM pop_fcat USING 'FKLMG'(037) 'Billing Quantity'(038) '17' .

PERFORM pop_fcat USING 'VRKME2'(039) 'Sales Unit'(025) '18' .

* Assigning title of layout

fs_lay-grid_title = 'List to display sales order details'(040).

* Creating container for displaying records

CREATE OBJECT w_container

EXPORTING

* parent =

container_name = 'CCONTAINER' .

IF sy-subrc NE 0.

MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno

WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.

ENDIF.

* Creating grid for the container

CREATE OBJECT w_grid

EXPORTING

* i_shellstyle = 0

* i_lifetime =

i_parent = w_container.

IF sy-subrc NE 0.

MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno

WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.

ENDIF. " IF SY_SUBRC NE 0

* Calling method of grid to display table contents

CALL METHOD w_grid->set_table_for_first_display

EXPORTING

i_structure_name = 'FS_ORDER'

is_layout = fs_lay

CHANGING

it_outtab = t_order

it_fieldcatalog = t_cat

EXCEPTIONS

invalid_parameter_combination = 1

program_error = 2

too_many_lines = 3

OTHERS = 4.

IF sy-subrc NE 0.

MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno

WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.

ENDIF. " IF SY_SUBRC NE 0

ENDMODULE. " SALES_DISPLAY OUTPUT

*&---------------------------------------------------------------------*

*& Form POP_FCAT

*&---------------------------------------------------------------------*

* This subroutine is used to populate fieldcatalogue for required

* fields

*----------------------------------------------------------------------*

* -->P_FNAME Holds fieldname

* -->P_CTEXT Holds column text

* -->P_CPOS Holds column position

*----------------------------------------------------------------------*

FORM pop_fcat USING value(p_fname) TYPE c

value(p_ctext) TYPE c

value(p_cpos) TYPE i.

* Assigning fieldname,column text,column position to field catalog

fs_cat-fieldname = p_fname.

fs_cat-coltext = p_ctext.

fs_cat-col_pos = p_cpos.

APPEND fs_cat TO t_cat.

CLEAR fs_cat.

ENDFORM. " POP_FCAT

 

view comments...
 
47.  Storage Location Determination in SD

Scenario - Mat 'ABC' is stored in plant 'PLA' in two different storage locations 'STR1' and 'STR2'. On creating a SO which storage location, will it pick and why??

Explanation

Let me explain first how the storage location is determined

Storage location is determined at the delivery document by storage location rules viz MALA, RETA or MARE
The most commonly used one is MALA.
One rule is defined against a Delivery type (ref: check in TC : 0VLK - Delivery type LF)
MALA is determined as below:
STORAGE LOCATION = SHIPPING POINT + PLANT + STORAGE CONDITION
RETA = PLANT + SITUATION + STORAGE CONDITION
MARE = MALA then RETA (First MALA will be applicable failing to find this rule, system will trace for RETA)
Check which rule is applied to the delivery type.
So if MALA is used for storage location determination then based on the parameters defined for shipping point, plant and storage condition system will pick the storage location.
We set Storage location determination under Shipping -->Picking-->Determine Picking Location-->Assign Picking Locations
But COPY CONTROL also plays its role in copying the storage location in the delivery document if it is created w.r.t. the SO. (Ref: TC: VTLA under Item data one routine is defined viz. “101”)
Please have look at the below portion of “Routine 101”
IF CVBAP-LGORT NE SPACE. (LGORT = Storage location)
LIPS-LGORT = CVBAP-LGORT.
ENDIF
IF NOT CVBAP-CHARG IS INITIAL.
LIPS-CHARG = CVBAP-CHARG.
ENDIF.

Coming back to the question, above portion of SAP standard routine 101, explains us if in a SO, storage location is blank, storage location will be picked determining the rule e.g. MALA, and if SO has a value for storage location then LIPS-LGORT = CVBAP-LGORT. (LIPS is delivery at item level and VBAP is SO at item level)
* SO = Sales Order

view comments...
 
48.  Routine "2" : Explaining the Requirement column in Pricing Procedure

Requirement: Denoted by numbers and maintained in VOFM, this is a condition required for a particular condition type.
E.g. PR00: req. 2 i.e. item relevant for pricing or not.
* Pricing is turned on in item category configuration

FORM KOBED_002.
SY-SUBRC = 4.
IF KOMP-KPOSN NE 0 (KOMP- Pricing Communication Item KPOSN -Condition item no. in SO)
CHECK: KOMP-PRSFD CA 'BX'. (PRSFD - PRSFD Carry out pricing)
CHECK: KOMP-KZNEP = SPACE. (KZNEP -Condition exclusion indicator)
ENDIF
SY-SUBRC = 0.
ENDFORM.
* Prestep
FORM KOBEV_002.
SY-SUBRC = 0.
ENDFORM.

For techi’s above program is self explanatory
Explanation:
The above routine “2” is included under Req column in Pricing Procedure. It means that system would first check whether the item category attached for an item in that condition item number in SO is relevant for pricing or not.
If the Item category is pricing relevant only then system will go to VK11 and fetch the price. On the contrary if req. column does not contain 2 as routine against a condition type system will not consider this parameter and would directly jump to VK11.
This routine improves system performance
Note: If an item category marked as NOT relevant for pricing system will not fetch price in the Sales order even if the condition records for the condition types are maintained.

view comments...
 
49.  Purchase order creation
Currently i am working for MIgration Project.. i would like to knowhat are the scenarios we can findout and also explain how thw purchase order will create give with example for my relevent project migration from xxxx plant to yyyy plant? view comments...
 
50.  sap fico ticket
Hi i am hemant, whenever u have been practicing that times u have getting some problem,so u can kindly provide me sap fico tickets in incidents and change request otherwise u can forward screen shot to my mail id : hemanta.sap2012@gmail.com view comments...
 
51.  i need sap sd tickets in incidents
Hi i am rajesh, kindly provide me sap sd tickets in incidents and change request. view comments...
 
52.  Preparation for exams
I am planning to give a certification exam, but i got a samll problem of how to prepare and how exams gonna be as i got no clue of what type of exam is going to be. I would appreciate if i can get some help. Thanks view comments...
 
53.  sap fi
can any one tell what interview questions frequently ask in inter view in SAP FI pls provide information...... view comments...
 
54.  FICO Tables
Hi Can any give me more info about SAP FICO Tables view comments...
 
55.  sap sd related tickets
please send me sd related tickets and its solutions. view comments...
 
56.  sap fico intex it
my sap view comments...
 
57.  bank reconcillation
bank reconc statement is matching but payment lying in bank in and bank out a.c not gets cleared after posting all processed statement. open lime mangent is on. view comments...
 
58.  sap fi
i am getting table error in customer and vendor payment. pl help. view comments...
 
59.  SAP SD
Hi Experts view comments...
 
60.  learner
AP FICO TICKETS REQUIRE :- Dear Experts, Please provide SAP FICO Support Tickets for me to the following mail id. pdgouds@gmail.com view comments...
 
61.  learner
AP FICO TICKETS REQUIRE :- Dear Experts, Please provide SAP FICO Support Tickets for me to the following mail id. view comments...
 
62.  naidusapsd
hi experts view comments...
 
63.  This is Test Blog
This is Test Message view comments...
 
64.  SAP MM CIN
Can any one explore about CIN if any chance material view comments...
 
65.  sap fico
hi experts, i have an interview day after tomorrow their required the candidate who have know and understand documentation process(SDLC) and work with specific testing tools as need arises. can anyboby help me how to prepare on those thing. Thanks in advance view comments...
 
66.  Related to Free goods determination...!
Dear All, I wanted determine the free goods in delivery level, and is there any standard configuration, if not, kindly let me know the procedure. Regards SK view comments...
 
67.  Use of Purchasing group in SAP MM Module.
Experts please explain the significance of Organizational Level -- Purchasing Group.. Thanks in advnace... view comments...
 
68.  Purpose of select views in Material master - Sap M
Can any body explain .. what is the purpose of select view in Material master in Sap MM Module.. Thanks in advance.. view comments...
 
69.  CUSTOMIZATION MATERIAL FOR CONTROLLING
Hi.. Can anyone kindly send me CUSTOMIZATION MATERIAL FOR CONTROLLING. my id is harsha.adiraju@gmail.com Thanks view comments...
 
70.  Querry?
Can any one explain me in detail the workflow, when i do a goods receipt of a material. i meant steps which occur in background by ERP system as to what GL acount it will hit to. view comments...
 
71.  Hello ,
Can anyone explain accounts payable issues view comments...
 
72.  field selection procedure
hi experts i wanted to know know how do u work with field selection procedure help me out with screen shots if u can view comments...
 
73.  sap sd
posting period 03 and 2012 is not open view comments...
 
74.  CO- Product Costing & Material Ledger Consultant
. view comments...
 
75.  Sales Order Overwritting in ECC
Hi Team, I had created a service order in CRM with spare parts and service material ( servicing charges ) , When I save the order with released status at item- service material (servicing charges ) and the Service order header status as released, the order is saving in CRM and replicating in to the ECC with spare parts order type, however, if I edit the same order in CRM and change the status to completed at the both item level and service order level, the first created sparts sales order is overwritten with debit memo service order type making the spare parts quantity as zero. Note # Two diff sales documents would be generated with diff num ranges and order types ( One with spare parts and another is of debit memo ) in ECC . There after the delivery and billing for spare parts ,only billing for the service material . ( IN ECC ) Kindly do give probable solution.. view comments...
 
76.  http://www.sapficoissue.blogspot.in/
SAP FICO AND MM ISSUE/TICKET/SOLUTION AND MANY MORE....... view comments...
 
77.  Please Don''t Post .DOCX documents Only Post .Doc
Dear All friend Please Post .DOC documents beacuase .docx are doing problem with opening and Saving. view comments...
 
78.  Onsite Opportunity SAP SD/CRM/MM Consultants
Warm Greeting to SAP SD/CRM/MM Consultants, Can you guys share your onsite (Abroad/India) experience as a SAP SD/MM/CRM consultant? I want to understand how frequently onsite opportunities available to us SD and other module consultants and what are the prerequisites to be eligible for onsite projects with in the company. Also, How is the CRM Functional market for Indian consultants in terms of Indian projects and onsite abroad. Everybody says CRM has got good career, but where are the job openings? Why there are very few opening for the CRM and lots of openings for SD on all job portals? Thanks, AR view comments...
 
79.  SAP FICO currency conversion error
Enter rate INR / EUR rate type M for 18.12.2011 in the system settings can any one help me to find solution of this i have configured the OB08 as well as all the tables but still these error is coming view comments...
 
80.  SAP MM
Come and discussed everything about the MM world ! view comments...
 
81.  White Paper on Production Support in SD
Hello All, Please share the white paper''s on Production Support in SAP SD, Thanks & Regards Ravi view comments...
 
82.  SAP SD
HI view comments...
 
83.  Quantiy change while Invoicing from Order
Their is a business scenario in my company which is as follow: We have order related billing. I am creating a order for say 10 units but have to invoice according to the update by business. So the invoice quantity is not fixed. say after 2 month it is decided to do a invoice for 3 quantity and remaining after 4 months. How can i map this situation. I tried doing this but the problem is that the quanitiy in invoice is showing in gray field. can not change the quantity. Please advice. view comments...
 
84.  sap sd
Hi, At the time of posting the material, I am getting an error, Enter the exchange rate INR/EUR from 25.10.2011 in the system settings. Can anybody give a solution for this. view comments...
 
85.  Makka
i have 10 line items in sales order but i have to process 6 line items in deliver how to configure? view comments...
 
86.  SAP FI/CO
Hello Please upload all SAP FI/CO material so that freshers like me can use of that. view comments...
 
87.  Regarding Solution Manager
Hi to all, Can any one send Solution Manager Material. Please do favor for me. view comments...
 
88.  navarun81
Can anyone please send me some solved tickets in sap sd. view comments...
 
89.  navarun81
New Message view comments...
 
90.  sql error
hi to all , while iam activating Master Data(CID) iam getting this error: sql error 8102 occured when accessing programme "saplrsd_gui_iobj_maint" If anyone know this please give reply to this blog ASAP. view comments...
 
91.  Problem in integration
Hi Frnds, while iam defining pricing procedure determination, its showing that condition type pr00 is not in procedure 2011(My company code) a v. if any body knows this please solve my problem asap view comments...
 
92.  mintu
help for sap sd view comments...
 
93.  FICO End User Material
Hello, could anyone please provide FICO End User Material, It will be very helpful. Thank you. view comments...
 
94.  Radhakrishna
Hi Every one, I am Radhakrishna, and i am new to this blog, i have few in issues in SD & FI/CO, Please support me in issues. view comments...
 
95.  sap sd cin
welcome to sap sd-cin view comments...
 
96.  Problem in billing
Hi Friends I am in the process of rolling out a project. The project requirement is I will raise a sales order and issue goods to customer from plant But when i will raise bill. The bill amount i will have to pay to the customer. The scenairo is in my stock there is huge scrap materials Scrap material is taken out of plant stock by an agent. And per Kg of scrap company pays Rs.50 to the agent if any solution pls help Santrupta view comments...
 
97.  ** SAP FI CO EXPERTS NEEDED **
** SAP FI CO expert needed ** I am currently searching for an SAP FI CO functional expert based in Germany with around 5 years experience who would be interested to join an international consultancy company! 80% travel Consulting background Based anywhere in Germany Max €90k salary If this is you or you know someone who may be interested please can you send me the details to bprowse@redcommerce.com as I am very interested to discuss this opportunity further. I look forward to your reply. Thanks, Barrie. view comments...
 
98.  Boart
Hi, i''m a new guy in this forum, please give your hand to me about SD issues ! view comments...
 
99.  SAP Concepts for beginners-II
So as we saw in my previous blog that SAP or any ERP package is just a simulation or replica of business. Now how do we proceed to create this replica? What are the things required to run a business apart from Capital? We require Offices, Plants, Distributors, Few materials to sell and obiviously few vendors and customers to supply and purchase. We also require purchasing and selling prices for these materials. Thus we first create organizational structure.And then we create their relationship with each other. Its nothing but a structure of differents offices ,plants and other units required to run the business. Then we create Master Data.A Master Data is nothing but a simple list of Customers, Materials, Vendors and Prices etc.. view comments...
 
100.  SAP Concepts for beginners
Hi All. This blog tries to make the beginners understand some very basic things about SAP in a very non-bookish language and develop an inside vision. So guys what is SAP? What it actually does? SAP just like any other ERP package basically simulates the business scenarios. Imagine a video game for exmaple Fifa Soccer. What it does? It simulates the Soccer played in real world at virtual lavel. Similarly u can imagine SAP as a game where our task is to run a business. Now every game has some rules. Same is here. The change is that we can modify rules according to the requirements of business. This is called configuration. U have to make the rules so that the game an exact replica of the real business. view comments...