SAP Interview Questions – Part 7
Generally speaking, system analysts provide tech-support for hardware and software or using hardware and software tools. Analysts install and maintain, as well as do research on hardware and software in order to optimize performance. They also provide specifics about the types of hardware and software they have actually worked with and in what context.
2. What characteristics should a system analyst possess to be good at his job?
System analysts are technical experts and outstanding analytical thinkers. They are highly organized and knowledgeable about many different types of technological systems. They are also able to adapt to the various unique technological conditions which each new company brings. They possess reasonable communication skills and are able to prepare effective reports, flowcharts, and diagrams.
3. What experience do you have with team work?
The job of a system analyst often requires communicating with relevant staff to coordinate information and operation in order to optimize the system at all levels, in all departments. Effective team communication is vital.
4.Can you provide the documentation required by the system analyst?
Describe the various documentations you keep as part of your daily responsibilities. Analysts document problems as well as the solutions required or implemented. They make records of inventory and maintain system documentation.
5. What is your technical expertise in this field?
Be specific about what you actually have training and experience in. But generally speaking, system analysts are usually experts in computer systems, hardware platforms, and programming techniques.
6. How do you stay current with the market?
Equity analysts always carry out current research on specific markets, industries, companies, sectors, or countries. Research is done through the internet, by interacting directly with corporate employees, clients, and company headquarters. They regularly read relevant magazines and newspapers.
7. What expertise does the job require?
In general terms, the job requires financial expertise – the equity analyst must be a highly trained financial expert, with knowledge of every rapidly-changing aspect of the financial world.
8. What key qualities allow you to succeed in your job?
Try not to speak theoretically, about analysts in general. Speak about yourself. Talk about multi-tasking and ability to work under pressure. Give examples of your analytical skills.
9.What exactly do you do in the capacity of an equity analyst?
More specifically, you analyze a particular niche of the financial world (be explicit). Then you communicate back with the clients and make suggestions on what or how to buy/sell/hold.
10. Do you have any experience preparing financial reports?
The ability to communicate effectively, both in speech and writing, is part of the job. Equity analysts prepare written reports and analysis briefs. They are conversant in the various formats and templates of analyses and reports typically required in the financial world.
11. What are the steps and processes you follow for considering credit to a customer?
The analyst’s job is to analyze customers, as well as the market. The analyst must know how safe the playing habits of the client are. The analyst studies customer records and meets customers regarding various issues.
12. What characteristics are most important to be successful as a credit analyst?
It may sound obvious, but it must be stressed: analytical thinking is vital to one’s success as a credit analyst. Professionals in this field do a lot of evaluating; they study customer records, meet clients in person, and become familiar with their history and habits. Analysts must be able to put all these together and decide if it is productive for the company to extend credit in this case.
13.What is the role of interpersonal and communication skills in the career of a credit analyst?
Interpersonal communication is crucial to realizing your full potential as a credit analyst. A credit analyst communicates regularly with internal and external business representatives regarding credit information. He/she also meets clients in person to answer queries, solve problems, respond to complaints, etc.
14. Are you skilled in financial analysis?
Financial analysis is part of the job. Analysts must understand things like financial and cash-flow statements, market share, management accounts, income growth, etc. They are required to generate financial ratios to understand a customer’s financial situation.
15. Are you proficient with relevant financially-oriented software and technology?
Professionals are typically required to use specialized software to perform things like generating financial ratios and developing statistical models to assess and predict information. Mentioning your ability to use computers in general for related activities such as market research is also relevant.
16.What are the steps included in Data integration process?
Stage data in an operational datastore, data warehouse, or data mart.
Update staged data in batch or real-time modes.
Create a single environment for developing, testing, and deploying the entire data integration platform.
Manage a single metadata repository to capture the relationships between different extraction and access methods and provide integrated lineage and impact analysis.
17. Define the terms Job, Workflow, and Dataflow
A job is the smallest unit of work that you can schedule independently for execution.
A work flow defines the decision-making process for executing data flows.
Data flows extract, transform, and load data. Everything having to do with data, including reading sources, transforming data, and loading targets, occurs inside a data flow.
18. Arrange these objects in order by their hierarchy: Dataflow, Job, Project, and Workflow.
Project, Job, Workflow, Dataflow.
19. What are reusable objects in DataServices?
Job, Workflow, Dataflow.
20. What is a transform?
A transform enables you to control how datasets change in a dataflow.
21. What is a Script?
A script is a single-use object that is used to call functions and assign values in a workflow.
22. What is a real time Job?
Real-time jobs “extract” data from the body of the real time message received and from any secondary sources used in the job.
23. What is an Embedded Dataflow?
An Embedded Dataflow is a dataflow that is called from inside another dataflow.
24. What is the difference between a data store and a database?
A datastore is a connection to a database.
25. How many types of datastores are present in Data services?
Database Datastores: provide a simple way to import metadata directly froman RDBMS.
Application Datastores: let users easily import metadata frommost Enterprise Resource Planning (ERP) systems.
Adapter Datastores: can provide access to an application’s data and metadata or just metadata.
26. What is the use of Compace repository?
Remove redundant and obsolete objects from the repository tables.
27. What are Memory Datastores?
Data Services also allows you to create a database datastore using Memory as the Database type. Memory Datastores are designed to enhance processing performance of data flows executing in real-time jobs.
28. What are file formats?
A file format is a set of properties describing the structure of a flat file (ASCII). File formats describe the metadata structure. File format objects can describe files in:
Delimited format — Characters such as commas or tabs separate each field.
Fixed width format — The column width is specified by the user.
SAP ERP and R/3 format.
29. Which is NOT a datastore type?
30. What is repository? List the types of repositories.
The DataServices repository is a set of tables that holds user-created and predefined system objects, source and target metadata, and transformation rules. There are 3 types of repositories.
A local repository
A central repository
A profiler repository
31. What is the difference between a Repository and a Datastore?
A Repository is a set of tables that hold system objects, source and target metadata, and transformation rules. A Datastore is an actual connection to a database that holds data.
32. What is the difference between a Parameter and a Variable?
A Parameter is an expression that passes a piece of information to a work flow, data flow or custom function when it is called in a job. A Variable is a symbolic placeholder for values.
33. When would you use a global variable instead of a local variable?
When the variable will need to be used multiple times within a job.
When you want to reduce the development time required for passing values between job components.
When you need to create a dependency between job level global variable name and job components.
34. What is Substitution Parameter?
The Value that is constant in one environment, but may change when a job is migrated to another environment.
35. List some reasons why a job might fail to execute?
Incorrect syntax, Job Server not running, port numbers for Designer and Job Server not matching.
36. List factors you consider when determining whether to run work flows or data flows serially or in parallel?
Consider the following:
Whether or not the flows are independent of each other
Whether or not the server can handle the processing requirements of flows running at the same time (in parallel)
37. What does a lookup function do? How do the different variations of the lookup function differ?
All lookup functions return one row for each row in the source. They differ in how they choose which of several matching rows to return.
38. List the three types of input formats accepted by the Address Cleanse transform.
Discrete, multiline, and hybrid.
39. Name the transform that you would use to combine incoming data sets to produce a single output data set with the same schema as the input data sets.
The Merge transform.
40. What are Adapters?
Adapters are additional Java-based programs that can be installed on the job server to provide connectivity to other systems such as Salesforce.com or the JavaMessagingQueue. There is also a SoftwareDevelopment Kit (SDK) to allow customers to create adapters for custom applications.
41. List the data integrator transforms
Pivot Reverse Pivot
42. List the Data Quality Transforms
43. What are Cleansing Packages?
These are packages that enhance the ability of Data Cleanse to accurately process various forms of global data by including language-specific reference data and parsing rules.
44. What is Data Cleanse?
The Data Cleanse transform identifies and isolates specific parts of mixed data, and standardizes your data based on information stored in the parsing dictionary, business rules defined in the rule file, and expressions defined in the pattern file.
45. What is the difference between Dictionary and Directory?
Directories provide information on addresses from postal authorities. Dictionary files are used to identify, parse, and standardize data such as names, titles, and firm data.
46. Give some examples of how data can be enhanced through the data cleanse transform, and describe the benefit of those enhancements.
Determine gender distributions and target
Gender Codes marketing campaigns
Provide fields for improving matching
Match Standards results
47. A project requires the parsing of names into given and family, validating address information, and finding duplicates across several systems. Name the transforms needed and the task they will perform.
Data Cleanse: Parse names into given and family.
Address Cleanse: Validate address information.
Match: Find duplicates.
48. Describe when to use the USA Regulatory and Global Address Cleanse transforms.
Use the USA Regulatory transform if USPS certification and/or additional options such as DPV and Geocode are required. Global Address Cleanse should be utilized when processing multi-country data.
49. Give two examples of how the Data Cleanse transform can enhance (append) data.
The Data Cleanse transform can generate name match standards and greetings. It can also assign gender codes and prenames such as Mr. and Mrs.
50. What are name match standards and how are they used?
Name match standards illustrate the multiple ways a name can be represented.They are used in the match process to greatly increase match results.
51. What are the different strategies you can use to avoid duplicate rows of data when re-loading a job.
Using the auto-correct load option in the target table.
Including the Table Comparison transform in the data flow.
Designing the data flow to completely replace the target table during each execution.
Including a preload SQL statement to execute before the table loads.
52. What is the use of Auto Correct Load?
It does not allow duplicated data entering into the target table.It works like Type 1 Insert else Update the rows based on Non-matching and matching data respectively.
53. What is the use of Array fetch size?
Array fetch size indicates the number of rows retrieved in a single request to a source database. The default value is 1000. Higher numbers reduce requests, lowering network traffic, and possibly improve performance. The maximum value is 5000
54. What are the difference between Row-by-row select and Cached comparison table and sorted input in Table Comparison Tranform?
Row-by-row select —look up the target table using SQL every time it receives an input row. This option is best if the target table is large.
Cached comparison table — To load the comparison table into memory. This option is best when the table fits into memory and you are comparing the entire target table
Sorted input — To read the comparison table in the order of the primary key column(s) using sequential read.This option improves performance because Data Integrator reads the comparison table only once.Add a query between the source and the Table_Comparison transform. Then, from the query’s input schema, drag the primary key columns into the Order By box of the query.
55. What is the use of using Number of loaders in Target Table?
Number of loaders loading with one loader is known as Single loader Loading. Loading when the number of loaders is greater than one is known as Parallel Loading. The default number of loaders is 1. The maximum number of loaders is 5.
56. What is the use of Rows per commit?
Specifies the transaction size in number of rows. If set to 1000, Data Integrator sends a commit to the underlying database every 1000 rows.
57. What is the difference between lookup (), lookup_ext () and lookup_seq ()?
lookup() : Briefly, It returns single value based on single condition
lookup_ext(): It returns multiple values based on single/multiple condition(s)
lookup_seq(): It returns multiple values based on sequence number
58. What is the use of History preserving transform?
The History_Preserving transform allows you to produce a new row in your target rather than updating an existing row. You can indicate in which columns the transform identifies changes to be preserved. If the value of certain columns change, this transform creates a new row for each row flagged as UPDATE in the input data set.
59. What is the use of Map-Operation Transfrom?
The Map_Operation transform allows you to change operation codes on data sets to produce the desired output. Operation codes: INSERT UPDATE, DELETE, NORMAL, or DISCARD.
60. What is Heirarchy Flatenning?
Constructs a complete hierarchy from parent/child relationships, and then produces a description of the hierarchy in vertically or horizontally flattened format.
Parent Column, Child Column
Parent Attributes, Child Attributes.
61. What is the use of Case Transform?
Use the Case transform to simplify branch logic in data flows by consolidating case or decision-making logic into one transform. The transformallows you to split a data set into smaller sets based on logical branches.
62. What must you define in order to audit a data flow?
You must define audit points and audit rules when you want to audit a data flow.
63.What is BW Statistics and how is it used?
The set of info cubes delivered by SAP as a part of SAP BIW which are useful in measuring the performance of how quickly a query is calculated, or how quickly data in loaded into BW and so on. BW statistics are the name suggest are useful in showing data about the costs associated with BW queries, aggregative data, OLAP, SAP business warehouse management.
64.Explain what is table partition?
In order to improve the performance SAP is using the fact table. It can be partitioned only on 0CAlMONTH or 0FISCPER.
65.When ABAP code is required during the transfer rule what important variables can be used and what are the options available in transfer rule?
When ABAP code is required during the transfer rule the important variables that can be used are by assigning info objects, assigning a constant, ABAP routine or a formula.
66.How many dimensions are in Cube?
There are 16 out of which 3 are predefined time, unit and request, customer is left with 13 dimensions.
67.How the dimensions are optimized?
It system can be used as many as possible for performance, for instance it may be assumed that 100 products and 200 customers; if one dimension for both, the size of the dimension will be 20000; if it was made individual dimensions then the total number of rows will be 300. Even if they are taken more than one characteristic per dimension, the math considering worst case and decide which characteristics may be combined in a dimension.
68.As per the update rule what are the conversion routines for units and currencies?
As per the update rule, the Time dimensions are automatically converted for example if the cube contains calendar month and the transfer structure contains date, the to calendar month is converted automatically.
69.What are the advantages of SID table?
The SIC table Surrogate ID table is the interface between master data and the dimension table, and the advantages are:
Uses Numeric as indexes for faster access.
Master Data independent of info cubes.
Slowly changing dimension support.
70.How can an info object as info provider and why?
When the report on characteristics or master data, it can make them as info provided for example make 0CUSTOMER as info provided and do Bex reporting on 0CUSTOMER, right click on the info area and select “Insert Characteristic as data target”
71.For info object what is the transfer routine?
The transfer routine is like a start routine, this is independent of the data source and valid for all the transfer routines, it can be used this to define global data and global checks.
72.What are the load process and processing?
The load process and processing are the Info package, Read PSA and update data target, save hierarchy, update ODS data object, data export open hub, delete overlapping requests.
73.What is DIM ID?
DIM ID is a link dimensions to the fact table.
74.What is the data target administration task?
The data target administration target is to delete index, generate index, construct database statistic, initial fill of new aggregates, roll up of filled aggregates, compression of the info cubes, activate ODS, complete deletion of data target.
75.How many extra partitions are possible to create and why?
Actually two extra partitions are created to accommodate data before that begin date and after the end date.
76.When there is a locking problems what is the parallel process?
The parallel process for locking problems:
Hierarchy attributes change run.
Loading master data from same info objects, for example avoiding master data from different sources system at the same time.
Rolling up for the same info cube.
Selecting deletions of info cube/ODS and parallel loading.
Activation or deletion of ODS object when loading parallel.
77.What is the procedure to convert an info package group into a process chain?
The procedure to convert an info package group into a process chain is by double clicking on the info package group, click on the Process Chain Maintenance button and type in the name and description, the individual info packages are inserted automatically.
78.How does a data transform through open Hub?
The data can be transformed through open Hub by using BADI.
79.How can a cube partitioned for which the data already exists?
The cube cannot be partitioned if the data already exists the cube must be empty to do this, one work around is to make a copy of the cube A to cube B, export data from A to B using export data source, empty cube A, create partition on A, re-import data from B, and delete cube B.
80.What is the transaction code for Administrator work bench?
The Transaction Code for Administrator work bench is RSA1.
81.What is a source system?
The source system is that which sends the data to BW like R/3, flat file, oracle database or external systems
82.What is a data source? >
The source which is sending data to a particular info source on BW, for example it contains an OCUSTOMER_ATTR data source to supply attributes to OCUSTOMER from R/3.
83.What is an info source?
Group of logically related objects, for example the OCUSTOMER info source will contain data related to customer and attributes like customer number, address, phone no, etc.
84.What is an Operations Data Source?
Operation Data Source is that which can over write the existing data in OSD.
85.What are the types of Info Source?
The Types of Info Source are Transactional, Attributes, Text and Hierarchy.
86.What way BW Statistics are useful?
The set of cubes delivered by SAP which is used to measure performance for query, loading data, etc it also shows the usage of aggregates and the cost associated with them.
87.What is communication structure?
The Independent structure created from Info source; which is independent of the source system or data source is known as communication structure.
88.What is the procedure to replace a query result from a master query to a child query?
If a characteristic value is selected with replacement path then it uses the results from previous query for example assume that we have query Q1 which display the top 10 customers, we have query Q2 which gets the top 10 customers for info object 0CUSTOMER with as a variable with replacement path and display detailed report on the customers list passed from Q1.
89.Give some available formulas?
Some available formulas are Concatenate, sub string, condense, left/right characters, 1_trim, r_trim, replace, date routines like DATECONV, date-week, add to date, date different, logical functions.
90.How do you define exception reporting in the background?
By using the reporting agent, from the AWB click on the exception icon on the left; give a name and description. Select the exception from query for reporting (drag and drop).
91.When defining aggregates what are the options available?
The groups are according to the characteristics:
H – Hierarchy
F – Fixed Value B – None
92.What are transfer rules?
The transformation rules are for the data from source system to info source or communication structure.
93.What is the procedure to debug errors with SAP GUI like Active X error?
Run Bex analyzer>Business Explorer menu item > Installation check, this shows an excel sheet with a start button; click on it; this verifies the GUI installation; if u find any errors either reinstall or fix it.
94.What is the global transfer rule?
Global Transfer rule is a transfer routine (ABAP) defined at the info object level; which is common for all source systems.
95.What is the first step performed when user exit for variables are written?
When the user exit for variables is written it would be used in ABAP code as a conditional check.
96. What is the procedure to insert an inspection checkpoint at the end of an operation?
The procedure to insert an inspection is by defining it in the process sample, assign inspection type 03 in the Material Master then create MIC and assign them in the Routings. Then the system automatically generates the Inspection Lots.
97. Explain me why work scheduling view is required for semi finished and finished products?
Work Schedule view is required for semi finished and finished products because all the details are mentioned in MRP views and they are also available for production for scheduler and production scheduler profile in this view, if not maintained conversion of planned order to production will not be possible.
98. What is Batch?
The partial quantity of material managed separately from other quantity of same material in stock is called Batch.
99. How to delete a group of products that are created in Screen MC84 and group of products?
The product group is created as a material master record with material type PROD and this record is deleted in SARA and the members deleted in the product group and then archive the product group in transaction SARA. Object MM_MATNR.
100. What is the procedure for MPS in Back ground online when it was run for a plant either ways?
MPS in Back ground online was run for a plant in either ways it would run MPS for a single material/plant, if it does not work it need to run a planning file and a consistency check first, transaction OMDO and MDRE respectively it require to set up two jobs to do it but it is straight forward, once it was done this set up the back ground job MPS to run and do it in NEUPL first time round and then change to NETCH after that.
101. What is the output of MPS run?
MPS is run to plan the materials which are of ‘A’ type in the abc analysis (80% dollar value) that have to be planned before. MPS is also a type of MRP only in which components just below the materials on which MPS run is taken place, are planned.
102. List some components of BOM?
Components of BOM are List of Components, Quantity of Components and Unit of Measure of Components.
103. How does SAP know that an operation has components as indicated by the component allocation indicator?
Component allocation is done by routing maintenance or BOM maintenance for the assembly. This is master data maintenance. As soon as a PO is created, the master data is read into the PO.
104. Is it possible to attach a drawing for a material to BOM and what is the process?
Yes, it’s possible to attach a drawing for a material to BOM and coming to the process create a document using Transaction Code CV01N Create Document and attach the drawing in that document. Then the system will automatically generate a document number and then assign this document in the BOM with item category D.