software testing terms part 3

 Software testing terms 3

call graph: 
An abstract representation of calling relationships between subroutines in a program.
Capability Maturity Model (CMM): 
A five level staged framework that describes the key elements of an effective software process. The Capability Maturity Model covers best-practices for planning, engineering and managing software development and maintenance. [CMM] See also Capability Maturity Model Integration (CMMI).

 

Capability Maturity Model Integration (CMMI):
A framework that describes the key elements of an effective product development and maintenance process. The Capability Maturity Model Integration covers best-practices for planning, engineering and managing product development and maintenance. CMMI is the designated successor of the CMM. [CMMI] See also Capability Maturity Model (CMM).

capture/playback tool: 

A type of test execution tool where inputs are recorded during manual testing in order to generate automated test scripts that can be executed later (i.e. replayed). These tools are often used to support automated regression testing.

CASE: 

Acronym for Computer Aided Software Engineering.

CAST: 

Acronym for Computer Aided Software Testing. See also test automation. causal analysis: The analysis of defects to determine their root cause.

cause-effect diagram: 

A graphical representation used to organize and display the interrelationships of various possible root causes of a problem. Possible causes of a real or

potential defect or failure are organized in categories and subcategories in a horizontal tree-structure, with the (potential) defect or failure as the root node. [After Juran]

cause-effect graph: A graphical representation of inputs and/or stimuli (causes) with their associated outputs (effects), which can be used to design test cases.

cause-effect graphing: A black box test design technique in which test cases are designed from cause-effect graphs.

certification: The process of confirming that a component, system or person complies with its specified requirements, e.g. by passing an exam.

change control: See configuration control.

change control board: See configuration control board.change management: (1) A structured approach to transitioning individuals, teams, and organizations from a current state to a desired future state. (2) Controlled way to effect a change, or a proposed change, to a product or service. See also configuration management.

changeability: The capability of the software product to enable specified modifications to be implemented. [ISO 9126] See also maintainability.

charter: See test charter. checker: See reviewer.

checklist-based testing: An experience-based test design technique whereby the experienced tester uses a high -level list of items to be noted, checked, or remembered, or a set of rules or criteria against which a product has to be verified. See also experience-based testing.

Chow’s coverage metrics: See N-switch coverage.

classification tree: A tree showing equivalence partitions hierarchically ordered, which is used to design test cases in the classification tree method. See also classification tree method.

classification tree method: A black box test design technique in which test cases, describedby means of a classification tree, are designed to execute combinations of representatives of input and/or output domains. [Grochtmann]

clear-box testing: See white-box testing.

code: Computer instructions and data definitions expressed in a programming language or in a form output by an assembler, compiler or other translator.

code analyzer: See static code analyzer.

code coverage: An analysis method that determines which parts of the software have been executed (covered) by the test suite and which parts have not been executed, e.g. statement coverage, decision coverage or condition coverage.

code-based testing: See white box testing.

codependent behavior: Excessive emotional or psychological dependence on another person, specifically in trying to change that person’s current (undesirable) behavior while supporting them in continuing that behavior. For example, in software testing, complaining about late delivery to test and yet enjoying the necessary “heroism” working additional hours to make up time when delivery is running late, therefore reinforcing the lateness.

co-existence: The capability of the software product to co-exist with other independent software in a common environment sharing common resources.

commercial off-the-shelf software: See off-the-shelf software. comparator: See test comparator.

compatibility testing: See interoperability testing.

compiler: A software tool that translates programs expressed in a high order language into their machine language equivalents.

complete testing:See exhaustive testing. completion criteria: See exit criteria.

complexity: The degree to which a component or system has a design and/or internal structure that is difficult to understand, maintain and verify.

compliance: The capability of the software product to adhere to standards, conventions or regulations in laws and similar prescriptions. [ISO 9126]

compliance testing: The process of testing to determine the compliance of the component or system.

component: A minimal software item that can be tested in isolation.

component integration testing: Testing performed to expose defects in the interfaces and interaction between integrated components.

component specification: A description of a component’s function in terms of its output values for specified input values under specified conditions, and required non-functional behavior (e.g. resource-utilization).

component testing: The testing of individual software components.

compound condition: Two or more single conditions joined by means of a logical operator (AND, OR or XOR), e.g. ‘A>B AND C>1000’.

 

concrete test case: See low level test case.

concurrency testing: Testing to determine how the occurrence of two or more activities within the same interval of time, achieved either by interleaving the activities or by simultaneous execution, is handled by the component or system.

condition: A logical expression that can be evaluated as True or False, e.g. A>B. See also test condition.

 

condition combination coverage: See multiple condition coverage. condition combination testing: See multiple condition testing.

condition coverage: 
The percentage of condition outcomes that have been exercised by a test suite. 100% condition coverage requires each single condition in every decision statement to be tested as True and False.

condition determination coverage: 
The percentage of all single condition outcomes that independently affect a decision outcome that have been exercised by a test case suite. 100% condition determination coverage implies 100% decision condition coverage.

condition determination testing:
 A white box test design technique in which test cases are designed to execute single condition outcomes that independently affect a decision outcome.

condition outcome: 
The evaluation of a condition to True or False.

 
condition testing: 
A white box test design technique in which test cases are designed to execute condition outcomes.configuration:

The composition of a component or system as defined by the number, nature, and interconnections of its constituent parts.

configuration auditing: 
The function to check on the contents of libraries of configuration items, e.g. for standards compliance.
configuration control: An element of configuration management, consisting of the evaluation, co-ordination, approval or disapproval, and implementation of changes to configuration items after formal establishment of their configuration identification.
configuration control board (CCB): 
A group of people responsible for evaluating and approving or disapproving proposed changes to configuration items, and for ensuring implementation of approved changes.
configuration identification:
An element of configuration management, consisting of selecting the configuration items for a system and recording their functional and physical characteristics in technical documentation.
configuration item: 
An aggregation of hardware, software or both, that is designated for configuration management and treated as a single entity in the configuration management process.
 

configuration management: A discipline applying technical and administrative direction and surveillance to: identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify compliance with specified requirements.

configuration management tool:
 A tool that provides support for the identification and control of configuration items, their status over changes and versions, and the release of baselines consisting of configuration items.

 
consistency: 
The degree of uniformity, standardization, and freedom from contradiction among the documents or parts of a component or system.


content-based model: 
A process model providing a detailed description of good engineering practices, e.g. test practices.

 
continuous representation: 
A capability maturity model structure wherein capability levels provide a recommended order for approaching process improvement within specified process areas. [CMMI]

control flow: 
A sequence of events (paths) in the execution through a component or system.

control flow analysis:
 A form of static analysis based on a representation of unique paths (sequences of events) in the execution through a component or system. Control flow analysis evaluates the integrity of control flow structures, looking for possible control flow anomalies such as closed loops or logically unreachable process steps.

control flow graph: 
An abstract representation of all possible sequences of events (paths) in the execution through a component or system.conversion testing:

Testing of software used to convert data from existing systems for use in replacement systems.corporate dashboard: A dashboard-style representation of the status of corporate performance data. See also balanced scorecard, dashboard.

cost of quality: 
The total costs incurred on quality activities and issues and often split into prevention costs, appraisal costs, internal failure costs and external failure costs.

COTS: 
Acronym for Commercial Off-The-Shelf software. See off-the-shelf software.coverage:

The degree, expressed as a percentage, to which a specified coverage item has been exercised by a test suite.

coverage analysis: 
Measurement of achieved coverage to a specified coverage item during test execution referring to predetermined criteria to determine whether additional testing is required and if so, which test cases are needed.

coverage item: 
An entity or property used as a basis for test coverage, e.g. equivalence partitions or code statements.

coverage tool: 
A tool that provides objective measures of what structural elements, e.g. statements, branches have been exercised by a test suite.

critical success factor: 
An element which is necessary for an organization or project to achieve its mission. They are the critical factors or activities required for ensuring the success. See also content-based model.

Critical Testing Processes:
 A content-based model for test process improvement built around twelve critical processes. These include highly visible processes, by which peers and management judge competence and mission-critical processes in which performance affects the company’s profits and reputation.

cyclomatic complexity: 
The number of independent paths through a program. Cyclomatic complexity is defined as: L – N + 2P, where-    L = the number of edges/links in a graph
–    N = the number of nodes in a graph
–    P = the number of disconnected parts of the graph (e.g. a called graph or subroutine) [After McCabe]
——————————————————–
QTP Online Training

 

Comments