Pages

Sunday, January 15, 2012

Testing_skills

Software Testing Concepts
Quality can be defined as meeting customer's requirements/fitness for purposes

Quality means
-conformance to requirements
-fit for use
-never-ending improvement

In terms of software services Quality means various aspects
-free from defects
-within budgets
-within schedule

Quality is a term which applies to all of the products we use in our daily life

In terms of software services, quality means various aspects such as
Rod software is free from defects. Completed within budgets.

To achieve quality consistently year after year, we need to have a process for quality assurance and quality control

QA:
Planned and systematic activities/process that provides confidence that product and services will conform to specified requirements and meets user needs.
-proactive
-Involves defining and implementing processes and measurements
-Includes audits of quality management system against standards.
-CMMI model is widely used to implement QA
-eg to create and enforce standards and methods to improve the development process and to prevent bugs.

QA is defined as a procedure or set of procedure intended to ensure that a product or service under development meets specified requirements
-It also ensures that a set of process are compiled.
-It tries to uncover any weak areas in the process and improves on them.
-It is a staff function and management responsibility.

QC:
Quality control is a set of procedures used by an organization to ensure that
-the product/service meets a defined set of quality standards, within budget
-To continually improve the ability of the organization to produce software products.
It is the actual operational technique to ensure that the quality of the product meets the requirements of the client or customer.
-leads to the idea of verification and validation testing.
-reactive
Eg:
a structured walkthrough, review, and testing

Qc is defined to inspect the product and looks for any inconsistency for the defined standards.
related to specific product or service
identifies defects for correction
It is a line function and everybody's responsibility.

Finally, QA aims to assure that quality work and quality deliverables will be built in before work is done (Preventive activity)

QC ...did occur after work (Detective activity) is conducted by inspection.

Verification: product right
process of determining whether the output of one phase of development conforms to its previous phase.

Validation: right product
Whether a fully developed system conforms to its requirement document.

Glen myer defined testing as a process of executing a program with the intent to find errors.

Testing:
A test is an act of exercising software with test cases with an objective of finding failure and to demonstrate correct execution.

paul Jorgensen
testing is obviously concerned with errors, faults, failures, and incidents.

test case includes
inputs, expected output, condition involved, constraints involved, as well as steps to be checked.

Importance of testing.
-to find any bugs or errors in the developed application.
-test diff bet actual and expected
-whether all requirements in RTM have been fulfilled as per customer requirements
-product delivered as good quality, traceable to client requirement and free of defect.

software is mind-crafted by humans and hence it is prone to errors.
testing is done throughout the SDLC

Cost of quality = prevention cost + appraisal cost + failure cost
prevention cost = process in line , cost required to prevent errors, process related training, DP efforts
Appraisal cost = testing walkthrough, inspection, and reviews, in process and interprocess inspection and testing
Failure cost: customer complaints, cost of defective product moved into production, rework costs (fixing review, testing defects and customer reported defects)

The techm at the VV&T specialist group executes projects with the understanding that unit testing will be done by the development team

The 2 arms of V also indicate that the VV&T activity is independent of development activity.

The artifacts review process (verification) goes hand in hand with each phase of development and testing. TM process and practices are based on the V model.
As per the V lifecycle of testing, VV&T activities start along with the first phase of requirement capture.

This ensures that VV&T team also has clarity regarding scope , technical details as well as business functionality.

Error:
represent a mistake made by people

Fault:
it is a result of error
fault of commission:
-we enter something into a representation that is incorrect
fault of omission:
-any stakeholder can make, the resulting fault is that sth is missing that sud have been present in the representation

failure:
occurs when fault executes

Incident:
the behaviour of fault. It is the symptom associated with a failure that alerts the user to the occurrence of a failure.

Test Case:
Associated with program behavior
It carries set of input and list of expected outputs.

VV&T Activities
strategy and planning
artifacts review
test case preparation and review
test env setup
test data creation
test execution and defect reporting
retest after defect resolution
test summary report
progress reporting/milestone reviews
Lessons learned and corrective actions to be taken

Final : testcase preparation and review is the most imp activity in a given VVT activities.

Testing methodologies
The objective of the testing process is to reduce risks inherent in a s/w being developed.
Agile s/w development movement is a test-driven s/w development model that is used widely.

testing cycle consists of following phases
Initation : Requirements are understood.
Test Plan: define test cycle methodology., strategy, tools, effort estimate, schedule of testing.
Test design: test environment setup. test cases , stubs, data, test automation framework.
test execution: testcases are executed and test metrics are collected.
test reporting: results are analyzed and reports are generated.

E2E testing experience in TELECOM
-starts with the component testing
1.Component testing:
both functional and non functional requirements.

2.System Integration testing (SIT)
interaction between system components, integrated as per specification.

3.System Acceptance testing (SAT)
concerned with the behavior of a whole system. compares the system behavior against business requirements. Includes non functional requirements such as accuracy, speed, security, and reliability.

4.Operational Readiness testing (ORT)
concerned with behavior of the whole system along with the normal process functioning in business env.ORT looks into the readiness of the organization, systems and the networks to provide business services that will be sold to the customers.

5.Production network and service testing (PROD)
ensure Qos and operational stability. some testing performed are provisioning testing(capacity and topology) Service activation testing, preventive maintenance testing, repair, and test.

Manual testing helps discover defects related to the usability testing and GUI testing area.
test cases are a list of instructions "test scripts"

If GUI is not displayed correctly but funs are proper means bugs are not detectable using automation tools

Why manual testing?
-Business critical heavily tested software.
-new to testing
-Script based automation tools not living up to their hype
-Full automation is simply not possible.
-Agile development

Repetitive manual testing can be difficult to perform. This drawback is compensated by using Manual black box testing including ECP and BVA

There is no complete substitute for manual testing.

Prob with Man testing
Require human resource
Time
Less reliable
Inconsistent

Testing during each phase of SDLC

Requirement :
verification of requirement
determine test strategy
determine adequecy of requirements
generate functional test condition

Design Phase:
determine consistency of design with requirement
determine adequecy of design
generate structural and functional test conditions

Coding Phase:
determine consistency with design
determine adequecy of implementation
generate structural and functional test conditions for units

Every step of STLC corresponds to some activity of SDLC

testing during SDLC phase:
1.testing Phase
establish test objective
design tcs , write tcs
review tc, execute tc
examine results, report

2.Installation phase
-Version control

3.Maintenance phase:
modify and retest
regression testing
document updation.

90% of the defects are process problems
60% of s/w defects originate from requirement phase of project.
The prob of detecting new defects is directly proportional to the no of defects already detected.

A test environment consists of testlab or labs and test artifacts.

The testing process produces artifacts such as test plan, test case, test script, test suite, test data and test harness.

test plan is test specification.

test case normally consists of unique identifier, requirement ref from design specification.

test script: is a combination of tc, test procedure and test data. test script may be manual or automated or both.

test data: multiple sets of values or data that are used to test the same functionality of a particular feature.

test suite: collection of tcs.

test harness: is a collection of tools, s/w, data inputs or outputs and configuration.

development --development team
system test --testing team or testing group
acceptance test ---customer
E2E test --- testing team or independent testing group
production ----live data --customer

Test data:
means creating representative processing conditions using test transactions.
test data has been usually identified for use in executing test scripts, and are used to verify the expected result obtained
created by the testing team. st an extract of live data may be added

test script in s/w testing is a set of instructions that will be performed under system under test to test the system function as expected. These steps can be executed manually or automatically.

test strategy :
only one test strategy is created for the entire project at the beg of the project. this document is prepared by the test manager with the approval of GM/SPM.

test plan:
is a systematic approach to testing a system such as a machine or s/w. It covers the following
-scope of testing
schedule
test deliverables
release criteria
risk and contingencies
prepared by test lead and reviewed by test manager

The test plan is prepared per work package incase of multiple works.

The test plan document will be prepared based on test plan template (ITS-E-T005A)

The process of preparation of test plan is documented in procedure document called defining test strategy and test plan procedure (ITS-E-P005)

after this TRTM needs to be prepared based on test requirement traceability matrix template (ITS-E-T027A)

Review and approval of test plan document is done by customer

test requirement traceability matrix is done through internal peer review.

test planning/designing done by test lead

test case preparation done by test lead

test plan review done by test manager.

The purpose of traceability matrix is to tie up the requirements, with the functions and built test cases to test these requirements.

It depends as test engineers may also be able to wear the hat of technical engineer.

Agile approach towards testing:

significantly reduces testing cycles time in a project

it will quickly establish that with the addition of new functionality, what worked before still works

It will also provide development teams with regular insights into the quality of the deliverables.

agile methodology advises:
continuous testing in parallel with the development
parallel testing in diff test environs

Agile testing is a result of proven practices implemented in many projects currently executed by techm for Bt

it includes intelligent use of tools for various aspects of the test management, preparation, execution and reporting.

Agile process implementations are not static and are constantly being adapted and improved as a new learning is uncovered.

Important methods of an Agile methodology

start early
Early and frequent drops
parallel drops
fail fast
Automation before object of test ready
stubbing
dashboards

------------------------------------------------------------------------------------------------
levels of testing

Unit--performed on module by developer
component--done by tester
Integration--
System
System Integration testing
End to End testing--is a last testing
Regression
Acceptance--usually performed at user level

testing approaches:

White box testing
testcases depend on implementation
Control flow testing
data flow testing
branch testing

Black box testing
testcases don't depend on implementation
Equivalence Partitioning
BVA
decision table testing
pairwise testing
Use case testing
Cross Functional testing
State Transition Tables

White box testing is applicable at unit, integration and system testing level
It is typically applied to the unit
It normally
though this method of test design can uncover an overwhelming number of test cases, it might not detect unimplemented parts of the specification or missing requirements, but one can be sure that all paths through the test object are executed.

Black box testing looks at the external interfaces of the test object to derive test cases.
These test can be either functional or nonfunctional, however, normally that are functional
the test designer selects valid and invalid inputs and determines the correct output
The method of test design is applicable to all levels of testing:, unit, integration, system, functional and acceptance testing.
While this method can uncover unimplemented parts of the specification, it does not ensure that all the paths are tested.

Hence we can say that white and black box testing are complementary and each uncovers diff areas.

The function of unit testing is to isolate each part of the program and show the individual parts are correct.

Unit test is applicable to:
Modular design
Locate error in smaller region
reduces debugging effort
basics for extreme programming

Component testing :
done by the tester is a black box testing technique
in this method, the s/w project is outsourced to other development organization and finally, third party components (Commercial-off-the-shelf or COTS) are integrated to form a s/w
one of the greatest problems with component technology is fault isolation of individual components in the systems.

It defines when to test and which s/w component to test.
Components characteristics which are relevant to component testing
component observability:
observed in terms of its operational behavior, input parameters and outputs.

Component traceability:
track the status of its attributes and behavior. The former is called behavior tracablitiy where component facilitates the tracking of its internal and external behaviours and latter is called trace controllability which is the ability of the component to facilitate the customization of its tracking function.

component controllability:
the controlling ease on components.

Component understandability:
how much component information is provided and how well is presented.

Integration testing
when we start with integration testing it is not essential to have all the modules ready

when we are integrating in a top-down fashion, the lower level modules are replaced by stubs.

When we are integrating in bottom-up manner the calling program function is replaced by Driver.
Calling component is the driver.
Drivers are exact opposite of stubs

System Testing:
includes non-functional testing such as installation testing, documentation testing, stress testing.

mainly three types of system testing
alpha: carried out by test team within the developing organization
Beta: performed by select group of friendly customers
Acceptance: performed by customer himself to accept or reject.

System Integration testing:
more relevant when multiple systems work together.
testing is formed to verify that all systems interact correctly with each other.
eg A bank front end may be windows NT and back end may be on the mainframe
Another eg -- correct data interchange between saving a/c and loan a/c

End to end testing:
refers to verification whether business requirements are delivered accurately irrespective of which systems are involved in the process.

etoe happens as the org grows its business requirements gradually evolve, resulting in multi-system architecture with complex interfaces.

eg of etoend
credit card user
another eg:
COTS products which are designed to be easily installed and to interoperate with existing system components.

one of major advantage of COTs of software, which is mass produced, is its relatively low cost.
another eg is OSS /BSS billing, network, management

Regression testing:
eg new field added to screen
not applicable when a system is developed fresh from the scratch and version 1.0 is moved into production
is basically used to test the functionality of untouched parts after the code changes.

Acceptance testing:
testing from end-user point of view . also called customer acceptance testing CAT or BAT business acceptance testing. test data usually is a live extract data
test specification are written by the business analyst.
functionality is tested against the business requirement specified by the customer
in case the system is maintained by IT service vendor acceptance testing is done by the client.
----------------------------------------------------------------------------------------------
Defect detection and reporting:

reviews are mainly required as the authors are blind to some trouble spots in their own work. Advantage is, even the non executable artifacts can be reviewed. For eg requirement specifications, design document.

reviews and testing are complements to each other.
the advantage with the review is that the defects are detected much earlier in the life cycle. testing is also essential but effective.

It is essential to keep an open mind when the defects are identified by the reviewer.

Review Techniques:
Peer review:
done by colleague
comments are recorded formally

Inspection:
formal review process with predefined roles and process

Walkthrough
not formal as inspection .normally used for specification and high level design

The inspection process was developed by michael Fagan in mid 1970s
used for checklist focuses reviewers attention.
data is collected at each inspection and is fed into future inspections to improve them.
goal of inspection is to identify defects.

The stages in the inspection are:
Planning--moderator
Overview meeting---author
Preparation--inspector
Inspection Meeting--reader
rework--author
Follow-up--moderator

Moderator is a leader of the inspection and recorder or scribe is a person who documents the defects that are found during the inspection.

Reviewer might be a programmer who is going to implement and for code might be an experienced programmer

Walkthrough
not formal
req spec and hLD
hosted by author
It is a flexible concept and can be adapted to specific needs of an organization.

Defect is a non-conformance to user requirements
S/w is especially a defect-prone since it is mind-crafted

Defect :
any noncompliance that moves from one development lifecycle to other.

Defect severity:
High--defect in reservation system that will stop reservations.5
Medium 3
Low 1

Priority:
Defines the urgency with which a defect needs to be fixed
usually, a service-level imposed by business on IT
Used to defined a turn around time given to IT

Examples of priorities.
Fix Immediately --12 hrs fix in next build
Normal queue 3days --fix ASAP
Medium --10 days --fix in next release
Low prio -1-3 months--fix it time allow

Defect priority will depend on the impact of the defect on the functioning of the application as well as the probability of occurrence.

Severity is given by TE
Priority by manager/customer

Defects found during system testing/SI tsting/E2E testing or in production need to be fixed by the dev team

Defect life cycle
open---assigned---fixed---rejected---retest--validated--closed

Six steps for defect management
defect analysis
root cause identification
statistical data analysis
abstraction/generalization
Modify the faulty process
Review data and maintain

Find defect:
Reviews, Walkthrough, and Inspection (static technique)
Testing (dynamic )
Found by users, customers etc (Operational )

Steps for defect prevention are:
Identify critical risks
estimate expected Impact
Minimize expected Impact

Defects uncovered in testing can be logged using test log ITS-F-041A.These can be combined with review defects in defect log template ITS-F-041A

defect template:
Item name
version
detected in phase
author
round
reviewer
defect location and description
severity
defect class and sub-class
Stage at which defect was intro
defect status

PM ensures that Defect Prevention meetings are conducted as per the project plan

All team members attend the DP meeting

Artifacts that will be subjected to V and V
-Review strategy and testing strategy

Measurements and Metrics:
Measurements:
Quantitative data gathered from field.
eg distance covered, no of the line of code

Metric
A quantity derived from the measurements
eg average speed, productivity per day

Some of the VV&T metrics are:
Internal review defects(Ir): Weighted defects found offshore
External review defects(Er): weighted defect found onsite by customer
DRE(Defect removal efficiency)= Ir/(Ir+Er)* 100 ideal is 100%
defect density = (Ir + Er)/size lower the better

Defect detection :
Di/(Di+De)*100

Test effectiveness
(Di-Dr)/Di*100

Dr : defects rejected
Di: defect found during testing
De: defects found after testing

metrics at tech Mahindra
project management metrics

Schedule Variance = actual duration - planned duration/planned duration
Effort variance= actual efforts-planned efforts/planned effort
productivity = Size(function points)/efforts(person days)

testing Metrics:
Defect density = weighted defects/size
defect removal efficiency = In phase defects/total defects
-----------------------------------------------------------------------------

Testing Approaches:

White box testing
also referred as structural testing

dis: does not ensure whether user requirements are satisfied
the test may not reflect real time data
not suitable to verify the screen interactions
the tester has to understand the particular programming language

The two main approaches to white box testing are

code coverage/statement coverage and design coverage

design coverage:
boolean expression, if, while, switch,
dis: ignores branches within boolean

Statement:
also called line coverage, basic block coverage
statement coverage does not report whether loops reach their termination condition.

diff bet code and decision coverage.
eg if (a>b)
else
do this

one test case a=1 b=2 is enogh for code but
two tcs a=1 b=2 and a=2 and b=1 need for decision coverage

Black box testing
derives data from specified functionality irrespective of the internal logic/structure of the program.Applicable to all level of testing exclusively use validation testing technique

Three test case design is basically used for black box testing

ECP, BVA, and negative testing

Guidelines for writing Black box testing:

Look for inputs which are mutually exclusive and then make all the realistic combinations.

Mutually exclusive inputs are those whose values can influence the output independent of each other.

Test cases are written in the following format
test case no, reference for traceability, test case data, and description, expected the result

black box billing eg ----6 tcs

if there are three or more inputs then it is best to form "groups" of the testcases.
------------------------------------------------------------------------------------
Nonfunctional testing:

Performance testing
stress
documentation
usability
recovery
compatibility
Instability
Non functional ORT
security

Testing of all aspects of application other than functionality is termed as non-fun testing usually conducted along with or after functionality testing

Performance:
used to test to determine the maximum sustainable load like max number of users.loadrunner is used

Stress:
It checks the robustness, availability or error handling in less than ideal circumstances
eg a website can be tested under peak load condition with various denial of service tools.
or mail server can be tested when the outgoing links are down

Documentation testing:
testing of all documents supplied with the software.
incase it is not included, tester must raise an issue.
documents supplied with s/w are
-user manual including installation instructions
-online help
-Tutorials supplied with s/w
-computer-based testing

Usability testing:
means how appropriate , functional and effective is the interaction between the s/w and its user.

the objective is to check the efficiency, accuracy, emotional response and recall of the user to uncover errors or areas of improvement.
testing methods used are
-scripted instructions
-pre, post questionaries
eg is a banking application

Recovery:
implies bringing a system to a halt artificially, so that its recovery procedure can be verified.
eg: atm m/c ...transaction done money not reflected in a/c

Installability testing:
for s/w products that have a complex installation, one needs to test installation on different platforms.

Compatibility testing:
verify that a system interacts with and shares data as per the specification with other systems.
eg. the application must provide the same functionality irrespective of the browser used.
validation checking is done here

Operational Readiness testing(ORT)
Before a system is ready to move into a production, it is tested for various aspects as seen earlier.

Nonfunctional areas need to be tested for ORT
-network readiness
-hardware readiness
-user readiness
-commercial readines
-legal readiness

Security testing:
process to determine that an Information system protects data and maintains functionality as intended
Other methods for security testing are syntax testing, property-based testing, fault injection, mutation testing, gligors testing.

Six basic security concepts:
confidentiality, Integrity,Authentication, authorization, avialibity,non-repudation.

Security testing involves identifying points with high risk of penetration and potential perpetrators. It involves five tasks.

task1
identify potential perpetrators
task2
Indentify potential points of penetration
task3
Create a penetration point matrix
vertical axis ,,horizontal axes
task4
identify high risk points of penetration
task5
Execute security test
-------------------------------------------------------------------------------------------
Specialized areas testing:

White box testing of websites

Dynamic and customizable contents are delivered using VB script, ASP, ActiveX and XML
test areas
Dynamic content
Client side and Server side
Database driven web pages
webpages rich in data are populated

Black box testing of website
hyperlinks
forms ie check field length is adequate ,,check if fields accept only valid data etc etc

Nonfunctional testing of website

compatibility testing
Usability testing
Security testing

datawarehouse testing
Major concerns in this areas are
security, accessibility, and integrity

process of testing data warehouse
1. identify relevant concern areas
2.identify data warehouse activity process to test
3.test the adequacy of data warehouse activity

CRM application testing:
test of CRM application would involve external customers
cost of CRM would depend on testing software used and the thoroughness of testing.

Focus is on:
-Installation
-Access control
-Workflow
-API testing
-Performance impacts
-User acceptance testing

ERP testing

Areas tested in ERP
Security control and customization areas of ERP
ERP systems are thoroughly tested before they go into production. ERP testing includes unit testing, component testing, regression testing, performance testing and user acceptance testing.
The test of ERP is system is often limited to the workers within the organization.
Authorization controls are enforced at the application level. Most of the data is centralized and also accessed by multiple modules.
It is therefore very essential to test the security control.
-----------------------------------------------------------------------------------------

test Automation:
Speed , reliable , repeatable, reusable

test automation enables
-measure the coverage of test cases
-analyze test case
-develop new testcase

Types of testing tools
Capture/playback tools for automation of functionality testing
eg Winrunner, QA run, Rational robo, Silk test

Performance testing
Load runner, rational Performance Studio

Defect tracking tools
Clear quest , Bugzilla

test coverage tools
Quantify

test management tool
test director

The biggest barrier to automation is high costs and cannot automate visual references.

The transaction used for performance testing has to be functionally stable before conducting this test.

QTP 9.2 integrates with other mercury testing solutions including:
test director/QC
Winrunner
Loadrunner

utilizes an add-in architecture for compactness and extensibility.

Windows structure of QTP
test pane
Active Screen
data table
debug viewer

By default three add-ins are there in QTP
ActiveX VB Web

Additional addins
Java oracle people Soft .NET Terminal Emulator(RTE) SAP Siebel Delphi

Testing process
create object repository -> create tests -> debug tests -> run tests

A QTP test is called a test script but it actually a folder containing several files. Each test is broken down into sub-parts known as actions.

Action are reusable and can be invoked multiple times within a test. these actions can be called from other tests.

QTP tool provides a WYSIWYG editor to use writing your test script.

keyboard view and
icon based for non programmers
Expert view
provides a VBscript

Object repository:
Object are visual eg button text box and nonvisual eg dictionary reporter elements in the application. each object has a elements
-properties, methods, and events

QTP object identification process
object identification
object spy
object repository

Creating an object repository
per action and shared

process in editing test
-insert synchronization points
-checkpoints
-insert output values
-insert parameterization
-include VB script code

there are two points of synchronization
specific time and until the event occurs

Different types of checkpoints are :
standard
test
test area
bitmap
database
accessibility
XML

When tool provided by QTP is not sufficient to test an application, VBScript coding is used which is used by Microsoft.

Important points:
checkpoints : check
synchronization points : wait
output values : retrieve information
Parameterization: configuration data
VBscript : adds programmbale logic

Testing is divided into QA and QC activities.

QAI : quality assurance association.


system test vs scale test vs stress test vs load test vs performance test
System testing, scale testing, stress testing, load testing, and performance testing are different types of testing activities conducted to evaluate various aspects of a software system. Each type of testing focuses on specific objectives and scenarios. Let's define and differentiate these types of tests: ### 1. System Testing: - **Objective:** - The primary goal of system testing is to evaluate the overall system's compliance with specified requirements. It involves testing the complete, integrated system to ensure that it behaves as expected. - **Scope:** - Focuses on testing the entire system, including all integrated components and modules. ### 2. Scale Testing: - **Objective:** - Scale testing assesses a system's ability to handle an increasing amount of load, typically related to the number of users or transactions. - **Scope:** - Evaluates the system's performance as it scales up in terms of the number of users, transactions, or data volume. ### 3. Stress Testing: - **Objective:** - Stress testing evaluates a system's robustness by subjecting it to conditions that exceed normal operating parameters. The goal is to identify how the system behaves under extreme conditions. - **Scope:** - Involves pushing the system beyond its expected capacity to determine its breaking point or the point at which it starts to degrade in performance. ### 4. Load Testing: - **Objective:** - Load testing involves applying a specific workload to a system to evaluate its performance under normal and peak load conditions. - **Scope:** - Tests the system's ability to handle a specific amount of concurrent users or transactions without degrading performance. ### 5. Performance Testing: - **Objective:** - Performance testing is a broad term that encompasses various testing activities aimed at assessing different aspects of a system's performance, including speed, responsiveness, and scalability. - **Scope:** - Encompasses activities such as load testing, stress testing, and other tests aimed at evaluating the system's overall performance characteristics. ### Key Differences: - **Focus:** - System testing evaluates the overall system's compliance with requirements. - Scale testing assesses scalability concerning users or transactions. - Stress testing assesses the system's behavior under extreme conditions. - Load testing evaluates performance under specific loads. - Performance testing is a broad term encompassing various testing activities. - **Scope:** - The scope varies for each type, focusing on different aspects of the system's behavior. - **Goals:** - The goals of each type of testing differ, ranging from compliance evaluation to assessing robustness, scalability, and performance characteristics. In summary, these testing activities serve distinct purposes within the software development life cycle, helping identify and address different aspects of a system's behavior and performance under various conditions. The choice of testing type depends on the specific objectives and requirements of the testing phase.

No comments:

Post a Comment