TU Wien:Software Engineering VU (Christaki)/Test 2 - Zusammenfassung
Requirements - Life Cycle Stages
- Requirements Elicitation - Design - Implementation - Validation
Requirements - What are Requirements
- Functionality - User Interaction - Error handling - External Interfaces
Requirements - Functional Requirements
- Functionality - Outputs to Inputs - Abnormal Situations - Sequence of Operations - Validity checks - Parameters - External Interfaces - Purpose - Source, Destination - Range, accuracy, tolerance - Units of measure - Relationships to other IO - Formats
Requirements - Non-Functional Requirements
- Performance - can be static or dynamic numerical requirements - Design constraints - Standard - Implementation (Tools, etc.) - Operation (Administration/Management) - Legal (Licences, etc.) - Qualitiy Criteria - Correctness - Consistency - Completeness - Clarity - Traceability - Realism - Verifiability
Requirements - Requirements Validation
Quality assurance step after elicitation - Reviews - Prototyping
Requirements - Steps
Identifying … - actors - scenarios - use cases - non-functional requirements
Requirements - Actors
Represent roles of - users - external systems - physical environments
Requirements - Scenarios
Describes what people do and experience using the system - Behavior of the system from User POV - Common Cases - Understandable
Requirements - Use Cases
Like Scenarios but much more general. A list of steps in the interaction between actor and system - Describe all possible cases - Completeness
Requirements - Non-Functional Requirements
- almost always in relation to functional requirements - identified using check lists - almost always in conflict with each other
Requirements - Sources of Information
- Client - Users - Documentation - Observations
Requirements - Design Specification
Models of the system using abstraction Abstractions are built using modeling
Modeling - Informal Models
Like UML - Mostly visual - easy to discuss - leave details open (not precise) - focus on particular views of the system
Modeling - Formal Models
- based on math - focus on some aspect - enable automatic analysis
Alloy - Consistency
- Formula is consistent if there is at least one structure satisfying the formula under a set of constraints - done using **run** command - positive answers are definite (structures)
Alloy - Validity
- Formula is valid if all structures under constraints satisfy it - done using **check** command - check by searching for counter examples - negative answers are definite (counterexamples)
Design by Contract - Preconditions
- Caller needs to fullfill - Method can expect
Design by Contract - Checking Preconditions
- Error output - Exception - Assert
Design by Contract - Postconditions
- Caller can expect - Method needs to fullfill
Design by Contract - Checking Postconditions
Assertions
Design by Contract - Inheritance (DbC in Subtypes)
- Preconditions can be weaker or equal - Postconditions can be stronger or equal
Design by Contract - Strong vs Weak Pre-Conditions
Trade-off - Usability for the client - Burden on the method
Effective & Systematic Testing - Requirement Analysis
- get requirements (UML, etc.) - analyse - write code
Effective & Systematic Testing - Test-Driven-Development
- Write tests, then code - Rapid Feedback - Easy refactoring
Effective & Systematic Testing - Contracts and Testability
- Requirements are large - Multiple Units with different Contracts implement Features - Write code with testability in mind
Effective & Systematic Testing - Unit Testing
When requirements are done and code is written Testing: - Domains - Boundary Values - Structural - …
Effective & Systematic Testing - Larger Tests
Unit Tests only focus on singular Units Larger Tests: - Integration tests - System tests
Effective & Systematic Testing - Intelligent Testing
Automated tools: - generate tests cases - mutation testing
Effective & Systematic Testing - Effective
Writing the right tests Find max bugs in min effort
Effective & Systematic Testing - Systematic
Tests shouldn’t depend on who’s writing them
Effective & Systematic Testing - Principles of Software Testing
- Exhaustive Testing is impossible - Prioritise - Effective tests - Know when to stop - not too little and not too many tests - adequate criteria to determine when to stop - Variability - No single technique finds all bugs - use multiple types of tests - Bugs are not uniformly distributed - look at data to search for bugs not just code - prioritise areas - Not all Bugs can be found - Set Expectations/Boundaries for amount of bugs - Context - amount and type of bugs depend on what you are doing - different types of software produce different bugs - Verification vs Validation - Verfications: Having the system right (implementation test) - Validation: Having the right system (do users like the system)
Effective & Systematic Testing - Testing Pyramid
- Manual Tests - System Tests - Integration Tests - Unit Tests
Effective & Systematic Testing - Unit Tests
- very easy - very fast - only individual methods/classes/packages - Inputs some values and check against expectations - test units individually
Effective & Systematic Testing - Integration Tests
- how do different parts interact - core system and external interfaces
Effective & Systematic Testing - System Tests
- no focus on how the system works - Input should give certain output
Effective & Systematic Testing - Manual Testing
- not automated - manually explore system - used for validation
Effective & Systematic Testing - Pros and Cons between Unit and System Tests
- Unit Tests - fast - easy control - easy writing - thorough - less real - Integration Tests - System Tests - slower - hard to control - hard to write - flaky - more real
Developing for Testability - Testability Definition
How easy it is to write automated tests Implement system so tests are easy - dependency injection - dependency inversion
Developing for Testability - Infrastructure vs Domain
- Domain code is main business logic - Infrastructure is external (Database, Email-Service, etc.)
Developing for Testability - Hexagonal Architecture
- Implement Interfaces as Ports to which Infrastructure adapts to - Domain Code doesn’t know whats behind port
Developing for Testability - Dependency Injection
Domain Classes only know the interface We implement a fake interface that simulates the external infrastructure to test domain code more easily Pros: - fake dependencies - more explicit dependencies - separation of concerns - more extensability
Developing for Testability - Dependency Inversion
- High-Level Modules (Domain) only know Intefaces - Interfaces don’t depend on Infrastructure (other way around)
Test Driven Development - Definition
- Read Requirements - Write Tests - Write Code
Test Driven Development - Pros
- Developers can choose their development speed - Instant feedback - Code is more testable - Design feedback
Specification-based and boundary testing - Definition
Uses the program requirements to derive tests
Specification-based and boundary testing - Approach
1. Understand requirements - Behaviour - Inputs - Outputs 2. Explore - call with different parameters 3. Identify Partitions - Input Classes that are equivalent - Identify - Look at each variables value ranges - Look at variable interactions - Look at outputs 4. Analyze Boundaries - Boundaries are where inputs make different outputs (we change partitions) - Each boundary has a **on-point** and multiple ********off-points******** 5. Devise Tests - Combine partitions - usually test exceptional cases individually (don’t combine them with other partitions) 6. Automate Tests - Write the test cases - identify inputs - identify outputs - make tests identifiable 7. Augment Test Suite - make variations
Structural Testing - Definition
- Using the source-code to make tests - Complements Specification-based and boundary testing
Structural Testing - Approach
1. Read implementation 2. Identify non-covered parts 3. understand why not tested 4. write tests
Structural Testing - Control Flow Coverage
- Control Flow Coverage - Basic Block - Branch - Path - Loop - Decision Coverage - Condition + Branch - MC/DC
Structural Testing - Basic Block
- a sequence of instructions (that runs in one go, atomic) - one entry - one exit
Structural Testing - Statement Coverage
$$ \text{Statement Coverage} = {\text{Executed Statements} \over \text{All Statements}} $$
Structural Testing - Branch Coverage
Is most used in industry $$ \text{Branch Coverage} = {\text{Executed Branches} \over \text{All Branches}} $$
Structural Testing - Path Coverage
A path goes from the entry node to the exit node All possible ways the code can run Not useful for input dependent loops $$ \text{Path Coverage} = {\text{Executed Paths} \over \text{All Paths}} $$
Structural Testing - Loop Coverage
For each loop in the code, test it with 0, 1 and multiple iterations Typically combined with other criteria $$ \text{Loop Coverage} = {\text{Executed Loops with 0, 1, more Iterations} \over \text{All Loops} \cdot 3} $$
Structural Testing - Condition + Branch Coverage
Each condition should evaluate to true and false at least once Each branch evaluates to true and false at least once $$ \text{C + B Coverage} = {\text{Executed Branches} + \text{Condition Values} \over \text{All Branches} + \text{All Condition Values}} $$
Structural Testing - MC/DC
- Make a boolean table for all conditions - Map test case to the truth assignments - For each condition the must be test cases (independence pairs) such that the condition is false and true - Independence pairs must have different results - Repeat for all conditions and then pick minimal set of test cases that cover all conditions
Structural Testing - Data Flow Coverage
Control flow coverage is not always feasible rather consider paths where a computation affects another computation somewhere else Data Flow Coverage Complements Control Flow Coverage
Structural Testing - DU-Pairs
- Definition and Use Pairs - A pair between definition node and usage node such that there is no definition node $n_i$ in between - The clear path is the path between those two node $$ \text{DU Coverage} = {\text{Executed DU Pairs} \over \text{All DU Pairs}} $$
Structural Testing - Determining DU Pairs
- For each basic block determine the reaching defintion sets - ReachIn = All definitions that reach the current block - definitions overwritten by predecessors - ReachOut = All definitions overwritten by current block - A DU-Pair for definition in block $i$ exists if it is in the ReachIn set of a block that uses the definition
Structural Testing - Measuring DU-Pairs Coverage
Two Maps: - - $\text{defCover}: \text{Variable} \mapsto \mathbb{N}$ - defCover[“x“] = … - $\text{useCover}: \text{Variable} \times \mathbb{N} \times \mathbb{N} \mapsto \mathbb{N}$ - useCover[“x“, defCover[“x“], …]++
Property-based testing - Motivation
- rather than focus on test (example-based), focus on types of values and properties - let the framework seach for counter-examples - Explores input domain better
Property-based testing - Issues
- more complex - more creativity - generating data is expensive or impossible - boundaries need to be expressed correctly - input may not always be fairly distributed
Test doubles and mocks - Motivation
- we only want to test core-code - any external dependencies just hinder testing - doubles mimic dependencies for testing (gives us control over interface) - implement fake dependencies
Test doubles and mocks - Pros
- more control - faster simulations - more conscious about class interactions
Test doubles and mocks - Cons
- less realistic - difficult on large scale - doubles need to be updated too when contracts change - more coupling between code and tests
Test doubles and mocks - Dummy Objects
Objects passed to the class but never used Just filling
Test doubles and mocks - Fake Objects
Working implementations but simulate behaviour
Test doubles and mocks - Stubs
Don’t really work Just hard-coded answers
Test doubles and mocks - Mocks
Like Stubs but also log the interactions
Test doubles and mocks - Spies
Wrappers around real working dependencies that log interactions
Test doubles and mocks - When to use it
- dependencies too slow - dependencies on external infrastructure - hard to simulate cases - non-deterministic behavior