Software metric: Difference between revisions
Undid revision 882182185 by 61.2.101.6 (talk) redundant |
m →Common software measurements: added COSMIC Function Points and Defect metrics |
||
Line 10: | Line 10: | ||
* Comment density<ref>{{cite web|title=Descriptive Information (DI) Metric Thresholds|url=http://www.lsec.dnd.ca/qsd_current_version/eng_support/di/metrics.htm|work=Land Software Engineering Centre|accessdate=19 October 2010|deadurl=yes|archiveurl=https://archive.is/20110706175332/http://www.lsec.dnd.ca/qsd_current_version/eng_support/di/metrics.htm|archivedate=6 July 2011|df=}}</ref> |
* Comment density<ref>{{cite web|title=Descriptive Information (DI) Metric Thresholds|url=http://www.lsec.dnd.ca/qsd_current_version/eng_support/di/metrics.htm|work=Land Software Engineering Centre|accessdate=19 October 2010|deadurl=yes|archiveurl=https://archive.is/20110706175332/http://www.lsec.dnd.ca/qsd_current_version/eng_support/di/metrics.htm|archivedate=6 July 2011|df=}}</ref> |
||
* [[Connascent software components]] |
* [[Connascent software components]] |
||
* [[COSMIC functional size measurement|COSMIC Function Points]] |
|||
* [[COCOMO|Constructive Cost Model]] |
* [[COCOMO|Constructive Cost Model]] |
||
* [[Coupling (computer science)|Coupling]] |
* [[Coupling (computer science)|Coupling]] |
||
* [[Cyclomatic complexity]] (McCabe's complexity) |
* [[Cyclomatic complexity]] (McCabe's complexity) |
||
* Defect density - defects found in a component |
|||
* Defect potential - expected number of defects in a particular component |
|||
* Defect removal rate |
|||
* [[DSQI]] (design structure quality index) |
* [[DSQI]] (design structure quality index) |
||
* [[Function Point]]s and Automated Function Points, an [[Object Management Group]] standard<ref>{{cite web|url=http://www.omg.org/news/releases/pr2013/01-17-13.htm |title=OMG Adopts Automated Function Point Specification |publisher=Omg.org |date=2013-01-17 |accessdate=2013-05-19}}</ref> |
* [[Function Point]]s and Automated Function Points, an [[Object Management Group]] standard<ref>{{cite web|url=http://www.omg.org/news/releases/pr2013/01-17-13.htm |title=OMG Adopts Automated Function Point Specification |publisher=Omg.org |date=2013-01-17 |accessdate=2013-05-19}}</ref> |
Revision as of 12:21, 27 February 2019
Part of a series on |
Software development |
---|
A software metric is a standard of measure of a degree to which a software system or process possesses some property. Even if a metric is not a measurement (metrics are functions, while measurements are the numbers obtained by the application of metrics), often the two terms are used as synonyms. Since quantitative measurements are essential in all sciences, there is a continuous effort by computer science practitioners and theoreticians to bring similar approaches to software development. The goal is obtaining objective, reproducible and quantifiable measurements, which may have numerous valuable applications in schedule and budget planning, cost estimation, quality assurance, testing, software debugging, software performance optimization, and optimal personnel task assignments.
Common software measurements
Common software measurements include:
- Balanced scorecard
- Bugs per line of code
- Code coverage
- Cohesion
- Comment density[1]
- Connascent software components
- COSMIC Function Points
- Constructive Cost Model
- Coupling
- Cyclomatic complexity (McCabe's complexity)
- Defect density - defects found in a component
- Defect potential - expected number of defects in a particular component
- Defect removal rate
- DSQI (design structure quality index)
- Function Points and Automated Function Points, an Object Management Group standard[2]
- Halstead Complexity
- Instruction path length
- Maintainability index
- Number of lines of code
- Program execution time
- Program load time
- Program size (binary)
- Weighted Micro Function Points
- CISQ automated quality characteristics measures
Limitations
As software development is a complex process, with high variance on both methodologies and objectives, it is difficult to define or measure software qualities and quantities and to determine a valid and concurrent measurement metric, especially when making such a prediction prior to the detail design. Another source of difficulty and debate is in determining which metrics matter, and what they mean.[3][4] The practical utility of software measurements has therefore been limited to the following domains:
- Scheduling
- Software sizing
- Programming complexity
- Software development effort estimation
- Software quality
A specific measurement may target one or more of the above aspects, or the balance between them, for example as an indicator of team motivation or project performance.
Acceptance and public opinion
Some software development practitioners point out that simplistic measurements can cause more harm than good.[5] Others have noted that metrics have become an integral part of the software development process.[3] Impact of measurement on programmer psychology have raised concerns for harmful effects to performance due to stress, performance anxiety, and attempts to cheat the metrics, while others find it to have positive impact on developers value towards their own work, and prevent them being undervalued.[6] Some argue that the definition of many measurement methodologies are imprecise, and consequently it is often unclear how tools for computing them arrive at a particular result,[7] while others argue that imperfect quantification is better than none (“You can’t control what you can't measure.”).[8] Evidence shows that software metrics are being widely used by government agencies, the US military, NASA,[9] IT consultants, academic institutions,[10] and commercial and academic development estimation software.
See also
- Goal Question-Metric
- List of tools for static code analysis
- Orthogonal Defect Classification
- Software crisis
- Software engineering
- Software package metrics
References
- ^ "Descriptive Information (DI) Metric Thresholds". Land Software Engineering Centre. Archived from the original on 6 July 2011. Retrieved 19 October 2010.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^ "OMG Adopts Automated Function Point Specification". Omg.org. 2013-01-17. Retrieved 2013-05-19.
- ^ a b Binstock, Andrew. "Integration Watch: Using metrics effectively". SD Times. BZ Media. Retrieved 19 October 2010.
- ^ Kolawa, Adam. "When, Why, and How: Code Analysis". The Code Project. Retrieved 19 October 2010.
- ^ Kaner, Dr. Cem, Software Engineer Metrics: What do they measure and how do we know?, CiteSeerX 10.1.1.1.2542
- ^ "ProjectCodeMeter (2010) "ProjectCodeMeter Users Manual" page 65" (PDF). Retrieved 2013-05-19.
- ^ Lincke, Rüdiger; Lundberg, Jonas; Löwe, Welf (2008), "Comparing software metrics tools" (PDF), International Symposium on Software Testing and Analysis 2008, pp. 131–142
- ^ DeMarco, Tom. Controlling Software Projects: Management, Measurement and Estimation. ISBN 0-13-171711-1.
- ^ "NASA Metrics Planning and Reporting Working Group (MPARWG)". Earthdata.nasa.gov. Archived from the original on 2011-10-22. Retrieved 2013-05-19.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^ "USC Center for Systems and Software Engineering". Sunset.usc.edu. Retrieved 2013-05-19.