开发者

Cyclomatic complexity using jar files

开发者 https://www.devze.com 2022-12-16 14:08 出处:网络
I am working on a project which requires me to find the cyclomatic complexity of Apache ant (Versions 1.1 through 1.6). I have been asked to use the jar files for this purpose. I used a couple of tool

I am working on a project which requires me to find the cyclomatic complexity of Apache ant (Versions 1.1 through 1.6). I have been asked to use the jar files for this purpose. I used a couple of tools (Xdepend trial version and Cyvis ) to have a look at the r开发者_StackOverflow中文版esults. Then i tried to validate the results with results from the source code of Ant Ver1.6. For analyzing the source i used a Netbeans plugin and also manaully found the CC of some methods.

What i found was that in many cases the CC from the jar files was almost the same, but in some there was much discrepancy. I examined one such method and i found that it contained quite few try and catch blocks. My questions are:

  1. Does the java compiler carry out optimisations (like say loop unwinding) which may majorly affect the CC value? Is it advisable to use jar files for such an analysis?
  2. Is there some specific problem with try and catch blocks, in which case i may consider other methods for analysis?
  3. Are there any better (more accurate) tools for such an analysis?

Kindly share your experience on this topic. Thanks in advance.

Cheers


The purpose of complexity measures like cyclomatic are about measuring the structure of a program as it affects the programmer's ability to understand the codebase; e.g. to maintain it. Since programmers view code at the source level, measures of source code complexity are what really matter.

If the measures you are getting by analysing bytecode files are different from regular source code measures, they are worthless and you should abandon the idea.

(It wouldn't surprise me for CC measures from bytecode files are different. On the one hand, a compiler can reorganize code to make it appear simpler; e.g. by unrolling loops. On the other hand, a compiler may need to generate complex looking bytecode sequences for simple language constructs, simply because of limitations in what bytecodes can express.)


This is not a full answer to your question..

The Java compiler is will carry out many optimisations like any other good compiler. Some of these will include Loop invariant code motion, Common subexpression elimination, Strength reduction and Variable allocation.

But the Java hotspot is the Java bytecode execution engine and this will dynamically compile bytecode to optomise it at runtime.

So there are many factors that will skew the src code CC to the byte code CC but this doesn't mean that improving the CC is not a valuable job, CC evaluation of src code is vital to maintainable clean code.

0

精彩评论

暂无评论...
验证码 换一张
取 消