Author:admin

Date:2008-09-17T13:53:28.000000Z


git-svn-id: https://svn.eiffel.com/eiffel-org/trunk@3 abb3cda0-5349-4a8f-a601-0c33ac3a8c38
This commit is contained in:
jfiat
2008-09-17 13:53:28 +00:00
parent 4fee9356ea
commit 2ee31ab9c7
763 changed files with 36576 additions and 0 deletions

View File

@@ -0,0 +1,15 @@
[[Property:title|Detailed Result Panel]]
[[Property:weight|1]]
The Detailed result panel is where detailed metric result and archive comparison result are displayed. In the following two figures, a detailed metric result and an archive comparison result are shown.
Detailed metric result:
[[Image:interface4|Defining an input domain]]
Archive comparison result:
[[Image:interface5|Defining an input domain]]

View File

@@ -0,0 +1,12 @@
[[Property:title|User interface basics]]
[[Property:weight|3]]
EiffelStudio includes a Metric tool based on the previously defined metric theory. This tool provides many facilities like computing measures over a project, including smaller scopes, defining new metrics according to users needs and handling archives to compare projects
* [[Metric Evaluation Panel|Metric Evaluation panel]]
* [[Detailed Result Panel|Detailed Result panel]]
* [[Metric Definition Panel|Metric Definition panel]]
* [[Metric History Panel|Metric History panel]]
* [[Metric Archive Panel|Metric Archive panel]]

View File

@@ -0,0 +1,33 @@
[[Property:title|Metric Archive Panel]]
[[Property:weight|4]]
The metric archive panel is used to calculate and restore metric archives and to do metric archive comparison.
==Metric Archiv Calculation==
Let's have a look at the buttons and options related to metric archive calculataion:
[[Image:metrics-tool--debug-run-icon|calculate archive]] Start metric archive calculation<br/>
After selecting the metrics you want to archive and the input domain, use this button to start a metric archive calculation.
[[Image:metrics-tool--debug-stop-icon|stop archive]] Stop metric archive calculation<br/>
Use this button to stop a metric archive calculation.
[[Image:interface22|archive file location]] Specify archive file<br/>
The file to store metric archive results is specified here.
[[Image:interface23|reset archive]] Reset archive file<br/>
If the specified archive file already contains some archive information, this option will be sensitive. If it's enabled, the information contained in that archive file will be cleaned before new archive information is written to that file, otherwise, new archive information will be appended to that file.
==Metric Archive Comparison==
After you specify two archive files you can compare them, as shown in the following image:
[[Image:interface24|Compare archives]]
The archive comparison result is shown in the following figure:
[[Image:interface25|archive comparison result]]

View File

@@ -0,0 +1,69 @@
[[Property:title|Metric Definition Panel]]
[[Property:weight|2]]
In the metric definition panel, you can do the following:
* Manage user-defined metrics, such as defining new metrics and modify or remove existing ones. Metrics with a small lock in their icons are predefined metrics which cannot be changed or remvoed
* Import metrics from other systems
* Backup user-defined metrics
The following figure shows the layout of the metric definition panel:
[[Image:interface6|Metric definition panel]]
Let's have a look at the buttons in the main toolbar which is highlighted in the following figure:
[[Image:interface7|Main toolbar buttons]]
[[Image:metrics-tool--new-metric-icon|New metric]]
[[Image:metrics-tool--new-document-icon|Clone metric]]
[[Image:metrics-tool--general-remove-icon|Remove metric]]
[[Image:metrics-tool--general-save-icon|Save metric]]
[[Image:metrics-tool--command-send-to-external-editor-icon|External editor]]
[[Image:metrics-tool--general-open-icon|Reload metrics]]
[[Image:metrics-tool--metric-export-to-file-icon|import]]
==Define New Metrics==
To define a new metric, you need to choose the metric type and unit (for basic metric and linear metric). The following figure shows how to choose metric type and unit:
[[Image:interface8|Choose metric type]]
The following figure shows a new basic class metric:
[[Image:interface9|basic metric]]
When defining basic metrics, you can press Ctrl + Space in a cell in the criterion column (the first column in "definition" area) to get a list of all applicable criteria. You can also type "and" or "or" in that cell to get criterion connectors. And you can put "not" in front of a criterion name to get its negation. After typing the criterion name and hitting enter, if the criterion needs further setup, the property cell of that criterion will be highlighted. For domain criterion such as ancestor_is, caller_is, you can pick an item and drop it into this property cell.
The following figure shows a new linear feature metric:
[[Image:interface10|linear metric]]
For every metric referenced in a linear metric, you need to specify a coefficient. You can pick a metric and drop it into a cell of the "Metrics" column in the "Metric Definition" area.
The following figure shows a new ratio metric:
[[Image:interface11|ratio metric]]
For the numerator or denominator metric, you need to specify a coefficient. When the denominator part evaluates to zero, the result of the ratio metric will be "Undefined". You can pick a metric and drop it into the numerator or denominator metric area.
==Import Metrics==
In order to reuse metrics defined in a different system, you need to import them into current system. The following figure shows how to import metrics:
[[Image:interface12|Import metrics]]
==Backup User-defined Metrics==
The following figure shows how to back user-defined metrics:
[[Image:interface13|Backup metrics]]

View File

@@ -0,0 +1,38 @@
[[Property:title|Metric Evaluation Panel]]
[[Property:weight|0]]
The Metric Evaluation panel is the place to do metric evaluation. After selecting a metric from the "Select metric" area and setting an input domain in the "Setup input domain" area, you can click the run button to start metric evaluation. You can start metric evaluation with and empty input domain and the result will be always zero.
[[Image:interface1|Defining an input domain]]
Let's first have a look at the buttons in the main toolbar, see the following figure in which the main toolbar is highlighted:
[[Image:interface2|Defining an input domain]]
[[Image:metrics-tool--debug-run-icon|Defining an input domain]] Start metric evaluation <br/>
Press this button to start evaluating the currently selected metric.
[[Image:metrics-tool--debug-stop-icon|Defining an input domain]] Stope metric<br/>
Press this button to terminate a running metric evaluation.
[[Image:metrics-tool--metric-send-to-archive-icon|Defining an input domain]] Send last result to metric history<br/>
After a metric evaluation this button will be sensitive and clicking it will record the evaluated metric as well as it's input domain and result in the metric history. This faciliates to evaluate it again and let's you compare different metric runs.
[[Image:metrics-tool--metric-run-and-show-details-icon|Defining an input domain]] Keep detailed result when evaluating metric<br/>
Normally, evaluating a metric will give you a number as result, but sometimes, you want to investigate into those items which make up that value. For example, evaluating '''Classes''' metric over the base library gives you 242, which means there are 242 classes in the base library, and sometimes, you want to know which they are. With this option enabled, you'll have a detailed result listed in the detailed result panel after a metric evaluation. This option only has effect when evaluating basic metrics, because the detailed result has no meaning for derived metrics (linear or ratio metrics) in general. Suppose you have a linear metric defined as 5 * Classes, then the notion of detailed result has no meaning. Another use of this option is for performance: keeping a detailed result can be quite expensive in some cases, such as when you calculate the metric '''Lines of code''' for a large system, which may result in hundred of thousands of lines in the result. So turning it off in such as case may be a good idea.
[[Image:metrics-tool--metric-filter-icon|Defining an input domain]] Filter result which is not visible from input domain<br/>
If this option is enabled, all non visible items from the input domain will be filtered. For definition of "non visible" items, please see documentation for criterion is_visible.
[[Image:metrics-tool--context-sync-icon|Defining an input domain]] Automatically go to result panel after metric evaluation<br/>
If this option is enabled, the metric tool will switch to the detailed result panel after a metric evaluation.
[[Image:metrics-tool--metric-quick-icon|Defining an input domain]] Define quick metric<br/>
Sometimes, you want to calculate some metrics which are not defined already. For example, find a feature which is named "foo". And it may be just a one time thing, so there is no need to go to metric definition panel, define and save a metric and then go back to evaluation panel and run it. Quick metric is designed for this situation, you can defined any basic metric in the quick metric definition area. It's the same as the basic metric definition area in the metric definition panel. Just define your metric and run it.
In the following figure, a defined quick metric is shown:
[[Image:interface3|Defining an input domain]]

View File

@@ -0,0 +1,65 @@
[[Property:title|Metric History Panel]]
[[Property:weight|3]]
The metric history panel lists all recorded metric evaluations. You can select them and reevaluate them to see the new value and if it differs from the old result.
Lets have a look at the buttons in the main toolbar highlighted in the following figure:
[[Image:interface14|Metric history panel]]
[[Image:metrics-tool--debug-run-icon|Run history]]
[[Image:metrics-tool--debug-stop-icon|Stop history]]
[[Image:metrics-tool--metric-run-and-show-details-icon|Keep detailed result]]
[[Image:metrics-tool--metric-unit-assertion-icon|Check warning]]
[[Image:metrics-tool--general-reset-icon|Remove detailed result]]
[[Image:metrics-tool--general-remove-icon|Remove history]]
[[Image:metrics-tool--metric-group-icon|Show tree]]
[[Image:interface16|Hide old metrics]]
[[Image:select-all|select all history]]
[[Image:deselect-all|deselect all history]]
[[Image:select-recalculatable|select all calculatable]]
[[Image:deselect-recalculatable|deselect all calculatable]]
==Recalculate Metric History==
To recalculate a metric history, you need to selected those items that you want to recalculate. In the following figure, a metric history item '''Uncommented features''' is selected.
[[Image:interface17|Select metric history]]
After recalculating the selected metric history items, the result will be highlighted, as shown in the following figure:
[[Image:interface18|History recalculationg result]]
In the above figure, the row '''Uncommented features''' is highlighted indicating that this item has been recalculated. And from the row, we can see that the current value is 1 while the previous value is 0, meaning that there is one uncommented feature in cluster sample now while there was no uncommented feature in cluster sample when this metric was calculated the last time.
==Metric History Warning Checking==
Another thing you can do in metric history is to assign a warning tester to each item. When the metric history is recalculated with metric history warning checking enabled, the warning tester will be evaluated against the metric value to see if it's condition is satisfied.
Let's use an example to demonstrate the idea. Suppose we have set up a metric history warning shown in the following two figures:
[[Image:interface20|warning tester]]
[[Image:interface19|warning tester]]
This warning means, when the metric '''Uncommented features''' is calculated over the input domain {sample}, the value should be zero, otherwise a warning should be emitted.
After recalculating the metric history item, we get the following result:
[[Image:interface21|warning tester]]
From the above result, we can see that the value of the metric '''Uncommented features''' over the input domain {sample} is 1 while our warning says it should be 0. So we get a warning message.