AHP Ontology Evaluation System and User Guide
About AHP Ontology Evaluation System

About the tool. The AHP-based ontology evaluation system is an automatic decision-aiding software that evaluates ontologies based on their complex characteristics. The user decides the importance of each criterion in a simple manner, as well as the domain covered by the desired ontology. We use a hierarchical model of independent characteristics that describe ontologies. The hierarchy is used for analyzing the problem from different perspectives and at different abstraction levels. The decision is based on the concrete end-node measurements and their relative importance at more abstract levels
User's manual (pdf)
Paper: A. Groza, I. Dragoste, I. Sincai and I. Jimborean -  An ontology selection and ranking system based on analytical hierarchy process, 16th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC2014), Timisoara, Romania, 22-25 September 2014 (preprint pdf) (presentation)

Installation Guide

This section describes the pre-requisites of running the application. The application has been developed using Java SE Development Kit 7 on aWindows X64 operating system, using Eclipse IDE.

It is a desktop application, which relies on internet connection for updating ontology measurements based on their URI, but can function without an internet connection using a local ontology repository, which may not be up to date.

Software dependencies
  • In order to run the application on a Windows operating system, you need to have a Java Runtime Environment version 1.7 or higher installed.

  • Dictionary application WordNet2.1 1 needs to be installed prior to running the ap- plication for the dictionary knowledge base access. The path of WordNet installation home directory (ex: "d:/jde/WordNet") needs to be set as a System Variable under the name WNHOME. This system variable name is set in the application as a static field in ApplicationConstants class, from where can be edited.

  • The application requires communication with a running instance of MySQL Server2, version 5.6 or higher. A local server connection (jdbc:mysql://localhost:3306/) needs to exist for the following credentials:
    username - root; password - root

    The database schema ontologies needs to be imported on the above server instance from the dump file located in project project_environment/prerequisites/ontologies_database_schema.sql. The application database connection can be re-configured editing file META-INF/persistence.xml.

  • File Resources
    The file structure of folder project environment contains the file resources needed to run the application.
    • prerequisites folder contains the database dump needed for creating and loading the database prior to running the application.
    • ontology evaluation files folder contains initial .pwc file which loads the AHP decision problem. It also contains problem files with pre-filled pairwise comparisons with different degrees of inconsistency (consistent1.pwc and consistent2.pwc, medium inconsistency, high inconsistency and demo file used as example in this chapter screen-shots), which can be imported using the AHP evaluation module GUI.
    • local_ ontology_repo folder contains the downloaded ontology files. They are used by the system as a local backup, when the corresponding online resources are not available or internet connection is disabled.
    • AHP Ontology Evaluation System folder contains the runnable java application.
    If one desires to alter the resources file structure, ApplicationConstants class con- stants need to be updated with the new file paths: ONTOLOGY LOCAL REPOSTORY FOLDER, EVALUATION FILES, REPORTS FILES.
User's manual (pdf)
This section describes the main functionality of AHP Ontology Evaluation System.

View ontology measurements.
Before and during running the application, the user can consist the MeasurementReport.pdf from the reports folder, in order to be familiar with available ontologies and their most recent measurements. The application opens with the domain coverage screen (Figure 1), where a user can evaluate the knowledge coverage of available ontologies for a given domain and preselect a subset for AHP evaluation.

Figure 1. Computing the domain coverage.

Define domain using synonyms.
Write a noun inside Concept(noun) text field and click Get Synonyms button to get the list of synonyms grouped by word meaning in the right panel. If you wish to add current Concept(noun) to domain definition, press Add Concept to Search Terms List. If you wish to add some of the suggested synonyms to the search term, write them one by one in the Synonym text field and press Add synonym to concept. The user can reset the Search Terms List that defines the domain by clicking Reset button.
Add new concepts repeating the steps above. As can be seen in Figure 2, some concepts may have different word meanings. The synonyms corresponding to the desired sense must be selected. The user can add concepts to the Search Terms List without adding synonyms for them, but this decreases the chances of finding classes corresponding to that concept.

Figure 2. Selecting synonyms from WordNet

Calculate domain coverage.
When you consider the Search Terms List is complete, press Done. This is the most time-consuming step of the application, you must wait until processing completes.
Once the domain coverage processing has completed, the dialog box shown in Figure 3 is shown. The user can consult the generated DomainCoverageReport.pdf from reports folder to see the values for each ontology.

Figure 3. Domain Coverage Screen

Pre-select ontologies for AHP evaluation.
In order to pre-select ontologies for an AHP evaluation, input the minimum domain coverage value in the dialog text field an press Ok. Default value 0 selects all ontologies for further step. By pressing Cancel, the user can proceed with a new domain coverage evaluation.

View criteria hierarchy.
Clicking Ok with a valid input loads the AHP evaluation screen, presented in Figure 4. Stimuli panel present the children of the selected element in Criteria panel. Judgements section presents the matrix of pairwise comparisons corresponding to the selected Criterion, that needs to be completed. To the left are the inconsistency measurements. The sub-criteria that need to be compared can be also visualized in Decision Aid->Graph View and Equalizer View. The inconsistency measurements are Consistency Ratio (CR), Consistency Measure (CM) and Congruence (Θ) for cardinal inconsistency; The Number of Three-way Cycles(L) and Dissonance (Ψ) for ordinal inconsistency.

Figure 4. AHP Evaluation Screen

Express preference judgements.
The pairwise comparison preference judgements can be input also in the Equalizer View (Figure 5), which suggests the relation between direct and indirect judgements graphically. Direct judgements are represented as larger circles, while indirect judgement are smaller circles. The segment between criteria is a two-directional preference axis, the middle being preference equivalence. A perfectly consistent matrix has circles concentric for all pairs. A latent violation is visible when a direct judgement and an indirect one have oppo- site directions on the axis, being highlighted in yellow. In this example, three intransitive judgement three-way-cycles have occurred from inconsistent input (L = 3).

Figure 5. AHP Evaluation Screen: Equalizer View

Assess preference consistency.
Inconsistent judgements can be identified, evaluated and corrected also in the Judgements panel (Figure 6). Button Dissonance reveals the Dissonance and Congruence of each individual judgement. Button Triad for CM highlights in blue the most ordinally inconsistent judgement. Figure 6 highlights in red the third set of intransitive judgements (L3) listed in the Equalizer View.

Figure 6. AHP Evaluation Screen: Judgements View

View evaluation results.
When all pairwise comparisons have been provided, the user can press the Evaluate! button to obtain the Final evaluation values for pre-selected alternatives. The user can consult the generated AHPEvaluationResultReport.pdf to obtain a detailed documentation about the current evaluation process.
The final values for ontology alternatives (given by id) is shown in Problem tab (Figure 7). The correspondence between alternatives and evaluation values is colour-coded. This tab also contains the problem description and use guidelines.

Figure 7. AHP Evaluation Screen: Problem View

View evaluation accuracy.
By selecting a non-leaf element in the Criteria panel, the elicited weights of its sub-criteria can be seen in Vectors tab. Gnatt View displays the values in a manner similar to Gnatt diagrams, suggesting the relation between value magnitudes visually (Figure 8). Numeric Values tab (Figure 9) displays also the elicitation accuracy measurements: Total Deviation form Direct Judgements (TD), Total Deviation form Indirect Judgements (TD2) and Number of Priority Violations (NV).
Figure 8. AHP Evaluation Screen: Criterion Weights

Figure 9. AHP Evaluation Screen: Criterion Weights

By selecting a leaf element in the Criteria panel, the elicited weights of alternatives for the corresponding Criterion can be seen in Vectors tab. The example from Figures 10 and 11 displays the alternative weights for Average Number of Sub-classes atomic criterion. As in the previous step, the relation between weight values and Stimuli is colour-coded.

Figure 10. AAHP Evaluation Screen: Alternative Weights

Figure 11. AHP Evaluation Screen: Alternative Weights

Import/export .pwc files.
The user can export the problem to a .PWC file or import a new problem for evaluation.
The application is exited by clicking the exit button in the upper right corner of the window. Unless exported, the decision problem is not saved. Before running the program again, the user is advised to save the generated reports in a different location or with a different name, as they will be overwritten.

irinadragoste [at] gmail.com

Adrian.Groza [at] cs.utcluj.ro