ABSTRACT

Contents 14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424 14.2 Agile Policy and Control Loops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 426 14.3 Assessment Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 428

14.3.1 Toward the Driving Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429 14.3.2 Some Results in Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429 14.3.3 Robust Control with Agile Policies . . . . . . . . . . . . . . . . . . . . . . . 430 14.3.4 Cognition and Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 435 14.3.5 Objective Monitoring Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 436 14.3.6 Specific Objectivity Framework . . . . . . . . . . . . . . . . . . . . . . . . . . 440 14.3.7 Assessment of Process Correctness . . . . . . . . . . . . . . . . . . . . . . . . 442

14.4 Practical Assessment of Cognitive Networking. . . . . . . . . . . . . . . . . . . . 445 14.4.1 The E3: Methodology and Approach . . . . . . . . . . . . . . . . . . . . 445 14.4.2 Building a Consensus on Assessment . . . . . . . . . . . . . . . . . . . . . 447

14.5 Open Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448

Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451 Abbreviation List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 452

Cognitive radio (CR) networks are believed to be self-managed. Do we know how to engineer such systems? And, if such engineering is done, do we know how to assess the properties of a self-managed system? In addition to performance evaluation and testing to guarantee for the consistency, safety, purposefulness, security, and the efficiency of the operation of such systems engineers require assessmentsystematic evaluation (of nonconventional to previously engineered systems) of properties such as cognition, learning, aware sensing, and perception. Technologywise we analyze the problem area of flexible spectrum management (FSM) and suggest that both (self-) management and assessment problems can be solved within the same framework-situation-aware behavioral composition-the key to our policy approach. We describe a model-driven assessment framework: the driving model in the cognition cycle is the enabler of robust control with agile policies-here the “subject-object” policy-based self-management framework is introduced; this framework is interpreted from the control theoretical viewpoint. We then compare assessment with the performance evaluation of cognitive networking, and conclude that the evaluation of the process correctness is the main difference between the two. We derive primary assessment metrics and build some derivatives that, using the specific objectivity framework of G. Rasch, allow us to define the assessment of process correctness in a way that does not depend on a particular cognition algorithm. We report the ongoing research and show some early results, followed by the open issues section.