ABSTRACT

Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308 14.4 Compilation of a Web-Based Editor . . . . . . . . . . . . . . . . . . . . . . . 309

14.4.1 Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309 14.4.2 Overview of JHipster . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309 14.4.3 Targeting JHipster . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311

14.5 Testing a Rule Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313 14.5.1 Random Intensive Testing . . . . . . . . . . . . . . . . . . . . . . . . . 314 14.5.2 Exhaustive Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 14.5.3 Pairwise Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316

14.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316

One common characteristic of all games is that they are defined with rules. These rules can of course be coded into some programming language to allow either the automatic computation of complex data (such as scoring) or even playing with (or against) the computer. Then the usual trade-off applies: if the rules are stable for centuries, as in chess, there is probably no better way than hard coding the rules using the abstractions provided by the chosen programming language (or for a minimum of flexibility use a design pattern such as the type-object pattern [71]). However when the rules are unstable, or evolve often, or have many variants, having these rules as data that is interpreted by a rule engine can be a better choice.