ABSTRACT

In parallel programming the programmer is required to decompose the computation into several processes that may be executed in parallel on different processors by the operating system. Parallel symbolic processing languages, such as Concurrent Lisp, require a different architecture than procedural numerical processing languages as a result of irregular memory access and a high processing power requirement. The root of practically all the well-known parallel logic programming languages can be traced to prolog. Interprocess communication is through instantiation of shared logical variables, which is usually achieved by unification. The programming language Modula-2 also provides limited concurrent programming features. The inclusion of concurrent programming constructs in Modula-2 was for simulating problems naturally expressed in parallel processes rather than improving efficiency. Parallel programming on a transputer network in any of these high-level languages is done in a way quite similar to that on a hypercube by organizing the application program as a collection of C programs each running on a separate transputer.