ABSTRACT

This chapter aims to introduce of some of the most useful parts of popular APIs for parallel programming in today's modern computing systems. Description of the APIs is meant to introduce functions and methods most often used in solving practical problems. The basic component of an MPI application is a process. A process executes an algorithm expressed in the code making use of resources such as memory, caches, disk etc. and communicates with other processes in order to send, receive data and synchronize. Several collective communication calls that are useful in a variety of applications are available. In MPI, in contrast to the point-to-point mode, all processes involved in a collective call invoke the same function. In many cases, an implementation requires more complex messages than those that contain elements of only one data type. The idea of one-sided communication API is to implement remote memory access and potentially increase concurrency over the traditional message passing communication paradigm.