Re: Parallel programming with FORALL?



On 2011-01-11 16:17:14 -0400, Daniel Carrera said:

Hello,

I want to write a simple program that uses both my CPU cores in
parallel. I don't care what the program does, I just want to learn how
to use both cores. So I wrote a dumb program with a non-trivial FORALL
statement. Unfortunately, it still uses only one core.

Here is my program. I tried to design it to prevent the compiler from
just pre-computing the result, and I tried to make the FORALL
expensive.


program test
integer, parameter :: N = 10000
integer :: i, j
real :: A(N) , B(N,N)

! Generate a pseudo-random "A" seeded with the system clock.
! Now the compiler cannot just pre-compute the answer.
!
call seed_prng()
call random_number(A)

! Bulk of the program is a FORALL statement.
!
forall (i = 1:N, j = 1:N)
B(i,j) = gamma(A(i) + i/j)
B(i,j) = cosh(B(j,i)) + tanh(B(i,j) + j/i)
end forall

! Select a random value to print.
!
i = floor(1 + A(1)*N)
j = floor(1 + A(2)*N)
print *,"B(i,j) = ",B(i,j)

contains
... define seed_prng() here ...
end program


When I compile this with GFortran (flags: -O3 -Wall) and run it, it
only uses one CPU core. I asked on the GFortran list, and someone
suggested that you have to write programs differently to make them
parallel, but didn't know how.

I was hoping that someone here might know something about parallel
programs and could help me get this program running on two cores.

Thanks for the help.

Cheers,
Daniel.

FORALL is an elaborate arraay assignment to cover the cases that simpler
syntax misses. Think about the diagonal of a 2d matrix. Computes the
full righthandside before doing any assignment. Could be parallel but
not guaranteed. Just like at lot of other array assigments might be.
Oh the joys of "as if"!

If you want to use multiple cores with gfrtran you will need to look at OpenMP
and all that flows from that.

I have a six core Mac Pro and for suitable embarassing parallel problems it
is "easy" to get 12 threads going flat out. You might find that "easy" is
only defined in context as little about parallel programming is easy as
defined in common usage.





.