Re: Other than php/perl/lisp/c/c++/java, anybody have a favorite computer-programming language?

On Mon, 5 Feb 2007, robert maas, see wrote:
From: "Arthur J. O'Dwyer" <ajonos...@xxxxxxxxxxxxxx>
[Someone else wrote, and Robert snipped the attribution for]
Ideally, an algorithm is designed independently of data type. While
this is not always possible, it most often is. A language that
supports generics/templates (C++, Eiffel, Ada, and now Java) can
enable this kind of generality of design.

Basically true if we assume "data type" means "primitive C-like
data type" (integer, char, bool, floating-point, bignum, even string),

I disagree with both of you. Many of the specific operations that are
meaningful for one datatype aren't meaningful at all for another datatype.
For example, arithmetic (add, subtract, multiply, divide) make
sense for numbers (integers including bignums, floating-points,
rationals) but not for the other types (char, bool, string).

We were talking about algorithms. You're now talking about operations;
that is, you're talking about mappings from S*S to S, where S is some
set (for example, the set of integers modulo 255, where the operation
is addition).
Algorithms --- to take a specific example, Euclid's GCD algorithm ---
can be applied to many different primitive types, as long as those types
provide operations which are isomorphic to each other.

Picking an element based on index makes sense for strings and other
kids of vectors (1-d arrays), but not for char or bool or even
integer except if the integer is treated as if a bitvector.

"Strings aren't vectors." "Yes they are, they're vectors of characters." "Well then, integers are vectors of bits." This is
very close to the point I was making: algorithms typically rely on
particular data /structures/ (for example, vectors), but not on the
primitive data types (for example, characters or bits).

Potentially interesting tidbit: If your language of choice
chooses to represent strings as /lists/ of characters rather than
random-access /vectors/, then you might find that algorithms expecting
vectors of characters have suddenly acquired a factor of O(n) or two!

who has nothing against Lisp