Re: Cracking DES with C++ is faster than Java?
From: Paul Schmidt (wogsterca_at_yahoo.ca)
Date: Sun, 02 May 2004 20:22:51 -0400
Paul Schlyter wrote:
> In article <tuYkc.60514$OU.firstname.lastname@example.org>,
> Paul Schmidt <email@example.com> wrote:
>>Bryan Olson wrote:
>>>Hey -- there are more important things to optimize than clock-
>>>cycle counts. I too am old enough to have learned FORTRAN IV,
>>>and let's not kid the kids: FORTRAN IV sucked! Wasting a few
>>>machine cycles is one thing, but don't waste large amounts of my
>>>time and claim to be doing me a favor.
>>You need to look at conditions of the time though, you could hire a
>>programmer for $5.00 an hour, computer time cost over $1000 per machine
>>second, so if wasting 400 hours of programmer time saved 5 machine
>>seconds you were ahead of the game.
> Could you give an actual example of computer time costing over $1000
> per machine second? That would amount to over $86 _million_ per day
> or $31 _billion_ per year ---- quite a lucrative business for just
> one single computer!!!! And even if the computer would be used by
> paying customers only 10% of the time, it would still mean over $3
> billion per year --- several decades ago when money was worth much
> more than today!
> Also, remember that the computers of the late 60's were, by today's
> standards, quite slow. The amount of computation made during CPU
> second on one of those machines could be duplicated by a human on a
> mechanical calculator in less than 10 hours. And if a programmer
> cost $5/hour, a human doing calculations could probably be obtained
> for $2.5/hour. So why waste over $1000 on computer time when the
> same amount of computations could be done by a human for less than
> $25 ?????
In trying to prove that I my implementation of the theory was wrong, you
missed the point of the theory. In the 1960's computer time was
expensive and labour was cheap, systems attempted to use as little
computer time as possible, due to the cost. So if writing 400 lines of
Fortran, COBOL or Assembler saved a few seconds of computer time, it was
worth it. Today computers are cheap, and labour is expensive, so new
languages have to be more oriented to reducing labour resources at the
expensive of computer resources. Using massive libraries of precanned
class libraries and reusing classes is a good way of reducing labour.
The fact that the more general code may not be as machine efficient is a
>>Today we look at different conditions, you can get a year of computer
>>time for $1,000 but the programmer costs that for a week, so tools need
>>to be programmer efficient rather then machine efficient. If you waste
>>5 hours of machine time and save a week of programmer time, your ahead
>>of the game.
>>Java becomes more programmer efficient by 2 of the 3Rs (reduce is the
>>missing one) reuse and recycle, because a class is an independant
>>entity, you can use the same class over and over again, in different
> There was software reuse before classes -- the subroutine was
> invented for that specific purpose: to wrap code into a package
> making it suitable for reuse in different programs.
Subroutines only dealt with the code, you had to be very careful with
the data, a lot of programs used global data, and it was common that one
subroutine would step on a another subroutines data. C and Pascal
allowed for local data, but still rely largely on global data. Objects
cured this to a large extent, it's easier to black-box an object, then
to black-box a subroutine.
> Also: in real life, classes aren't as independent as you describe
> here: most classes are dependent on other classes. And in extreme
> examples, trying to extract a class from a particular program for use
> in another program will force you to bring along a whole tree of
> classes, which can make moving that class to the new program
You missed the point, you CAN write classes with the idea of writing a
class once, and then using it over and over again, in each new project
that needs that kind of class, put it in a package or class library and
just bolt in the library or package.
>>I think the future will be more descriptive, in that a program will
>>describe what an object needs to accomplish rather then how the object
>>does it. The compiler will then figure out how to do that.
> Like Prolog ? It was considered "the future" in the 1980's .....
Okay, so it's not a new idea, and previous implementations have failed,
objects were the same way, the first attempt to Objectize C was
ObjectiveC who uses it today, you don't see a big call for SmallTalk
programmers either? Everybody seemed to like C++, and Java has been
popular enough. We have objects, we will eventually move away from low
level object handling to high level object handling.