[oclug] Re: Fwd: Re: Why we have Source code
linuxdoctor at yahoo.com
Thu Jan 25 09:14:42 EST 2001
--- "David F. Skoll" <dfs at roaringpenguin.com> wrote:
> On Wed, 24 Jan 2001, Francis Pinteric wrote:
> > Indeed, I'm talking big picture here. For about a year now I have
> > undertaken the task of writing an Encyclical Letter to the World
> > concerning modern computer technology and it's moral impact on
> > world.
> Technology cannot have a "moral impact" on the world.
That of course is the whole point. I say that it can and it already
has. I also assert that because it can directly affect how man
behaves there is such a thing as good technology and bad technology.
Nuclear weapons whould be a case in point of bad technology. On the
other hand food irraditation would be a case of good technology. You
can quibble about the morality of nuclear research in general but the
specific application in itself does have an independent morality.
Nothing good can come from nuclear weapons. There is no point to
having these weapons except to perpertrate mass destruction. Even the
use of such weapons as a deterent presupposes it's only purpose which
A simplistic analysis will simply point out that it is man who is the
author of these evils and that is true enough and he can take the
blame for it's creation and it's use. But that does not change the
moral value of the technology itself.
A more sophisticed analysis would point out that in order for any
creature to have a moral culpability it must have the capacity to
recognize the difference between good and evil and also be able to
make that choice. As a creature, a nuclear weapon cannot help being
what it is and in fact cannot choose to detonate itself and therefore
cannot be culpable for the mass destruction that it may inflict. This
would be the classical moral argument.
My own thesis basically argues that while this is true, a technology
can still have a moral value not based on culpability but based on a
double effect principle. That is to say, the end use determines its
moral truth-value. It may be that it is man who causes the weapon to
do evil but it is because the sole purpose of it is to do evil (in
this instance) which makes it evil. The double effect principle in
this case being that man does evil because of the near occasion of
In other words, culpability does not necessarily determine moral
value. Anyway, this is very cursory and there are many holes in above
argument which I can't address here.
> > Secondly, to assume that technolgoy cannot "improve society or
> > people better citizens" simply ignores history. Technolgy has
> > influenced human behaviour.
> Has it influenced human _moral_ behaviour? I would say not really.
The response of "not really" means that it "does in some way" which
To say that something is 'nonessential' is to say that
it is 'unnecessary.'
GUIs, while nice, are nonessential.
Do You Yahoo!?
Yahoo! Auctions - Buy the things you want at great prices.
More information about the OCLUG