Eliza - der Computer als Gesprächspartner
Autor: Andre Adrian
Version: 03.Feb.2025
Der Eliza-Effekt
Das Programm Eliza wurde 1966 von Joseph Weizenbaum geschrieben.
Eliza analysiert die Texteingabe des Gesprächspartners nach
einfachen Regeln und produziert aus Teilen der Eingabe und eigenen
Satzbausteinen eine Antwort. Berühmt wurde Eliza mit einem Regelsatz
welcher die Gesprächpsychotherapie von Carl Rogers nachbildet.
Joseph Weizenbaum hat mit Eliza ein Künstliches Intelligenz Programm
geschaffen welches nach 40 Jahren immer noch beachtet wird. Der
Erfolg von Eliza war dem Autor wohl etwas unheimlich. Die Eliza
Manie zeigt doch nur, daß ein Mensch alles vermenschlicht. Blitz und
Donner wurden als Götter vermenschlicht. In Fabeln, Märchen und
Romanen wie z.B. "Watership Down" treten sprechende, vermenschlichte
Tiere auf.
Es scheint dem Menschen unmöglich zu sein den menschlichen
Blickwinkel abzulegen. Alles nicht-menschliche in der Umwelt wird
als mehr oder minder seltsamer Mitmensch angesehen. Vielleicht ist
dies die Funktionsweise der Spiegelneuronen: Wir können uns in ein
rollendes Rad welches kurz vor dem Umkippen ist hineinversetzen. Wir
können sinnvoll die Zukunft "gleich kippt es um" vorhersehen. Und
weil wir das nötige "physikalische Gefühl" für das Rad haben,
funktioniert die menschliche Empathie (Einfühlungsvermögen). Die
Gedankenkette wird mit der Schlußfolgerung: "Wenn ich mich
hineinversetzen kann, dann muß es ein Mitmensch sein" abgeschlossen.
Alle Tamagotchis dieser Welt und wohl ein Großteil der Dackel
funktionieren für ihre Besitzer nach diesem Muster: Das Verhalten
ist verständlich weil vorhersehbar, der Besitzer kann sich
hineinversetzen. Die Spiegelneuronen funktionieren wie wenn ein
Mitmensch die beobachtete Handlung ausführt. Und weil die
Spiegelneuronen arbeiten muß das beobachtete Objekt ein Mitmensch
sein. Vielleicht aus Plastik, mit drei Knöpfen und einem LCD
Bildschirm, aber trotzdem.
Eigentlich ist der Eliza-Effekt nichts Neues und vor allem nichts
Unheimliches.
Leider gab es im Bereich Künstliche Intelligenz die Verwechslung
zwischen "der Programm Autor ist intelligent" und "das Programm ist
intelligent". Das Eliza Programm zeigt keine Intelligenz im Sinne
von "lernfähiges Programm".
Klientenzentrierte Psychotherapie
Die Therapie von Carl Rogers ist heute fest etabliert. Einmal in der
Besserung oder Heilung von psychischen und psychosomatischen
Störungen. Viel wichtiger ist ihre Bedeutung für die Kommunikation
zwischen "Gesunden". Niccolò Machiavelli sagt "der Mensch ist
schlecht". Carl Rogers sagt "der Mensch ist gut". Beide Autoren
können sich auf empirische Beweise stützen. Vielleicht zeigen
Machiavelli und Rogers die beiden Extreme des menschlichen
Verhaltens.
Die klientenzentrierte Psychotherapie macht den Therapeuten zum
Geburtshelfer der Selbsthilfe. Eigentlich weiss der Klient ja
selbst, was am besten für ihn ist. Nur kommt dieses Wissen ohne
Therapeut nicht heraus. Für die Geburtshelfer Funktion des
Therapeuten sind drei Verhaltenselemente nötig:
- Bedingungslose positive Wertschätzung
- Einfühlungsvermögen (Empathie)
- Wahrhaftigkeit gegenüber den Klienten (Kongruenz)
Das Eliza Programm hat diese Therapeuten Anforderungen erfüllt, oder
besser gesagt: Eliza hat die Gesprächspartner nicht enttäuscht. Wenn
im Regelsatz von Eliza nur Aussagen mit positiver Wertschätzung
sind, kann Eliza nur eine positive Wertschätzung zeigen. Wenn die
Sätze des Klienten als Fragen gespiegelt werden, wird eine
"Seelenverwandtschaft" zwischen Klienten und Programm suggeriert.
Sind die Antwortphrasen auf Schlüsselwörter des Klienten gut
gewählt, entsteht sogar der Eindruck von "Eigeninitiative" des
Programmes. Werden die Antwortphrasen passend zu den
Schlüsselwörtern gewählt, "lügt" Eliza nicht.
Joseph Weizenbaum schreibt hierzu: "It is important to note that
this assumption is one made by the speaker. Whether it is realistic
or not is an altogether separate question. In any case, it has a
crucial psychological utility in that it serves the speaker to
maintain his sense of being heard and understood. The speaker
further defends his impression (which even in real life may be
illusory) by attributing to his conversational partner all sorts of
background knowledge, insights and reasoning ability. But again,
these are the speaker's contribution to the conversation."
Gespräch zwischen zwei KIs
Neben Eliza sind noch weitere ähnliche Künstliche Intelligenz
Programm geschrieben worden. RFC439 enthält das
Gespräch zwischen Eliza und Parry vom 18 September 1972. Dabei ist
Eliza der Doktor und Parry der Patient. Das Programm Parry simuliert
einen paranoiden Schizophrenen. Der Autor von Parry ist Kenneth
Colby. Ein Auszug (Parry beginnt):
*Do you know anything about bookies?
What makes you think I know anything about bookies?
*A bookie didn't pay me off once.
What does that suggest to you?
*I beat up the bookie who did not pay off, and I thought he might use
his friends in the underworld to get even with me.
Original Eliza
Das original Eliza Programm wurde in SLIP,
genau MAD-SLIP, auf einer IBM 7094 geschrieben. Der komplette Eliza
Quelltext ist gefunden worden. Von Anthony Hay gibt es im
Internet eine Eliza Javascript
Version welche die gleichen Antworten wie original Eliza liefert.
Die "Patienten" Eingaben sind aus dem Weizenbaum Eliza Artikel.
Ausgabe (Eliza) ist in Großschreibung:
HOW DO YOU DO. PLEASE TELL ME YOUR PROBLEM
Men are all alike.
IN WHAT WAY
They're always bugging us about something or other.
CAN YOU THINK OF A SPECIFIC EXAMPLE
Well, my boyfriend made me come here.
YOUR BOYFRIEND MADE YOU COME HERE
He says I'm depressed much of the time.
I AM SORRY TO HEAR YOU ARE DEPRESSED
It's true. I am unhappy.
DO YOU THINK COMING HERE WILL HELP YOU NOT TO BE UNHAPPY
I need some help, that much seems certain.
WHAT WOULD IT MEAN TO YOU IF YOU GOT SOME HELP
Perhaps I could learn to get along with my mother.
TELL ME MORE ABOUT YOUR FAMILY
My mother takes care of me.
WHO ELSE IN YOUR FAMILY TAKES CARE OF YOU
My father.
YOUR FATHER
You are like my father in some ways.
WHAT RESEMBLANCE DO YOU SEE
You are not very aggressive but I think you don't want me
to notice that.
WHAT MAKES YOU THINK I AM NOT VERY AGGRESSIVE
You don't argue with me.
WHY DO YOU THINK I DON'T ARGUE WITH YOU
You are afraid of me.
DOES IT PLEASE YOU TO BELIEVE I AM AFRAID OF YOU
My father is afraid of everybody.
WHAT ELSE COMES TO MIND WHEN YOU THINK OF YOUR FATHER
Bullies.
DOES THAT HAVE ANYTHING TO DO WITH THE FACT THAT YOUR
BOYFRIEND MADE YOU COME HERE
Die Eliza Regeln
Joseph Weizenbaum hat die Eliza Regeln in dem Papier "ELIZA--A Computer Program For the Study of
Natural Language Communication Between Man and Machine"
beschrieben. Schlüsselwort und Satz Transformation sind die beiden
Kernelemente von Eliza
Eliza Schlüsselwörter
In der Eingabe des Gesprächspartners wird nach Schlüsselworten
gesucht. Die Schlüsselwörter in dem kleine Eliza Program therapist
von Joe Strout in Python sind:
- Ich Satzanfänge
I need .., I think .., I feel .., I have .., I would .., I want
.., I can't .., I am .., I'm .., I don't .., Can I .., My ..
- Du Satzanfänge
Why don't you .., Are you .., Can you .., You are .., You're ..,
You ..
- Es Satzanfänge
Is it .., It is .., Is there ..
- Fragewort Satzanfänge
Why can't I .., Why .., What .., How ..
- Sonstige Satzanfänge
Because .., Hello .., Yes
- Hauptwörter
.. mother .., .. father .., .. child .., .. friend .., ..
computer ..
- Sonstige Schlüsselwörter
.. sorry ..
Eliza Transformationen
Andere Schlüsselwörter im Eingabe-Satz werden ersetzt. Durch die
Ersetzung kann ein Teil des Eingabe-Satzes als Teil des
Ausgabe-Satzes benutzt werden. Im Python Programm sind dies:
"am" :
"are",
"was" : "were",
"i"
: "you",
"i'd" : "you
would",
"i've" : "you
have",
"i'll" : "you
will",
"my" : "your",
"are" : "am",
"you've": "I have",
"you'll": "I will",
"your" : "my",
"yours" :
"mine",
"you" : "me",
"me" : "you"
Eine Eliza Sitzung
Die "Patienten" Eingaben sind aus dem Weizenbaum Eliza Artikel.
Eingabe (Patient) beginnt mit > Prompt. Ausgabe (Eliza) ist
ohne Prompt.
>>> import eliza
>>>
eliza.command_interface()
Therapist
---------
Talk to the program by typing in plain English, using normal
upper-
and lower-case letters and punctuation. Enter "quit" when
done.
========================================================================
Hello. How are you feeling today?
>Men are all alike.
Please tell me more.
>They're always bugging us about something or other.
Can you elaborate on that?
>Well, my boyfriend made me come here.
Let's change focus a bit... Tell me about your family.
>He says I'm depressed much of the time.
Why do you say that he says i'm depressed much of the time?
>It's true. I am unhappy.
Very interesting.
>I need some help, that much seems certain.
Why do you need some help, that much seems certain?
>Perhaps I could learn to get along with my mother.
Good family relations are important.
>My mother takes care of me.
I see, your mother takes care of you.
>My father.
When your father, how do you feel?
>You are like my father in some ways.
Does it please you to think that I'm like your father in some
ways?
>You are not very aggressive but I think you don't want me to
notice that.
Perhaps you're really talking about yourself?
>You don't argue with me.
We should be discussing you, not me.
>You are afraid of me.
Perhaps you would like me to be afraid of you.
>My father is afraid of everybody.
When your father is afraid of everybody, how do you feel?
>Bullies.
Please tell me more.
>quit
Good-bye.
>>>
Eliza in Python Quelltext
Der Quelltext ist kompatibel zu Python 2.7.18.
#----------------------------------------------------------------------
# eliza.py
#
# a cheezy little Eliza
knock-off by Joe Strout <joe@strout.net>
# with some updates by
Jeff Epler <jepler@inetnebr.com>
# hacked into a module
and updated by Jez Higgins <jez@jezuk.co.uk>
# changed whrandom to
random by Andre Adrian
# last revised: 21 April
2007
#----------------------------------------------------------------------
import string
import re
import random
class eliza:
def __init__(self):
self.keys =
map(lambda x:re.compile(x[0], re.IGNORECASE),gPats)
self.values
= map(lambda x:x[1],gPats)
#----------------------------------------------------------------------
# translate: take a
string, replace any words found in dict.keys()
# with the
corresponding dict.values()
#----------------------------------------------------------------------
def
translate(self,str,dict):
words =
string.split(string.lower(str))
keys =
dict.keys();
for i in
range(0,len(words)):
if words[i] in keys:
words[i]
= dict[words[i]]
return
string.join(words)
#----------------------------------------------------------------------
# respond: take a
string, a set of regexps, and a corresponding
# set
of response lists; find a match, and return a randomly
#
chosen response from the corresponding list.
#----------------------------------------------------------------------
def respond(self,str):
# find a
match among keys
for i in
range(0,len(self.keys)):
match = self.keys[i].match(str)
if match:
#
found a match ... stuff with corresponding value
#
chosen randomly from among the available options
resp
= random.choice(self.values[i])
#
we've got a response... stuff in reflected text where indicated
pos
= string.find(resp,'%')
while
pos > -1:
num
= string.atoi(resp[pos+1:pos+2])
resp
= resp[:pos] + \
self.translate(match.group(num),gReflections)
+ \
resp[pos+2:]
pos
= string.find(resp,'%')
#
fix munged punctuation at the end
if
resp[-2:] == '?.': resp = resp[:-2] + '.'
if
resp[-2:] == '??': resp = resp[:-2] + '?'
return
resp
#----------------------------------------------------------------------
# gReflections, a translation
table used to convert things you say
# into
things the computer says back, e.g. "I am" --> "you are"
#----------------------------------------------------------------------
gReflections = {
"am" :
"are",
"was" : "were",
"i" :
"you",
"i'd" : "you
would",
"i've" : "you
have",
"i'll" : "you
will",
"my" : "your",
"are" : "am",
"you've": "I have",
"you'll": "I will",
"your" : "my",
"yours" : "mine",
"you" : "me",
"me" : "you"
}
#----------------------------------------------------------------------
# gPats, the main response
table. Each element of the list is a
# two-element list; the
first is a regexp, and the second is a
# list of possible
responses, with group-macros labelled as
# %1, %2, etc.
#----------------------------------------------------------------------
gPats = [
[r'I need (.*)',
[ "Why do you
need %1?",
"Would it
really help you to get %1?",
"Are you
sure you need %1?"]],
[r'Why don\'?t you
([^\?]*)\??',
[ "Do you really
think I don't %1?",
"Perhaps
eventually I will %1.",
"Do you
really want me to %1?"]],
[r'Why can\'?t I
([^\?]*)\??',
[ "Do you think
you should be able to %1?",
"If you
could %1, what would you do?",
"I don't
know -- why can't you %1?",
"Have you
really tried?"]],
[r'I can\'?t (.*)',
[ "How do you
know you can't %1?",
"Perhaps
you could %1 if you tried.",
"What would
it take for you to %1?"]],
[r'I am (.*)',
[ "Did you come
to me because you are %1?",
"How long
have you been %1?",
"How do you
feel about being %1?"]],
[r'I\'?m (.*)',
[ "How does being
%1 make you feel?",
"Do you
enjoy being %1?",
"Why do you
tell me you're %1?",
"Why do you
think you're %1?"]],
[r'Are you
([^\?]*)\??',
[ "Why does it
matter whether I am %1?",
"Would you
prefer it if I were not %1?",
"Perhaps
you believe I am %1.",
"I may be
%1 -- what do you think?"]],
[r'What (.*)',
[ "Why do you
ask?",
"How would
an answer to that help you?",
"What do
you think?"]],
[r'How (.*)',
[ "How do you
suppose?",
"Perhaps
you can answer your own question.",
"What is it
you're really asking?"]],
[r'Because (.*)',
[ "Is that the
real reason?",
"What other
reasons come to mind?",
"Does that
reason apply to anything else?",
"If %1,
what else must be true?"]],
[r'(.*) sorry (.*)',
[ "There are many
times when no apology is needed.",
"What
feelings do you have when you apologize?"]],
[r'Hello(.*)',
[ "Hello... I'm
glad you could drop by today.",
"Hi
there... how are you today?",
"Hello, how
are you feeling today?"]],
[r'I think (.*)',
[ "Do you doubt
%1?",
"Do you
really think so?",
"But you're
not sure %1?"]],
[r'(.*) friend (.*)',
[ "Tell me more
about your friends.",
"When you
think of a friend, what comes to mind?",
"Why don't
you tell me about a childhood friend?"]],
[r'Yes',
[ "You seem quite
sure.",
"OK, but
can you elaborate a bit?"]],
[r'(.*) computer(.*)',
[ "Are you really
talking about me?",
"Does it
seem strange to talk to a computer?",
"How do
computers make you feel?",
"Do you
feel threatened by computers?"]],
[r'Is it (.*)',
[ "Do you think
it is %1?",
"Perhaps
it's %1 -- what do you think?",
"If it were
%1, what would you do?",
"It could
well be that %1."]],
[r'It is (.*)',
[ "You seem very
certain.",
"If I told
you that it probably isn't %1, what would you feel?"]],
[r'Can you
([^\?]*)\??',
[ "What makes you
think I can't %1?",
"If I could
%1, then what?",
"Why do you
ask if I can %1?"]],
[r'Can I ([^\?]*)\??',
[ "Perhaps you
don't want to %1.",
"Do you
want to be able to %1?",
"If you
could %1, would you?"]],
[r'You are (.*)',
[ "Why do you
think I am %1?",
"Does it
please you to think that I'm %1?",
"Perhaps
you would like me to be %1.",
"Perhaps
you're really talking about yourself?"]],
[r'You\'?re (.*)',
[ "Why do you say
I am %1?",
"Why do you
think I am %1?",
"Are we
talking about you, or me?"]],
[r'I don\'?t (.*)',
[ "Don't you
really %1?",
"Why don't
you %1?",
"Do you
want to %1?"]],
[r'I feel (.*)',
[ "Good, tell me
more about these feelings.",
"Do you
often feel %1?",
"When do
you usually feel %1?",
"When you
feel %1, what do you do?"]],
[r'I have (.*)',
[ "Why do you
tell me that you've %1?",
"Have you
really %1?",
"Now that
you have %1, what will you do next?"]],
[r'I would (.*)',
[ "Could you
explain why you would %1?",
"Why would
you %1?",
"Who else
knows that you would %1?"]],
[r'Is there (.*)',
[ "Do you think
there is %1?",
"It's
likely that there is %1.",
"Would you
like there to be %1?"]],
[r'My (.*)',
[ "I see, your
%1.",
"Why do you
say that your %1?",
"When your
%1, how do you feel?"]],
[r'You (.*)',
[ "We should be
discussing you, not me.",
"Why do you
say that about me?",
"Why do you
care whether I %1?"]],
[r'Why (.*)',
[ "Why don't you
tell me the reason why %1?",
"Why do you
think %1?" ]],
[r'I want (.*)',
[ "What would it
mean to you if you got %1?",
"Why do you
want %1?",
"What would
you do if you got %1?",
"If you got
%1, then what would you do?"]],
[r'(.*) mother(.*)',
[ "Tell me more
about your mother.",
"What was
your relationship with your mother like?",
"How do you
feel about your mother?",
"How does
this relate to your feelings today?",
"Good
family relations are important."]],
[r'(.*) father(.*)',
[ "Tell me more
about your father.",
"How did
your father make you feel?",
"How do you
feel about your father?",
"Does your
relationship with your father relate to your feelings today?",
"Do you
have trouble showing affection with your family?"]],
[r'(.*) child(.*)',
[ "Did you have
close friends as a child?",
"What is
your favorite childhood memory?",
"Do you
remember any dreams or nightmares from childhood?",
"Did the
other children sometimes tease you?",
"How do you
think your childhood experiences relate to your feelings
today?"]],
[r'(.*)\?',
[ "Why do you ask
that?",
"Please
consider whether you can answer your own question.",
"Perhaps
the answer lies within yourself?",
"Why don't
you tell me?"]],
[r'quit',
[ "Thank you for
talking with me.",
"Good-bye.",
"Thank you,
that will be $150. Have a good day!"]],
[r'(.*)',
[ "Please tell me
more.",
"Let's
change focus a bit... Tell me about your family.",
"Can you
elaborate on that?",
"Why do you
say that %1?",
"I see.",
"Very
interesting.",
"%1.",
"I
see. And what does that tell you?",
"How does
that make you feel?",
"How do you
feel when you say that?"]]
]
#----------------------------------------------------------------------
# command_interface
#----------------------------------------------------------------------
def command_interface():
print
"Therapist\n---------"
print "Talk to the
program by typing in plain English, using normal upper-"
print 'and lower-case
letters and punctuation. Enter "quit" when done.'
print '='*72
print "Hello. How
are you feeling today?"
s = ""
therapist = eliza();
while s != "quit":
try: s =
raw_input(">")
except
EOFError:
s = "quit"
print s
while s[-1]
in "!.": s = s[:-1]
print
therapist.respond(s)
if __name__ == "__main__":
command_interface()
Eliza in BASIC

Bild: Eliza Programm in Microsoft MBasic-85 unter RunCPM CP/M
Emulator unter DOSBox MS-DOS Emulator unter MS-Windows 10
Im Jahr 1977 erschienen die Homecomputer Apple 2, Commodore PET,
Tandy TRS-80 und eine Eliza Version in Basic für das Microsoft-Basic
was in diesen Computern eingebaut war. Der Artikel Eliza
erschien in der Zeitschrift Creative Computing, Ausgabe Jul/Aug
1977. Jeff Shrager schrieb wahrscheinlich ein Eliza in Basic in
1973. Steve North passte dieses BASIC Programm an die Fähigkeiten
von 8 KByte MITS Basic, d.h. 8 KByte Microsoft Basic an.
1979 gab es in meinem Gymnasium einen Commodore PET. Für mich war
BASIC Eliza das interessanteste Programm für diesen Computer. Die
8KByte RAM (7167 Bytes free) waren zu klein für das veröffentliche
BASIC Programm, es mußte auf einige Antwortsätze verzichtet werden.
Im Eliza Artikel stehen Vorschläge für Erweiterung/Verbesserung:
- Just as a suggestion, if you decide to insert "ME" and "YOU"
in the translation string list, put a nonprinting (control)
character in YOU to prevent ELIZA from substituting I -> YOU
-> ME.
- The structures found in lines 120, 420, and 590 could be
replaced by RESTORE NNNN
- The use of an INSTR, SEARCH or POS function to determine if
one string is a substring of another would probably speed things
up
Eliza Basic Sitzung
Wie bei Eliza Python wird als Eingabe der Text aus dem Weizenbaum
Papier genommen. Meiner Meinung nach passt Eliza Basic besser zu
diesen Eingaben als Eliza Python. Eingabe (Patient) ist in Groß-
und Kleinschreibung. Ausgabe (Eliza) ist in Großschreibung:
HI! I'M ELIZA. WHAT'S YOUR PROBLEM?
Men are all alike.
IN WHAT WAY?
They're always bugging us about something or other.
CAN YOU THINK OF A SPECIFIC EXAMPLE?
Well, my boyfriend made me come here.
WHY DO YOU BRING UP THE TOPIC OF FRIENDS?
He says I'm depressed much of the time.
DID YOU COME TO ME BECAUSE YOU ARE DEPRESSED MUCH OF THE
TIME.
It's true. I am unhappy.
DID YOU COME TO ME BECAUSE YOU ARE UNHAPPY.
I need some help, that much seems certain.
SAY, DO YOU HAVE ANY PSYCHOLOGICAL PROBLEMS?
Perhaps I could learn to get along with my mother.
WHAT DOES THAT SUGGEST TO YOU?
My mother takes care of me.
I SEE.
My father.
I'M NOT SURE I UNDERSTAND YOU FULLY.
You are like my father in some ways.
WHAT MAKES YOU THINK I AM LIKE YOUR FATHER IN SOME WAYS.
You are not very aggressive but I think you don't want me to
notice that.
DOES IT PLEASE YOU TO BELIEVE I AM
NOT VERY AGGRESSIVE BUT YOU THINK I DONT WANT Y0U TO
NOTICE THAT.
You don't argue with me.
WE WERE DISCUSSING YOU-- NOT ME.
You are afraid of me.
PERHAPS YOU WOULD LIKE TO BE AFRAID OF ME.
My father is afraid of everybody.
COME, COME ELUCIDATE YOUR THOUGHTS.
Bullies.
CAN YOU ELABORATE ON THAT?
shut
SHUT UP...
Ok
Eliza in MBASIC-85, GW-BASIC Quelltext
GW-Basic Version 3.2.3 für MS-DOS war die letzte "traditionelle"
Microsoft Basic Version. Unter DOSbox kann dieser
Basis-Interpreter heute noch unter MS-Windows, Linux und anderen
Betriebssystemen ausgeführt werden. BASIC-85 Rev. 5.29 war eine
späte CP/M Microsoft Basic Version. RunCPM ist
ein CP/M Emulator welcher unter DOSBox läuft.
Die oben genannten Erweiterungen habe ich eingebaut. Mein Trick für
die ME nach YOU Ersetzung: Ich schreibe YOU mit der Ziffer 0
anstelle dem Buchstaben O. Zusätzlich habe ich eine Umsetzung in
Großbuchstaben eingebaut. Der Commodore PET kannte nur
Großbuchstaben. Ich verwende LINE INPUT anstelle von INPUT. Dieses
Basic Programm sollte auch unter Tandy
Model 100 Basic laufen. Das Model 100 war ein "Laptop" aus dem
Jahr 1983 mit 8-Bit 80C85 Prozessor und 300 Baud Modem.
10 REM
20 REM ELIZA/DOCTOR
30 REM CREATED BY JOSEPH WEIZENBAUM
40 REM THIS VERSION BY JEFF SHRAGER
50 REM EDITIED AND MODIFIED FOR
MITS 8K BASIC 4.0 BY STEVE NORTH
60 REM CREATIVE COMPUTING PO BOX
789-M MORRISTOWN NJ 07960
65 REM EDITIED AND MODIFIED FOR
GW-BASIC 3.2.3 BY ANDRE ADRIAN
70 REM
80 REM -----INITIALIZATION-----
100 DEFINT A-Z:DIM S(36),R(36),N(36)
110 N1=36
120 RESTORE 2500
130 FOR X=1 TO N1
140 READ S(X),L:R(X)=S(X):N(X)=S(X)+L-1
150 NEXT X
160 PRINT "HI! I'M ELIZA. WHAT'S YOUR PROBLEM?"
170 REM
180 REM -----USER INPUT
SECTION-----
190 REM
200 LINE INPUT K$
210 REM GET RID OF APOSTROPHES, CONVERT TO UPPERCASE
215 I$=" ":FOR L=1 TO LEN(K$):X=ASC(MID$(K$,L,1)):IF X=39
THEN 230
220 IF X>=97 AND X<=122 THEN X=X-32
225 I$=I$+CHR$(X)
230 NEXT L
235 I$=I$+" ":IF INSTR(I$,"SHUT")>0 THEN PRINT
"SHUT UP...":END
255 IF I$=P$ THEN PRINT "PLEASE DON'T REPEAT
YOURSELF!":GOTO 170
260 REM
270 REM -----FIND KEYWORD IN
I$-----
280 REM
290 RESTORE
295 S=0
300 FOR K=1 TO N1
310 READ K$
315 IF S>0 THEN 360
320 L=INSTR(I$,K$):IF L>0 THEN S=K:T=L:F$=K$
360 NEXT K
365 IF S>0 THEN K=S:L=T:GOTO 390
370 K=36:GOTO 570:REM WE DIDN'T FIND ANY KEYWORDS
380 REM
390 REM TAKE RIGHT PART OF STRING
AND CONJUGATE IT
400 REM USING THE LIST OF STRINGS
TO BE SWAPPED
410 REM
420 RESTORE 1200:REM SKIP OVER KEYWORDS
430 C$=" "+RIGHT$(I$,LEN(I$)-LEN(F$)-L+1)
440 FOR X=1 TO 7
450 READ S$,R$
460 S=INSTR(C$,S$):R=INSTR(C$,R$):IF S<R THEN 500
470 IF S>0 THEN
C$=LEFT$(C$,S-1)+R$+RIGHT$(C$,LEN(C$)-S-LEN(S$)+1)
480 IF R>0 THEN
C$=LEFT$(C$,R-1)+S$+RIGHT$(C$,LEN(C$)-R-LEN(R$)+1)
490 GOTO 520
500 IF R>0 THEN
C$=LEFT$(C$,R-1)+S$+RIGHT$(C$,LEN(C$)-R-LEN(R$)+1)
510 IF S>0 THEN
C$=LEFT$(C$,S-1)+R$+RIGHT$(C$,LEN(C$)-S-LEN(S$)+1)
520 NEXT X
555 IF MID$(C$,2,1)=" "THEN C$=RIGHT$(C$,LEN(C$)-1):REM
ONLY 1 SPACE
560 REM
570 REM NOW USING THE KEYWORD
NUMBER (K) GET REPLY
580 REM
590 RESTORE 1300:REM SKIP TO REPLIES
600 FOR X=1 TO R(K):READ F$:NEXT X:REM READ RIGHT
REPLY
610 R(K)=R(K)+1: IF R(K)>N(K) THEN R(K)=S(K)
620 IF RIGHT$(F$,1)<>"*" THEN PRINT F$:P$=I$:GOTO 170
630 PRINT LEFT$(F$,LEN(F$)-1);C$
640 P$=I$:GOTO 170
1000 REM
1010 REM -----PROGRAM DATA
FOLLOWS-----
1020 REM
1030 REM KEYWORDS
1040 REM
1050 DATA "CAN YOU","CAN I","YOU ARE","YOURE","I DONT","I
FEEL"
1060 DATA "WHY DONT YOU","WHY CANT I","ARE YOU","I CANT","I
AM","IM "
1070 DATA "YOU ","I
WANT","WHAT","HOW","WHO","WHERE","WHEN","WHY"
1080 DATA "NAME","CAUSE","SORRY","DREAM","HELLO","HI
","MAYBE"
1090 DATA "
NO","YOUR","ALWAYS","THINK","ALIKE","YES","FRIEND"
1100 DATA "COMPUTER","NOKEYFOUND"
1200 REM
1210 REM STRING DATA FOR
CONJUGATIONS
1220 REM
1230 DATA " ARE "," AM ","WERE ","WAS "," YOU "," I ","YOUR
","MY "
1235 DATA " IVE "," YOUVE "," IM "," YOURE "," Y0U "," ME "
1300 REM
1310 REM REPLIES
1320 REM
1330 DATA "DON'T YOU BELIEVE THAT I CAN*"
1340 DATA "PERHAPS YOU WOULD LIKE TO BE ABLE TO*"
1350 DATA "YOU WANT ME TO BE ABLE TO*"
1360 DATA "PERHAPS YOU DON'T WANT TO*"
1365 DATA "DO YOU WANT TO BE ABLE TO*"
1370 DATA "WHAT MAKES YOU THINK I AM*"
1380 DATA "DOES IT PLEASE YOU TO BELIEVE I AM*"
1390 DATA "PERHAPS YOU WOULD LIKE TO BE*"
1400 DATA "DO YOU SOMETIMES WISH YOU WERE*"
1410 DATA "DON'T YOU REALLY*"
1420 DATA "WHY DON'T YOU*"
1430 DATA "DO YOU WISH TO BE ABLE TO*"
1440 DATA "DOES THAT TROUBLE YOU?"
1450 DATA "TELL ME MORE ABOUT SUCH FEELINGS."
1460 DATA "DO YOU OFTEN FEEL*"
1470 DATA "DO YOU ENJOY FEELING*"
1480 DATA "DO YOU REALLY BELIEVE I DON'T*"
1490 DATA "PERHAPS IN GOOD TIME I WILL*"
1500 DATA "DO YOU WANT ME TO*"
1510 DATA "DO YOU THINK YOU SHOULD BE ABLE TO*"
1520 DATA "WHY CAN'T YOU*"
1530 DATA "WHY ARE YOU INTERESTED IN WHETHER OR NOT I AM*"
1540 DATA "WOULD YOU PREFER IF I WERE NOT*"
1550 DATA "PERHAPS IN YOUR FANTASIES I AM*"
1560 DATA "HOW DO YOU KNOW YOU CAN'T*"
1570 DATA "HAVE YOU TRIED?"
1580 DATA "PERHAPS YOU CAN NOW*"
1590 DATA "DID YOU COME TO ME BECAUSE YOU ARE*"
1600 DATA "HOW LONG HAVE YOU BEEN*"
1610 DATA "DO YOU BELIEVE IT IS NORMAL TO BE*"
1620 DATA "DO YOU ENJOY BEING*"
1630 DATA "WE WERE DISCUSSING YOU-- NOT ME."
1640 DATA "OH, I*"
1650 DATA "YOU'RE NOT REALLY TALKING ABOUT ME, ARE YOU?"
1660 DATA "WHAT WOULD IT MEAN TO YOU IF YOU GOT*"
1670 DATA "WHY DO YOU WANT*"
1680 DATA "SUPPOSE YOU SOON GOT*"
1690 DATA "WHAT IF YOU NEVER GOT*"
1700 DATA "I SOMETIMES ALSO WANT*"
1710 DATA "WHY DO YOU ASK?"
1720 DATA "DOES THAT QUESTION INTEREST YOU?"
1730 DATA "WHAT ANSWER WOULD PLEASE YOU THE MOST?"
1740 DATA "WHAT DO YOU THINK?"
1750 DATA "ARE SUCH QUESTIONS ON YOUR MIND OFTEN?"
1760 DATA "WHAT IS IT THAT YOU REALLY WANT TO KNOW?"
1770 DATA "HAVE YOU ASKED ANYONE ELSE?"
1780 DATA "HAVE YOU ASKED SUCH QUESTIONS BEFORE?"
1790 DATA "WHAT ELSE COMES TO MIND WHEN YOU ASK THAT?"
1800 DATA "NAMES DON'T INTEREST ME."
1810 DATA "I DON'T CARE ABOUT NAMES-- PLEASE GO ON."
1820 DATA "IS THAT THE REAL REASON?"
1830 DATA "DON'T ANY OTHER REASONS COME TO MIND?"
1840 DATA "DOES THAT REASON EXPLAIN ANYTHING ELSE?"
1850 DATA "WHAT OTHER REASONS MIGHT THERE BE?"
1860 DATA "PLEASE DON'T APOLOGIZE!"
1870 DATA "APOLOGIES ARE NOT NECESSARY."
1880 DATA "WHAT FEELINGS DO YOU HAVE WHEN YOU APOLOGIZE."
1890 DATA "DON'T BE SO DEFENSIVE!"
1900 DATA "WHAT DOES THAT DREAM SUGGEST TO YOU?"
1910 DATA "DO YOU DREAM OFTEN?"
1920 DATA "WHAT PERSONS APPEAR IN YOUR DREAMS?"
1930 DATA "ARE YOU DISTURBED BY YOUR DREAMS?"
1940 DATA "HOW DO YOU DO ... PLEASE STATE YOUR PROBLEM."
1950 DATA "YOU DON'T SEEM QUITE CERTAIN."
1960 DATA "WHY THE UNCERTAIN TONE?"
1970 DATA "CAN'T YOU BE MORE POSITIVE?"
1980 DATA "YOU AREN'T SURE?"
1990 DATA "DON'T YOU KNOW?"
2000 DATA "ARE YOU SAYING NO JUST TO BE NEGATIVE?"
2010 DATA "YOU ARE BEING A BIT NEGATIVE."
2020 DATA "WHY NOT?"
2030 DATA "ARE YOU SURE?"
2040 DATA "WHY NO?"
2050 DATA "WHY ARE YOU CONCERNED ABOUT MY*"
2060 DATA "WHAT ABOUT YOUR OWN*"
2070 DATA "CAN YOU THINK OF A SPECIFIC EXAMPLE?"
2080 DATA "WHEN?"
2090 DATA "WHAT ARE YOU THINKING OF?"
2100 DATA "REALLY, ALWAYS?"
2110 DATA "DO YOU REALLY THINK SO?"
2120 DATA "BUT YOU ARE NOT SURE YOU*"
2130 DATA "DO YOU DOUBT YOU*"
2140 DATA "IN WHAT WAY?"
2150 DATA "WHAT RESEMBLANCE DO YOU SEE?"
2160 DATA "WHAT DOES THE SIMILARITY SUGGEST TO YOU?"
2170 DATA "WHAT OTHER CONNECTIONS DO YOU SEE?"
2180 DATA "COULD THERE REALLY BE SOME CONNECTION?"
2190 DATA "HOW?"
2200 DATA "YOU SEEM QUITE POSITIVE."
2210 DATA "ARE YOU SURE?"
2220 DATA "I SEE."
2230 DATA "I UNDERSTAND."
2240 DATA "WHY DO YOU BRING UP THE TOPIC OF FRIENDS?"
2250 DATA "DO YOUR FRIENDS WORRY YOU?"
2260 DATA "DO YOUR FRIENDS PICK ON YOU?"
2270 DATA "ARE YOU SURE YOU HAVE ANY FRIENDS?"
2280 DATA "DO YOU IMPOSE ON YOUR FRIENDS?"
2290 DATA "PERHAPS YOUR LOVE FOR FRIENDS WORRIES YOU."
2300 DATA "DO COMPUTERS WORRY YOU?"
2310 DATA "ARE YOU TALKING ABOUT ME IN PARTICULAR?"
2320 DATA "ARE YOU FRIGHTENED BY MACHINES?"
2330 DATA "WHY DO YOU MENTION COMPUTERS?"
2340 DATA "WHAT DO YOU THINK MACHINES HAVE TO DO WITH YOUR
PROBLEM?"
2350 DATA "DON'T YOU THINK COMPUTERS CAN HELP PEOPLE?"
2360 DATA "WHAT IS IT ABOUT MACHINES THAT WORRIES YOU?"
2370 DATA "SAY, DO YOU HAVE ANY PSYCHOLOGICAL PROBLEMS?"
2380 DATA "WHAT DOES THAT SUGGEST TO YOU?"
2390 DATA "I SEE."
2400 DATA "I'M NOT SURE I UNDERSTAND YOU FULLY."
2410 DATA "COME, COME ELUCIDATE YOUR THOUGHTS."
2420 DATA "CAN YOU ELABORATE ON THAT?"
2430 DATA "THAT IS QUITE INTERESTING."
2500 REM
2510 REM DATA FOR FINDING RIGHT
REPLIES
2520 REM
2530 DATA 1,3,4,2,6,4,6,4,10,4,14,3,17,3,20,2,22,3,25,3
2540 DATA 28,4,28,4,32,3,35,5,40,9,40,9,40,9,40,9,40,9,40,9
2550 DATA 49,2,51,4,55,4,59,4,63,1,63,1,64,5,69,5,74,2,76,4
2560 DATA 80,3,83,7,90,3,93,6,99,7,106,6