Serious Word Document Problems - Please, PLEASE Help! :(

B

Beth Rosengard

Congratulations, Jeff! And by all means hang around. McGhie needs someone
to keep him in line (not to speak of Elliott ;-).

Cheers,

Beth
 
J

Jeff Wiseman

Beth said:
Congratulations, Jeff! And by all means hang around. McGhie needs someone
to keep him in line (not to speak of Elliott ;-).


The main reason that I've primarily been only lurking here for
that last several months is that I've not been using the tool
(Word) on a regular or intensive basis and so there has not been
a drive to have to learn a lot in a short time.

I suspect that with this new job I'll get there and they will
promptly give me a PC with Office on it and that then will be
come my primary documenting and communications tool--something
I've desperately tried to avoid over the last two decades :)

If this is true, I will likely need all of your expert input as I
come up to speed on these new tools very quickly. I've got some
references here that I've determined to read but haven't gotten
to yet (#1 will likely be "Bending Word").

In any event, you all won't be free of me, you just will not
likely get as much philosophy as much as trouble reports I
suspect. John's weekends are likely to get a lot more boring :)
 
E

Elliott Roper

The analogy is that Word is like a bacteria in a healthy body
when it is run over Darwin. The prioritizing algorithm of Darwin
sees this process that wants to just keep running because of its
silly "idle loop" design, and so it will attempt to protect the
rest of the system by dropping Word's run time priority in stages.

You are being hard on Word here. It is not so much an idle loop as an
'event loop'. Nearly everything with a GUI has something like an event
loop in it. It is horrible, but that's the way it is. If Word is
waiting for input, it will use very little CPU. Looking at top forces
you to believe that it is scheduling itself to run more than it ought
to, probably looking for stuff from other bits of Office in a dumber
than normal way, but it is never sitting in a hard loop as you claim.

PS I like your and John's discussion about users getting into a working
pattern. It helps to explain how proponents from each camp stay
polarized. The other I was showing someone a few neat Mac tricks
(keyboard nav mostly) and he was mildly impressed. Then he counterd
with a virtuoso display on his Windows lappy, and so was I.
 
J

Jeff Wiseman

Elliott said:
You are being hard on Word here. It is not so much an idle loop as an
'event loop'. Nearly everything with a GUI has something like an event
loop in it. It is horrible, but that's the way it is. If Word is
waiting for input, it will use very little CPU. Looking at top forces
you to believe that it is scheduling itself to run more than it ought
to, probably looking for stuff from other bits of Office in a dumber
than normal way, but it is never sitting in a hard loop as you claim.


In your response to John, you also discussed issues of this
nature imposed by the GUI implementation. I must admit a lot went
over my head as the days of me working as a real time embedded
systems software engineer was quite a ways back. Most
applications I worked on tended to work directly off the standard
input and output in a Unix type system. Keystrokes were handled
via interrupts in the device driver and sent to the application
grouped statistically. Although applications could be set up to
poll for everything, the correct way was for the application to
block while waiting for a semaphore or message in a task queue.
When this occurred, that task was totally dormant pulling no CPU
cycles whatsoever until it was passed the appropriate event from
the OS itself.

However, using a GUI to essentially replace the I/O device
drivers obviously will have issues I'm not familiar with. A
device driver by its hardware interrupt nature must run at top
priority in the system. I'm guessing that a formalized GUI (e.g.,
the finder, Openwindows, X11, etc.) is forced to have large
software components that must deal with multiple priority
components to handle the data flow control issues. Are these the
reasons you make the above comment? Because certainly from a
command line, all applications can be made (if the developer is
smart enough to choose it) to fully block when waiting for any
events.

Also, this "horrible" circumstance that the GUI introduces and
you refer to--is it a common issue in all GUIs that you know of?
For example, would this issue exist in typical Unix GUIs as well
(e.g., the Openlook Window manager or other Solaris interfaces,
X11 windows, and even something like Aqua over Darwin?

PS I like your and John's discussion about users getting into a working
pattern. It helps to explain how proponents from each camp stay
polarized.


I was going to elaborate a bit on that but I figured Sunday was
over in Australia anyway :)

I have an example that I frequently use in presenting this
concept. As kid, we used to have a backyard badminton set that I
enjoyed playing with others. Since I was "self taught" for many
years I learned to hold the badminton racket like a tennis racket
(i.e. if you hold your arm out straight in front of you with the
racket in your hand, you will be looking through the face of the
racket). When I got into high school in gym we were taught that
this is not correct and I needed to rotate the racket 90 degrees
in my hand so that when holding it out in front of you, you only
see the edge of the racket.

I really struggled unlearning the old technique and my game was
very poor for quite a while using the new grip. Eventually, as I
became adjusted I began realizing that I could do so many things
that were never possible for me to do before. For example, a hard
smash coming directly at your face could be instantly returned
with a slight flick of the wrist (backhand).

I've attempted to learn from this experience over the years that
when people with more experience than I in some subject suggest
that how I am doing something can be improved a lot by changing
something, it is worth my time and at least some effort to
explore it and see if I need to unlearn something that is holding
me back. This is not a natural thing to do. The pain to change
keeps many people (my self included) from doing things better.

Another thing is that superficial components of a problem seem to
get so much more weight in making decisions sometime. The Aqua
based finder in the new OS X may be more cosmetically appealling
to newcomers than say a Windows XP desktop. Is that alone really
enough to base a decision on? I'm more interested in how much of
the OS has been built in an orthogonally structured fashion. That
way it will have fewer bugs, the bugs that are there will be
detected and removed more easily. This kind of an OS is
strengthened over time. A poorly structured OS will only have a
future of increased entropy. Structure begets structure, entropy
follows entropy.

I would rather have a more limited GUI knowing that what's
underneath it is more solid. I want my life to be easier, it
would be awfully disappointing to realize how easy or intuitive
something could be by just changing how you think of it.

Case in point, trying getting a light user/newby to understand
what a hanging indent is and that you don't use multiple spaces.
If the application makes this easy to learn and apply (like
MacWrite, or Clarisworks used to be) It's easy for the person to
cross over. Trying to use the paragraph styles and formatting to
set up hanging indents in Word are still scary to ME!

This is the basis for so many of my rants with MS products. Items
that are not intuitive and have severe penalties and side effects
when not totally, fully, and extensively implemented, will NOT
be learned by many, many people because they will give up and
even reject the product itself (if they are given the
choice--many are stuck with Word because they have no choice as
John has pointed out before). If a product is structured
cohesively, it will tend to be intuitive such that if a newby
made a guess as to where he would find an answer or how to do
something, if after trying it finds he was correct, it fosters
confidence such that they will more easily try even more new
concepts. If they spend 4 frustrating hours and then STILL fail
due to some inane obscure gotcha in the product, they will give
up and probably never try searching for or trying any new aspect
of that product on their own again!

(and most folks are not tenacious enough to stay at it 4 hours. I
am, so if I've still failed--especially on multiple aspects of
that product--I tend to become that product's marketing
department's worst nightmare because after all that time I spent
trying to make it work, I will be able to rattle off a long list
of undeniable idiocies that exist in that product that will scare
the daylights out of anyone considering it :)

The other I was showing someone a few neat Mac tricks
(keyboard nav mostly) and he was mildly impressed. Then he counterd
with a virtuoso display on his Windows lappy, and so was I.


I will not deny that the interface flexibility--especially
customization--on the PC can be quite extensive. However, were
all of those "tricks" simply logical extensions of things you
already knew, or were they special exception capabilities that
were put in just to provide some kind of useful trick?

For example, the object pardigm should be maintained everywhere
possible since that is how we think. If I select an object and
then perform a sizing function on it, when I select a group of
object (i.e., an object which is a set of objects) and want to
size it, it should be performed in exactly the same way using the
same menu items, etc. What you see in a legacy product is a menu
item for sizing single objects and a separate item for sizing
groups of objects. What you are seeing in the GUI there is a
reflection of the lack of cohesion in the innards of the
software. There are two software functions so you get two menu
items even though intuitively they are functionally identical.
The only reason there are two items is because two sets of
software, each different, were developed at two different times
or by two different development groups.

Case in point. I have a spreadsheet. I select a full colum and by
positioning my cursor over the head of the colum I can make it
wider. What if I wanted to take three adjacent colums and scale
them all together a a single unit? Well I typically would just
select all three colums, position the cursor, and drag it to make
them all scale up together--exactly the same operation. This is
usually how you do this in most spreadsheets that I've used
EXCEPT of course for Excel. For some reason (likley one of the
two I just mentioned), scaling a single colum and a set are done
totally different. In fact, if you select a set of (say) three
colums and use the click and drag method to size them up, even
though all three are selected, the silly application only sizes
ONE of them. It conveniently ignores the fact that the other two
colums have also been selected--probably because scaling them as
a set is too complicated. But instead of simply putting up a
denial indicating that it can't do that (bad marketing policy),
it just pretends that you told it to do something else.

In other words, it didn't do what you told it to do and what
intuitively should have happened. This is a reason that when I'm
in a hurry and need to put to gether a fast spreadsheet, I do not
use excel--too much risk. Very little confidence. Why should I
change? Well in this case, corporate standards seem to revolve
around Excel for spreadsheets and therefore I really need to
understand it better to function well at my job.

So after I calm down from being so steamed about a stupid
behavior, I will start over and again try to figure out how to
size groups of colums different from how I've done it the last 20
years using at least three different major spreadsheet programs.

Get the point? :)
 
E

Elliott Roper

Jeff Wiseman said:
In your response to John, you also discussed issues of this
nature imposed by the GUI implementation. I must admit a lot went
over my head as the days of me working as a real time embedded
systems software engineer was quite a ways back. Most
applications I worked on tended to work directly off the standard
input and output in a Unix type system. Keystrokes were handled
via interrupts in the device driver and sent to the application
grouped statistically. Although applications could be set up to
poll for everything, the correct way was for the application to
block while waiting for a semaphore or message in a task queue.
When this occurred, that task was totally dormant pulling no CPU
cycles whatsoever until it was passed the appropriate event from
the OS itself.

However, using a GUI to essentially replace the I/O device
drivers obviously will have issues I'm not familiar with. A
device driver by its hardware interrupt nature must run at top
priority in the system. I'm guessing that a formalized GUI (e.g.,
the finder, Openwindows, X11, etc.) is forced to have large
software components that must deal with multiple priority
components to handle the data flow control issues. Are these the
reasons you make the above comment? Because certainly from a
command line, all applications can be made (if the developer is
smart enough to choose it) to fully block when waiting for any
events.
When you set up your application's GUI interface, you register with the
OS all the events you are interested in and then wait. The operating
system will schedule you whenever one of them occurs in one of your
windows. That part of the OS is the windows manager and is typically a
monster.
When you get control, you run about looking at why you were scheduled
and decide what to do about it. That is your event loop. You may say
you were interested in time expiry as well. Then you do whatever is
needed and wait.
Also, this "horrible" circumstance that the GUI introduces and
you refer to--is it a common issue in all GUIs that you know of?
For example, would this issue exist in typical Unix GUIs as well
(e.g., the Openlook Window manager or other Solaris interfaces,
X11 windows, and even something like Aqua over Darwin?

Yes, it is almost universal. I preferred VWS on VMS, where ASTs were
delivered right to the door, in chronological order, but that was
thrown out years ago in favour of X11/DECWindows/Motif like all the
others used, where you scuttle about your event loop, emitting
callbacks willy-nilly, possibly in a different order from the user's
actions.

Either way, it is in contrast to the CLI way, where you simply do not
have to worry about the pesky customer from one return to the next.

Cramming CLI thinking into a GUI is often unsatisfactory. If the
computation takes a long time, the interface goes to pot. If you
counter that by scheduling threads, you end up writing a mini operating
system inside your application to keep the little snivellers apart.
I was going to elaborate a bit on that but I figured Sunday was
over in Australia anyway :)
Another thing is that superficial components of a problem seem to
get so much more weight in making decisions sometime. The Aqua
based finder in the new OS X may be more cosmetically appealling
to newcomers than say a Windows XP desktop. Is that alone really
enough to base a decision on? I'm more interested in how much of
the OS has been built in an orthogonally structured fashion. That
way it will have fewer bugs, the bugs that are there will be
detected and removed more easily. This kind of an OS is
strengthened over time. A poorly structured OS will only have a
future of increased entropy. Structure begets structure, entropy
follows entropy.

I think both Windows and Mac OS X each suffer equally in this regard.
Unix never progressed beyond X11 for GUI stuff. It is still a
multi-user CLI minicomputer operating system. The NT kernel is much the
same, without the multi-user history. Neither sits well with its GUI.
NT's entropy is Win32, dumped uncermoniously into the kernel, mixed
with ActiveX and released as a security nightmare. OS X's is the unix
file system, stunted at birth by lack of metadata, and totally bereft
of structure. How its proponents can claim this to be a feature has
always escaped me.
I would rather have a more limited GUI knowing that what's
underneath it is more solid. I want my life to be easier, it
would be awfully disappointing to realize how easy or intuitive
something could be by just changing how you think of it.
Heh! We would have found it by now if it were there. Actually one of
the pleasant aspects of GUI working are the diverse alternatives. Once
in a while it is pleasant to use the mouse. The programs you get to
really know and love often have several ways of doing the same job.
You just have to watch the different answers to users' problems on this
list.
Case in point, trying getting a light user/newby to understand
what a hanging indent is and that you don't use multiple spaces.
If the application makes this easy to learn and apply (like
MacWrite, or Clarisworks used to be) It's easy for the person to
cross over. Trying to use the paragraph styles and formatting to
set up hanging indents in Word are still scary to ME!

Trying to get Word to do *anything* with typographic elegance is almost
impossible, but here you have a point. Look at the way specifiying
numbers and specifying indents are intertwingled in Word.
This is the basis for so many of my rants with MS products. Items
that are not intuitive and have severe penalties and side effects
when not totally, fully, and extensively implemented, will NOT
be learned by many, many people because they will give up and
even reject the product itself (if they are given the
choice--many are stuck with Word because they have no choice as
John has pointed out before). If a product is structured
cohesively, it will tend to be intuitive such that if a newby
made a guess as to where he would find an answer or how to do
something, if after trying it finds he was correct, it fosters
confidence such that they will more easily try even more new
concepts. If they spend 4 frustrating hours and then STILL fail
due to some inane obscure gotcha in the product, they will give
up and probably never try searching for or trying any new aspect
of that product on their own again!

Indeed. What you need is a decent framework underneath it all. You can
see the difference when applications are built on the Cocoa Framework
by developers who are not fighting it. Look at OmniOutliner or
OmniEverythingElse or TextEdit or Pages or Xcode itself to see how the
structure builds on itself. With Core Image and Core Data this will end
up as powerful stuff.

Word on the Mac is between a rock and a hard place. It is built on
Carbon and has to work like its cuz on the PC, and work like it used
to.

It would be nice if the XML-ish next version of Word could be
compatible with Core Data... wheee! is that a pig circling overhead?

Not the least of Office on Mac's problems is the Windows frameworks
are not standing still either. There will be a lot of original work
needed to map new Windows stuff to new Mac stuff as they try to keep
the two in line. It must be something like herding cats.
I will not deny that the interface flexibility--especially
customization--on the PC can be quite extensive. However, were
all of those "tricks" simply logical extensions of things you
already knew, or were they special exception capabilities that
were put in just to provide some kind of useful trick?

I think he was holding his raquet the right way.
For example, the object pardigm should be maintained everywhere
possible since that is how we think....
Get the point? :)
Oh yes. Like the famous yokel said "I wouldn't start from here"
 
J

Jeff Wiseman

Elliott said:
When you set up your application's GUI interface, you register with the
OS all the events you are interested in and then wait. The operating
system will schedule you whenever one of them occurs in one of your
windows. That part of the OS is the windows manager and is typically a
monster.
When you get control, you run about looking at why you were scheduled
and decide what to do about it. That is your event loop. You may say
you were interested in time expiry as well. Then you do whatever is
needed and wait.

I kinda thought that was what you were getting at. In the custom
systems that I was involved in building, we got around all those
problems by restricting the unblocking of a task to a single
event type--the receiption of a message on a pipe. All messages
were defined to have a "message type" field so as soon as the
task came unblocked and called up its message, it simply passed
it to a case statement--basically a simple state machine to
process the event. Even interrupts were simply converted to
messages by the task and sent to itself. This solves a magnitude
of glare conditions by GUARANTEEING that all events are processed
exactly in the order that they occurred IN REAL TIME. It also had
the advantage that during state changes, events still in the
pipe/queue that were no longer valid could be selectively purged
appropriately during the state transition.

And this was all done on a "Unix" type environment (e.g., SCo
Xenix, etc). The upshot was the need to disallow programmers from
using the "other" legacy signalling mechanisms in the OS. Unix
has become efficient enough that sending a message to a pipe is
just as fast as sending a signal or trap. Forcing the programmers
to use the single generic message event concept solved an entire
host of problems of the nature which you describe. The problem is
that most OSs out there now are not trimmed up this way as there
is too much concern about "legacy" applications that have all of
this confusion of inter-task communication types that need to be
maintained.

It CAN be done, I've worked on teams that have done it but the
Legacy beast is a hard one to kill in marketing (and as Mr.
McGhie has pointed out on occasion, if there's no money coming
in, there is no payroll for programmers...)

Yes, it is almost universal. I preferred VWS on VMS, where ASTs were
delivered right to the door, in chronological order, but that was
thrown out years ago in favour of X11/DECWindows/Motif like all the
others used, where you scuttle about your event loop, emitting
callbacks willy-nilly, possibly in a different order from the user's
actions.


Ok, so it does appear that its the same problem in that there are
too many ways to report and process those events (i.e., like in
Unix, signals, traps, messages, etc). I assume that by AST, you
are referring to Asynchronous Traps? I also assume that your
comment was referring to how in VMS only processing ASTs as
events you are guaranteed real time sequencing? My point is that
you CAN have that in the Unix environment but it requires the OS
wide restrictions and programmer constraints that I just pointed
out. You would be amazed at how well this works. However,
everyone wants Unix to stay Unix. And these WERE custom GUIs
(albeit much simpler than the current Finder, etc.)

Either way, it is in contrast to the CLI way, where you simply do not
have to worry about the pesky customer from one return to the next.


Well true, but maybe that's only because CLI based applications
typically don't have a widely divergent set of event types to
deal with. It would seem to me that that is basically the only
real difference is the much wider range of event types the GUI
itself has to handle and all of the real time implications. E.g.,
the one thing that I always hated the most about Windows was when
you lost control of the cursor or it's movement across the screen
wasn't smooth. You sometimes couldn't move it when a process
started spinning and the "feel" of that was always disturbing to
me (in general, the Mac never really did this). This doesn't have
to be this way but it requires a significant infrastructure
change and we are again back up against both the Legacy beast AND
the programmer's ability for a proper implementation to be realized.

Cramming CLI thinking into a GUI is often unsatisfactory. If the
computation takes a long time, the interface goes to pot. If you
counter that by scheduling threads, you end up writing a mini operating
system inside your application to keep the little snivellers apart.


Yes, as just mentioned above. The other issue is that most
programmers think procedure when you really want folks that are
fluent in state manipulation at the top of the application
itself. Unfortunately is not a common skill. Although there are
many programmers that can partion software modules into fairly
cohesive sections, not many know how to partition a massive state
machine in to its cascaded Mealey and Moore type components
(although as folks become more proficient in the object paradigm
this is changing some). If your entire application task is
reduced to nothing but a state machine's event handler (which I
declare is the "correct" solution here), you are really going to
need people who know how to do it effectively.

BTW, I suspect that the multi-threading type programming is
actually a bottom-up approach to the partitioning of a high level
application state machine to deal with the real time issues, but
again, with so many people thinking in terms of linear processing
instead of pure event handling, this type of design becomes more
of an art form than a science. Hence the complexities that you
have already referred to.

I think both Windows and Mac OS X each suffer equally in this regard.
Unix never progressed beyond X11 for GUI stuff. It is still a


I thought that Motif was kinda the next step since it sorta sat
on top of X11, didn't it? In recent years Sun's Solaris started
having lots of applications that insisted on the Motif stuff.
Again, not as familiar as I'd like to be but I thought Motif had
actually gotten past the X11. Is X11 still the only really solid
standard in that area? Has Motif been dying?

multi-user CLI minicomputer operating system. The NT kernel is much the
same, without the multi-user history. Neither sits well with its GUI.
NT's entropy is Win32, dumped uncermoniously into the kernel, mixed
with ActiveX and released as a security nightmare. OS X's is the unix
file system, stunted at birth by lack of metadata, and totally bereft
of structure. How its proponents can claim this to be a feature has
always escaped me.


Heh! Ever try setting up a desktop on a Sun? what a nightmare!
Once it's set up, you're ok, but don't try to change it. The
tools that Sun provided just to do things like add an application
to a tool bar was the only thing that I saw which would
consistently crash the entire workstation.

The issue as you point out is, of course, the metadata concept is
an object that contradicts the basic standard IO data structures
of UNIX. We're back to the Legacy beast again. It's hard to add
he metadata concept since it really needs to replace some things
IMHO.

Trying to get Word to do *anything* with typographic elegance is almost
impossible, but here you have a point. Look at the way specifiying
numbers and specifying indents are intertwingled in Word.


Heh, "interwingled" is right!

Indeed. What you need is a decent framework underneath it all. You can
see the difference when applications are built on the Cocoa Framework
by developers who are not fighting it. Look at OmniOutliner or
OmniEverythingElse or TextEdit or Pages or Xcode itself to see how the
structure builds on itself. With Core Image and Core Data this will end
up as powerful stuff.


Right. What you are seeing here in those frameworks is a natural
slow but sure evolution toward the inherent "objectness" of the
problem domain for a GUI. Unfortunately, the Legacy sits
underneath, in the case of Mac OS X, it is a Unix clone. Now if
they can just retrofit those concepts into the OS itself--sorta
like when Microsoft tossed the DOS layer in order to allow it
windows objects to run freely (not all of the Window objects
would map nicely through the DOS--not to mention the layering
problems). Obviously, one of the things this would do for Darwin
is integrating the metadata in a native fashion as you've pointed
out already. The "objectness" of the way data is viewed in the
abstract would be more closely modeled in the OS's implementation.

I think he was holding his raquet the right way.


I have come very close to using a racket on a PC...

Oh yes. Like the famous yokel said "I wouldn't start from here"


But the million dollar questions are "Can you GET there from
here?" and if you can, "Will marketing LET you?"
:)
 
J

Jeff Wiseman

Beth said:
Hi Jeff,

Have you seen the previews of the new WinOffice 12 UI? It seems to me to
have been designed with your comments in mind. I have no idea if/when we'll
see something similar for MacOffice, but I like what I see.

There's a video at the following URL that shows some of the UI. You'll need
to have Windows Media Player (for the Mac) to view it. Even so, some people
have had trouble downloading the high resolution version of it (including
me). If the download button triggers a stream instead of a download, right
click on the download button and choose the download option. (Sounds
ridiculous, I know).

<http://channel9.msdn.com/showpost.aspx?postid=114720>

I've started a download but it's 35 minutes on my DSL. If I get a
chance later on, I'll have a look at it. In the meantime, I'm
swamped with all these forms I gotta get in for the next line of
checks and background checks I need to have done for the job I'm
going into.

i'm just having too much fun these days in m.p.m.o.word
 
E

Elliott Roper

This is becoming a private discussion innit?

I *strongly* agree with just about everything you said.
Motif sits on top of X11 and is alive and well.
I too love finite state machines and transaction pipes.
ASTs are asynch system traps.
I do look forward to Word's XML-ish framework and hope like hell it
works beautifully.

There. That's sorted the world out. I hope you new job works out well.

;-)
Elliott
 
J

Jeff Wiseman

Elliott said:
This is becoming a private discussion innit?


Maybe so. I guess the Subject line should've been changed to
"Wiseman and Elliott..."

'nuff said.

There. That's sorted the world out. I hope you new job works out well.


If it pays me money, it should :)

However, I suspect that the current tool set is going to require
me to develop a much deeper understanding of Office and Word than
I currently have which is why I've been watching this group more
closely recently. Thanks for all the feedback.
 
C

Clive Huggan

i'm just having too much fun these days in m.p.m.o.word

So are we lurkers, Jeff and Elliott! Thank you for your comments in this
thread -- they have certainly filled up some gaps in my knowledge.

Cheers,

Clive
======
 
J

John McGhie [MVP - Word and Word Macintosh]

{Whew} Wiseman and Elliott both have jobs... The rest of us may get some
sleep now :)

Gentlemen, I thank you -- a fascinating discussion :)

I agree: I would currently back the WinXP task scheduler over the Darwin
one. Note: I said XP, not Windows 2000. They are all very closely related
to the DEC Vax VMS OS, because the same architect designed both.

However, they have come a long way. Perhaps the most elegant was Windows
3.5.1. That was the "purest" expression of the idea... Everything had to
be abstracted through the Hardware Abstraction Layer: absolutely *nothing*
was allowed to get its grubby little hands on the hardware directly. Fixed
the crashes! And the entire OS was built in C. None of this evil Assembly
language where people could do naughty things...

Problem: Stable as a rock, sure, but slow as a wet week.

Along comes NT4: They let the damned drivers back into the hardware ring.
Bad idea! Much faster, but a bad graphics driver could take out the system
(and they did...)

Windows 2000 fixed that: "we know who you are, we know your parents, and if
you install naughty things we will silently roll you back out again!" Fixed
the crashes, but not the hangs. The system had only one request queue...

Windows XP (Windows NT 5.1...) is where they really got it together.
Multiple request queues. Only one processor architecture supported, so they
could pull all those nasty branches around Alpha code out. We know for
certain we're on a WinTel architecture now, so we can precisely tune the
memory model and branch prediction to the Pentium... But the real gift was
that, having done all that, they could finally begin to tune the thing
properly... The hour glass that tuned into a calendar was finally gone
forever. There is some very serious science went into tuning that kernel...

Of course, we have no idea what will happen in Windows Vista (Longhorn, the
"new one"...) But opportunities abound. Look at that video Beth referred
you to earlier. Then put your thinking caps on :) Ask yourself: "Why did
they do that?" I think Jeff will feel vindicated: it's a much better UI
structure. But that isn't the reason they did it, I bet. Sure, they have
64-bit memory addressing to play with: but THAT's not the reason: no user
this side of Bill Gates can AFFORD 64 bits worth of memory :)

My guess? They finally seized the opportunity to multi-thread Office. They
HAD to re-architect it anyway at some point. Windows Vista gave them the
excuse (read: "budget") they needed to actually DO it. Read the tea-leaves
with me for a moment: consider the direction software is taking -- much
more collaboration. Much more sharing of stuff. Much more seamless
integration between what used to be separate applications. Much more
dot-Netty-style distributed computing. A demand for better manageability
and security: one way to do that is to put all the "works" and all of the
"data" back in the glasshouse where it belongs, looked after by
professionals who do data security/integrity/availability as a lifetime
career choice. (You got any idea how many terror-dollars worth of data is
currently sitting on crappy IDE hard drives on the un-backed-up workstations
of the world's corporations?)

Now, in the glasshouse, you can afford a seriously-parallel computer with 64
bits worth of memory. But you want your application to work properly in the
glasshouse? Start thinking multi-threaded, multi-user, and
horror-of-horrors, multiply-reentrant code. Even Klingon programmers run
screaming into the night at the thought of designing multiply-reentrant
multi-threaded code :)

But if you WERE going to pull the whole thing to pieces and build it again
from scratch, and this time do it right, what's the first thing you would
do? Me? I would probably saw the user interface off of the thing and see if
I could get that bit right first. Much easier to design the underlying
engine if you have some idea how the driver is going to drive the car!

Look at that movie again. Think about it. What ARE they up to? For me, a
pattern emerges... The thing is massively context-sensitive, but there's
almost no sign of modality. Could this suggest the underlying engine is
event-driven? Controls being rigidly grouped into functional areas, those
entire areas are coming and going as the user works. Multiple threads under
the hood, anyone?

All of this would seem like a good idea. An idea they would have had years
ago when they began designing Longhorn. So why do it now?

Could it have something to do with Microsoft Office's need to take
citizenship in FOUR different nations next year? We always knew it would
have to run in Win32 and Win64. But now, it has to run in MacIntel and
MacOrolla as well. Tricky. Unless you designed it that way. And if you
WERE going to design it that way, would you have multiply-reentrant
multi-user multi-threading in your future?

I would...

Just my $US 0.02 worth. Notice: it's $US this week: this comes to you from
Los Angeles -- I'm at Beth's place on my way up to Redmond for the annual
conference where we try to find out whatinthehell they actually ARE up to.

And I *know* they won't tell us :)

Cheers


So are we lurkers, Jeff and Elliott! Thank you for your comments in this
thread -- they have certainly filled up some gaps in my knowledge.

Cheers,

Clive
======

--

Please reply to the newsgroup to maintain the thread. Please do not email
me unless I ask you to.

John McGhie <[email protected]>
Microsoft MVP, Word and Word for Macintosh. Consultant Technical Writer
Sydney, Australia +61 4 1209 1410
 
J

Jeff Wiseman

John said:
{Whew} Wiseman and Elliott both have jobs... The rest of us may get some
sleep now :)


Only because I don't even have the time to feed myself anymore.
Man am I tired...

And as far as others getting sleep, I strongly suspect that only
those with the strongest of constitutions would follow a thread
this deep :)

Also, my suspicions have been confirmed. What do government
contractors use as thier main tool on $280Million/year contracts?
Windows 2000 and Office of course (with a little DOORS on the
side). I'll never be able to use Unix at work again! I'm going to
have to totally rely on our IT department for support now! I feel
like such a wimp...

I agree: I would currently back the WinXP task scheduler over the Darwin
one. Note: I said XP, not Windows 2000. They are all very closely related
to the DEC Vax VMS OS, because the same architect designed both.


Man, I really wish I knew the underpinnings of XP at least as
well as I know Berkley Unix. For a multiuser, multitasking
scheduler, the BSD system is actually very good. If what you say
is really true, you are driving me towards a far greater respect
for XP. But because I don't, I can't have an intelligent,
detailed, and long winded discussion with you on it.

<<stuff deleted>>
Windows 2000 fixed that: "we know who you are, we know your parents, and if
you install naughty things we will silently roll you back out again!" Fixed
the crashes, but not the hangs. The system had only one request queue...


Ah! The perfect system for government contractors

<<more goodies gone>>
Now, in the glasshouse, you can afford a seriously-parallel computer with 64
bits worth of memory. But you want your application to work properly in the
glasshouse? Start thinking multi-threaded, multi-user, and
horror-of-horrors, multiply-reentrant code. Even Klingon programmers run
screaming into the night at the thought of designing multiply-reentrant
multi-threaded code :)


But if they DO devise such a beast and it "escapes", you want a
system that will contain it. Windows has always had a problem
doing that. UNIX has always been very stable in trapping and
locking down that type of application (unless of course the
Klingons were root administrators)

But if you WERE going to pull the whole thing to pieces and build it again
from scratch, and this time do it right, what's the first thing you would
do? Me? I would probably saw the user interface off of the thing and see if
I could get that bit right first. Much easier to design the underlying
engine if you have some idea how the driver is going to drive the car!


Exactly. It is the means to an end, not he end itself (although
you've be had pressed to sell that to the marketeers at Ford or
Chrysler)

Just my $US 0.02 worth. Notice: it's $US this week: this comes to you from
Los Angeles -- I'm at Beth's place on my way up to Redmond for the annual
conference where we try to find out whatinthehell they actually ARE up to.

And I *know* they won't tell us :)


Please do what you can anyway John. We need you there doing what
you do.
 
J

John McGhie [MVP - Word and Word Macintosh]

Hi All:

Only because I don't even have the time to feed myself anymore.
Man am I tired...

{Chortle} I'll press my advantage then :) With only Elliott awake, I
might even win one... :)
Also, my suspicions have been confirmed. What do government
contractors use as thier main tool on $280Million/year contracts?
Windows 2000 and Office of course (with a little DOORS on the
side). I'll never be able to use Unix at work again! I'm going to
have to totally rely on our IT department for support now! I feel
like such a wimp...

Windows 2000 was picked for its "Stability" :) Windows is like wine --
"2000" was a good year. NT4 was "not" :) XP is, but only if you know what
you're doing.

You'll find that most of your Unixy tricks will work in Windows 2000, and at
least half the time, it will respond well to such ministrations. Even uses
many of the same commands in the Command Prompt.
Man, I really wish I knew the underpinnings of XP at least as
well as I know Berkley Unix. For a multiuser, multitasking
scheduler, the BSD system is actually very good. If what you say
is really true, you are driving me towards a far greater respect
for XP. But because I don't, I can't have an intelligent,
detailed, and long winded discussion with you on it.

Yeah, well they wouldn't give ME the source code, either. Actually, they
WILL, but there's three million lines of it... I'd rather argue with
Wiseman when I can't sleep...

I am told that it's very complex, but very very carefully optimised for
efficiency. The "feel" of a heavily-loaded XP box certainly bears that out.
Until the CPU fans become really deafening, you hardly realise the box is
working at all :)
But if they DO devise such a beast and it "escapes", you want a
system that will contain it. Windows has always had a problem
doing that. UNIX has always been very stable in trapping and
locking down that type of application (unless of course the
Klingons were root administrators)

Fixed. Fixed utterly in Win 2000. "Excuse me sir? Your application had a
naughty thought, so I shot it. Where shall I dispose of the carcase?"
Please do what you can anyway John. We need you there doing what
you do.

I need you guys here doing what you do far more... MVPs do this as a hobby.
In some cases, it's a principle source of pleasure in our lives (yeah, some
of us "do" have one of those...). But imagine what it would be like if this
place was full of over-opinionated MVPs with no "questions" to answer? Bit
pointless, wouldn't you think? :)

Cheers

--

Please reply to the newsgroup to maintain the thread. Please do not email
me unless I ask you to.

John McGhie <[email protected]>
Microsoft MVP, Word and Word for Macintosh. Consultant Technical Writer
Sydney, Australia +61 4 1209 1410
 
E

Elliott Roper

John McGhie [MVP - Word said:
Hi All:

Only because I don't even have the time to feed myself anymore.
Man am I tired...

{Chortle} I'll press my advantage then :) With only Elliott awake, I
might even win one... :)

Heh! You only got that far because I was in France re-stocking the wine
cellar and doing stupid things on a new mountain bike. Not a good idea
for someone of my age and lack of fitness.

I'd just like to add that I'm a long time fan of that Windows architect
you mentioned, but not by name (Dave Cutler) He doesn't know it, and
wouldn't care, even if he did, but he taught me to program properly.

Lots of people say that Windows owes a lot to VMS (Dave was very
heavily involved in version 1 thereof) However, another project of
Dave's at Digital Equipment, called VAXeln, was in many ways closer to
the original NT 3.5.1 than VMS. The HAL (hardware abstraction layer) is
pure Cutler. Indeed the whole I/O model would be very familiar to
anyone who knows VMS or its predecessor RSX-11M. It was the source code
of the latter that was Cutler's contribution to my education. Back when
the world was new and all.

I'm looking forward with great interest to Vista. A lot of people will
be looking at the eye-candy and comparing it with OS X, but there is a
*lot* more going on in there.

We live in interesting times.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top