Essays
Odds and ends from around the web

Essays


Click to see the XML version of this web page.

Click here to send an email to the editor of this weblog.
 

"Data! data! data!" he cried impatiently. "I can't make bricks without clay."
— Sherlock Holmes to Dr. Watson in "The Adventure of the Copper Beeches" by Arthur Conan Doyle. 


"I like deadlines," cartoonist Scott Adams once said. "I especially like the whooshing sound they make as they fly by."
"There is nothing like that feeling of spending days and days banging your head against a wall trying to solve a programming problem then suddenly finding that one tiny obscure and seemingly unrelated piece of the puzzle that unlocks the solution. Oh yeah!"

- Chris Maunder, CodeProject Newsletter 28 Jan 2002
"Management at eSnipe, which is me, is also feeling the pain of the 2002 bear market. So rather than pout about it, I bought some stuff on eBay that I really didn’t need, but made me feel better."

- Tom Campbell, president of eSnipe

 



 

 
 Saturday, March 12, 2005
  10:32:13 AM  

On Error Resume Next

I was instructed to fix a problem that a "star" programmer had discovered with my code. Apparently his VBScript routine was working fine, but he was calling my COM component's functions and they were corrupting the database. Imagine my joy when I started to trace through his code:

Function WorksFine(EmployeeID)
On Error Resume Next
EmployeeRecord = LoadEmployeeData(EmployeeID)
On Error Resume Next
Calculate(EmployeeRecord)
On Error Resume Next
SaveEmployeeData(EmployeeRecord)
WorksFine = SUCCESS
End Function

As you might expect, sometimes the Calculate function would fail and return one of its many documented error codes -- such as, for instance, when the LoadEmployeeData call failed and returned one of its many documented error codes. What you might not expect was that sometimes the EmployeeRecord object had enough valid data left over from the previous transaction that the SaveEmployeeData call would actually succeed in trashing some unrelated employee's record, before returning with or without one of its many documented error codes.

The other programmer was quite upset when I put error handling into his routine.

It turned out he had written it this way deliberately so he didn't need to write an error handler. This was how he achieved the high productivity that made him a "star" programmer.

Moral: You get what you reward.

[True tales from the trenches]

  9:13:03 AM  

"Patch it until it's robust"

After months of panic mode, we finally managed to ship a functional port of a legacy program. The horrible hash of years-old FORTRAN code was impossible to decode, let alone fix, but we finally got it to run by turning off all execution error trapping.

While I was recovering, I took two weeks off my regular job as team leader to do a thorough code review. We knew the program didn't work too well (when it worked at all), but we weren't sure why. It turned out that not only did the program rely on overwriting past the end of one array into another (which we already knew), but that it didn't implement its core algorithms correctly. The results were more consistent than random numbers, but only a little more useful.

I concluded that the only way the program could be made to work was with a complete rewrite, which would take 4-6 calendar weeks.

I explained the situation at our next project meeting. The boss (and sole owner of the company) explained in turn that we'd already made a number of sales, and we couldn't delay delivery for merely technical reasons. When I reminded him that the program rarely ran without crashing, that the results were wrong, and that the code was unfixable, he reminded me that the top priority was to eliminate crashes during demos.

Finally, he concluded, "So it's settled - we'll patch it until it's robust".

...

Not too long after that my immediate boss complained that I wasn't showing the proper team spirit -- "We've noticed you've started taking Sunday afternoons off ... is this likely to continue?"

Well, yes.

Soon after that I left the company. The remainder of the development team beavered away putting patches on the patches. Finally, a year later, they got up the nerve to tell the boss the program really needed a complete rewrite before it could work.

Faced with the inevitable, the boss laid off the entire programming staff.

Instead, he boosted the marketing effort -- and, finally, sold the company to a group of investors.

Moral: Make sure you understand the business model before you think about the software.

[True tales from the trenches]

 
 Thursday, November 21, 2002
  9:22:04 PM  

Business Technology: Do The Report--Or Fix The Problem?

I had a beer the other night with the two junior execs whom we first met about a year ago as they pondered the ROI of ROI projects ("ROI Mania Is Upon Us"), and it seems their new fixation is a real-time business-feasibility report they've been asked to prepare. Was it the beer, was it them, or is it me?

"I'm telling you, he won't do it. Not in a million years. Heck, that guy still tells the salespeople to squeeze extra hard when they shake the hands of customers who don't increase their orders by at least 10%."

[Dubious stare.] "Who told you that?"

"Well, nobody exactly told me that, but the last couple of times I've met with him to talk about Project Open Book, he steered me to the door and offered a handshake and then crushed my hand so hard that he cracked three knuckles while giving me that big phony smile and asking me to tell him just how the project would help him increase sales by 10%. So that's how I know."

[Long-suffering sigh.] "OK, I'll take this slow: Did you tell him that by sharing with customers our forecasting information, general customer profiles, marketing plans, and competitive overviews, he can help his customers be more successful?"

"Of course! Well, I was just about to, and then he started telling me about this eagle he had the last time he played golf with our biggest customer's CEO. And he said that allowed the two of them to beat two other execs from the customer's company and that the 200 bucks he and the CEO shared from the match is the ONLY kind of sharing we should be doing with customers."

"Is that right? Do I need to remind you that our job is to CONVINCE the top managers to get behind this project by SHOWING them that real-time business can help them and our customers and our suppliers be more successful?"

"Spare me your sarcasm. You can lead a horse to water ... "

"Good point. I'll try to get that in our report, if it'll fit. Just remember what tops the list of potential disasters: Clogging the system with excessive and irrelevant data is as bad as not gathering the stuff we really need."

"I thought the No. 1 danger was failing to get buy-in on new and more-valuable business processes from all stakeholders?"

[Long stare.] "Y'know, sometimes you surprise me. But I digress. What about accounting?"

"They were great! They said they're supporting us--wait a second, let me check my notes--175%! The only problem, they said, is that they need more space for signatures on the electronic forms."

"No! That's the point! We've changed the business process so that our top priority is getting valuable, relevant, and timely information to the right people so they can make rapid, effective, customer-focused decisions! The answer is no--no additional room for redundant signatures because we're ELIMINATING the redundant signoffs that lead to redundant signatures and wasted time."

"I think I get the point. So you probably don't want to hear what the CFO said about that data-quality thing."

"Oh, go ahead--stick a knitting needle in my other ear."

"She said that if our feasibility report makes sense, she might be willing to allocate $25 million of the $30 million we're requesting, spread over 18 months. But she said she'll chew her leg off before giving us $2 million for data quality and integration. She said she's teaching us a lesson in the importance of getting it right the first time."

"But without that, the project won't work!"

"Of course it won't. But that's not our problem--our problem is finishing this report."

Having heard enough, I burped, paid the check, and said good night.

- Bob Evans is editor-in-chief of InformationWeek. E-mail him at bevans@cmp.com. [InformationWeek - 12 Nov 02]

 
 Monday, November 18, 2002
  9:33:08 PM  

Code counting metrics should give a big bonus for negative lines of code - they're vastly more effective than the other kind.

However you look at it, each line of code you write adds at least one more potential bug. Excess code is bad. Simple is good. Reuse is bliss.

Sometimes I've managed to drop hundreds of lines of code in a day - especially when debugging and troubleshooting other people's code. Obviously, someone "achieved" ten times the "productivity" by pasting the same code ten times instead of using a subroutine. Over time, though, you end up with one fix in four of those copies, and another in three ...

Even with new code, you can often generalize the existing code to get improved functionality from fewer lines.

Finally, one of the smartest things a programmer can do is to use other people's components. Sure, you *could* write your own XML parser, and "produce" thousands of lines of code --- and thousands of bugs (and lines of code for patches) etc.

We all know that, but I've never had much luck convincing management that less is better. After all, quality is hard to measure, but lines of code is easy. The same managers who like to track the number of hours spent on each task tend to like counting code. ["What do you mean, you spent three days working on it and you only wrote two lines of code?"].

By the way, I think lines of code is a meaningless concept to start with - how does one line of assembler code compare to one line of APL?

 
 Tuesday, November 05, 2002
  4:05:29 PM  

Feature Article: Know Thy User

David S. Platt

In this issue, I want to depart a little from my practice of writing hard-code programming articles and do a little ranting about eh absolutely TERRIBLE user interfaces I've been seeing lately. Is it only me, or have they gotten worse, by an order of magnitude?  With .NET, programmers have all kinds of tools with which to write user interfaces, on the Web or with rich clients on the desktop.  And most of them SCREW IT UP ROYALLY!! The UIs are cumbersome. They're cryptic. They require a user to think like a computer, not the other way around. They're just plain stupid.  

How do they get this way? Programmers have to have a certain level of intelligence in order to program, and most of them are at least somewhat smart when dealing with code. How can they be such lobotomized morons when designing a user interface? One simple reason: they don't know their users. And it never occurs to them that they don't know their users. Every programmer thinks he knows exactly what users want. After all, he uses a computer all day, he OUGHT to know.  He says to himself,  "If I design a user interface that I like, the users will love it," 

WRONG, TURKEY! You don't know what the users want, because you're not one of them. Your user is not you. Write that down. Engrave it on your heart, along with F = MA (any physics majors in the audience?) and "always cut the cards". Your user is not you. Your user is not you. Unless you are writing programs for the use of burned-out computer geeks, your user is not you.  Here is Platt's first, last, and only law of user interface design. Get it right, and you can't screw up too badly. Get it wrong, and your product is guaranteed to bomb:

Know Thy User

For He Is Not Thee

Here's an example. Every time I teach a class at a company (How about yours? Call me at 978-356-6377.), I ask the class how many of the students drive cars with a manual, stick-shift transmission (as I do). I usually get about half the hands. I then ask how many more WOULD drive stick shifts if their wives would let them, or if they came on the minivans (like my next car, probably) that they need to drive because they're turning into old farts, curmudgeons like me. I usually get about half the remaining hands. (Try this test around your company and tell me what results you get.) "Now would you not agree," I ask, "that driving a stick shift takes more work than an automatic to learn and to use, but gives somewhat better performance, flexibility, and efficiency ?" They know they're being led somewhere they don't want to go, but they can't usually wriggle out at this point, so they agree suspiciously. "Now, what percentage of cars do you think are sold with stick shifts in the US?" They wriggle uncomfortably and say something like, "I bet it's low. Thirty percent?" You wish. Sales estimates vary, but automotive correspondent Tom Whitehurst, says roughly 10% at this link here, and Edmunds comes in with 13.5 percent at this link here. Let's call it 12.5 percent, or 1 out of 8, for the purpose of comparison. 

This means that 6 out of 8 programmer geeks value a slight increase in performance and flexibility so highly that when they spend $25,000 or more on Motor City iron, they're willing to do more work (they'll say not much more) continuously over the life of the product to get it. But only 1 out of 8 of the general population makes the same decision when confronted with the same choice. And it's actually even lower than that, because all 6 of those geeks are in that 1 out of 8. The percentage of normal people willing to tolerate the extra effort is probably half that, maybe one out of 15. You value power and performance and configurability. Your user values ease of use, by a factor of 10 to 1 or more. Your user is not you.

For another example, I keep putting these pictures of my daughter into this newsletter (see her with a baby puffin in Iceland at the bottom of this page). Why am I laboring under the misconception that you actually enjoy looking at them? Because I enjoy looking at them, so I know what you would enjoy them as well (What the hell do you mean, you don't? You'll have to agree that she's far better looking than me. And will probably be smarter, too [don't touch that line]. Let me tell you about the cool stuff she just did ...) And because it's my newsletter, so if you don't like it, tough. That attitude doesn't usually translate into sales of commercial products, though. If this newsletter didn't carry content of such scintillating brilliance, you wouldn't put up with that nonsense. And the fact that it's free doesn't hurt either. 

So who is your user? That's the first question in any work of communication. And the second and the third. And arguably the fourth, fifth, and sixth.  I've met enough of you and taught enough of you that I think I have a handle on who you are, in the aggregate, anyway. But again, it's not enough to actually THINK you know who your users are. You have to KNOW, and that's much harder to do than you think it is. Finding real users is harder than you think. Letting a marketing guy represent them, as often happens, is like playing Russian Roulette with an automatic: quick suicide. Find some real ones. A client of mine, a Canadian company who will probably recognize themselves, is designing a major Web application without talking to any real users. I told them that they are committing malpractice. I haven't heard anything about changes.  

Once you've found real users, interviewing them isn't enough. Sure, ask them what they want, ask them about their background, ask them where they're coming from and where they're trying to get to. But this gets you only so far. They often don't know exactly what they want. Or they'll politely tell you what they think you want to hear (this problem is particularly bad in Canada), guided by clues in your questions. You'll say, "Wouldn't it be cool if [some UI feature you like]?", and what can they say? Especially because if they do say, "No, that would suck, and you're an idiot" the interviewer often starts arguing: "but haven't you always wanted to ..."  The act of observation changes the result. A physicist would call it "getting Heisenberged", after the author of the famous Uncertainty Principal. 

And once you're finished with your user design, you need to test it on real users. You'd never ship a product without testing its internal algorithms (OK, you SHOULDN'T), so why would you think that you can get away without testing a user interface? A computer that your users can't figure out how to use is a very expensive paperweight. You could give them the application to try and ask them afterwards how they liked it. But they often won't be able to remember what they did, or won't want to tell you about problems because they feel stupid that they couldn't figure it out, or won't want to insult you by telling you what a complete pile of crap the product of your last two years of professional life has turned out to be (this is a problem that I do not have.)  So you can get some idea by talking to them, but you really have to observe what they do in the act of dealing with your user interface. And you have to do that in a manner that doesn't affect their behavior. This means that you have to put them in front of the application in an isolated room, having access to only whatever support materials (e.g. online documentation) they will have in real life. You have to watch them through one-way glass, videotaping their reactions, and have logging software so you can see exactly which keystrokes and mouse clicks they used to try to deal with your application. 

When you do this, the light bulb goes on. As Alan Cooper wrote in his classic book About Face: The Essentials of User Interface Design (IDG Books, 1995): "Programmers fight desperately against the insistence that their creations can't be valid until they are tested by users, and usability professionals seem to have retreated to the empirical as their only way to convince the logical, rational, engineering mind that there is another, better approach to designing the user interface. They drag programmers into dark rooms, where they watch through one-way mirrors as hapless users struggle with their software. At first, the programmers suspect that the test subject has brain damage. Finally, after much painful observation, the programmers are forced to bow to empirical evidence. They admit that their user interface design needs work, and they vow to fix it." 

Here's an example of doing it right. I once taught Windows at an insurance company that was writing a Windows terminal emulator application to replace some expensive IBM terminals. Unusually for an insurance company's internal applications, they actually did the usability testing that I just told you about. And they did it well, too, with video tape and one-way glass, programmers watching, the whole thing. They found that the users basically liked the application and found it usable. But the users had the habit of pressing the "Enter" key to move from one input control to the next, as their terminals did, rather than the Tab key as Windows applications do, and had to keep going back to redo things when it didn't work. Couldn't the developers change that, they asked? After hashing it over, the developers decided that, while it was quite easy technically , it wouldn't be a good idea from a user standpoint. Sure, they could make this application work the old way. But all the new Windows applications that the users were going to have to use wouldn't work like that, and the users would soon go schizoid switching back and forth many times per day. So they developers convinced the users to bite the bullet and make the change. And after a period of squawking, the users calmed down, helped by the abysmal job market in that area at that time. My point is not that you should cram down users throats the features you think would be good for them. You usually can't get away with that; this was a special case. I'm relating the story to show you how a client of mine did a good job of usability testing. They did the testing they needed to do. They found what there was to find. And then they made the right decision based on what they found. I wish more companies would do that. 

So remember: Your user is not you. They care about ease of use first, and everything else fifth or worse. Observe them as I've told you, and you'll quickly see that. Now go write your programs for your users and not for yourself. 

I hope you enjoyed my UI rant. Give me a call if I can help you out. Until next time,  as Red Green  would say, "Keep your stick on the ice." 

Thunderclap, the Newsletter of Rolling Thunder Computing - Volume 5, Number 1 Fall 2002
This newsletter is Copyright © 2001 by Rolling Thunder Computing, Inc., Ipswich MA. It may be freely redistributed provided that it is redistributed in its entirety, and that absolutely no changes are made in any way, including the removal of these legal notices.

 
 Monday, October 28, 2002
  1:05:26 PM  

Cathy Rogers Responds Without Crashing

Responding to your questions today in finestkind all-lowercase form is Cathy Rogers, former co-host (the technical term is "presenter") of Scrapheap Challenge and Junkyard Wars, now presiding over a brand-new show, Full Metal Challenge. [Slashdot] Posted by Roblimo on Monday October 28, @12:00PM from the large-hunks-of-metal-slamming-into-each-other dept.

1) Time...
by AmigaAvenger:
On Junkyard wars it always seemed that the teams had something in running condition before the end of the time limit. Was there ever a time when a team had ABSOLUTELY nothing worth sending into competition? (Wouldn't make for much of a show though...)

Cathy:

absolutely nothing? hmmm. i think that's a question of interpretation... did you see the hydrofoils show? neither of the machines worked at all. so what did we do... repeated the challenge for the british version of the show and that time... neither of them worked again. we just won't learn. but its funny - people use to think i was just being a smart arse when i would go in and give the teams a hard time for being behind, having nothing ready etc - but really i was terrified that we wouldn't have a last part of the show and was imagining that we'd all have to do the can-can or something...

2) Why do you think Engineering is so male dominated?
by Anonymous Coward: You have said in the past that it would be good to have an all female team, but as yet, we haven't seen this. Why do you think so few women are interested in technology?

Cathy:

oh lord i don't know. i vacillate so much on this one - sometimes i think it is all just habit and training and sometimes i think there really is some different configuration of men's and women's brains - like when i see my little niece desperately wanting to wear pink and play dollies and my nephew constantly deconstructing the alphabet / numbers etc.

but we have actually had all-female teams a couple of times now - twice on junk and in the new show full metal challenge. (in fact there is a fabulous all women team in the show next week - the flamin' aussies who are all drag-racers and are cooool) and they've done well - but they're always a real battle to find. i thought it would be easier in america, where in many ways women's position in society generally is more evolved - but i was wrong. it seems just as tough. and its odd because in other areas of science women are ahead of men. its just something about wirey stuff and digit stuff and big hammer stuff. but any tech-keen ladies reading this, please please apply! you have my ear.

3) how do you do it?
by Suppafly: A lot of people don't realize that not only do you work on all of these shows, you help conceive the initial ideas behind them. How do you do it? Did you just one day have an idea and present it to a network, or did you work from the inside to have your concepts realized? What in your past got you interested in the whole build things from junkyard parts concept?

Cathy:

i was working for an independent tv company (rdf media) when we first hatched the idea for scrapheap challenge (the british name for junkyard wars). so i was in a good position in that i was talking to people at the networks here all the time about all kinds of ideas. and that was just one that hit home. the idea actually first came from the movie apollo 13 and being transfixed by the 'houston we have a problem' part. that scene in which all the very non-typical-hero boys at ground control had to figure out how to save the astronauts lives with nothing but a bit of knicker elastic and a plastic knife. it was that that got us thinking - making life-saving stuff out of rubbish - brilliant, and making the people who aren't normally heroes (i call them the grubby fingernail brigade) into heroes - fantastic. the junkyard and all the rest kind of followed from there. don't know quite how i have managed to end up doing so many shows about boy stuff though. i would much rather go to a nice art gallery.

4) American vs. British contestants
by banda: Have you found any differences between the contestants in different iterations of the show? Speaking as an American who spent part of his youth in England, I find the British contestants much more entertaining, insightful and engaging. Was it easier to work with any particular group? Were there any contestants that made the show difficult?

Cathy:

well here's a funny thing - a lot of americans prefer the british teams and a lot of british people prefer the american teams... what can it all mean? are we all riddled with self-loathing? are we all superbly positive and outward-looking and natural anthropologists? i don't know. i think there is part of the show which is about observing people doing their thing in their natural habitat, a bit like how we might watch a natural history film about baracudas. and in that sense it is easier to watch people who are bit removed from ourselves. i would say in terms of being a host (yuk yuk hate that word) - it is easier to do the american shows because american people are more 'tv-articulate' - they understand what is required for tv - i guess simply because tv is the most dominant medium in american life and history. whereas for brits, other media are still dominant if you look over the whole period of our history; we haven't quite let go of a time when we read dickens serialised in pamphlets, so we are more used to sitting quietly taking things in - rather than 'putting them out there' ourselves. americans can get away with saying things like 'i am the big cahuna' whereas british people just sound silly saying things like that. the only downside of the american show is that americans seem to be more competitive, which can mean that things get a bit serious sometimes. in the new show FMC the brits often lose and find it all rather funny and are very self-deprecating. but the americans sometimes cry!

5) Sounds from the indie records
by Mikey-San: Before the 'Heap, you were in a British indie-crash-twee-pop band called Marine Research, and before that, Heavenly. Do you keep in touch with Amelia and Rob these days?

Cathy:

indie crash twee pop?! yikes. don't let that get out. yes i do keep in touch with the old indies though i must say i don't go and shuffle along to shows as much as i used to. i saw britney in vegas so the tortured lollipops at the dublin castle will never feel quite the same...

6) As a musician, what do you think of...
by CSG_SurferDude: As a musician, what do you think of the music industry these days, specifically about the slave-labor-like recording contracts, industry ownership of copyrights, Peer-to-peer song sharing (MP3s), and the current fruitless atempts to copy-protect CDs? Is there anything that you can do in your current position to help change any of that to the betterment of recording artists and consumers everywhere?

Cathy:

is this a leading question?! do you have a letter drafted for me to sign?!

er.. where to start? big corporations are scary in many many ways and the music industry is obviously no exception. but although there seem to be so many new issues today where normal people / artists / whatever are exploited i wonder whether it is really that different from when i was a kid and me and my mates used to tape everything off the radio and make compilation tapes (one of the greatest and most overlooked art forms) and never buy a record in our life. except if it was a local band or a band on a really cool label or a record where we just loved the cover and had to have it. its a big discussion - the only incontrovertible good is to support your truly independent labels. k records / kill rock stars / many others have proved that you can have integrity, great music and not go under.

7) Role of expert
by naarok: Watching on TV, it often seems that the expert provides some good initial insight into a problem, but then often becomes superflous. Sitting through many hours of actually watching the challenges unfold. How valuable were the experts in comparison to teams with general inventiveness?

Cathy:

it depends a lot on the challenge. if its something innovative and thought-provoking like 'build a car that fits in a suitcase' then most teams who have the necessary know-how to get on in the first place would be able to make a pretty good stab at it expert-less. but in other challenges, such as making gliders or submarines, they are dependent. it also depends of course how well they all get along....

8) massive disruption to geeks everywhere....
by gclef: So, have you ever been tempted to wander into somewhere like a LinuxWorld conference, just to see if you could stop all productive work from occurring? (you probably could, you know...) If not, are you tempted now?

Cathy:

er. i blush easily. my sister and i used to have a fantasy about going to this event called 'crufts' (a really pompous but very-seriously-taken dog show in england (like, they show it on tv! ) where people parade their over-coiffured hounds around doing daft tricks and generally proving that to be english is to be humorous in this fairly tragic way) and doing a streak. but maybe just with bottom halves! it would be a totally pointless act of sort-of-harmless-sabotage of a worthless institution and this amused us.

i suppose what i mean (ie not evading your question quite so obviously) is that the notion of committing a minor act that leads to massive disruption is an appealing idea. but i'm not quite sure about yours....

9) Off screen testing?
by The Mutant: How much testing goes on off screen? For example, the episode where participants had to build a diving bell, descend to the bottom of a small pond, and retrieve a chest of gold. I don't believe that this was not tested off camera, if for no other reason solely to insure you didn't inadvertantly end up making a snuff episode. Same thing goes for pretty much any device where explosives were used, or even the airplanes.

Cathy:

worryingly little. its always the hardest decision - test them and make sure they work but risk them breaking during the test (which you're not filming) and then you have no show, or fail to test them and have true spontaneity and excitement about the outcome but risk them failing during the show or being dangerous or whatever. we debate it endlessly and there is often a half way house - the diving bells you can put in the water and test-pump some air, the gliders you can tow up on a winch without a person on them. but it never gives you the full picture and what you see in the show is invariably the first time the machines have been properly tested, people and all. scary isn't it?

10) Why Rollins? Why!!
by SanLouBlues: What's the coolest thing you've ever built yourself? Or, what's the coolest thing you've ever tried to build yourself?

Cathy:

well who else would look as good in a power station? i mean, just say the words 'disused power station' and you think of henry. i think he is fantastic - a force of nature. and he makes me laugh a lot.

what have i built? lord how embarrassing. you have outed me. the sad truth is the things i have made which have been the most impressive feats of engineering and construction have been cakes. sshhhhhh.

 
 Thursday, October 24, 2002
  8:12:41 AM  

mpt
the Weblog of Matthew Thomas

Rage, rage against the bloating of the preferences

In Havoc Pennington’s usability essay, he described a phenomenon which curses Free software development:

Reading dozens of GNOME and Red Hat bugs per day, I find that users ask for a preference by default. If a user is using my app FooBar and they come to something they think is stupid — say the app deletes all their email — it’s extremely common that they’ll file a bug saying “there should be an option to disable eating all my email” instead of one saying “your craptastic junk-heap of an app ate my email.” People just assume that FooBar was designed to eat your email, and humbly ask that you let them turn off this feature they don’t like.

This also happens with Mozilla — people assume that a bug is a feature (though given Mozilla’s design, sometimes it’s hard to tell the difference), and file a bug report asking for the option to turn it off. For example, for years Mozilla has had occasional bugs where a browser window jumps to the front when it loads a page. They’re hard bugs to track down, and eventually people start thinking they’re deliberate. So again and again and again, they request not that the bug be fixed, but that they be allowed to turn the bug off.

Now Ian Hickson documents a related phenomenon: a module owner decides that something should behave a particular way, whereupon someone who disagrees will immediately file a bug report asking for the option to have the opposite behavior. In these cases it doesn’t particularly matter what behavior was chosen — tossing a coin and hard-coding the result would be better overall than providing a user option would. Nevertheless, the prefs dialog becomes mere territory in the battle of wills:

Annoying Person
Do what I say!
Module Owner
No.
Annoying Person
Ok, make it a GUI pref!
Module Owner
No.
Annoying Person
Could it be a hidden pref, then?
Module Owner
No.
Annoying Person [feebly]
Ok, how about asking the user at install time?
Module Owner
No.
Annoying Person [whimpering]
There could be a pref to turn off the pref …
Module Owner
No.

(Ian asked me earlier for an example of the last extreme suggestion, and eventually I found one. Brian Ryner: “I could do a pref to hide [the Mouse Wheel preferences panel] from the prefs page easily enough … Would that be acceptable to everyone?”)

Unlike Ian, I don’t find this “disturbing” or “confusing” — it happens often enough that it’s almost certainly a social problem in the community, rather than a psychological or psychiatric problem with the individuals concerned. As with many social problems, it’s difficult to solve, so it’s easier to feign bewilderment — or, as Blake Ross and I do, to make fun of it.

http://mpt.phrasewise.com/2002/05/12#a207 Posted by mpt on 5/12/02; 7:35:40 PM Copyright © mpt.

  8:07:36 AM  

Expand your infobase

Russell Pavlicek
October 18, 2002 01:01 PM PST

EVERY ONCE in a while, I get a message from someone who says "We had such-and-such problem under Linux. We called our vendor support line. They suggested a couple things that did not help. They said they would look at it, but they could not find an answer. We can't implement unless we can get this solution!"

There are a couple of problems here. First, if these accounts are accurate, it would appear that some companies supporting Linux are using an old methodology that assumes that all answers will be found in an in-house infobase or internal engineering organization.

Any support organization running that way is long overdue for an overhaul. Software that was born on the Web sometimes needs to be supported with information gleaned from the Web. If you rely solely on in-house information, you will eventually hit a brick wall.

The second problem is that clearly a large number of people in the IT industry -- both support specialists and IT techs alike -- do not seem to know how to use the Internet well. Now, I understand that sounds like an immensely stupid and arrogant statement. With so many IT professionals "living" on the Internet, how can they possibly not know how to use it?

As implausible as this sounds, I've begun to accept that this must be the case. I've lost track of how many times I have found answers to supposedly unsolvable "showstopper" problems in under 30 minutes. It's not a matter of technical expertise. Half the time I know less about the subject matter than the person with the problem. And it's not a matter of tools. I can find most solutions with little more than Google and a decent Internet connection.

No, the Web-based nature of open source has highlighted a real problem. The computer industry can no longer afford the luxury of technical myopia. IT professionals need to learn how to locate answers on the Web. Support vendors have to stop relying solely on pat answers in their own infobase and learn how to search for answers. It's time to apply 21st-century solutions to 21st-century problems.

People who cut their teeth on open source are often quite good at these skills. The amount of information on the Net regarding open-source software is astounding. But finding the precise answer can take a little effort. Still, many open-source folks are quite clever at seemingly pulling answers from thin air.

Unfortunately, some support organizations -- even those supporting open-source products -- don't seem to have enough of these people. And that is a major problem.

In today's Internet-driven world, we can't afford a single support organization's single point of failure. The Web has gobs of support information, even for many closed-source products. And a company looking for an edge will learn how to use this information effectively. Those who rely on 20th-century methods will run behind.


Contact Contributing Editor Russell Pavlicek at pavlicek@linuxprofessionalsolutions.com or log on to his forum at www.infoworld.com/os.

http://www.infoworld.com/articles/op/xml/02/10/21/021021opsource.xml
Copyright 2001 InfoWorld Media Group, Inc.

 
 Thursday, October 17, 2002
  9:37:22 AM  
Cuddling a cat beats talking to spouse


Study finds pets ease stress (with some exceptions)


By ANDRé PICARD
PUBLIC HEALTH REPORTER
Tuesday, September 24, 2002 – Page A1

Spending a few minutes cuddling a pet can do more to relieve stress than trying to talk about problems with your spouse, a new study says.

Researchers also found that having a pet present when you carry out unpleasant tasks is more effective than human support.

"While the idea of a pet as social support may appear to some as a peculiar notion, our participants' responses to stress, combined with their descriptions of the meaning of pets in their lives, suggest to us that social support can indeed cross species," said Karen Allen, a psychologist at the State University of New York in Buffalo, and lead author of the research.

To conduct the study, published in today's edition of the journal Psychosomatic Medicine, researchers turned to 240 married couples, half of whom had a cat or dog as a pet.

Each participant underwent two "stress tasks": mental arithmetic problems and submerging one hand in ice water for two minutes. They conducted the tasks in the presence of, separately, spouse and pet, and their blood pressure and heart rate were monitored.

Those who did arithmetic in the presence of a spouse made the most errors. When pets were present, participants not only did better at math, but were less stressed by the cold-water test and recovered more quickly.

There was no difference in results between dog and cat owners. "The findings demonstrate that pets can buffer reactivity to acute stress as well as diminish perceptions of stress," Dr. Allen said.

A second study, also published in Psychosomatic Medicine, found that arguing with your spouse is not only stressful; it can damage your heart.

Laura Glynn, an assistant professor of psychiatry at the University of California, San Diego, said an argument not only causes a person's blood pressure to rise, but ruminating about drives blood pressure up again in a way that physical stresses do not.

"Exposure to emotional stress may be of greater potential harm to cardiovascular health than stresses that lack emotion, even though both types of stress may have been provoked by the same initial responses," she said.

The study was conducted on 72 students who were given four tasks, two designed to induce an emotionally driven rise in blood pressure and two designed to provoke strictly a physical rise. Students who were left alone after the emotional task tended to ruminate over it and their blood pressure stayed high. But students who were distracted saw their blood pressure return quickly to normal.

Chronic stress is considered an important factor in elevation of blood pressure, which is a major cause of cardiovascular disease. As many as 20 per cent of Canadians suffer from high blood pressure. Chronically high blood pressure can lead to heart attacks, atherosclerosis, strokes and hardening of the arteries.

Cardiovascular disease is the leading cause of death in Canada.

A recent poll conducted for The Globe and Mail and CTV News revealed that the greatest stressor in the life of Canadians is work. Forty-three per cent of respondents said work was the main cause of stress in their life; 39 per cent said finances; another 10 per cent pointed to children, and 7 per cent said it was their health.

  9:34:00 AM  

eBusiness Journal

August 2002, Vol. 4 No. 8  

Geeks: The true innovators

by Dave Cosgrave

It's fair to say that Web services have plenty of work to do before they even begin to live up to the hype. Instances of Web services in action — whether slashing costs, hoisting profits, or enabling the ultra-lean and super-nimble corporation — seem few and far between.

Of the few, none really demonstrate the kinds of ingenuity or creativity that marked the early (and equally hyped) days of the Web. Most are straightforward enterprise application integration (EAI) projects.

EAI may be interesting to some, but it just doesn't do it for me.

To find truly innovative applications of Web services technology, you have to stop looking in corporate IT departments, and start paying attention to the geeks.

The techno-blogs are buzzing these days with clever initiatives that utilize open APIs from Google, Amazon, eBay and growing list of others.

Most of these cyber-contraptions are silly, even useless. None is even close to any sort of commercial application. But that's not the point.

It took a critical mass of "outsiders," fiddling around with e-mail and HTML, to kick-start the Web. The same thing is happening with Web services, as the development community has begun using XML and SOAP (or its substitutes) to exploit IT and business processes in new and appealing ways.

Google's API in particular is generating much of the activity. On the silly and useless side of things try Google Smackdown, where you can compare the relative popularity of two keywords in a single search. (Note: Google references to Dave still outnumber references to "Osama" by more than 10 to one.)

You'll probably find the Googlematic a more useful tool. This handy little application lets you query Google, as well as receive a subset of the results, via MSN and AOL instant messenger. Just send an IM to Googlematic, and the rest is magic.

The most interesting use of the Google API I have seen is the Touch Graph GoogleBrowser. It queries Google's similar pages database to visually represent complex linkages between Web pages. In other words, it helps you to "see" the Web. The GoogleBrowser has been heralded as a breakthrough in the presentation of associative information, and a social networking killer app.

Other initiatives grabbing attention tap into the considerable IT power of Amazon.

Delicious.org, for one, maps the live playlist of an independent Seattle radio station onto Amazon's CD catalogue, resulting in a new kind of retail organization.

Bookwatch trolls the blog universe for Amazon references, which it then compiles in a linked Top 10 list.

In a variation on the GoogleBrowser (which is open source), the Amazon Vista browser draws on Amazon's collaborative filtering application to graph associations between products, based on buying patterns of customers. Amazon, a persistent innovator in its own right, never thought of that.

I expect the next wave of grassroots Web services innovation will see the development community experiment with combinations of these elemental Web Services. Think of it as open source — or maybe Lego — raised to the level of business design. With a little imagination, we can imagine snapping together processes from Google, Amazon and eBay into intriguing new structures. Not all of them will be useful or even stable at first. Evolutionary forces will sort the winners from the losers.

I also expect to see more and more Web-based services, like MapQuest or ePinions.com, publish open APIs in order to capitalize on this eager and highly capable innovation engine.

Among economists there is growing recognition of the crucial role users play in the supply of ingenuity, from the design of mountain bike components to modules in computer operating systems.

The early users of the Web, the stereotypical computer geeks, drove its evolution through innovation.

Now, they're interested in Web services. Best keep an eye on them.

Dave Cosgrave works for Digital 4Sight (formerly The Alliance for Converging Technologies), an international consulting, research and education organization based in Toronto. He is also co-author of Chips & Pop: Decoding the Nexus Generation. He can be reached at dcosgrave@digital-4sight.com

http://www.itbusiness.ca/index.asp?theaction=61&;sid=49470#
Copyright © 2002 Transcontinental Media Inc. All rights reserved.

  9:16:57 AM  

A Blogger Code of UnProfessional Ethics

My readers:

...know me. They will judge me according to context.

...are smart. They will not be misled by some stray comment I may happen to make.

...are kind. They make allowances and forgive me ahead of time.

In return:

I will speak my mind about what I care about.

I will not revise too much or too carefully: Blogging about opera is still jazz.

I will not anticipate and reply to every objection: Punctilliousness in pursuit of the appearance of propriety kills voice.

If I apologize, it will be because I have actually betrayed my readers' trust, not because I may have, might have, or could be misread as having done so.

I pledge to keep the reading of my weblog purely optional.

I love you, Doc.
JOHO the Blog Wednesday, October 16, 2002
A Blogger Code of UnProfessional Ethics

Some mighty fine blogging on this topic going on over at AKMA's place. For example, he writes: "When we’ve been most effectively seduced, we’re not aware of it ourselves." As they say in churches around the land: Bingo!

Also, I am reminded of Chris Pirillo's Blogger's Manifesto from February '02 (as well as my parody of it).
8:41 AM | PermaLink

Full Disclosure

Why I Can't Ever Tell the Truth about Microsoft, Ever

To satisfy the requirements of the new Standards of Integrity and Professional Ethics for bloggers (for a discussion, see Dave, Doc and Mitch), I am hereby posting all the influences Microsoft has had on me, pro and con.

I use and like many Microsoft products

Microsoft products have been crashing on me regularly for over 15 years

I got a reviewer's copy of XP for free

I've bought thousands of dollars worth of Microsoft stuff, including upgrades that I felt had been forced on me

I competed against them at three companies

I cooperated with them at the same three companies

The Microsoft Word product manager listed the product I was flogging on a slide at Documation 1992 as an excellent complement to Word

Word eventually incorporated many of that product's best ideas. (You like them right mouse button menus? You're welcome.)

I got a free beta of their Word-to-SGML software

I couldn't get the free beta of their Word-to-SGML software to work

When I was liaison to Microsoft for a company I worked for 1991-1993, the Microsoft manager I was working with paid for my lunch a couple of times in their cafeteria

During all the time that I was liaison, they never once took me to a nice place for dinner

I have routinely installed single-user Microsoft products on two or more household machines up until XP and Office 2002

Having to pay to multiply install software I've bought means this was my last upgrade, pal

Every Microsoft engineer I have ever met has impressed me with his or her intelligence, customer focus, and integrity

I routinely curse the stupidity of the assholes who design dumbass fucking Microsoft products. What are they, a bunch of morons?

I hate Microsoft's de facto monopoly of office productivity software

I am happy that everyone uses PowerPoint because it makes complex events so much simpler

I am very glad I am not Bill Gates

I am envious of Bill Gates

(PS: When Dave asks, "It's a matter of what kind of blogging we want -- do we want it to be sloppy or crisp," my answer is an emphatic yes.)

8:20 PM | PermaLink JOHO the Blog Tuesday, October 15, 2002
Full Disclosure Why I Can't Ever Tell the Truth about Microsoft, Ever

 
 Monday, October 14, 2002
  7:46:04 PM  

A few words ... Business Object Programming

I started to write an editorial about reflection in .NET and set that aside to focus briefly about a change I see in the development industry. I want to briefly talk about a paradigm shift that is starting in the development industry.

About a week ago, I helped arrange a presentation on "What is Microsoft .NET". This was a technical presentation give to a little over 50 technical business leaders here in my home town.

These were not Microsoft cronies, rather this group provided a good cross section of the technical industry. Most of these people were leaders in IT departments.

A realization came to light as I listened to the questions being asked about .NET and the direction Microsoft seems to be taking their tools and platforms. This insight goes deeper and wider than just Microsoft's .NET, it extends to the very concepts and standards taking form today. The development world is changing, and most people don't realize the extent of the change that is happening. Worse, the change is so subtle, that most developers may not fully understand that it is a change.

I was once told that the paradigms in the development world change about every 10 years. About ten years ago, OOP started to take hold. In simplistic terms, OOP was a different way for programmers to approach an application. OOP changed the way programmers needed to think about both design and development. I could argue that OOP is just now--in the last few years--being truly implemented on a broad scale. Many people still don't really understand the basic concepts in OOP including concepts such as encapsulations, polymorphism, and inheritance. Having said that, I believe there are few people who would argue against OOP being a positive approach that offers significant advantages for developing applications.

Just as OOP was a significant change, the new paradigm that is coming into existence will also have a huge impact. Some of you may already be developing using the new paradigm. Many more may believe you are. Regardless, the change may not seem very big. The long term impact on implementing this change can be as big as implementing OOP. It could possibly be even bigger.

What is this change? I don't know the word for it yet. I'll call it BOP.

With standardizations occurring in a number of areas, the ability to create interoperable applications is becoming much easier.

Better yet, the ability to create interoperable components is now easier.

With new tools such as .NET, the programming language is becoming less of an issue. While various programming languages have their benefits, for the most part it doesn't matter what language you use to create your applications. Add to this the fact that building user interfaces has become easier. Standards such as SOAP and XML are making it easy to communicate between applications or between processes. Additionally, infrastructures such as the .NET Framework are providing the flexibility to create a single application that can adapt to a number of different interfaces.

Because the form factor of devices is changing, and will continue to change, it is becoming imperative to not develop applications that assume a specific set of device or interface characteristics.

Additionally, building forms can be done with much less effort.

Building forms that can adapt to different device interfaces is also becoming easier.

Where is all this leading to? It is leading to a change in the priority of the focus of an application being developed. Enter BOP.

The paradigm shift that I see happening is a movement towards Business Object Programming. Business logic is the unique, most valuable aspect of most programs. It is often the business logic that is core to an application meeting a user's needs. The trend will be to build the business logic of the application (Business Objects) as a completely independent part of the application. The core business logic should be programmed into modules that stand independent of the interface. By doing this, you will leave yourself much more flexible to react and adapt to the changing interfaces. Additionally, just like regular OOP, you'll be able to adapt these business objects to your changing business rules.

You'll find that standards for accessing business objects are being created. Many are already in place. Additionally, standard ways of making business logic available are also being created.

This includes the use of Web Services, remoting, and more.

BOP has been happening. With easier standards coming and with the ability to communicate with such objects on any machine nywhere in the world (via the Web and other standard protocols), the level of difficulty for interacting with Business Objects is becoming easier. It doesn't take a COM or Corba programmer to create and use business objects. They can be created without much additional effort by programmers of any language.

BOP is not really new. Business Objects have been around for a while. Additionally, there are a number of related concepts such as Business Intellegence.

As a developer, your focus should be on the business logic. Look for BOP to come to an application near you.

Until next week...

Brad!

Brad Jones
www.CodeGuru.com
webmaster@codeguru.com

CodeGuru Newsletter (09-24-2002)
This newsletter is published by Jupitermedia Corporation
http://internet.com - The Internet & IT Network Copyright (c) 2002 Jupitermedia Corporation. All rights reserved.

 
 Wednesday, September 18, 2002
  7:30:04 PM  

Don’t Touch Me!

In the 1990s my son had a computer game that allowed him to build whole cities. He controlled multiple serfs, specialized tradesmen, and soldiers. The sound track fit what the workers were doing, and most of them mumbled or graciously cried “Yes my lord” when prodded into action by his mouse clicks. But one particular worker bee had a different retort. “Don’t Touch Me!” he’d cry.

I’m beginning to feel like that person. Every time I turn around some new utility on my desktop wants to auto update itself. Working on its own, it contacts some update URL, downloads new code and installs it without asking me!  More ...

I’m not talking about Windows XP here; I can elect to turn that automatic update off. It’s a very simple chore to access Control PanelSystemAutomatic Updates and select “Turn off automatic updating. I want to update my computer manually.” And yes, I believe you should do this. What I’m complaining about is the audacity with which an ever- increasing number of companies think that they know what’s best for me, and that they have the right to touch my computer and change the code running on it. We used to call that cracking and it used to be illegal.

An example of what I’m talking about is EarthLink’s Update Manager. This little bugger runs all the time and periodically checks for changes that EarthLink would like me to believe are critical. Not only is it using resources on my machine, but it doesn’t ask me before an update. I’ll keep my EarthLink account, but recent updates to my system blocked me from getting any mail via dialup the last time I was on the road. I don’t mean to single out EarthLink--it just happens to be the one I’ve had the most problems with lately. EarthLink at least displays their update service on the task bar, and defines it in their help system. They’re also extremely responsive to customer calls.

The most egregious example of this desire to control my computer is the EULA for Windows 2000 Service Pack 3 (and others tell me it’s the same for XP SP1). It gives blanket permission for Microsoft to change code on your computer any time it wants to. The statement below is taken from the EULA.

"...You acknowledge and agree that Microsoft may automatically check the version of the OS Product and / or its components that you are utilizing and may provide upgrades or fixes to the OS Product that will be automatically downloaded to your computer."

To me this says that when I click “I agree”, Microsoft can come calling anytime it wants and download upgrades and fixes. At least XP allowed me to turn the feature off. This new EULA seems to say I can’t. I know there are those among you who think automatic updates are the only thing that will save us. You think a software company should do this automatically and transparently.

Hogwash. Do you allow the company that made your refrigerator, furnace or television set to come inside your house without your permission to see if these appliances need updating? Would you think it’s OK if your leased cars got pulled over, inspected and updated? Who’s to say these upgrades and fixes won’t crash the computer or interfere with something else I’m doing (It’s not like a service pack or hotfix has ever done that before, is it)? I want--and you should demand--the right to control what fixes are applied, when they’re applied, and to which version of the OS or any other software you’re running.

Another issue is that you can’t guarantee integrity and privacy. If a software company’s allowed to muck about in your system’s internals, you can’t prove in a court of law that they haven’t violated the privacy of your employees, customers or patients. In fact, I know several organizations subject to HIPAA (the Health Insurance Portability and Accountability Act of 1996) that aren’t installing SP3 because of this issue. HIPAA affects hospitals, doctors, clinics, insurance companies and any organization that deals with patient data. It requires strict protection of patient information and proof that access is denied to unauthorized individuals. Since when is Microsoft authorized to see my medical history?

Let’s stop the automatic editing of our computer systems by companies that think they know better than us.

Roberta Bragg, MCSE, MCT, CISSP, runs her company, Have Computer Will Travel Inc., out of a notebook carrying case. She's an independent consultant specializing in security, operating systems and databases. Send her your questions or comments at mailto:roberta.bragg@mcpmag.com.

By Roberta Bragg
Security Watch September 16, 2002
Copyright 2002 101communications LLC.

 
 Sunday, August 25, 2002
  3:40:59 PM  

Builder.com May 1, 2002
The architect's role in team development

One of Microsoft's key assets is its continual investment in research and development (R&D). Its R&D budget is larger than most companies' annual revenues. But the company doesn't just spend its R&D money on investigating new products and services; Microsoft also makes significant investments in the human factor (i.e., looking at how people use existing products and what improvements can be made to enhance the usability of those products). As part of the product planning for Visual Studio .NET, Microsoft did extensive analysis of how enterprise architects participate on development teams and what tools it could provide to make the architect's job easier and his or her time more productive.

The architectural investment

So what did Microsoft discover when analyzing the role of the architect? Its major conclusion was that although companies invest millions of dollars each year in designing and architecting their systems, much of the investment never makes it into the final product because there's no effective way to disseminate the architectural recommendations and, more importantly, enforce key architectural decisions about any particular application design. Microsoft also found that companies using external contractors have an additional complicating factor in the design-to-product investment. Contractors often use their own tools and methodologies, which tend to detract from, rather than enhance, their customer's internal architectural recommendations. Based on these observations, Microsoft added some key features to the Enterprise Architect version of Visual Studio .NET.

Propagating architectural decisions

The most common approach to publishing architectural decisions to the rest of the development community is to put up a "standards" Web site or publish architectural documents for specific projects on an intranet project's Web. The problem with either approach is that asking programmers to be proactive about reading architectural documents before beginning a project, or referring to them during a project, is like asking a teenage boy to call home when he goes out with his friends: It's just not going to happen. To help companies enforce architectural policies, Microsoft included Enterprise Template functionality in the Enterprise Architect version of VS.NET.

Enterprise Templates allow system architects to provide architectural blueprints, reusable components, development policies, or instructions delivered in the context of a particular application. Templates include two key elements. First, the template project includes an initial structure for an application, standard components allowed in the application, and any reusable internal components or systems appropriate for this particular project. (For examples of how these policies work, you can launch Visual Studio and select File | New | Projects | Other Projects | Enterprise Template Projects and choose from templates like Business Facade, Business Rules, and Data Access projects.) The template project policies are enforced by the second key template element--a policy definition file. The policy definition file includes a set of instructions for what a developer can't do in the development environment.

After a project template and policy file are developed, these are deployed to a development workstation. When the developer selects one of these projects, Visual Studio first creates an initial project structure from the template. Then it applies the policy file, which customizes elements of the Visual Studio IDE, including the TaskList, Toolbox, Property Browser, Server Explorer, and Help System. For example, an architect could develop a template for developing Windows clients for the Help Desk System. The template would include a base Windows form, Help Desk business objects with predefined references, and load the Help Desk technical documentation into the help system. The TaskList could be preloaded with instructions on creating a new form by inheriting from the base Windows form, as well as directions for starting the application using the supplied business objects. More importantly, the TDL file can remove all of the controls from the Windows forms palette that shouldn't be used in this application. It can also remove all of the data objects from the data palette and disallow all references to the System.Data namespace to insure that the developer only accesses the Help System data using the supplied business objects.

Although many developers will complain about the restrictions imposed by these templates, they're one of the most effective ways to ensure that companies benefit from the investment they've made in systems architecture and design. Using the templates also makes it much easier to enforce architectural policies when using outside contractors.

As founder and president of eAdvantage, Tim Landgrave provides business strategy consulting services to VARs and xSPs.

  3:38:04 PM  

#21: Designing on both sides of your brain
By Scott Berkun, scott@uiweb.com

Someone once asked me if, as a thinker, I was rational or creative. Left brained or right-brained. I considered it, and asked in reply, do I have to choose? Is it possible to be both? I didn’t think I could afford to discriminate. I wanted to be good at designing things, and I needed all the brainpower I had available.

Later in life, having read about our best thinkers and problem solvers, I learned that there is a natural balance that can be mastered between both intensely imaginative, and passionately logical lines of thought. It’s my claim, echoing many people before me, that we need to seek out this synergy to be good at design.

The myths of rational thinking and the methods of science

Before learning about design in college, I studied advanced logic theory. At parties, it was the last thing I wanted to mention, since it was certain to bring yawns and glares of boredom from beer-holding peers. The general reputation for the subject was that, like mathematics and science, it was dreadfully dull. These fields were seen as predictable and highly structured: you learn a formula for this, an equation for that, repeat a proof someone discovered decades ago, and then call it a day. It followed that the scientific method, considered a pillar of progress in the academic world, had an equally poor reputation. But this status is decidedly unearned. The surprising truth is that for designers everywhere, the scientific method can be an extremely powerful tool for finding and evangelizing your great ideas.

The ultra-compressed version of the scientific method has two parts. Part one: when you have an idea, you must expend time and energy to prove that it works. Part two: you must also expend time and energy trying to prove that it doesn’t work. That’s it. Welcome to the world of science. This simple set of two different approaches to evaluating ideas can improve the quality of your entire thought process, and the value of the work you produce. The power of the method is that it asks anyone who would call themself a scientist, or a designer, to attack problems from both sides.

It’s not enough to take a pet idea and make claims about its value. Instead, you must seek an objective view of the idea, and invest time in disproving your own claims. If it truly is a good idea, it should be able to withstand the scrutiny of your critical evaluation. If it crumbles under your own inspection, you will have saved your team, and your users some time by returning yourself to the drawing board without interrupting anyone else.

The value to designers is twofold: First, it improves the clarity with which you view your work. Your ideas might be good in some abstract sense, but we should not confuse them with meaningful solutions to the customer or business problems at hand. Second, by making yourself comfortable with critiquing your own work, you improve the quality of any work you output. Like proofreading your own essays, you become more self-sufficient, as a designer.

Beyond your design skill, your strength in convincing others of your ideas to your team will improve. Instead of only offering the positive attributes of your design, you can present the challenging questions you asked, and express how your proposed design excelled in the face of those challenges. The more scrutiny you can apply yourself, the more confidence you will have. Even better, your comfort level with discussing designs with others will improve. Your internal dialog about your work will more naturally match your external dialog with peers and teammates.

(Note: While many teams thrive on communal critiquing of work in an open and supportive environment, which is good, many others are not so fortunate. The overall goal is not to spend more time in isolation, nor to control your development team like puppets, but instead, to derive a system, both internal and collaborative, for surfacing the best ideas, and delivering them to customers).

Live by analysis, die by analysis

To the frustration of creative thinkers everywhere, the nature of poor usability engineering can often cause a team to focus on design problems in the narrowest way. Without proper counsel from an experienced usability engineer a team can miss large opportunities, falling into the trap of minimizing problems instead of solving them. Quick changes rarely solve the underlying and systemic causes. Often problems run deep, down to assumptions made early in a project, and manifested at different levels in code. Without the wisdom to look deeper, no amount of usability studies will significantly improve a design.

At the most negative extreme, an analysis-dominated development team will achieve only turd polishing: the constant refinement of bad ideas, without any hope of arriving at substantially new approaches to the problem. What’s required to maximize the value of analysis is creative thinking. Knowing when and how to switch gears between analysis and creative exploration is the key.

The synergy of solutions

Cars are designed to switch gears easily. Modern humans are not. When we learn a technique or a tool, it‚s natural for us to fixate on its use. If you have a hammer, everything is a nail.

It follows that people who learn to take comfort in analysis and process, tend to have difficulty rejecting those models, and switching to inspiration and creative passion as their guideposts. Alternatively, the creative-minded often struggle with structured evaluation, or models, methods and processes for problem solving. The good news is that you don’t have to choose. You can be both creative and rational.

At the moment when project goals, or usability issues, are identified, someone has to take control of the response. How deep a solution will be required? If shallow corrections or additions are all that are needed, then it’s appropriate to engage in a quick brainstorming session before proceeding in a single direction. But if more serious problems are found, or the project goals are more substantial and involved, deeper exploration into new alternatives is justified. This is where the team has to shift gears and invest in exploration.

All problems have multiple solutions. The larger the problem, the more open the solution space. To understand the different choices requires a creative approach. Someone must lead the way in expressing the different possible directions. What are three alternative navigation designs, and how would this improve the design, relative to known customer behavior? Can we reduce the number of categories we have by a third to simplify some user decisions?

This line of exploration might be led by someone skilled at asking the right questions, or by a surge of inspiration and expression of ideas by someone gifted in those traits. But the method is less important than the result: A set of alternatives, manifested in prototypes or pictures, that can be compared and evaluated relative to the needs and problems at hand.

Comparison vs. creativity?

Some designers ruffle at the comparison process. They’d prefer to allow their personal choice of approach to surface as the direction for change. This is almost always a mistake. Often it’s impossible to understand the merit of a design, without comparing it against several potential alternatives. It’s hard to know if something is good or bad if it is standing alone.

Recalling the scientific method, it’s in the designer’s interest to work to prove and disprove her own firmly held beliefs. If her heart is concerned with the customer’s experience, she should want the best idea to surface, regardless of her own initial preferences. She should be willing to be convinced that there is another way that’s better, regardless of who came up with it. Often it’s only when comparing two ideas that the best idea—a hybrid of the two—is discovered.

At the moment when the team has arrived at some good ideas, hopefully through evaluating the tradeoffs of alternatives, analysis becomes the greatest need. Perhaps there is time for a quick usability study, or heuristic evaluation, to help confirm that the proposed changes will truly have a positive impact on the customer. There are always more details in the world than can be considered by the designer’s mind, and it’s much cheaper to learn from mistakes in prototypes than in production code.

Design is a reflective process

The entire process of idea exploration, evaluation and implementation is reflective. No one mindset or attitude prevails. Instead, it’s the judgment of the designer, or the team leaders, to approach each kind of problem in the appropriate way. Some moments require an emphasis on the logical and rational. Others demand creative exploration and expression driven work. Often the entire project cycle is spent shifting between different modes of thought, exploring, evaluating, and exploring again.

Few things of importance arrive from either/or thinking. It’s the wise and the successful that are able to derive approaches to difficult situations that unify and combine, rather than separate and divide. Think Yin/Yang or chocolate and peanut butter. The greatest opportunities for the mindful designer are in exploring how to build complementary relationships from seemingly competing traits.


Hfactor/Uiweb column information

This issue can be found online at http://www.uiweb.com/issues/issue21.htm.
Column archive, author info, and other design/usability resources:
http://www.uiweb.com
To get new columns direct via email: hfactor-subscribe@topica.com
Question, comment or something you'd like to see me write about?   scott@uiweb.com


Copyright (c) 2000-2002, Scott Berkun - All Rights reserved
  3:34:16 PM  

.NET UPDATE -- brought to you by the Windows & .NET Magazine Network
(contributed by Paul Thurrott, news editor, thurrott@winnetmag.com)
June 27, 2002

There but for the Grace of the Legal System Go We

In a meeting with Microsoft officials last week, I heard a bit of trivia that should have been obvious but that surprised me nonetheless.

When asked whether Microsoft would support Sun Microsystems' server-based Java technology in the next Windows .NET Server (Win.NET Server) version, John Montgomery, the group product manager for the Developer Platform and Evangelism Group at Microsoft, said no but noted that Sun was as free to build on Microsoft's server platform as anyone else. Montgomery then explained that the world would have been a very different place if Sun had made different decisions at crucial points in its relationship with Microsoft.

Here's what happened, and how things might have turned out much differently.

In December 1995, Microsoft announced an abrupt strategy shift in which the company reorganized its business around the Internet. The announcement, touched off by Bill Gates's earlier "Internet Tidal Wave" memo to employees at the company, included a couple of shocking interoperability revelations: Microsoft would expand its licensing of Spyglass's Web browser code to port the Microsoft Internet Explorer (IE) product to Windows 3.x, UNIX, and the Macintosh; and the company would license Java, the popular Web-oriented programming language.

Java had an interesting beginning at Sun, the high-end server and enterprise-class UNIX vendor. According to popular legend, developer James Gosling had decided to create a new programming language for a set-top box that the company was then designing but later scrapped. Looking out his office window for inspiration, Gosling saw a tree and dubbed the object-oriented language "Elm." But the name Elm was already taken, so Gosling settled on Java.

Over the next few years, Java sat in limbo as Sun's set-top box plans folded. But with the rise of the Internet, Gosling saw that a small and elegant programming language like Java could be quite useful, and so it was redesigned for that purpose.

Microsoft originally didn't trust Sun or Java because of the possibility that developers would prefer Java over Windows. Java applications, applets, and services run in a software "sandbox," a protected environment that sits on top of various OSs such as UNIX, Windows, and the Mac OS. If developers were to accept Java as the overlying platform for their OSs, Windows would lose its importance. So Microsoft surprised everyone and licensed the Java technology from Sun.

According to the licensing agreement, Microsoft would be able to "modify, adapt, and create derivative works of Sun's Java technology," which is exactly what the company did, creating a Java version that ran better on Windows and offered unique Windows-only features. During 1996 and 1997, Microsoft invested heavily in Java, releasing Java interfaces to various Microsoft applications and creating a Java development environment called Visual J++ (VJ++). The company even held Java-oriented developer events and was reportedly working on Java components for Microsoft Office. A future version of Visual Studio (VS) was going to generate Java byte codes--the underlying executable format that programs written in Java use.

The company's distributed Java strategy was interesting. Sensing a move to Internet-based software subscriptions--a term more common today than it was 4 or 5 years ago--Microsoft began adapting its Windows-only component technologies (which the company referred to by the umbrella term Windows DNA) to include cross-platform hooks through Java. Java would have been the gatekeeper, or glue, between software components running on Windows servers. And with Java, Microsoft finally had an interoperability strategy for communicating with non-Windows systems.

If you're familiar with what became .NET, the information in the preceding paragraph should sound familiar. But somewhere along the line, Microsoft's plans for Java completely fell apart. Sun sued Microsoft in October 1997 for violating the Java licensing terms. And when it became clear that Sun was going to win the case--as it eventually did in an early 2001 settlement, Microsoft slowly but surely walked away from Java and did its own thing.

"Had Sun not decided to compete through the courts, we would have happily continued using Java, and .NET never would have happened," Montgomery said.

Faced with the prospect of not being able to improve Java, Microsoft started working up internal strategies for cross-platform, Internet-based subscription software. The company had already hired longtime Borland software architect Anders Hejlsberg, who created the Turbo Pascal compiler and the Delphi Visual Component Library (VCL), to work on Java technologies. Hejlsberg was set to working up a new, Java-like programming language eventually named C#. But C# isn't Hejlsberg's most important contribution to Microsoft. His creation of the logical and powerful class library and runtime environment Common Language Runtime (CLR) is credited with making .NET a huge hit with developers.

In some ways, the Sun-Microsoft feud was a good thing. The .NET platform as it now stands is more powerful than anything the company could have accomplished with Sun, thanks to .NET's from-scratch architecture, support for multiple programming languages, and use of standards-based technologies such as XML and Simple Object Access Protocol (SOAP). And, in response to criticism of Sun for not doing the same for Java, Microsoft has taken the high road and worked to make key portions of .NET open standards that are available for anyone to use. Microsoft will also open more of .NET to standards bodies over the next few years, Montgomery said.

Sun, of course, is still positioning Java for Web services, and the possibility exists that Java will be in place years from now, working side by side with .NET technologies. But if Sun could have found some way of working with Microsoft, avoided its lengthy and ultimately damaging court case, and positioned Java as an open standard, Sun might have driven the move to the Web services platforms of the future. Instead, although Java will be a player in Web services, its position will probably be as a minor player relegated to non-Microsoft platforms.

Do you think Sun would do it differently if it had a second chance?

This biweekly email newsletter is brought to you by Windows & .NET Magazine, the leading publication for Windows professionals who want to learn more and perform better. Subscribe today: http://www.winnetmag.com/sub.cfm?code=wswi201x1z. Receive the latest information about the Windows and .NET topics of your choice. Subscribe to our other FREE email newsletters: http://www.winnetmag.com/email

Copyright 2002, Penton Media, Inc.


Click here to visit the Radio UserLand website. © Copyright 2005 Eric Hartwell.
Last update: 3/13/2005; 2:21:21 PM.
This theme is based on the SoundWaves (blue) Manila theme.