Thursday, January 25, 2007

Changing your development platform

http://discuss.joelonsoftware.com/default.asp?joel.3.443537

There are certain milestones in the life of a product when developers are free to ask if it’s time to change the platform it’s developed on. Typically you’ve shipped a major version and gone into maintenance mode. Planning has started for the next version, and you wonder if you should stick with what you’ve got or if, knowing what you know now, it might be better to switch from .NET to PHP, or from PHP to Java.

You might think that checking Netcraft would be a good idea. You can see if your current platform is gaining or losing market share, and who doesn’t like market share? If you look at the latest chart you’ll see that Microsoft is gaining on Apache.

But keep in mind that while Apache's market share has gone down marginally, the total number of sites has still gone up. Most of Microsoft's gain is from new sites, not from existing sites switching. (The exception being large site-parking operations switching to IIS.)

But really the important question is whether your preferred platform faces a reasonable possibility of becoming obsolete/unsupported. This is actually one place where the Unix world's slower upgrade cycles help. You rarely have applications "sunsetted" by the manufacturer.

Am I arguing in favor of dropping .NET? Not at all. I think you should use what works for you. What I'm saying is unless your chosen platform is in danger of becoming unsupported, and that causes a problem for you, then looking at market share charts should never get you to switch.

Now if you hadn't already chosen a platform, and you wanted to know what platform had a larger market, then you'd care about market share. But that's a subject for another post.

Tuesday, January 16, 2007

Geeks still don't know what normal people want

If you listen to geeks, locking out development of third-party applications will doom the iPhone in the market. But remember the now-famous review when the iPod was released:

No wireless. Less space than a nomad. Lame.
The market quickly decided they didn't care about wireless and bought the things in droves. And current versions have more space than the nomad did when the iPod came out. Now that the iPhone has been shown, geeks are again claiming that it's going to fail. This time because it's not going to be open to third-party applications.

Apple doesn't care if you can extend it because they believe their target customer doesn't want it extended. They want something that works well, the same way, every time. The iPod wins because it does pretty much what people want, close enough to how they want, without making them think about how to do it.

The iPhone may not be open to developers, but it's upgradable. When Apple finishes writing software to make the Wi-Fi automatically pick up a hotspot and act as a VoIP phone, that functionality can be rolled out transparently. First-gen iPhones will become second-gen iPhones without the users having to do anything.

The upgrade path will be to higher HD capacity, so people can carry more movies with them. I see these things as hugely popular for people who take trains to work. If I could take a train where I work now, I'd already be on a waiting list for an iPhone.

Thursday, January 11, 2007

Principles: Deployment

  • Code that only runs on the developer's workstation isn't finished.
  • Any programmer that can't be bothered with ensuring his code can be deployed is only doing half his job.
  • Being responsible for the integration issues caused by your code can be a real eye-opener.

Meet the new boss, same as the old boss v2

Sometimes you read something that you can't summarize without losing a lot. I just can't find any extra words in this post, so here it is in its entirety:

1) Whatever language is currently popular will be the target of dislike for novel and marginal languages.

2) Substitute technology or methodology for language in #1. In the case of methodology, it seems a straw man suffices.

3) Advocates will point to the success of toy projects to support claims for their language/methodology/technology (LMT).

4) Eventually either scale matters or nothing matters. Success brings scale. An LMT is worthy of consideration only after proving out at scale.

5) Feature velocity matters in early stage Web 2.0 startups with hyperbolic time to market, but that is only a popular topic on the Web for the same reason Hollywood loves to hand out Oscars.

6) Industry success brings baggage. Purity is the sign of an unpopular LMT. The volume of participants alone will otherwise muddy the water.

7) Popularity invites scrutiny. Being unfairly blamed for project failure signals a maturing LMT; unfairly claiming success, immature LMT. Advocates rarely spend much time differentiating success factors.

8) You can tell whether a LMT is mature by whether it is easier to find a practitioner or a consultant. Or by whether there is more software written *with* or prose written *about* the LMT.

9) If you stick around the industry long enough, the tech refresh cycle will repeat with different terminology and personalities. The neophytes trying to make their bones will accuse the old guard of being unable to adapt, when really we just don't want to stay on this treadmill. That's why making statements like "Java is the new COBOL" are ironic; given time, "N+1 is the new N" for all values of N. It's the same playbook, every time -- but as Harlan Ellison said of fiction, every story has already been told, but nobody was listening the first time.

10) Per #9, I could have written this same post, with little alteration, ten, twenty or thirty years ago. It seems to take ten years of practise to truly understand the value of any LMT. Early adopters do play the important role of exploring all the dead ends and limitations, at their cost. It's cheaper to watch other people fail, just like it hurts less to watch other people get injured.

11) Lisp is older than I am. There's a big difference between novel and marginal, although the marginal LMTs try to appear novel by inserting themselves into every tech refresh cycle. Disco will rise again!

12) If an LMT is truly essential, learning it is eventually involuntary. Early adopters assume high risks; on the plus side they generate a lot of fodder for blogs, books, courses and conferences.

13) I wonder if I can get rich writing a book called Agile Lisp for Web 2.0 SOA. At least the consulting and course revenue would be sweet. Maybe I can buy an island. Or at least afford the mortgage payments on a small semi-detached bungalow in the Bay area.

14) It requires support from a major industry player to bootstrap any novel LMT into popularity. The marginal LMTs often are good or even great, but lack sponsors.

15) C/C++ remain fundamental for historical reasons. C is a good compromise between portability and performance -- in fact, a C compiler creates more optimal code than humans on modern machine architectures. Even if not using C/C++ for implementation, most advocates of new languages must at least acknowledge how much heavy lifting C/C++ does for them.

16) Ditto with Agile and every preceding iterative methodology. Winding the clock back to waterfall is cheating. I'm more sophisticated than a neanderthal, but that won't work as a pick up line.

17) Per #13, I don't think so, because writing this post was already a chore, let alone expanding the material to book length. Me an Yegge both need a good editor.


This covers the technology pretty well. All he left out was the reason so much is coming back.

Get your boxes in order

Everyone seems to have an opinion on downloading music and TV shows, everything from "Information wants to be free" to "Skipping commercials with your TIVO is theft." Some of the views are self-serving, some are rationalizations, and some people have strong opinions based on what they believe is right and just.

Here's the thing a lot of people are missing, though: Breaking the law does not count as civil disobedience unless you go out of your way to do it publicly. Obviously I'm referring to people who upload and download music, movies or software without permission from the copyright holders. Some of them are just in it for the free tunes. Some of them think the law is wrong. But the ones who believe copyright laws have gone too far damage their case when they quietly violate the law, expecting to protest the law if they are caught.

Think the law has tilted too far in favor of the copyright industry? Great, so do I. Have you written to your congressman? If not, then don't complain about the law when you get busted. It makes it look like you're just trying to stay out of jail -- which you are -- and supports the MPAA and RIAA next time they try to get copyright extended.

Before you end up in a jury box, you should really try the ballot box. Time for me to get off my soapbox.

What is Steve Jobs thinking?

http://apple.com/iphone

We all knew Cisco had the trademark on the name. According to their press release:

"Cisco entered into negotiations with Apple in good faith after Apple repeatedly asked permission to use Cisco's iPhone name," said Mark Chandler, senior vice president and general counsel, Cisco. "There is no doubt that Apple's new phone is very exciting, but they should not be using our trademark without our permission."
They negotiated, Cisco said no. So they release it anyway. And Apple's response is:

Apple responded by saying the lawsuit was "silly" and that Cisco's trademark registration was "tenuous at best".

"We think Cisco's trademark lawsuit is silly," Apple spokesman Alan Hely said. "There are already several companies using the name iPhone for Voice Over Internet Protocol (VOIP) products."
It's "silly"? Come on, that sounds like they're daring Cisco to take it to court. And claiming that the trademark has already been diluted by other products is a dangerous game. If that argument prevails, then Apple will have no standing to prevent anyone else from releasing their own iPhone.

What the hell are they thinking?


[Update]

See the Joel on Software forums for some discussion of this.

Wednesday, January 10, 2007

Pay the man

http://discuss.joelonsoftware.com/default.asp?joel.3.434525

IT people are frequently highly-educated, with extensive formal and on-the-job training. And we all, if you look at our resumés, think that we're fast learners. That's probably because everything we work with keeps changing every couple of years, so anyone who's been doing this for very long has learned multiple generations of tools. Many of our jobs also require us to be generalists, with a broad range of knowledge across multiple unrelated fields.

It's probably not surprising, then, that we tend to be DIYers. Never changed a light fixture? No problem. Give me a few minutes with a book and I'll know enough to do it. House needs painting? Heck, I've always wanted an excuse to go get one of those power sprayers, I'm on it! That's why we're shocked to hear how much people pay to have someone do work that, after all, we could do ourselves with little or no training.

That was my frame of mind when I had to replace the shower door. The frame was mounted on tiled walls. I only cracked two of the tiles a little bit trying to get the old frame off, and lifted about a dozen away from the wall. No problem, just ran to the hardware store for some tile adhesive. And I only put the adhesive on a little too thick, so two of the tiles fell off the next day when I started mounting the frame. And I only cracked one more because I was unfamiliar with the mounting hardware.

I had to remove all the tiles and start over because the adhesive was actually nowhere near dry. I wanted to make sure it dried all the way, because I wasn't completely sure I did it right this time. When I tried again three days later, there was only one tile that fell off because I had gone too thin with the adhesive. But after waiting a day for the grout on the rest to dry, I was able to scrape that space out and get the last tile up and grout it. The caulk and grout I used to patch the cracks looks mostly okay ... for now ... while it's still white

All in all, it only took me a week and a half to hang that door. And the cracked and patched tiles will probably still look good when I go to sell the house. (At least I hope they will; the color was discontinued years ago, so I'd have to re-tile the whole damn bathroom otherwise.) I'm so glad I didn't pay a hundred bucks to some barely-trained tradesman to do it for me.

Lipstick on a pig

If you've ever seen one of my project plans, there's a chance you've seen a task at the end that says "Add pretty". With good use of stylesheets, you can radically improve -- or damage -- the look of a website even after all the coding and most of the testing are done. A different person or group with a different skill set can take over from the programmers and work some magic with little interaction.

You might think, based on this, that other parts of development can be pushed to the end after "real" development is done. You'll know someone was thinking that when you see a task late in a project plan that says "Add fast".

I suppose I can live with the idea that there will be some performance tuning that's best done once everything else is complete. And on some projects just throwing more hardware at the problem is cheaper than a programmer's time to fix it. But actually improving the performance of an application is hard, and the changes pervasive.

The second manifestation of specialized groups, one that always raises the brown flag, is when I see "Add security" at the end of a plan. It's simply inexperience that allows anyone to think they can graft a security model onto a codebase after the fact without significant amounts of rewriting.

"But this is a quick hack, and we only need the numbers for this one meeting." Sure, a report you'll only ever need once. I guess such a thing could exist, but I've never seen it. In the first place, nothing lasts as long as a temporary fix that works well enough. And in the second place, many (most?) large, successful products started out as small, successful products.

End/begin dependencies look really great on a Gantt chart. Activities that invite and incorporate feedback don't look so neat and clean. Treating security as something that can happen to a product after it's already done is no better than ... well, see the title of this post.

Design = function + aesthetics

Ask your local programmer if he knows how to design user interfaces and invariably he'll say he does. Go ahead, ask. I'll wait.

...

You're back? Good. Now go look at the new iPhone. Has your guy ever made anything remotely that cool? Unless you're reading this from Cupertino, odds are he hasn't. The UI is more beautiful and, as near as I can tell from the demo movies, more usable than any other phone or music player I've seen. But I wonder, how much of the perceived usability is a response to the beauty?

It's becoming conventional wisdom that you don't want to make the demo look done. Excessive visual polish early in the process not only limits the feedback you get to comments about the superficial details, it also suggests equally finished interaction with the system. It literally makes it look like it's doing more than it really is doing.

I've avoided this problem in my career by not being very good at graphics, and avoided realizing that by not working with any real visual artists to compare my work to. Yes, I used to think I was good at it, just like every programmer. Eventually I realized that consistency and predictability were a poor subset of what an artist can add.

Now, whenever I make up a project plan, there is a task at the end for "Add Pretty". And my name isn't on that task.

Tuesday, January 9, 2007

The Digital Dark Ages

I've been paying my mortgage for about three years now. Unless I change something, I'm going to keep paying on it for another 27 years. I try not to think about the fact that although I have an actual physical copy of the mortgage agreement, with real pen-and-ink signatures, I don't have any proof that I've ever made a payment.

At the risk of sounding like a Luddite, it bothers me that I have to trust the bank's computer system to keep track of all 360 payments I'll have made by the time it's over. I'm not just being paranoid. I had an issue where a bank said my wife still owed money on a loan we had paid off three years earlier. We didn't have anything in writing for each payment. The bank couldn't even tell us the history of the loan; just that the computer showed we still owed money. And if a bank says you owe money, unless your lawyers are bigger than their lawyers, then you owe them money.

If you go to museums, you'll see ledgers from banks in the 1800s and earlier. Over two hundred years later and we still know who paid their bills and when. But five years in the past ... it doesn't exist.

This could change with new regulations and retention requirements. But the big difference is what is standard vs. what you have to work at. A hundred years ago everything was written down. If you wanted to get rid of records you had to make an effort to identify what you wanted to delete, somehow separate it from the rest, and physically destroy it. Today, we only keep data as long as we have to. We only bother with long-term storage when the law or financial necessity makes us.

Let's assume we have some data that we really want to keep "forever". What is that going to take?

First, you'll want to store it on something that doesn't degrade quickly. Burning it to a CD or DVD seems to offer better longevity than VHS. Well, maybe. Second, you want to store it in a format that you'll be able to read when you want to. This might be a harder problem than the physical longevity, when you start to consider how much data goes into a modern file format.

Look at the problem from the user's perspective: The document format (the same applies to music and video) is just a way of saving the document in a way that it can be opened and look the same way at a later time, maybe on the same computer maybe not. When Windows 97 handles table formatting and text reflow around images a certain way for instance, the document format has a way of capturing the choices the user made.

If I open that Word 97 document in Word 2003, either the tables, text and images look the same or they don't. If they look the same, it's because there's an import filter that understands what the old format means, and Word 2003 has a way of representing the same layout. If I then save as Word 2003, while the specific way to represent the layout has changed, the user doesn't see the difference nor care.

If, on the other hand, that Word 97 document doesn't look the same in Word 2003, it really doesn't matter to the user if problem is a bad import filter or if Word 2003 doesn't support the same features from Word 97. (Maybe they used flame text.) So a format that technically captures all the information needed to exactly recreate a document is utterly useless without something that can render it the same way.

Okay, so we need long-term media, and we need to choose a format that is popular enough that there will still be import filters for it in the foreseeable future. Eventually we'll still reach the end of those paths. Either the disks will degrade, or the file format will be so out of date that no one makes import filters any more. When that happens, the only way to keep our data will be to copy it to new media, and potentially in a new format.

What should that format look like? We've already got PDF, which is based on how something looks in print. We've got various audio and video formats, which deal with playing an uninterrupted stream. But what about interactive/animated documents designed for online viewing?

Believe it or not, I'm going to suggest a Microsoft solution, though it's one they haven't thought to apply this way: PowerPoint. Today nearly everyone has a viewer, but not so long ago most of the slideshows I got were executables. If you had PowerPoint installed you could open the executable and edit the slideshow the same way you can edit a PDF if you have Acrobat.

As much as people complain about the bloat that Word adds to simple files, I think the future of file distribution will be to package the viewer along with the file. At some point storage becomes cheaper than the hassle of constantly updating all those obsolete file formats. The only question is how low a level the viewers will be written to: OS family, processor architecture, anything that runs C, etc.

Monday, January 8, 2007

Meet the new boss, same as the old boss

In case you haven't noticed yet, we're going through another round of power struggles in the IT industry. Oh, that might not look like what's going on. On the surface what people are saying is that it's a matter of web-based vs. desktop applications. Frequently these conversations are based on the premise that it's a discussion of the technical merits.

Nope. It's the return of the glass house. Peel back all the rationalizations about easier deployment, easier support, more consistency, and what it really comes down to is more control. If we can just keep the software out of the users' hands then everything will be okay.

But what history shows us is that users like having control of their "stuff". Taking that control away requires either redefining "their stuff" to be "our stuff", or convincing them that they aren't qualified to handle their stuff.

Is this what your customers are hearing from you?

Sunday, January 7, 2007

The War on Laundry™

Let's see:

  • Not a finite thing that can be destroyed, nor group which can be defeated.
  • No one qualified to declare surrender for it.
  • There are better and worse ways to deal with it, none of which are able to completely eliminate it.
  • No matter how much you fight it, there will always be more soon.
  • No one really likes it, but the only way to avoid it is to change your lifestyle so profoundly that the alternative is worse.
Hmm, sounds about right.


Any relation to other Wars on Nouns are completely intentional.

Friday, January 5, 2007

The day I got a lot smarter

One sign of intelligence is the ability to learn from your mistakes. An even better sign is the ability to learn from someone else's mistakes. Unfortunately, we don't always have the luxury of watching someone else learn a valuable lesson, and we have to do it ourselves. But if we pay attention, sometimes we get to learn multiple lessons from one mistake. (Lucky us.)

Case in point: Dealing with a crisis. I was managing a group of web developers, and the project lead on an integration with our largest client was going on vacation. He assured me his backup was fully trained, and would be able to deal with any issues. He left on Friday, and we deployed some new code on Monday. Everything looked good.

On Wednesday at about 4 p.m., we got a call asking about an order. We couldn't find it in our system. From what we could tell, the branch that placed the order wasn't set up to use our system yet, so we shouldn't have the order. At 5 I let the backup go home for the day while I worked on writing up what we'd found. I sent an internal email explaining what I believed had happened. I said that I would call the client and explain why we didn't have the order, and that they should check their old system.

While double-checking the deployment plan, I discovered that the new branch actually was on our new system ... as of that Monday. That's part of what was included in the new code. That's when I got the shiver down my spine. By that time the backup, whose house was conveniently in a patch of bad cell coverage, was gone. The lead was on vacation. "Okay," I thought, "I've seen most of this code, in fact I've written a good bit of it. I can figure this out."

Stop laughing. It sounded good at the time.

To make a long story short (Too late!) we hadn't been accepting orders for three days from several branches, but had been returning confirmations for them. It was somewhere around 3 a.m. when I finally thought I knew exactly how many orders we had dropped, though I hadn't found the actual bug in the code yet. I created a spreadsheet with the list of affected orders. At one point I used Excel's drag-to-copy feature to fill a range of cells with the branch number for a set of orders.

Did you know Excel will automatically increment a number if you drag to copy? Yes, I know it too. At 11:30 in the morning today I know it. At 3 a.m. that night I apparently didn't know that. So I sent it to the client with non-existent branch numbers that I didn't double-check. "Oops" apparently doesn't quite cover it.

The next morning on a conference call with the client, my boss, his boss, and several other people, we were going over the spreadsheet when someone noticed the problem. To me, it seemed obvious that it was a simple cut-and-paste error on the spreadsheet. But someone -- a co-worker, believe it or not -- decided to ask, "Are you sure? Because I don't see those other two branches on here either." After dumbly admitting that I didn't know anything about any other two branches, I ended the call so I could go figure out what was happening.

Now I had apparently demonstrated that I didn't actually know what was wrong, that I had no idea of the scope of it, and that I was trying to cover it up. Yay me. We called in the lead (whose vacation was at home doing renovations) and started going through the code. I finally found the cause of the error, and it caused exactly the list of errors that I had sent out early that morning, except for the cut-and-paste error. The "other two branches" turned out to be from the previous night's email, where I had specifically said those branches were not affected by the problem.

Within two hours, we had the code fixed and all the orders recovered. So everyone's happy, right? If you think so, then you haven't yet learned the lessons I did that day.

  1. No matter how urgently someone says they need an answer, the wrong answer won't help.

  2. If it looks like the wrong answer, it might as well be the wrong answer. This doesn't mean counter-intuitive answers can't be right. It means that presentation and the ability to support your conclusion count.

  3. If you didn't create the problem, always give the person who did the first chance to fix it.

  4. If someone knows more about a topic than you do, have them check your work.

  5. Don't make important decisions on too little sleep.

  6. Before making a presentation to a client, review the materials with your co-workers.

  7. Don't make important changes when key people are unavailable.

Looking at that list, I realize I already knew several of those lessons. So why did it take that incident to "learn" them? Because there's a difference between knowing something, and believing it.

Wednesday, January 3, 2007

When design is not design

"How is software production like the car industry?"

Oh no, not again. Yeah, well, most people are getting it wrong. So here's another shot at it.

There are aspects of car design that strictly deal with measurable quality: performance of the electrical system, horsepower, fuel economy, reliability. But the shape and style of the car are much more loosely coupled to hard-and-fast measurements. That facet of the design -- the way it looks, the demographic it will appeal to -- is not amenable to Six Sigma processes.

Granted, there are some cars that are strictly (or nearly so) utilitarian. Some people only care about efficiency and reliability. They buy Corollas by the boatload. But the FJ Cruiser is not the result of a logical, statistical analysis, with high conformance to the mean and low variation of anything.

I think what I'm trying to say is that marketing design is building the right thing, while production design is building the thing right. The auto industry is mature enough that you need both. Success in the software industry still relies more on building the right thing.

There are no IT projects ... mostly

http://www.issurvivor.com/ArticlesDetail.asp?ID=556

Whenever someone says something I've been thinking or saying for a while, it's clear evidence of how smart they are. (Don't laugh, you think so too.) So when Bob Lewis published the KJR Manifesto - Core Principles, he confirmed his intelligence when he wrote:

There are no IT projects. Projects are about changing and improving the business or what's the point?
The variation that I've been telling people for years is that people don't want software, they want the things they do with the software. So if you're working on an IT project and can't explain the benefits in terms that matter to the business, you probably shouldn't be doing the project. Then in the middle of making this point to someone, I realized it's not always true.

The one case I thought of was a steel manufacturer that I interviewed with. While the factory was computer-controlled, the people who worked on those systems were in Engineering. The non-production computer system -- email, financials, advertising, etc. -- was IT. In that case, IT really was a support function, no more important to the company than telecom.

That doesn't mean it was unimportant. They could no more survive without their back-office system than they could do without phones. But that system really had no bearing on how they ran their business. It was something that was expected to Just Work™, like the electricity or plumbing.

The thing I don't know is if this is the exception that proves the rule, or if it's more common than I thought to find a place where IT really isn't a strategic partner in the business.

Tuesday, January 2, 2007

Maybe I'm the one missing something

http://www.infoworld.com/article/06/12/11/50OPopenent_1.html

Magicians make a living at misdirection, getting you to look at their right hand while they're hiding the ball with their left hand. You'd think journalists would want to be a little more direct than that. But Neil McAllister did a whopper of a slight-of-hand recently, using more than half his column to summarize a Joel Spolsky post before jumping to a completely unrelated conclusion.

Joel's point, and the first more-than-half of Neil's summary, was shooting down the idea beloved of suits that programming can be reduced to a set of building blocks that can be snapped together by a non-programmer. (For a hysterically painful example of how wrong this is, and how far people will go to try to do it anyway, see The Customer-Friendly System at The Daily WTF.)

Joel covered the ground pretty well, so I was wondering where Neil was going with this. Once I got to it, I had to re-read the segue three times to see what connection I was missing:

Don't you believe it. If, as Brooks wrote, the hard part of software development is the initial design, then no amount of radical workflows or agile development methods will get a struggling project out the door, any more than the latest GUI rapid-development toolkit will.

And neither will open source. Too often, commercial software companies decide to turn over their orphaned software to "the community" -- if such a thing exists -- in the naïve belief that open source will be a miracle cure to get a flagging project back on track. This is just another fallacy, as history demonstrates.
If there's a fundamental connection between open source and "Lego programming" I don't know about it. Maybe Neil makes the connection for us:

As Jamie Zawinski recounts, the resulting decision to rewrite [Netscape's] rendering engine from scratch derailed the project anywhere from six to ten months.
Which, as far as I can see, has nothing to do with the fact that it was open source. In fact it seems more like what Lotus did when they delayed 1-2-3 for 16 months while they rewrote it to fit in 640k, by which time Microsoft had taken the market with Excel. Actually that's another point that Joel made, sooner and better.

Is Neil trying to say that Lego programming assumes that code can be interchangeable, and man-month scheduling assumes that programmers are interchangeable? Maybe, and that's even an interesting idea. But that's not what he said, and if I flesh out the idea it won't be in the context of a critique of someone else's work.

Or maybe it was an opportunity to take a shot at the idea of "the community". Although in his very next column he talks about the year ahead for the open source community, negative community reaction to the Novell/Microsoft deal, and praise from the community for Sun open-sourcing Java. Does he really dispute the existence of a community, or was it hit bait?

Okay, so where did I start? Right, with misdirection. So the formula seems to be: quote a better columnist making a point that I like, completely change the subject with the word "therefore", summarize another author making my second point, and send it to InfoWorld. Am I ready to be a "real" pundit yet?