Archive for June, 2007

Recently Michael Gorman at the Encyclopedia Britannica Blog has published a series of screeds against Wikipedia and Web 2.0. Clay and Danah at Many To Many have written some fantastic responses and even though the posts are longer than the average blog post, their writing is so luscious that it is worth the effort. Here are my favorites:

Gorman, redux: The Siren Song of the Internet

Knowledge access as a public good


  • Share/Bookmark

My family and I recently took a road trip from Philadelphia southward. Google Maps wanted to route us through the Washington D.C. beltway, which can be a parking lot at times, so I re-routed us through Fredrick which only added 5 minutes. Google Maps has the ability to add “waypoints” but it is a hassle and you have to know or guess at intermediate addresses, even if you don’t actually want to go to those places.

You may ask why I haven’t gotten one of those nifty in-car navigation modules – maybe I will write about that another time.

For years I have been irritated that Mapquest doesn’t allow you to add intermediate stops to driving directions. I know they could do it if they wanted to. When we were making the first version of Mapquest in 1994, we already had consumer CD-ROMs with very clean and user-friendly methods, usually involving right-clicking on the map. I suspect that AOL (owner of mapquest) has done some sort of focus group that indicated that people don’t care about this feature.

update – 6-30-07 —–
I just realized that Mapquest added a way to add intermediate stops a year ago. I guess I wasn’t paying attention! My apologies to Mapquest.
update —–

Well someone at Google didn’t get the memo because yesterday, they came out with an awesome way to add waypoints to driving routes, and it is better than what we were doing in 1994. As far as I can tell, this is the first time an online map is easier to use than the desktop applications from 13 years ago. I think it is an important turning point.Here’s what you do: First, plan your route.

google-map-routing-waypoint1.png

Second, grab the blue “route” line near Washington DC and drag it to Fredrick.

google-map-routing-waypoint2.png

Done!

  • Share/Bookmark

Phil Conrad, after reading my essay about engineering goodness, pointed me to Jeff Attwood’s blog, “Bridges, Software Engineering, and God”, which nicely demonstrates that software development has very little in common with classical engineering disciplines like Civil, Mechanical or Electrical Engineering.

Attwood concludes that “…Software development is unquestionably a profession, but I don’t think we can learn as much from the fields of mathematics or traditional engineering as is so often assumed…”

I completely agree with him and I share his frustrations with naive comparisons (I am speaking as a mechanical engineer, computer engineer and a software developer).

engineers-software-engineers-2.pngI think one cause of misunderstanding is that people tend to confuse the professions with the professionals. Even though software development and engineering are quite different, the people in those professions are very similar – engineers are just like software developers. I’m thinking in particular about “creative technical people.” You know the type; they are drawn to engineering and computer science because they like to make things. They are caricatured in literature, movies and TV as the typical inventor-geek.engineers-software-engineers.png

Of course, engineers and software developers vary in how well they fit this “technical creative” stereotype, and some don’t fit it at all. And some of very the best don’t have formal degrees in their fields but were instead irresistibly drawn to their careers from other professions. My children fit the mold, but in different ways; My daughter is more of an engineer and my son is more of a computer person.

Although engineers and software people must necessarily follow a totally different process for building things, their ultimate value to society is measured in terms of what they have created. This ties in nicely with my observations on engineering goodness; no matter what process you follow to create new things, you should measure your progress against some standard, evaluate how well you are doing and follow strategies to improve your record.This could be said of most endeavors, but for engineering and software development the application of the principal is the same.

  • Share/Bookmark

I ordered Cinema 4D for my kids who currently are using Bryce for 3D modeling. But I clicked on the wrong button

cinema-4d-mistake-3.png

and received the “CINEMA 4D WIN NON CG STUDENT/PROFESSOR” instead of the “CINEMA 4D MAC NON CG STUDENT/PROFESSOR.” In case you missed it, the kids use a Macintosh (a sweet Mac Pro – the quietest computer I’ve ever owned) but I bought the Windows version by accident. I didn’t notice the difference so I opened the box. Even though the software installs fine on the Macintosh, the serial number in the box won’t work. I found out from the Academic Superstore, the reseller, and Maxon, the manufacturer, that they won’t take a return on an open box but I can pay a $100 fine to switch the license from Windows to Macintosh. I must also sign a document promising not to use my Windows serial number. I tried to persuade them to cut me some slack for being honest but careless, but no dice. Luckily, after much groveling on my part, the reseller agreed to take back the open box for only 15% plus shipping. I am proud that I’m only paying a $55 stupidity tax instead of $100!

Obviously Maxon is concerned about software piracy but it is amazing to me that they haven’t been able to think of a better strategy than this. Software companies have more options than ever for creative distribution of their products, yet they seem to be stuck in the 1990’s. It reminds me of the Recording Industry Association of America’s battles over online music, which I hope will end badly for RIAA.

Cinema 4D is an awesome tool that is used to make movies and TV productions. Here’s one of Jonathan’s latest creations.

cinema-4d-blob2.png

  • Share/Bookmark

In an earlier post, I speculated that one might use the variance between Quantcast and Alexa as a predictor for success. The basis for this was that Alexa toolbar users seem like early adopters. Another possible explanation can be found in the general feeling that Alexa users are heavily represented in Asia. Do a search on Alexa user demographics and you will find a number of blogs and forum posts suggesting this.

If this idea is correct, I think that under some circumstances, you may still view Alexa as a predictor for success since much of the growth in the population of internet users is coming from Asia.

Here’s the latest data on my Wikia watch:

wikia-rank-6-26.png

According to Alexa, Wikia’s rankings were

Yesterday: 849
1 Wk Avg: 1102
3 Mo Avg: 1456

     
  • Share/Bookmark

Phil Butler reported on June 22 that Wikia has gotten a face lift. I think the UI and AJAX features are insignificant compared to the new collaboration features; tagging, voting and sharing.

wikia-sharing.png

They still haven’t implemented a way to get an RSS feed from a page to show edits. This is a very useful feature on Confluence and I wonder why they don’t do something similar?

I’ve also noticed that wikia is gaining some traction with Alexa users:

wikia-alexa-6-26.png

  • Share/Bookmark

When Tolstoy wrote War and Peace in the 1860’s, he sprinkled it with side chapters where he ranted against the historians of the day. He complained that they told history as a progression of major events precipitated by “great men” when in fact history is a much more complicated progression of cause and effect. Tolstoy was particularly sarcastic when he highlighted the conflicting conclusions of various historians (for instance English vs. French vs. Russian). At one point, he proposed applying the scientific method to history, asserting that a complete understanding of an event could be obtained by slicing that event into smaller and smaller pieces, in exactly the same way that a math student performs integral calculus.

Perhaps Tolstoy’s idea can finally be realized by using the power of mass collaboration?

—-

I’m posting this from Cashiers, North Carolina. I wanted to visit some local civil war battle fields on the drive home (Interstate 81 in Virginia), but it is turning out to be quite difficult to find the information on the web.

  • Share/Bookmark

In part 1, I used an undergraduate mechanical engineering design competition to introduce the concept of Engineering Goodness.
In part 2, I provided a vocabulary for measuring engineering goodness.
In this final installment I will describe some goodness math, show why most of us aren’t nearly as good as we think we are and then provide some suggestions for improving our odds.

Goodness Math

I’ve defined engineering goodness in terms of a grid of values and percentages and shown why it is difficult in practice to fill out the grid. But pretend that you could actually calculate goodness not just for one project but for every project you have ever worked on. Or you can consider every project your team, company, competitors or industry have ever undertaken. You would get a bell curve (Figure 1):

goodness-graph1.png

Figure 1 Goodness Bell Curve

The best thing about this curve is that it lets you think past your current project or problem. To use the familiar sports analogy, you can work on your average without sweating every single game.

Now apply this concept to the coffee can competition (Figure 2)

goodness-graph2.png

Figure 2 – Coffee Can Competition Goodness Curve

The curve is skewed over to the “Not so good” side of things because most entries didn’t even meet the primary requirement, which was to pass the finish line. Delorian and Buggy were good and Oatmeal was great.
I think most creative technical people “get” this basic concept but have no idea what their own curve looks like. For instance, we sometimes work within “segments” (companies, industries, etc) where the curve is off to the left. Creating a “great” solution in a “not so good” company isn’t always a great accomplishment.

I have also observed that most technical people think they are on the right side of this bell curve. This tendency of people to overestimate their capabilities seems to be an inherent part of human nature, and has been reported over and over again in books and literature. Sadly we can’t all be above average.

Another thing I’ve noticed is that many individuals, teams and businesses exert little conscious effort toward moving their own curve to the right and making it more narrow (Figure 3).

goodness-graph3.png

Figure 3 – Move Your Curve to the Right.

What they seem to be missing is that the overall value of the solution is not linear as you move along the horizontal axis (Figure 4)

goodness-graph31.png

Figure 4 – The Value of Goodness

In fact the difference between a good solution and a great solution can be a huge. Depending on your key metric, it could be millions of dollars, thousands of man-hours, millions of unique visitors to your site or many customer satisfaction points.

If Figure 4 is correct, then small amounts of effort expended in moving your bell curve over to the right can mean big payoffs in the long run. Unfortunately, these payoffs are not easily quantifiable and it is difficult to establish a positive correlation between the initial actions and the end result. Further, it requires a long view of events because results will only show in aggregate over a number of “solutions”.

Now lets look at the Coffee Can Competition one more time (Figure 5)

goodness-graph6.png

Figure 5 – Coffee Can Competition Goodness and Value

This chart nicely sums up my original observation about Oatmeal: “No matter how you weight the engineering factors, the oatmeal box was far superior to other designs. It took less time to build, cost less, had less parts, was more reliable and it was faster. It was obviously a great solution.

Steps to Improve your Odds
Anything you do to move your curve to the right and reduce the variability is usually worth the effort! Here is a short list of ideas:

  • Follow a disciplined creative design process – there is plenty of guidance on this subject
  • Challenge fundamental assumptions
  • Always evaluate multiple solutions in parallel
  • Measure against competitors (even if there are no direct ones)
  • Don’t trust your instincts until they are measurably proven right
  • Don’t assume your solution is good just because it works – let the bell curve be your guide

Engineering Goodness, part 1

Engineering Goodness, part 2

  • Share/Bookmark

This is a story of the first years of Mapquest, told in order to illustrate a point about the power and weakness of ideas. Retelling any event is risky because memory is imperfect and always colored by the brush of personal bias, so with apologies to those who remember things differently…

The idea
In early 1995 Mosaic and Netscape (created by Marc Andreesen and others) were the most popular browsers, Yahoo was still the hobby of two Stanford grad students, and many businesses were unsure about whether to get a domain name or not. The company I worked for, GeoSystems Global Corp., was a cartography company that had a small group of software developers that among other things built very cool trip routing and CD-ROM mapping applications for companies like AAA, National Geographic and Comptons Encyclopedia. I was an enthusiastic proponent of the new world wide web and so with a little persuading, I got permission to build a prototype to demonstrate the feasibility of using a web browser to display and manipulate maps. Travis and I built it by hacking one of our CD-ROM mapping applications to snapshot thousands of GIF image tiles at three different “elevations”. Then we wrote a quick CGI application to navigate the map. Later on, more people joined the team and we ripped code out of our other applications to make the first map servers and web front end software. A big part of the project was back end work – map data licenses and data processing. Our secret code name was Project Bandwagon and with much effort from a group of talented and commited technical and business people, we eventually launched as mapquest.com on Feb 5, 1996 (see the launch T-shirt below). The company renamed itself to Mapquest a year or two later.

mapquest-feb5-1996.png

The environment
Early on, there were mixed feelings about the project within the company. Some complained that the web was a huge step backward for users. Our CD-ROM products were way more interactive; you could right mouse click, drag and drop, shift select and change all sorts of layer parameters, so why would anyone want to use a poky 28k or 56k modem with a buggy web browser to do the same thing you could do with a cheap CD? Others maintained, myself included, that there was no business model. I wanted to somehow find a way charge money for each map (we eventually made money with advertising and business services). Others felt we were out of our core business which was publishing and IT services.

But our leaders made some unconventional decisions that paid off handsomely in the long run. The site generated a lot of buzz and was on Netscape’s What’s Cool page for ages. Our traffic grew day by day and our servers didn’t catch up for at least a year. We kept adding equipment until the corner of the office was uncomfortably hot. One day, Brian tripped over an extension chord and the whole site went down for about an hour. I distinctly remember the day we generated 1000 maps per minute; we were all standing around the monitor looking at the summarizer log file as it scrolled down the screen waiting for the number to creep past the magic number. The log also showed our queue wait time which indicated that our servers weren’t handling the load very well.

The competition
There were competitors in 1995 and 1996 but they never amounted to anything significant. One company, a private firm that published both paper maps and a very popular mapping CD-ROM application, demonstrated an interactive map on their web site. This company could have easily created something like Mapquest but they never did. I guess they were making too much money with paper and CD-ROMs to try something new. There was another company that created a download application (presaging Google Earth?) that let you add points, modify layers and annotate your maps. At first I was quite concerned that they would be a big competitor, but they quickly dropped off of the radar. I think the download-install process was too big of a barrier for consumers (I’d love to know what happened to them).

Later on, more serious competitors appeared, including Microsoft and Vicinity (later bought by Microsoft), but by that time we were already established as the place to go for maps and driving directions. Yahoo Maps and Google Maps came even later.

Why a good idea isn’t good enough
Those early competitors had the right technology and plenty of people with good ideas, but they never made it. So what was it about our group that allowed us to be the ones to create Mapquest? I think the answer is a familiar one – it was a critical confluence of factors. In this case those factors were:
* We took technical risk
* We took business risk
* We had good timing
* We were a venture backed company

The right conditions
Afterward, I worked at a number of internet startups. Some were spectacular failures, others did ok, but none were like Mapquest. The experience left me with an appreciation for how difficult it is to create the right conditions for commercially successful innovations.

Marc Andreesen and Fred Wilson have recently written some interesting notes on the investment logic of venture capitalists; how most startups fail and the ones that succeed pay for the ones that don’t. I think the story of Mapquest and its early competitors is a graphic illustration of that principal.

  • Share/Bookmark

Despite its many innovations, Wikipedia is still a very traditional encyclopedia, following a pattern that was laid down in the late 1700’s by Diderot and others. Each article summarizes a particular topic, discusses details and provides references. Each article is a linear discourse that starts at beginning and reaches a conclusion about the topic, which in wikipedia is termed “consensus.”

The problem is that there can be only one consensus (as they said in Highlander) . One of the biggest criticisms of Wikipedia is that its articles are not accurate. Accordingto wiktionary, accuracy means “…exact or careful conformity to the truth...” Since everyone has a different view of what is true and false, by definition every article in Wikipedia is inaccurate.

It turns out that this is nothing new. Encyclopedias have been controversial since the very beginning. For instance, the encyclopedists in eighteenth century France were considered to be radicals, distorting the truth in order to weaken the might of the catholic church and the monarchy.

Wikipedia proponents feel that it harnesses the “wisdom of the masses” in order to optimize the truth of articles. On the other hand, critics could claim it optimizes the truthiness of articles.

I believe the biggest problem with Wikipedia is the encyclopedia format itself because most interesting topics defy consensus. Take my original example of the American War of 1812: was it the result of upstart United States taking advantage of Britain’s preoccupation with Napolean or was it U.S. finally fighting back after years of oppression? The answer is “yes to both.” You might argue that a skillful writer can illustrate both viewpoints in a single article, but that is an over-simplification. Even seemingly objective areas such biology and physics can be fantastically controversial.

Why does Wikipedia need to be like this? Is Wikipedia an anachronism; an eighteenth century idea repacked as a modern day internet phenomenon? What would happen if Wikipedia somehow removed the imperative for consensus, instead embracing and requiring differing viewpoints? It certainly would no longer fit the established pattern of an encyclopedia, but perhaps that pattern is no longer useful.

I’ll conclude with a quote about history and historians from Leo Tolstoy’s War and Peace, Book 11 (which I am reading and enjoying right now):

The first fifteen years of the nineteenth century in Europe present an extraordinary movement of millions of people. Men leave their customary pursuits, hasten from one side of Europe to the other, plunder and slaughter one another, triumph and are plunged in despair, and for some years the whole course of life is altered and presents an intensive movement which first increases and then slackens. What was the cause of this movement, by what laws was it governed? asks the mind of man.

The historians, replying to this question, lay before us the sayings and doings of a few dozen men in a building in the city of Paris, calling these sayings and doings “the Revolution”; then they give a detailed biography of Napoleon and of certain people favorable or hostile to him; tell of the influence some of these people had on others, and say: that is why this movement took place and those are its laws.

But the mind of man not only refuses to believe this explanation, but plainly says that this method of explanation is fallacious, because in it a weaker phenomenon is taken as the cause of a stronger. The sum of human wills produced the Revolution and Napoleon, and only the sum of those wills first tolerated and then destroyed them.

“But every time there have been conquests there have been conquerors; every time there has been a revolution in any state there have been great men,” says history. And, indeed, human reason replies: every time conquerors appear there have been wars, but this does not prove that the conquerors caused the wars and that it is possible to find the laws of a war in the personal activity of a single man. Whenever I look at my watch and its hands point to ten, I hear the bells of the neighboring church; but because the bells begin to ring when the hands of the clock reach ten, I have no right to assume that the movement of the bells is caused by the position of the hands of the watch.

  • Share/Bookmark

Breaking news! – Earlier I pointed out a trivial example of conflicting editorial views on history within Wikipedia. Recently, someone resolved the conflict by completely deleting the entire sentence:

Smaller scale conflict occurred in North America with the USA finally reacting to years of British assaults on US shipping, but the conflict ended inconclusively.

The poster gave the following reason:

I edited a line about the war of 1812, because it isn’t actually relevant to the Napoleonic Wars.

Some people will disagree with this line of reasoning and so the attempt to resolve the conflict has actually created a new conflict. Luckily there is a mostly reasonable discussion of the problem on the talk page and I suspect we haven’t head the last of this ongoing drama.

Wikipedia’s detractors like to point to this type of conflict as an example of the problems with community driven content, but to me it is a beautiful thing. Here you see a small group of regular people who, in the process of creating a summary of the Napoleonic Wars, are interacting and learning in a deep, personal and permanent way. At the same time they are creating something that is pretty good, and it is useful to many, many other people.

  • Share/Bookmark

Candy Mountain! Fill me with sweet sugary goodness” – Charlie the Unicorn

In a previous post, I wrote about what I learned from an undergraduate engineering design competition. In this installment, I explain why I think that most engineers and software developers don’t know how good they are. In a future post I will show why I think most of us are rarely as good as we think and suggest some methods for improving our average.

While the example is drawn from a bunch of mechanical engineering students (myself among them), it applies equally well to any technical creative endeavor; especially to software development.

Why you don’t know how good you are

The superficial results of the coffee can competition were:

  • The ramp stopped most entrants: only three or four successfully completed the course
  • The difference between the fastest entrant and the next fastest was an order of magnitude: 7 sec, 65 sec, 240 sec

But that only tells part of the story. Table 1 ranks entrants by other key design factors, with 10 being the best and 0 being the worst. I’ve created six design archetypes into which most of the designs fit neatly.

goodness-math11.png

Table 1 Design Factors

Based on these rankings, you can create a weighted average in order to quantify the success of each design. Table 2 shows results weighted towards “test results”. In other words, “time to complete” is more important than other factors such as cost and reliability.

goodness-math2.png

Table 2 – Weighted Average Test Results

If you weigh “Time to Market” (e.g. how many hours it took to build and test) you would get something like Table 3.

goodness-math3.png

Table 3 Weighted Average Time to Market

I have identified a few patterns of failure (sometimes called anti-patterns) in the various entrants:

  • Failure to “Paradigm Shift” :Turbine, Flywheel, Rubber Band
  • Poor execution : Other electric vehicles
  • Gold Plated : Delorean
  • Good job! : Buggy
  • Great Job! : Oatmeal

My entry was in the Failure to paradigm shift group. I think all of the terms are self-explanatory.

No matter how you weight the engineering factors, the oatmeal box was far superior to other designs. It took less time to build, cost less, had less parts, was more reliable and it was faster. It was obviously a great solution. But what about the other successful entrants – were they good or great? More importantly, when you finish a project, how will you know whether you did a great job?

The saying goes, “Great is the enemy of good enough,” but in this discussion great is exactly equal to good enough. More specifically, the best engineering solution is the simplest one that meets the requirements. So it should be a simple matter of making a few tables (like above) and adding some weighted factors and you will know how you did, right? Unfortunately, there are two major problems with this.

Firstly, if you are doing something new or inventive, you can never know all the requirements because your customers can’t articulate what needs to be invented. Malcom Gladwell does a great job of discussing this type of thought process in his book Blink: The Power of Thinking Without Thinking.

Secondly, in real life we aren’t all given the same set of rules and asked to solve the same problem, so it is often difficult, if not impossible, to find any concrete “competition” with which to compare ourselves. For example, there is no direct comparison for an internal IT project for a customer service application. If it is over budget and one year late, how do you know whether the planners were too optimistic or the implementation was sloppy and inefficient? Conversely, if it is perfectly on time and budget, how do you know that it didn’t cost ten times what it should have cost?

Here are a few more ways you might be blinded from the truth:

  • Project Hell. Often things go so badly wrong that there is no sorting out what actually happened. When this happens, it usually makes no sense to try to make sense of the disaster.
  • Lax standards. Sometimes people don’t care whether they got the Oatmeal Box or the Delorian, even though they should care.
  • Technical Obfuscation. Technical people are people too. As in any population, there are always a few who will hide behind technical jargon and plausible but false explanations.

For all of these reasons, the sad reality is that the vast majority of engineers and software developers will rarely know whether what they created was great, good or bad.

Fortunately, there are ways to deal with this problem, which I will blog about later.

Engineering Goodness, Part 3

Engineering Goodness, Part 1

  • Share/Bookmark

In an earlier post, I mentioned some new ways to avoid using lollypops, but I forgot an important one that was announced at where 2.0. As Stefan Geens recently mentioned, Geocommons has some great ways to visualize data. For instance, here’s a map of immigration in the US:

geocommons-immigration-map.png

GeoCommons is designed to be used b regular people to do this sort of mapping.

  • Share/Bookmark

Here are a few more notes from the 2007 Where 2.0 conference .

Google’s pride of ownership
Michael Jones, the CTO of Google Earth was a very entertaining speaker. He has a real tricorder from the original Star Trek show and isn’t afraid to use it. He reiterated Google’s aim to “…organize the world’s information and make it universally accessible and useful…” and I couldn’t help ironically thinking that he said it with a certain pride of ownership. Just the kind of thing someone says after purchasing a valuable but rundown building – “it’s mine now and think how much nicer it will be when I clean it up.” Don’t get me wrong – I like Google (may Google be great and live forever) and use many of their products. Some of my best friends work for Google. But sometimes it is scary what they can do.

Open Map Data
I’m very impressed with Open Street Map, an initiative to make a creative commons map of the world. In the US we take for granted some of the access to cartographic data that we have. In other parts of the world (e.g. UK, China) the data is government owned and tightly regulated. These folks are using open source tools to create a community of amateur cartographers. Their goals are ambitious but their strategy is to take small steps first.

Closed Map Data
One of the most entertaining talks was by Ian White of Urban Mapping. He spoke about the difficulties in obtaining theoretically public data (such as metro stop locations) from governmental bureaucracies in the US. Hilarious but sad.

API
Everyone was showing off their web services. The new Mapquest Actionscript API for Adobe Flex looks very interesting. It allows you to quickly make very interactive mapping applications that run within the flash virtual machine. As I mentioned in my earlier post, ESRI also showed a very neat API for adding GIS like functions to maps.

Something in the air
The show had some of the feeling of those early Internet World conferences. Though it was much smaller and definitely less pompous, there was a charge in the air. There were at least a few VC’s in attendance. It feels like 1998 all over again.

  • Share/Bookmark