20061230

Satellite Radio - Are you Sirius??

I cannot for the life of me understand the American consumer. I admit this freely, that's why I prefer to rely upon objective data when making decisions regarding marketing and product feature selection. Suffice it to say that you can never completely predict how customers will react to product introductions. You either know it in your gut, or get the answers directly from the people.

I do, however, claim to fully understand American business motivations. This is simple. American business is motivated by greed and short-term gain. Key metrics include stock price, revenue, and earnings, all measured within the near-term, say the next 5 quarters. Therefore, in order to optimize these metrics, businesses need to motivate consumers to buy things and, preferably, continue buying over an extended period.

Enter the "annuity stream" concept of business:

The concept is simple. Get a consumer to buy in at a ridiculously low price, then force them to continue paying a "nominal" fee for a very, very long time. It seems obvious that the American consumer is OK with this. Data clearly suggests that the American consumer will opt for a low entry price regardless of the ongoing "service" fees associated with them. This has been proven with the cell-phone industry. We are. apparently, ok paying $40, $50, even $100 a month for cell phone service, as long as the phone itself is "free". The logic is ridiculous. Yet it works.

This concept was initially applied to TV by the cable industry. People used to watch TV or free by using an antenna. Not much of a revenue stream there. Enter the cable companies, who "gave" the consumer a cable box (which actually costs several hundred dollars), and then provided a "service" for a "nominal" monthly fee of between $40-$100/month or more. Great business model.

Now the same is being applied to...believe it or not...radio. Same idea. You get a satellite reciever for a very low cost (sometimes free), and then pay between $10-$15 a month to listen to (supposedly) commercial-free radio stations with CD quality sound.

I cannot for the life of me understand why people would do this, especially for listening to music in the car. Now, in case you're wondering, I am a music lover. I love listening to great music. But when I'm driving my car around Boston, I just don't consider it imperitive that I get the absolute best quality sound in the cabin of my Volkswagon Passat. And I would never consider paying someone a monthly fee for the priviledge.

Compounding that are the myriad options available to people without a monthly fee. FM radio ain't bad, especially in the city, but I can see that some rural locations may not consider that an option. However, tapes, CD players, and even iPod connections are very low cost, and then you can listen to what you want when you want, without paying a monthly fee.

It also appears the car companies, accessory vendors, and car stereo vendors are complicit in this farce. This entire problem could go almost completely away by providing an AUX input jack (the same jack that is on your iPod for headphones) on the front panel of every car stereo. Then you could simply plug your iPod into this jack and listen to what you want using the iPod controls. It amazes and baffles me that no one has suggested this (NOTE: in the future all stereos should have a stereo/multi-channel bluetooth input available for any bluetooth device to wirelessly connect to). Yet there are the continuous round-ups of the overly-complex, substandard, and completely ridiculous FM transmitter accessories for players like the iPod that get all the coverage. Do you know why? Because these things MAKE MONEY, and a stupid $0.50 AUX jack would NOT make anybody a cent.

As long as the American consumer is willing and able to part with their money so easily, American businesses will continue to take it. It's a match made in heaven.

20061204

Biometrics, take me away!

Passwords are a real pain. Every program that your company "lets" you use for your job requires a password. I get that, and I'm fine with it in general, but sometimes the IT folks, in their diligence to be measured favorably, don't seem to give a hoot about usability. Let's face it, without me doing my job, we really don't need them, so it does matter a little bit about how easy or hard it is to do my job. Requiring me to change my password "periodically" seems innocuous...at first. Until you read the fine print.

Let's see...I have only a small number of corporate accounts requiring password authenication. They are:

  1. General corporate Internet access account.
  2. My computer's workgroup account.
  3. My company's email client account.

--> Each of these accounts requires me to use a password, and change it "periodically". What really does that mean? Here are the rules:

  1. You need to use a "strong" password. That usually means something difficult to guess. The IT folks define "strong" as having both letters and numbers, not using any known or easily guessed names, words, or phrases. A password like "b42wd3fg" is considered "strong".
  2. You must change your password periodically. For the IT folks, periodically means every 90 days.
  3. You cannot repeat the use of a password for a "while". This is where things get interesting. For our wonderful IT folks, that means NEVER. But they are more reasonable than that. They'll let you repeat a password, but only after you have used FIFTY other passwords. Yep, that's right. No repeats until after 50 unique passwords have been used.
--> Now if you combine these three rules, over the three separate accounts I have, you begin to see the issue. This "simple" procedure mushrooms into an unGodly mess.

Every 90 days I'm forced to come up with a new, unique, strong password, that has not been repeated in less than 50 sessions, for each of my three accounts.

Sure, I could try to use the same password for each account. Problem is the 90 day renewal cycle is not synchronized, and eventually will "beat" against each other mercilessly.

Bottom line: Anybody who thinks this is the best way to safeguard our security is an idiot. Here are two better ways:

  1. Dongle. Use a USB-key as a physical dongle that needs to be inserted into the USB port of the computer you are trying to access. The key generates a random, rotating key that cannot be copied or subverted. Add to it a simple, easy to remember password that I don't have to change very often and you have GOOD ENOUGH security unless you are in charge of nuclear weapons.
  2. Biometrics. Please, PLEASE, someone get this right. If the USB dongle alone is not enough, add a simple fingerprint scanner to the USB key itself. I've seen these things in the wild, but they dont' really work well enough or correctly yet. Here is just one example. There is no reason this could not work extremely well if someone really wanted to perfect the technology.
---> Combining #1 and #2 above would provide a robust and secure AND simple-to-use system that would provide acceptable security for 90+% of the world's applications.

Why is this not done today? Simple. The IT folks are not measured on "simple-to-use" so they don't care. No one is sufficiently motivated to make this problem go away. As with most things, wherever there is a "loose connection" between a problem and a solution, it does not occur.

20061201

RFID tags for US passports - what is the big deal?

I travel all the time, mostly for my job. Often, this takes me out of the U.S. I'm what the airlines call a "frequent", ostensibly for "frequent flyer". What this really means is, I get routinely abused by the travel industry. But that's another blog. This one is about RFID (Radio-Frequency IDentification) tags. These little buggers have been around (technologically) for decades, but only now are starting to become ubiquitous. Bottom line, these things will be everywhere.

First, what the heck is an "RFID tag"? Put simply, it's a small electrical circuit that gets stuck to an object that someone wants to track, identify, or otherwise hold descriptive information for. An example would be a package that is being shipped across the country. The shipper places a sticker containing an RFID tag on or in the package. That way, the shipper can easily track the package wherever it goes. It's much harder to lose something you can track this easily. These things can be incredibly small (like a postage stamp or smaller), and, lately, pretty cheap to make. How cheap? Literally cents.

Given these things WILL be everywhere, allowing the simple, cheap tracking of objects, the U.S. Government has decided that it might be a good thing to place one of these things inside all of our U.S. Passports. Therefore, starting early 2007, some if not all new or renewed U.S. Passports will contain RFID chips that mimic the information printed on the passport. Eventually, readers will be used at airports and in other places to verify the passports' contents.

Many people are not happy about this. Expect more media coverage and public discussion as the date for the first of these new passports to be issued draws nearer. The media simply has not noticed this issue yet, but they will. To prepare, read on, and check out these other sites:

This link will take you to the US Gov site and many links they provide to describe the new epassport. My take is that this is a good thing for our security, but not everybody agrees with me. Here is a general story from CNet. And here is a more technical site if you want more detail. You can read all the Orwellian horror predictions here.

For me, I simply do not understand why certain people are continually afraid of new technology like this. RFID tags will not reduce your security, they will increase it. Here's how:
  1. Access to the digital info is more difficult. In order to read the digital info stored in the RFID tag in your passport, an unscrupulous person would have to: 1) illegally obtain a reader device, 2) get to within 12-24 INCHES of your passport, 3) circumvent any type of RF-shielding (I plan to get a new passport holder that incorporates a simple RF-shield (like alumium foil)) present, and 3) bypass all the data encryption inherent in the RFID tag. Compare this to simply stealing your passport by hitting you over the head in a dark alley...
  2. The digital info simply encodes the analog info. There is not supposed to be anything encoded in the RFID tag that is different from what you could see and read if you opened the passport and simply looked at it. It's not like there is some additional, more sensitive info in there. Of course, it is the government ;)
  3. If somebody does steal your passport, or if it is lost, it will be much easier to replace and re-validate. The government can simply "invalidate" the RFID tag in your stolen passport instantly, such that it will be flagged by every computer from here to Bangladesh. Then they can issue you a new one, including the picture, instantly.
I know it's scary to some folks when a government encodes data about you and stores it. But really this is no different from credit card companies issuing you a credit card with a magnetic stripe. What do you think is on that stripe? Are you sure it's protected? How easy is it to read? How many of those cards do you have in your wallet right now?

As a frequent, I'm happy to see anything that can be used to protect my security and enhance the efficiency of the travel industry. As long as the potential for abuse is constantly monitored and minimized, these technologies will make things better, not worse.

20061029

LEDs are the future of lighting

To many of you, this is probably obvious by now. I'm convinced that, in the not too distant future, all lights will be LED (or some similar semiconductor based) lights. We finally have come to a point where LEDs can be manufactured in white or near-white variants, and that was the big roadblock.

The advantages are many. LEDs are incredibly robust, with MTBFs (mean-time between failure) in the 20,000 hr. range. That means 20 YEARS of average use! Further, LEDs consume much less power than incandescants, and even less than those low-power flourescents we all thought would take over the world a few years back. Oh, well, that never really happened, did it?

Wanna try out this new frontier of lighting easily and cheaply? Then get some LED-based night lights. These little cuties can be had for around $4 apiece. Check out this one offered through one of my favorite online sellers, Amazon:


You can get these at most hardware stores like Home Depot also. The cool thing about going with an LED nightlight is that you'll probably never have to replace it! Yep, since they plug into the wall, there are no batteries, and the photocell and LED will last virtually forever. You'll probably move or die before needing to replace this bugger (now that's a wierd thought).

What's really wierd is the notion that, in a few years, ALL your lights could be like this! Imagine never replacing a light bulb!

The really interesting thing is:

No matter how many of these "timesavers" we seem to impliment in our dailiy lives, we remain way too busy...figure that one out and then we'll have something to discuss.

20061020

Why buy a KRZR ?



I recently decided to buy a new cell phone. Like most folks, I wanted something as small as possible. I wanted a really good phone. Something that was tested and used by a lot of folks. Buying the latest and greatest tech is the best way to be disappointed.

I had decided for a while now to buy a RAZR. They've been out since 2004, and MOT has sold millions of them. They work great as phones, and are still one of the coolest looking designs out there.

Then, right before I decided to make the purchase, MOT came out with the KRZR. This phone was originally hailed as being the "latest and greatest" and even smaller than the RAZR. What a dilemma.

After checking out both phones, I decided to get a RAZR. I learned through several reviews that the KRZR is essentially a repackaged RAZR V3m, so from a functional standpoint there was very little difference. That left only the size metric to consider. I REALLY wanted a SMALL phone.

So how much smaller is the KRZR? Well, here are the facts:


RAZR: 14mm x 53mm x 99mm which gives a total volume of 73,458 cubic millimeters







KRZR: 16mm x 42mm x 103mm which gives a total volume of
69,216 cubic millimeters




--> This implies the KRZR is 4,242 cubic millimeters smaller than the RAZR, which represents a mere 5.77% reduction in volume.

So, let's review:

1. KRZR is the same phone internally as the RAZR V3m.
2. KRZR is 4mm longer and 2mm thicker than the RAZR.
3. KRZR is only less than 6% smaller in overall volume.
4. KRZR costs at least $100 more!

Therefore, I feel that unless you HAVE to own the "latest and greatest", or you need a skinnier (albeit longer and thicker) version of the RAZR V3m, seems like a no-brainer.

RAZR still has legs!

I've had the phone for over two weeks, and think it's great!

20061009

Phone companies - from Ma Bell to God Awfull...

Back in the day, you did not have any "choice". Ma Bell was the only option. You either dealt with her or had no phone service. As a result, the phone company had absolutely no motivation to make customers happy. You constantly got the run around for service. You felt frustrated, like you never got your money's worth. The "phone company" was the butt of many a late-night joke. You had no choice...you were "locked-in".

Well, we've come a heck of a long way in 30 years. Granted, now we have many different "phone companies" to choose from. But are they any different? Service is still a joke. I still feel very frustrated whenever I have to deal with them. Subsidies aside, I STILL do not feel like I get my money's worth. Instead of paying $20/mo for phone service like we did 20 years ago, we now have three bills from three different companies, and pay $100/mo for phone service. Factor in the fact that technology dramatically lowers the cost of electronics over time, and that is an amazing number. Somebody, somewhere, is making a TON of money on all of us.

Now, instead of feeling locked-in because there is only one phone company, I'm locked-in because, some fast-talking bottom-feeder salesman (who clearly works on commission and is spiffed by pushing certain phones over others) gets me to sign a 2-year "committment" with an outrageous "early termination fee".

I think it's incredibly convenient that all of their "satisfaction guarantees" expire way before you ever get a bill. That way, once you pick your jaw up off the floor, it's too late to do anything about it.

Thanks, deregulation. Some improvement. From "no-choice" to indentured servitude.

20061004

MIT Emerging Technology Conference MITETC - Day Two

MIT Emerging Technologies Conference
Thursday, Sept 28, 2006
Cambridge, MA

Day two of this conference was decidedly different than day one. While day one was comprised of more light-hearted, even optimisitc viewpoints about techy topics such as web services, online applications, how to define and nuture innovation, and what AOL is up to, day two was much more sobering.

The day started with an interesting, if not optimistic, talk from George M. Whitesides. George discussed the recent senate committee report on competitiveness in the U.S. Here are the high (uh, maybe low) lights:

  • The U.S. is now a net importer of high tech products
  • U.S. companies are now ranking lower with respect to high tech patents than ever before
  • U.S. students are now doing worse compared to their peers outside the U.S.
Recommendations given to deal with this reality include:

  1. Capital - increase teacher, student support, strengthen funding for basic research
  2. Labor - get the best teachers, increase teacher status and salaries
  3. Energy - clearly the most controversial recommendation, to create a DARPA-like organization whose goal was to fund advanced energy-saving research. Called "ARPA-E" by the committee, I can't for the life of me understand why this recommendation did not fare well. Seems we need to push and reward BIGTIME efforts to do amazing things among alternative-energy solutions. My cynical view: legislators are either too ignorant to know what to do with this organization should they create it, or they have been bought-off by existing energy interests to act this way.
My take is that we are and have been in a true innovation crisis in America. We only support incremental and/or piecemeal solutions that, while being cheap and attainable, are woefully inadequate. We will only achieve big when we think big. Seems all the big thinkers are dead. Incrementalism will never get us where we need to go.

Former VC Roger McNamee, Co-Founder and Managing Director, Elevation Partners

This guy was truly one of the best and most exciting speakers in the entire two-day event. He talked on almost any subject, and kept the audience engaged constantly. Here are some of his most notable quotes:

"People don't want software, they want outcomes."

"Technology will make a difference [at least first] through media."

Although the Internet has done a great job in aggregating information, it does not prioritize it effectively so people can easily get at what they want.

I totally agree with this one. In fact, now that any 13 year-0ld can create a web site, you can't really tell if someone is an expert (like me ;) ), or a nut-job (take your pick). This is why the Internet is the "great equalizer", or more appropriately, the "great medocre-maker". We truly need, as Roger says, to "move up the stack from 'information to wisdom' ".


Big issues with moving up the stack are tackling things like personalization, trust, and authority. We need to KNOW that the info we get is the best info, most relevant for our needs. In accordance with my comments earlier about incrementalizm, Roger states that:

"We are harvesting our national economy." -

This is exactly right. We are doing nothing to plan the great things of the future. We are merely milking the cash cow of the present for our own current needs, and to Hell with the future.

--> As if this was not enough to chew upon, the next panel was even more disturbing:

Panel Discussion on Global Warming

As I said in an earlier post, this discussion was really depressing. Here are the basics:

Once countries figure out the reality and danger of the global warming crisis, reducing carbon emissions will be THE central focus for investment in the future. There will be simply no alternatives. This was the premise put forth by a group of esteemed experts on the subject. All were in agreement as to the dire nature of this threat. Problem is, we only have between 10-20 years to act, and most countries are not convinced of the threat yet. They seem to think that, by the time we all figure this out, it will be essentially too late to do anything about it.

Since you need to stay below 500ppm of carbon atoms in the atmosphere to avoid an irreversible global catastrophe (their assertion), called a "tipping point", if we act too late we will not be able to avoid this. Apparently this is due to the fact that it takes 3000 years to get carbon out of the atmosphere once it gets in. All of this stuff is essentially cumulative, and we are already dangerously close to the limit.

IF you believe all this stuff..it's very scary indeed.

Next topic ended my day on a lighter, albeit more frustrating, note.

Panel discussion on DRM: "Making PCs safe for Hollywood"

Discussion of activities around the Trusted Computing Group's efforts to create HW and SW DRM that works. Newest version of this is called the AACS, for "Advanced Access Content System". This basically will allow a certain number of "managed copies" of an asset to exist simultaneously on different media with different devices, so one person can listen to music, say, on portable devices as well as PCs.

I believe that this DRM stuff is basically the music industries' attempt to "hold back the ocean". My opinion is that it will work, but we will all be made miserable in the process. I personally resent being made miserable in order to prevent something being done by a small minority of people. However, this is not the first time in history this has occurred, and it won't be the last.

That's it...all in all one of the most interesting and thought-provoking conferences I've attended in a while. I would highly recommend it to anyone.

20061003

MIT Emerging Technology Conference MITETC - Day One

Finally here are my highlight notes about this conference:

Last Wed and Thursday, Sept 27 and 28, 2006, I attended the MIT Emerging Technology Conference on the MIT campus in Cambridge, MA. This two-day event brought speakers from around the world to discuss the key technologies that will impact our world. It's scope was broad, and the speakers were varied. The one thing they all had in common was a passion for key technologies of the day. Here are my recollections from Day One:

MIT Emerging Technologies Conference
Wednesday, Sept 27, 2006
Cambridge, MA

Keynote - Jeff Bezos - Chairman and Founder, Amazon

You can view the Jeff's entire keynote on video here.

Jeff talked about how Amazon is providing web services to developers. They are "eating their own dog food", by developing web services that they are using internally for the Amazon web site, and offering these services to developers for their own web applications. The fee structures are very appealing, with zero startup cost, and usage fees that are essentially linear, so small startups can start small with very little money, yet use the same services to scale to super large proportions with no changeovers required. Jeff calls this process one of commoditizing "undifferentiated heavy lifting", what Bezos calls "muck". Offered services:

1. Mechanical Turk - encoding human intelligence. Using humans to answer questions. This is like Microsoft Live QnA.

2. S3 - Simple Storage Service - web service that ASP can use to store data in the cloud. Redundant, 24/7 storage/backup. Smugmug uses S3 today. Cost: $0.15/GB/mo to store data.

3. EC2 - Elastic Compute Cloud - web service that provides CPU cycles on a standard platform.

4. Amazon fulfillment services - just launched.

--> for more information. check out www.aws.amazon.com.

Bezos quote: "We make muck, so you don't have to."

another good quote from Bezos:

"In the future, the best applications will be a hybrid of client-side code and server-side code. "


John Miller, CEO, AOL

John gave an "OK" overview of the Internet industry, and focussed mainly on AOL's new video distribution business, ostensibly an effort to get in on the YouTube phenomena. His message was while consumption is being disaggregated by all these new web sites, the monetization of video and other media is actually being aggregated by a small number of companies. I guess John hopes that AOL will be one of them. After hearing his talk, I'm not so sure...

Panel discussion lead by David Faber of CNBC, with reps from CNET, Reuters, etc. - Interesting but not particularly ground-breaking discussion.

Panel discussion lead by the CTO of Motorola. Another (barely) interesting discussion about innovation in various types of companies. Some of the notable statements:

- Innovation - can it be taught? Interestingly the moderator stated at the end that the group "concluded" that yes, indeed, innovation can be taught. This is of course garbage, as the group clearly indicated that, although the basics of innovative processes can be presented/taught, and examples can be given, and innovation can be nurtured, this is just like any other talent. Some people have it, and some clearly don't. So in my opinion the entire question was a red herring (much like most of these panel questions, because they want to generate discussion, not necessarily answers).

- Behavior - You get what you measure and reward. If you want a certain outcome, you need to measure it and reward it effectively. I really agree with this one. Way too many times people wonder why they dont get something when they are not measuring/rewarding what they want!

- The most interesting panelist by far was Jay Walker - had many great comments. One of his best was:
"Leadership is like soul - it can get 10X output from 1X input." This guy really gets it.

Panel discussion - "Online Application Wars" - reps from Google, Salesforce.com, Amazon and two small ASP companies (37 signals, Goowi) discussing how online apps will fare in the future. Microsoft was invited to send a rep but did not. Interesting title since on nearly every topic, all the panelists were in violent agreement. It was interesting, but it would be nice to have such strong, articulate minds engaged in more controversial, passion-inducing topics of the day. Here is my quote of the day:

"Life is too short to talk about the things we all agree on..." - Tom Berarducci.

That's it for day one...

MIT Systems Engineering Conf - Sept 26, 2006

Finally getting around to blogging about the conferences I attended last week.

Last Tuesday, I attended the MIT Systems Engineering conference at the MIT media lab. Billed as essentially a "supply chain" conference, it seemed like it might be a royal yawn. Boy was I wrong! I never , ever saw engineering and supply chain issues brought more alive. The speakers were dynamic, and kept the topics interesting and relevant. Many of the issues were broadened to encompass general business, and even life, issues.

Here are the nuggets I mined from the day's events:

Yossi Sheffi - Author of The Resilient Enterprise.

www.theresiliententerprise.com

- winner of the "best quote of the day" award:

" 'A' people hire 'A' people, but 'B' people hire 'C' people." - Awesome.

and another good one:

"Focus on those few aspects of your products that your customers find most valuable, and [commoditize] the rest." - great advise for all of the engineers out there...

Professor Levenson -

"Safety is an emergent property." - This can be of course extended to almost anything. The key point here is that we need to strive in our systems, product designs, etc. to create "emergent properties" that are more than the sum of the parts. This is where true value lies. A more contemporary example would be "Style is the emergent property of the iPOD." It seems bigger than the simple little music player itself. Very cool.


Professor Nightingale -

Book that invented the term "lean" was:

The Machine that Changed the world

...did not get to confirm this one.

Joel Cutcher-Gershenfeld -

Lateral Alignment in Complex Systems - interesting talk about how to relate horizontally systems of vertically integrated organizations.

Irving Wladawsky-Berger -

"Highly Visual interactive interfaces" - discussed how we are now moving past the simple desktop metaphor into more graphical, interactive interfaces. "Second Life" was his example of something which is on the forefront of this initiative.

And last but certainly not least:

Dr. Michael Hammer -

www.hammerandco.com

- This guy was a riot. Basically talking about organizing the enterprise for the next century. I never thought a talk about organization could be so amusing, entertaining, and educational at the same time. His premise was that all the organizational metaphors we are using are based upon antiquated techniques which were invented a century ago. Instead of simple functional organization, we need end-end process organization. Very interesting stuff.

If you can ever hear this guy speak, go and see him!

That's it. All in all a very interesting day.

20060929

MIT Emerging Technology Conference MITETC

Just got back from the MIT Emerging Technology Conference - very cool indeed.

I'll post a summary when I get around to it...which may be a while the way I'm going these days.

What was the coolest thing?? Hard to say. I think the most "descriptive" aspect of the conference was not who was there, but rather who was not there. MIT did a great job of bringing in luminaries, thought leaders, nobel lauriates, and other heavyweights from around the country and around the world. I was very impressed with the quality of the speakers. What was interesting was the rather large hole created by the absence of one major company - Microsoft. During all the exciting discussions of Web2.0, ASPs, webapps, webservices, etc., the only time the Redmond-based company came up in discussion was as the butt of a joke by one of the speakers. True, they are an easy target, and it did get some cheap laughs, but the reality is much more telling.

Our computing and Internet industry sits nearing a significant crossroads, with Web2.0 and other open-standards proponents clearly on one side, and Microsoft on the other. The Vista operating system will be a step-function in the negative direction in this respect, with its myriad proprietary protocols and formats.

The first step toward solution in any negotiation is dialog. When Microsoft does not even care to be in the room...that is truly sad.

20060815

My monitor rant...and why Microsoft must change to beat Apple

Yes, my new wide-screen 20-inch Samsung flat-panel monitor is great. Near-HD resolution (1680x1050), 16:10 aspect ratio, crisp, clear graphics with good uniformity and great color fidelity is a joy to behold. That is, until you try to power it down.

Don't get me wrong, this is NOT the fault of the monitor. Problem is, determining exactly WHO's fault it is is very, very difficult, and that is what lead's me to my Rant, and more broadly, to why Microsoft has to become more vertically integrated (like what they are currently doing with Zune) in order to beat Apple at its own game.

First the Monitor Rant.

When you leave a Mac computer unused for a while, the computer goes to sleep (standby mode) in a very friendly way. It just works. The machine goes into a low-power mode, and the display quitely fades to black. When you want to go back to work, you simply touch a key or move the mouse, and the display INSTANTLY resumes, with a nice, friendly fade-in to full brightness.

Simplicity. That's what Apple has perfected in their products.

So why can't a Windows machine do that? Well, the answer, although perfectly legitimate, is not so simple to explain. Here goes:

Microsoft's entire system strategy depends upon partners adhering to specs laid down by Microsoft (and certain standards organizations) so that the various interdependent interfaces in the system work together. The case of system standby operation is an excellent example of how this approach tends to lead to what I call "least common denominator" user experience.

When a computer goes into standby (sleep) mode, here is what has to happen. The OS needs to decide it's time to sleep (by a user setting). Then it needs to tell all the devices attached to the computer to go into sleep mode. This mode is defined by some sort of standard. One such standard, typically used for computer monitors, is called ACPI, for Advanced Configuration and Power Interface. After all the devices have gone to sleep, the OS can tell the motherboard to enter a low-power state, and then the system goes to sleep. Waking up is done by a user touching a key or moving a mouse. That requires the motherboard BIOS (basic input output system) to recognize this action and in turn wake up the OS, who then wakes up the devices, and you are back in business.

Problem is, on a Microsoft system many of these devices are not designed by Microsoft. And if any ONE of these devices does not fully comply with the interface spec, you will get unwanted results. This may include the screen not coming back on after standby, or the system not going into standby at all. Sometimes you'll notice this, sometimes not. Thus, the system operation is at the mercy of the "least common denominator" component.

Why doesn't Apple have this problem? Simple. Apple controls all the interfaces and devices attached to the Mac much more closely than Microsoft does. In many cases, Apple engineers design all the hardware . This way they can make sure the system performs as desired, not simply as the spec allows. The figure below illustrates what I'm talking about for the case of system standby turning off the monitor:


Because of this tight integration, Apple systems simply work better. Thus, my nice Samsung monitor may look great when on, but for some reason, the system will not resume from standby. Therefore I need to disable standby in the display settings, and simply use a screen saver. I don't like this because it uses a lot more power leaving the computer on 24-7. It also does not make use of the monitor's Energy Star power saving capabilities. It's also Just Plain Wrong.

Unfortunately, the only way Microsoft will ever get past this issue is to copy what Apple has done. They need to field "whole product solutions" so they can control more tightly the entire system's user experience.

Now the part that relates to Zune.

Windows compatible music players have been around for years. They have had much less success than the iPod. Why? While I firmly believe a lot of the credit goes to the fantastic marketing prowess of Apple (who could probably sell icecubes to eskimos), that is not the only reason. Fact is, the iPod would not sell if it sucked. The reason Windows compatible music players have not succeeded is that they simply don't work as well on Windows as the iPod works on Macs. The Zune player may hope to overcome this limitation by controlling more tightly the interfaces and therefore the user experience.

Microsoft must do this because you simply cannot leave system integration to standards organizations, and hope that a set of specs, designed by committee, will "enable" the desired emergent behavior. In short, you cannot spec a killer product by remote control. It must be a close, hands-on, collaboration between design, manufacturing, engineering, and marketing. Nothing else will work. Period.

Will Zune succeed? Only time will tell...but my guess is that it will...not. Microsoft is moving in the right direction with this product, but it appears to be targeted at a mature market which is already in the possession of a competent, agressive, clear leader in the space. Unless Microsoft somehow disrupts the product space with a completely new approach and functionality, they really are too late to this party.

But I think they should keep on plugging. One thing about Microsoft. They are tenacious.

20060806

Bigger is not always better

An additional piece of info regarding the Samsung 205BW 20-inch widescreen monitor review posted before. When considering monitors, you want the sharpest picture available at a price you are willing to pay. Unfortunately, it's not necessarily in the best interest of the manufacturer to make that decision an easy one. Specsmanship has for a long time been the bane of the electronics buyer.

As I indicated in my previous review of the Samsung, I believe that, for now at least, this monitor is at the sweet spot in terms of price and technology. What I meant by that is that the pixel size of this monitor is optimum for a sharp, crisp picture without breaking the bank.

Well, now you don't have to take my word for it. As you can see in the following graph, I've plotted the relative pixel sizes (in terms of total volume) for several representative monitors with diagonal measures ranging from 19" to 24". I'm sure you'll recognize the model numbers I've used to create the plot, since I've tried to pick some of the newest and hottest models around. Take a look:

Monitors used for this graph:


If you want a clear, crisp, sharp picture, you need a lot of small pixels. With today's technology, that means a 20" widescreen monitor with 1680x1050 pixel resolution. Bigger screens simply stretch the same number of pixels over a larger area and, at the same viewing distance, will make the same picture less sharp! Even the Dell 24" flat panel, going for approx. $800 today, has larger pixels than the 20" samsung, which can be had for around 300 bucks.

If you are going to hang the monitor on a wall and step far away from it, then bigger is usually better. But if you are going to be sitting the monitor on a desk and trying to do some serious work on it, you don't necessarily want more diagonal inches.

Buyer beware...and good luck!

20060725

Samsung Syncmaster 205BW Review

Since it seems that no one has yet reviewed this little baby I thought it was my duty to put my $0.02 in. Here goes.

The Samsung 205BW 20" widescreen LCD monitor is quite nice, offering a clean design, with simple, intelligent controls, nice thin 3/4" bezel surrounding a gorgeous 1680x1050 resolution LCD panel. Take a look:





Here's a look at the monitor full-view, showing the pedestal stand:




What's included in the box: Monitor, AC line cable, 15-pin analog video cable, and (a very nice touch) a nice DVI cable. Also included is a software disc that includes the driver and some accessory software, and a printed manual.

Connections: In the picture below, you can see the connections available on the back of the monitor:




As you can see, they are all located nicely in a recessed area at the bottom of the back of the monitor.

Operation: Operating this monitor is a breeze. Simply plug it all in, and boot your computer. If there is no driver present, the in-box VGA driver will work enough to allow you to see the video. Then either the plug-n-play system will ask for a driver, or you can simply slam in the CD and load it yourself. After loading the driver, reboot the computer and you will be in business.

Quality: This monitor rocks. At 20" diagonal, 16:10 aspect ratio, and 1680x1050 pixels, the display is crisp and clear. This choice of form-factor, diagonal size, and spatial resolution is simply optimum for today's technology and general digital imaging work. 19-inch monitors have lower resolution, typically 1400x900, yet are not much smaller, so the result is the pixels are larger and the image less sharp. Larger monitors, be them 21 or 22-inch types, almost all have the exact same resolution as the 20" (1680x1050), again making the resultant pixels larger and the image less sharp. You need to go up to 24" diagonal measure to get more pixels (1900x1000, or full 1080p resolution), but yet again the pixels will not be that much smaller, the image will not be that much sharper, and your pocketbook will be much, much lighter. So, for today's technology at least, this monitor is exactly at the sweet-spot. When you consider it is available for around $300, and the 24" models are sitting at $500 today, this deal is really sweet.

What's cool: Besides the simple, clean design, and fantastic image quality, one of the coolest things about this monitor is the neat articulating stand. The stand, a simple pedestal mount, is attached to the monitor via a standard VESA-compliant mount. You can see how the monitor attaches to the stand here:



If you want, you can remove this mount and attach an after-market type, perhaps for mounting the unit to a wall. But believe me, for general tabletop use, this stand is great.

Once you have released the monitor stand from its shipping condition by pulling the cotter pin from the base, you can easily raise the monitor with a finger. It can be easily lowered to nearly 2" from the surface, and raised to approx. 5 inches off the surface. Moreover the monitor can be rotated by approx 180 degrees around the base by simple finger pressure!

After seeing how many manufacturers mess up a simple thing like articulation, it is truly a joy to see Samsung do such a marvelous job. Why can't everyone do that?


Obviously I think this monitor offers a great combination of features, quality, and price. You could wait for the new 22-inch Samsung widescreen, but honestly I don't think it's worth it. If you are interested in more in-depth monitor reviews, along with reviews of other computer-related equipment, I suggest you check out Tom's Hardware Guide's (no relation) review pages, or epinions.com. Here is the link to the latest LCD monitor review at Tom's Hardware. At the time of this post, the 205bw was not reviewed there, but I'm sure they'll get around to it soon, so check back. Until then, you'll just have to take my word for it.

20060720

19" wide-screen monitors: a good deal if you know what you are getting...and what you are giving up.

Ever wonder why things are put on "sale"? Think about it, it actually COSTS money to put something on sale. The vendor has to advertise the fact that they are willing to sell something for less. So not only do they lose the difference between "regular" price and "sale" price, they lose much more because they had to advertise the sale.

So why do this?

There is only one reason items are put on sale: to STIMULATE SALES, either in that item (legitamate sale), or in other items (so-called "loss leaders", or cross-effects).

But why put things on sale at certain times? Standard practices dictate that you place items on sale when you need to move them. In the case of technology, you often need to move items to make way for new technology. The last thing you want is some old "obsolete" piece of junk sitting next to the NextBigThing on the shelf.

That is the case with 19" monitors RIGHT NOW. Two things are happening in the industry wrt monitors right now. Here's the scoop on both of them:

#1. Shift from 4:3 aspect to "Widescren" aspect. HD now has enough critical mass, and manufacturers now have enough supply, to get behind massive promotions of new HD format monitors, which will have the same "aspect ratio" (ratio of width to height) as our movie theaters have had for decades. This "new" aspect ratio is 16:9, much wider than the old television standard of 4:3. For whatever reason, the whole world is moving this way.

#2. New display technology which will bring near HD resolution in a desktop monitor. New monitors will have 1650x1080 pixels in a 19-22 diagonal size, with an approx 16:10 aspect ratio. The next meaningful breakpoint will be so-called "full 1080p" resolution of 1920x1080 pixels, which will be in monitors over 23" diagonal and cost a LOT more (these prices are never linear until commoditization occurs).

--> So the reason 19" monitors are on sale right now is that newer monitors with near-HD resolution and widescreen format are coming to a storeshelf near you! The stores know that once they do, there is NO WAY they'll sell these old units for what they want to.

So you have a choice: buy old technology now, or wait until the new stuff comes on line. Or better yet, wait. The new stuff will most likely be horribly over-priced (at least until Black Friday - day after Thanksgiving), and when they do arrive, if you can find one of those old monitors, I guarantee you'll get it for a steal.

My advice to all technology buyers is the same. Know exactly what you need and don't buy anything more. The rest is just useless indulgence. Further, if you can get away with it, buy at least one and preferably two technology iterations behind the current "Best" product. That way, you'll save boatloads of money, and have pretty good stuff. You can usually find items like this on eBay, the day after the new stuff goes on sale...

Using this method, today you should be buying proven technology like:

- iPods (which are only now falling out of favor)
- Motorola RAZR phones (which are now really cheap, and have proven to be great units)
- ALL 4:3 aspect ratio monitors...which will start dropping like flies this year.
- And, yes, those 19" wide-screen monitors (approx. 1440x900, 16:9 aspect ratio) which are now on sale for $199 at many stores!

Bottom line: Things go on sale for a reason. Never pay retail if you can get away with it, and it's ok to buy "old" technology, as long as you are ok with giving up what will be in the stores the next week.

20060717

Why is my computer so FREAKIN SLOW??

Now here's a topic everyone can relate to. We have come a tremendously long way in the computer industry over the last 30+ years. I can still remember (unfortunately) some of those early machines. They were "clunky". They were big. The screens were low-res, black and white or Green on black, or something similarly boring. And, interestingly, they were also

Dog Slow.

Which is something we ALL can relate to. Because, with all our wonderful reductions in size, increased resolution, graphics "acceleration", etc., for everyday tasks (booting up, starting an application, switching between applications, shutting down, opening windows, closing windows, etc., etc.) most of our computers, are still DOG SLOW. So much for 30+ years of computer industry "improvements".

But what really gripes me is, not that our computers are slow for these everyday tasks, it's that

---> Nobody seems to acknowledge this fact and, furthermore, nobody seems to be doing anything about it.

Yes, yes...I'm an engineer. So please don't start telling me that "it's all very complicated", or something like "things really are better"...blah, blah, blah. The reality is that engineers think performance tuning is boring. Further, most development engineers have the most tricked-out, memory-maxed, highest-possible-end systems on the face of the planet that they have personally tuned to the max for themselves. So guess what? It's not their problem. It's ours.

For what it's worth, I think we have allowed the architectural concept of multi-tasking to be taken to extremes,and that is a big part of this problem. Microsoft has created a "system" that allows processes to be spawned at will by almost any ISV that simply sit there and take up CPU time, disk access, and God knows what, and there is very little control placed over them at any level. That is why 1) Game manufacturers don't have this problem; they try their best to take over the machine and not let the offending code run at all, and 2) clearly this is why, after only a short time, our once fast running new computers slow down to an abismal crawl.

Yes, our computers are wonderfully fast when they are new. All of them. Not just the expensive ones. And they ALL slow down after a few weeks of use. All of them. Not just the cheap ones. Why? Because of a combination of the OS and ISVs. ISVs write code that abuses the operating system, and the operating system let's the code execute.

We need our computer OS to guard against performance-hogging processes the same way we guard against viruses (BTW, crappy anti-virus software is one of the biggest sappers of performance). The APIs need to enforce this process.

For example, why does an application, once installed, slow down your system EVEN WHEN NOT BEING USED? Suffice it to say that it does and this is unacceptable. The ONLY way an application I install should be allowed to run a single process is by my explicit permission; it should not be allowed to spawn process after process, leaving them to run not only when the application is called, but whenever the machine is booted. These processes are often hidden, and almost impossible for even a trained expert to find and extinguish.

And there are other reasons why, as I install more and more software, that my system slows down. One is the registry itself. The whole concept of "installing" software needs to be re-thought out. And soon.

What can you do? Unfortunately, not much. You could choose to not install any software on your new computer, but that's easier said than done as it is impossible to browse the internet without being asked(sometimes) to install something. There are also Windows "critical" updates, virus attacks, anti-virus attacks, etc.

What else can you do? Go buy a better computer... oh, yeah, not much choice their either. You could do what the industry wants you to do. Buy a faster computer, one with more memory, a more expensive graphics card, etc.

And when you do...after a few weeks, it will still be Dog Slow.

20060707

Metadata, Metadata, everywhere, but not a bit to use...

Technically speaking, the term "metadata" means "data about data", or simply information that describes other information. When it comes to digital pictures, typical metadata includes:

Date taken - the date the picture was captured (typically by a camera).
Caption - a user-entered descriptive word, sentence, or phase about the picture.
Keywords - separate descriptive words or properties about the picture.
Author/Creator - Identifies the person who took the picture.
Owner - Identifies the person who owns the picture. It's important to note that this is not always the Creator. Also note that. although a picture typically has only one Creator, it can have many Owners.

There are many, many other examples of picture metadata. Also, although this post deals specifically with pictures, this discussion easily extends to all other types of files (digital assets).

Still further, some metadata is "static" and some is "dynamic". Static metadata are items that don't change when the picture is edited, saved, modified, etc. The examples shown above are largely static metadata. Other metadata, like content IDs generated by some hash of the image data, or simply the "date modified" flag, will generally change every time the file is changed somehow. Or at least they *should* change.

Confused yet? It gets better. Although many "standards" exist for metadata, very few software programs completely adhere to them. That's because they all depend upon the goodwill of the software vendor themselves, who quite frankly have bigger fish to fry than fret over some seemingly unimportant adherance to a spec that will not generate a dime more revenue in the near-term. Thus the problem: the world is becoming littered with mountains of digital image files that have illegal, non-compliant, erroneous, or simply no metadata at all.

"So what"" you say? Well, don't say that to the commercial stock photographers. They clearly want to keep track of the very pictures that form the basis for their livelihood. But what about everybody else? Well, you tell me. Go to your computer, and (if you can figure out how) do a search of all your hard drives for .JPG files and sort them by file size. I wagering what you will find is exactly what I find on my computer. Thousands of files of course. But also, many many files with exactly the same size. Now THAT's odd. Why is that?? I'll tell you why...these files are duplicates. Copies of the same image, over and over again, in the same or different directories, partitions, or separate hard drives. They are there because you have NO IDEA how to get rid of them! An effective metadata management system would make this a thing of the past. Further, wouldn't it be nice to know WHICH version of a picture you had? Ever edit a picture, but want to save a backup just in case something went wrong? But you are intentionally creating duplicates! IF you had the proper metadata in these files, the system could automatically manage all this crap for you.

Bottom line. Effective metadata management can make the tangle of duplicates, backups, different versions, different renditions, etc. simple and automatic.

So why don't we have this? If everyone realizes this is a problem, why hasn't something happened? Well, it's simply because doing this right is a heck of a lot of work, requires a lot of collaborative effort among several companies, will take years to implement fully, and ...here is the big one... is is not obvious to most company managers how to make money from the investment! No one wants to pay for it unless they are going to get a big piece of the pie.

Actually...it's worse than that. Current metadata schemes are "fragile". That is, it takes a LOT of hard work to create, and yet one non-compliant program or nasty user can destroy all the good intentions. Since most metadata systems rely upon binary or text data embedded inside the file itself, anybody can write (or use) a program to remove, modify, edit, or erase the metadata that has been placed there. So what good is working hard to put my "Creator" or other metadata in my digital files, if once they leave my computer, any schmuk can simply change the info and make "my" picture his??

There must be a better way. One possibility is embedding a unique identifier INSIDE the IMAGE DATA itself. Companies such as Digimarc and others have been touting this type of digital watermarking for years. Supposedly it survives even edits, so someone could take my picture, crop it, apply "autofix" image processing, color balance, etc., to it, and the watermark would remain.

IF we had a standard system that linked this digital watermark to all the other relevant metadata for an image then we just might be able to claw our way out of this digital mess about to be foisted upon us by our own technology.

NOW...ALL WE NEED IS SOMEONE TO READ THIS BLOG AND DO SOMETHING...!

Interested? Here are some relevant links:

Adobe XMP metadata format
Stock Artists Alliance (SAA)
International Press Telecommunications Council (IPTC)
Dublin Core
Universal Photographic Digital Imaging Guidelines (UPDIG)

Welcome to TomsTechBlog

There comes a time in every man's life when his thoughts turn to...blogging. Or something like that. I've decided to augment my other web postings with this blog focussed specifically on technologies. Since I've been in the computer field for over 20 years, I will be discussing computers, networks, digital imaging, photography, portable devices, and related enabling technologies. Take a look at my general blog if you are more interested in my sometimes irreverent opinions on current events of the day. Or check out my website for other stuff. But here we are all about tech. Thanks for reading; I'll try to be at least as interesting as all the other blogs...

Tom Berarducci