Saturday, June 16, 2012

The Fake cloud, the Real Cloud


This article is a little rant about all of the "cloud" proposals in the market. And I think a little history is in order.

During the late 90's , a group of companies lead by Oracle (including Netscape, Sun, IBM and even A pre-Jobs Apple was in the club) proposed the Network Computing Architecture, or NCA. Some technologists (like myself) bought into the idea right away. Back then, nobody spoke about “the cloud”.

The idea behind the NCA was having big application servers serving the needs of clients, moving the computing power to the network, and having thin clients, with no storage units. Some models even had an ID Card as login credential.

The model was attractive, cheaper than the client/server model, easier to maintain, update, backup, and support. The computer (called “network computer”) was going to be an appliance, almost disposable.

The NCA was ridiculed by a big sector of the tech press. They claimed “nobody wants go go back to the dumb terminal again” (not true, though, since network computers had an interesting level of computing power), and many went as far as saying that the web was OK for a shopping cart but it was never going to be able to support a whole business system.

Back then, during the dawn of the public Internet (or the www era, or the .com boom, however you want to call it), the terms used for the available services were:


  • Hosting: a company rented storage in their servers, to host HTML code (a web site), files (via FTP) and email. Later on, the service expanded to providing database access, back-end programming and scripting and some additional services.
  • Application Service Providers (ASP): Companies Building web based applications (in many cases they were not strictly “web based”, but “plugin based” or “applet based”) so customers could pay a fee and use them. 
  • Housing: I could set up my own server and place it in their data center. Housing companies would provide the networking services to access the servers. Some people called it “caging”, for the cages used to store the servers.


Those services still exist. But with minor modifications, they replaced them as “cloud computing”.

To me, the “fake cloud” is just an old service with new packaging. And to me, the “fake cloud” doesn't deserve the right to be called “a new paradigm” or “an innovative model”. It's just not fair.

Let's see some models. Let me start with Amazon Elastic Cloud. It's a great service. It's helping a lot of start ups. But it's still nothing else than housing. The only difference is, server are virtual instead of physical boxes. But the service core is the same: placing my box into someone else’s data center. Once set up, the skill level to set it up is no different than the skill level you need to set up a server sitting on your desk.

Google Apps are another example. I love Google Apps. Despite its limitations, it's a great service for those not willing to spend tons of money in traditional productivity suites and gaining collaboration. But once again, that is ASP to me. Nothing more than a bunch of applications offered via the web. It's not a new or innovative model. It's a great implementation of an already existing model. And it doesn't deserve the cloud “medal” either.

Microsoft Azure allows developers to build apps in Visual Studio and SQL Server and put them to work on Microsoft’s data center. If you're a Microsoft developer, I think it's a great way to deploy applications. But still, it's still hosting to me. It's giving me database, web and back end coding storage. Shouldn’t that service be called simply “hosting”?

In that matter, proposals from Oracle, IBM, HP and most big vendors are not much different that the ones mentioned.

Let me insist: those are NOT worthless services. I think it's great we have those available. But to me, if “cloud” is supposed to define a new paradigm, we cannot include “reheating old dishes”.

What would I expect from something called “the Real Cloud”? First, I’d try to inherit as much from the NCA as possible, meaning, the workstation should be as thin as possible. In a way, that means the operating system should become irrelevant. Real Cloud applications mush be 100% platform agnostic.

A Real Cloud platform should include development tools. And those development tools should be cloud based too! We should be able to leave nothing at the client level. As we promise the power of using an application in the cloud, with no client software installed, we should also be able to promise the ability of BUILDING application in the cloud. Real applications, which we can make available to desktop computers, laptops, tablets and phones, effortless and in one shot.

The Real Cloud platform should also be able to talk to legacy systems. This world is not a binary world, where it's “cloud or nothing”. Cloud apps should be able to talk to legacy applications, in a comprehensive and complete way.

In order to achieve that Real Cloud, we need to change the way we know databases.  The way we understand development. We’ll need to kill old paradigms, not because they’re wrong, nor because they’re old, but because they can not take us to the Real Cloud dream.

The Real Cloud has the potential to change computing as we know it. And to me, changing computing as we know it has a name: A revolution.

Saturday, May 19, 2012

The Future of Microsoft


It is no secret Microsoft is in decline. Steve Ballmer is regarded as the worst CEO in the industry. They are losing traction. They went from the #1 technology company to be worth half of an almost dead Apple. And they meant no profits for stockholders in the last 10 years.

Microsoft is doomed. At least, that's what many analysts say. On the other side, I believe there's a way our for Microsoft.

People who know me know I'm not very fond of Microsoft products. I'm also not very fond of some of their business practices. But one thing I'm not is a destructive thinker. I like to think positive, and in that sense, I'd rather see a new living Microsoft than an old dead Microsoft.

What would I do to give Microsoft new life?

1. Learn from Apple.

If we remember the early 80's, Apple was the big honcho in the Personal Computing world. Microsoft was an important player, but their business was building software for other platforms. Apple had a vision that went wayward when Steve Jobs got out. I will not discuss the how or blame anybody about what happened then. The story is complex enough to ignite passions. But I can remark the fact that when Steve Jobs went back to Apple, he brought back the important things that made Apple great. He went back to its origins, to Apple's mission: Make it for the masses, make it insanely great, make a dent in the universe. And bringing back its soul to Apple made it turn around, in the most amazing way.

I believe Microsoft is going through the same. Their vision went wayward. Remember "a computer in every desk, and Microsoft software in it"? Well, computers are no only in desks now. They are also in pockets, briefcases, and under our arms, moving everywhere. I would start by changing the vision, for the modern days and saying: "A device in every hand and Microsoft software in all of them".

2. Accept the fact that the Windows days are over. 

Microsoft started building software for other platforms. In fact, it made its fortune building software for other platforms. And even today, they make products for the Wintel platform, built by other manufacturers. During the 90's, Steve Jobs said "The desktop wars are over. Windows won". True enough, Windows is still in most desktops. But they have barely any presence in other platforms. On the server side, Linux is every day more popular. On the mobile side, iOS and Android have most of the market share. But, hey, those are OTHER platforms! But Microsoft is too stubborn to build software for those other platforms, tying their product to Windows, Windows Server, Windows Mobile, and Windows whatever. That's not the Microsoft created by Bill Gates. Microsoft moved from a software developer to a closed platform integrator. And that is killing them.

Is it to far fetched to think about Microsoft Office for iOS or Android? Or SQL Server and Visual Studio for Linux? And not a crippled version. I'm talking about competing toe-to-toe with Apple's software for iOS. Microsoft have the talent and the money to make it happen. I'm sure it would be a best-seller. Build connectors for the enterprise services: SQL Server, Exchange, etc. Make those other platforms coexist with their products. It's interesting that Apple and Google could be the companies that might save Microsoft.

Paraphrasing Steve Jobs: "The mobile war is over. Microsoft lost". It's time to accept it, and move on.

3. Bring Bill Gates back to the CEO chair.

I was just going to say "get rid of Steve Ballmer", and simply get a new CEO. But Bill Gates brings that I believe Microsoft needs desperately: The founder's vision. I don't want to compare Bill Gates to Steve Jobs. But the parallelism between them is remarkable. Hard fact is, both were living legends. Sadly, we lost Steve. But Bill Gates remains a living legend. And at this point, that's what Microsoft needs to turn around.

4. Focus, Focus, Focus.

Stop the copy machines. Get rid of product lines that makes the company lose money. Remember the Zune: a half-fast copy of the iPod. And a complete disaster. I don't know the state of the XBox, but last I checked, it was losing money as well. Don't think you can compete with a half-fast version of other products. Those days are over as well. Recruit innovators. Disrupt your own business model. Play ahead. Don't rest on your laurels. Don't wait for the future: Build the future.

Meaning, all the things Microsoft has not done in the last 10+ years.

5. Be Humble.

Fact: There are better products than Microsoft. Fact: Microsoft is not the best, even at its own game. Fact: There has been no innovations coming from Microsoft in the last 10+ years. Fact: All of the technology world knows it.

And fact: Every time Steve Ballmer talks about "innovation", and every time he mocks other manufacturers (the iPhone mocking at release date made him look like a complete moron, considering the iPhone's sales results), he reflects on what Microsoft is becoming: A joke. Here's another fact: Steve Ballmer doesn't see the future. He makes himself look like a mental dinosaur. I don't know if he is. I don't know him in person. But he displays that.

It's time to be honest, and that starts being honest to themselves. And then being honest to the public. It's time for a "mea culpa". And it's time for redemption. And by "redemption", I mean great products, products so awesome we all want to use them. Only that can save Microsoft. Any other way, they are as good as dead.

Monday, September 5, 2011

Gary Kildall: The Tesla of the Technology Industry

We all know the story of Tesla, the forgotten genius, left out of the spotlight by a inferior but business-savvier Thomas Alva Edison.

I believe it's time to remember Gary Kildall, to kill some myths, and to place him in the position he should have always been in the the PC industry.

Gary was not like other PC pioneers. He actually studies in college, earning a Doctorate Degree in Computer Sciences. Companies like Intel and Microsoft owe a lot of their success to Gary Kildall.

Gary created the first personal computer operating system: CP/M. Let's put in on perspective. Without an operating system, a program can run only in the computer it was developed for. The operating system is a layer between the machine and the program, allowing compatibility among computers. The PC revolution would not have happened without operating systems.

CP/M's history is great. Gary was working at Intel as a consultant. When Intel released the 8080 processor, they thought they've created just a micro controller. The people at Intel didn't know the jewel they've created. It was Kildall who showed them the chip was an actual computer. To prove it, he created PL/M, the first high-leve programming language and CP/M, the operating system.

Gary starter Digital Research Inc. and ported CP/M to the Zilog Z-80 microprocessor. CP/M was so popular, a card was creates for the Apple II, to allow it to run CP/M. This card had a Z-80 processor. The card's manufacturer was a company having success with two products. One was a programming language and the other was a word processor. The language was a version of BASIC, which became the most popular. The word processor was called simply Word. The company's name was Micro Soft, which changed to today's Microsoft.

Gary Kildall and Digital Research invented much more than CP/M, though. When the 286 processor hit the market, the first company releasing a multitasking product was Digital Research. Multitasking feels natural today, like having your word processor and email open and running at the same time. But back then, you could have only one program running at a time. Gary Kildall made this change before anybody else.

GUIs on PCs was something Gary was a pioneer as well. GEM was one of the first GUIs for the IBM PC, as well as in other computers, like the Atari. Introduced in 1985, GEM was superior to Microsoft Windows in every sense.

Gary Kildall was an innovator in optical media. Before the CD-ROM, Kildall presented a prototype of electronic encyclopedia using videodiscs, before 1985. Later on, in 1985, Gary introduced the first CD-ROM encyclopedia: the Grolier Electronic Encyclopedia.

Gary Kildall was an excellent car and plane pilot. He had a private plane, which he used to go to his business meeting along the USA. This virtue would eventually cost him to become the victim of an infamous myth.

This is the myth: Digital Research lost the contract to licence CP/M for the IBM PC because Gary Kildall went out flying and forgot about IBM.

The facts are very different. I know them from someone I had the privilege to work with, who worked at Digital Research and was a personal friend of Gary Kildall.

By the time IBM was building their IBM PC, they wanted it to run CM/P as its primary operating system. The decision was obvious. CP/M was a well-established and successful operating system in the business area. Gary Kildall was not in charge of the legal department within Digital Research, so he delegated that part, while he was flying back to meet with the IBM executives, and resume the meeting.

IBM wanted to force something called "unilateral non-disclosure agreement", which basically said Digital Research could no reveal anything about the conversations with IBM (not even the fact that they met) but IBM could reveal anything about them.

Let's remember that back them (1980) IBM was practically synonym of computers. But Digital Research was an already well established company, and required a determined level of standard with legal agreements, and IBM's conditions were not acceptable. Digital Research wanted to work with IBM, but they didn't like the terms of negotiation.

Let's remember the CP/M card built by Microsoft. IBM thought Microsoft had some sort of licensing agreement with Digital Research, and contacted them.

Today we might say Microsoft was smarter. But the fact is, Microsoft had much less to risk. At the time, Microsoft was just another player in a big market. Their best seller was the BASIC language. The had nothing to offer as an operating system. The opportunity with IBM was very big, and with small risk.

The story was different with Digital Research. Their core business was operating systems. They had CP/M running in a lot of different computers. Licencing CP/M had to be done the right way.

The outcome is well known: Microsoft bought a CP/M clone, called QDOS, repacked it as PC-DOC and later as MS-DOS.

Gary Kildall passed away in 1994. His name deserves to be with the names of the great pioneers of the PC industry: Steve Jobs, Steve Wozniak, Bill Gates, Paul Allen, Ed Roberts, and others. It's extremely unfair for him to be remember as "they guy who blew the IBM deal".

I want to close this post with a quote from Bill Gates, after Gary's death:

"Gary Kildall was one of the original pioneers of the PC revolution. He was a very creative computer scientist who did excellent work. Although we were competitors, I always had tremendous respect for his contributions to the PC industry. His untimely death was very unfortunate and he and his work will be missed."

With this, I finish my homage to Gary Kildall, a superior mind, a pioneer, a visionary, and an example of what innovation is.

Sunday, August 28, 2011

Computer Chronicles: A History Jewel

I discovered Computer Chronicles back in the 90s, and it became my favorite show back then.

There's nothing like that nowadays. Having the genius Gary Kildall was a big plus.

I found this archive, with chapter which ran way before I discovered the show:

http://www.archive.org/details/computerchronicles

For those with interest in computer history, there's an amazing source of information.

Enjoy!!

Saturday, August 27, 2011

The Difference Between "Hearing" and "Listening" to our Customers

Many companies claim to listen to their customers, when what they really do is just to hear them. The difference might be subtle, but important. What's the difference?

Let's see the "focus group" approach. We meet with the customers, they say what they want, and we build exactly what they request. It looks like the right way, right? What could be better than giving the customers exactly what they request!

But, where's the flaw on this? The flaw is, most of the times, customers don't know what they need. It's not that they are dumb. Most of the times, customer are way smarter than we are. Now, what they know better than anybody is, what their problems are.

Those to just hear the customer don't care about the problem. They limit themselves to build what the customer asks for.

Those to listen to the customer put special care on what the problem is. They inquiry about it. They analyze it. They try to understand it beyond the scope of the process. They make it personal. The customer's request becomes a mere suggestion. And based on that the next step should be to deliver BEYOND the customer's expectations.

Beyond customer's expectations means to see the problems behind the problems, solving as many as we can with a simple, comprehensive solution, where the customer will no only see his problem solved, but his life improved.

And that's what makes the difference between top of the line and half-fast, between excellence and mediocrity, between being a leader and a follower. Real innovation comes from listening to our customers, from having empathy with them, from really understanding their issues, to see their problems and frustrations in ourselves. And in coming our with the best ways we would like it to be for ourselves.

Listening to our customers is becoming a lost trade, especially in the technology business. As an exercise, think about some of the companies you work with. Which ones really listen to you? Which ones limit themselves to hear you?

When a customer is listened to, a loyalty relationship is developed. The customer becomes a friend, someone who can ask us "what do you think about this? Is this good or bad for me?", pretty much as we do in a personal basis. If you limit yourself to hear your customer, you become a butler, a clerk, someone who just follows orders, despite of those being good or bad for the customer.

And that makes a big difference: A butler or a clerk can be replaced. A friend is forever.

So, challenge yourselves. Do you listen to your customers? The day you start doing it, your business will change forever.

Thursday, August 25, 2011

Why do we still need Steve Jobs

The news fell like a bomb. Steve Jobs is no longer Apple's CEO.

In less than 24 hours, a lot have been written about it. Coverage of Bill Gates's departure from Microsoft was not even close. So, why do I dare about writing more? Simply because since most people know the Steve Jobs which saved Apple from bankruptcy, to me he was one of my childhood heroes.

My first time in front of a computer was when in 1979, when I was 8 years old. I started programming in 1983, at the age of 12. Back then there was no Macintosh, and the most coveted computer for a kid like myself was the Apple IIe, a commodity almost unreachable in my country back then, because of the import taxes set by our incompetent politicians.

Even then, I used to enjoy collecting every single brochure (all in English) from every computer I saw in the fairs. How do I wish to have them now! Those would be nice collection pieces today. In one of those brochures was the short tale of two kids (almost teenagers) who founded a company by selling a minivan and a calculator. It was inspiring.

Steve Jobs has always been a complex character, a misunderstood person, a man with a different view of technology. To Steve Jobs, technology is a part of being human. It should not be something arcane and complex, that divides genius from common people, but something that makes us better human beings, which get us together, makes our life better. Nothing describes his vision better than the metaphor "a bicycle for the brain"

Let's highlight what Steve Jobs is NOT: He's not a brilliant programmer or engineer. He has no academic degrees. He did not design the Apple I or II. That merit goes 100% to his partner, Steve Wozniak. In that sense, people like Ed Roberts, Paul Allen, Gary Kildall and others are to be recognized for the creation of the technology itself.

Now, what Steve Jobs IS: He's an excellent judge of people's talent, who gets rid without any remorse of those he calls "bozos", keeping only the "genius". He's stubborn and driven. He believes in his own vision. He can fail. Projects like the Apple III, the Apple Lisa and the Mac Cube were big flops. He's perhaps the best presenter in the world. He's an unsatisfied perfectionist. And he has the passion of taking technology to the people.

Steve Jobs's innovation is sometimes misunderstood. I think it's worth recalling a little bit. The IBM PC was born as a reaction to the new personal computer trend, started by the Apple II. Windows and most of the other GUIs were born as a reaction to the Apple Macintosh. The newer generations can remember how listening to music was before and after the iPod, how the iPhone changed the phone as we knew it and how the iPad became the standard for tablet computers.

None of these inventions we created by Apple. The personal computer was created by Ed Roberts, the GUI, by Xerox, there were already MP3 players in the market, as well as smartphones (Blackberry and Palm). Regarding the tablet, I'd suggest to search the Compaq Concerto in Google. Nevertheless, none of these products took off. And the reason is easy enough to understand: They lacked the human touch. And it's there where Steve Jobs is above anybody else.

In today's world, technology companies are going through very dark times. Dell is a glorified assembler. HP is not even the shadow of what it used to be. Microsoft has no actual leadership nor direction. IBM has no consumer products. Oracle has dedicated to buy others and compete in areas it should have not get into. Google lacks of that human touch technology needs to blossom. Wall Street has contaminated Silicon Valley in a way that it's worth getting rid of the creative minds to make the stock go one point up. There's no long term thinking anymore. I've seen it myself in HP.

In this environment, Steve Jobs is a beacon of light in the darkness. He is who raised the quality demand bar for what a technology product must be. Steve Jobs made Apple what every technology company should be: A product-focus company, not a Wall Street expectations-focus company. And funny enough, is that vision what made Apple the most valuable company in the world.

Technology moves the world. And the world still needs Steve Jobs. As a matter of fact, the world needs more and new Steve Jobs. Something not easy to achieve, but a reachable goal, if one has enough will power to make the objective. And funny enough, that's another of Steve's lessons: With will power we can achieve the impossible.

My best wishes to Steve Jobs, and may he be doing what he always does, for many, many more years.

Wednesday, October 8, 2008

A practical way to close your Apple Mighty Mouse after cleaning

I love the Apple Mighty Mouse. Except (as most users do) when it's time to clean the little scroll ball.

I'm not going to include all of the solutions (rubbing it, opening it) since most of them are posted on the web. From my experience, it's unavoidable to open it to clean it from the inside.

Unfortunately, this action means you have to unseal (or crack-open) your Mighty Mouse. This is a good instructable:

http://www.instructables.com/id/How-to-clean-your-Apple-Mighty-Mouse/

But instead of re-sealing using super glue, use a couple of thin pieces of scotch tape.

Super glue fumes might be damaging to the circuits, and it will leave your mouse stained and with bumps if you're not careful.

Also, you will need to crack it open again if you need to clean it up again.

Scotch Tape works like a charm and it will allow you to open it in the future.