Friday, September 4, 2009

Cloud Independence Ahead ...

Cloud Computing is getting a fair share of technology news these days. In a nutshell it is a way of providing and/or using remote computing resources through internet. The business model is evolving but one way or the other the end users pay for the usage of the resources instead of purchasing additional equipment. This is of interest to the hardware vendors as they would be able to sell their products as a service, which means a constant flow of money, albeit in smaller chunks than selling hardware outright. Their vision is that they can then control the deployment of new hardware and not rely on the end users to make hardware purchasing decisions directly. To make this work, and to make it work seamlessly, layers of software is needed on the end-user side and on the remote resources.

Like many other technology advances, in the first phase the cloud computing vendors needed to provide the pieces since there was not much else in place. As a result the current vendors each have their own solution that to a large extent is not compatible with others, making migration or multi-vendor configurations difficult or impossible.

Deltacloud, a new open source project within Red Hat, announced recently, is trying to solve that problem by providing a "cloud broker" that would convert the communications between the consumer end and the cloud end of one vendor to another, making the interoperability possible (see the diagram). There goal is to enable an ecosystem of developers, tools, scripts, and applications which can interoperate across the public and private clouds.

While Deltacloud is a young project and more than likely will face other competing products, the problem that it is attaching, the interoperability, is a major requirement for users, developers, and IT departments if the cloud computing is to grow rapidly. Eventually, there will be a very small number of (or ideally just one) standardized means of connecting to the cloud services (also referred to as API or Application Programming Interface), but in the meanwhile, this translation layer is the next step before the standardization of APIs.

To a large extent, after it is all said and done, to the end users, this will be transparent and invisible. At the end the average user is interested in the end-services that can increase their efficiency and productivity. The lower total cost of ownership has been the primary selling point of cloud computing. But the low cost has been the norm and not a new trend. In my opinion cloud computing needs a killer application to become a difference maker, otherwise it will remain as yet another step in the ever evolving technology quest.

And that is how I see it ...
Reblog this post [with Zemanta]

Sunday, August 30, 2009

Time for My Monthly Vision Thing - Flash as a Computer: FaaC

I am always thinking. I don't know if it is a good thing or not, but it is something I cannot stop, and frankly why should I? After I am gone it will stop forever so I might as well let it go while it wants to. So I have decided to write about some of my crazy ideas once a month. Here is my first ever monthly vision thing.

I think the time has come for what I refer to as "Flash as a Computer" or FaaC. I must admit it is not entirely a new concept, but my twist to it might be somewhat different, or at least it is not based on what I have read or seen anywhere else.

We have all seen the flash memory cards, right? Those little memory modules that we use in cameras and increasingly in camcorders. Their capacity is getting large enough (I have seen 64GB in stores) that they are now comparable in size to the disk drives in some laptops. And that comparison triggers the thought that maybe the entire computer environment can be saved in one. In other words one way or the other one might be able to hold most if not all of the setups and unique data that is ordinarily on disk in a laptop in a large capacity flash memory card. Then why can't we have a new generations of computers and laptops that can read the settings and data off these flash memory cards and make the computer act as if it were our own computer. If instead of carrying my own laptop I have a flash card that I can take with me and stick into any available computer in a hotel, in the office, airport, or wherever, and if that computer could act just like my own computer, it would be very convenient, don't you think? Probably one would have one FaaC for home computer and one for the office, and maybe for practical reasons, these FaaCs could be made the size of credit cards. I found the idea intriguing.

But is this practical? One potential approach would be to replace the boot disk of one's computer with a portable flash memory card. Since there are many different chipsets out there and such disk would not necessarily work everywhere, however. So what if we can create a virtual system on the flash and that way it would work on any computer that can load that virtual system. And if the ability to load a virtual machine is built into the hardware (it is technologically doable today) then we have a solution. And there can be a few enhancements too. For example there could be a storage hierarchy. Sensitive or archived data can be stored on a secure local disks. These storage devices would be accessible only locally, some at home, some at work, etc. Once disconnected they would not be available which is desirable. At the same time it would reduce the required store space on the flash memory card. Some other data, the ones that are not very sensitive and also not necessarily needed all the time can be stored remotely via internet (also known as "in the cloud"). These are internet based storage spaces like "SkyDrive" offered by Microsoft. An example of files that could be stored in the cloud is the old multimedia files that we no longer listen to or watch on a regular basis, but we don't want to throw them away either. They would be available only when online, and that could be a reasonable tradeoff to further reduce the need for larger flash memory cards. The files that are used frequent enough can be stored on FaaC and they would remain available all the time. A clever piece of software could monitor the file usage and make sure the most recently used one (the last 30 days?) are moved to FaaC (actually copied and kept synchronized with the original files). In addition, before getting on the road one could manually "take" additional files and data that might be needed while offline, on FaaC.

Software licensing would be the next issue. Currently when there are two computers at home and five users, there are only two licenses (of Microsoft office, for example), one on each computer. Would they need 5 copies with FaaC? The cost could be a problem. There needs to be some way of associating all those FaaC with the available licenses (two) and be able to have only two of them authorized at any given time. This is possible, using some sort of internet based authorization, but what happens when the machines are offline? May be they can check out a license before going offline? No matter what, the software vendors need to embrace the concept and come up with workable options and solutions.

So what does it take for this to become real? The hardware vendors need to agree on and implement a hardware based virtualization, software application vendors need to sort out the licensing options, the operating system vendors (Windows, OS X, Linux, as a minimum) need to make the FaaC concept and the overall storage hierarchy work seamlessly and transparent to the users, and the businesses, hotels, airport lounges, etc need to all buy into this concept and deploy the base hardware and software. Somebody needs to pull the trigger and make others follow their lead. Somebody, please? To start with, maybe the Linux community could create a stripped down kernel that in essence would turn an existing hardware into a system that does nothing other waiting for a flash memory to be inserted into at which time it would look for and run any virtual machine it could find on it. This is not fancy, but is enough to allow people to take an image of their systems with them and run on other hardware. If the concept proves itself, I am sure money will follow.

And that is how I see it ...

Reblog this post [with Zemanta]

Friday, August 28, 2009

Will cost cuttings result in more jobs to be outsourced?

This is a repost of an answer I recently posted on Linked.

I think in the short run no, in the long run yes.

Outsourcing, especially to other countries, started more than a decade ago (in its current definition). It was overdone and rushed, before creating the right processes and procedures to measure, monitor, manage, and adjust. There were some successes and lots of failures too. The failures slowed down the practice and in some cases even reversed the trend. As methodologies became more and more established and better implemented, and as the vendors learned to meet the needs of their clients better, the outsourcing started becoming the norm rather than the exception. and I believe this trend will continue.

From a bigger picture perspective, to remain competitive companies need to do things horizontally (focusing on their particular expertise and added-value) and outsource functions that are required to run a business but are not particular to their company. This is not necessarily bad for the workforce, although it does shift the jobs around. Also this does not necessarily mean jobs will move outside the countries, although in some cases they do, but for some other jobs there should be an increase via insourcing (if that is a word). This process, over time, should level the playing field.

Reblog this post [with Zemanta]

Wednesday, August 26, 2009

To Be or Not to Be (online)

Many small businesses believe they don't need to be online. They feel it is complex and/or costly. Most of those who are online, don't realize enough real benefits. The fact is, more than ever businesess of any size need online presence. For example a one-employee company (maybe a consultant) might want to have their resume, past clients, expertise, and contact information on a website that is bookmarked by clients or prospects, in addition to being searchable when others are looking for someone with their background. A small local store might want to offer coupons, specials or other information that would connect them to the community. Furthermore, online presence is not expensive, time-consuming, or too difficult for a small business, and with a little planning and foresight it can effectively increase market penetration which often results in higher revenue.

Here are a few tips to get online, successfully.

1. Start with a website

There are quick and easy ways to get online and have a presence. Depending on the nature of your business, you could need a more interaction with your partners, clients, and prospective customers through your website, in that case you might want to consider getting professional help to create an original design that sets your company and business apart from the competition.

2. Get your own email address

As a business, your email address should not be from yahoo.com or comcast.net. Your emails should be originate from your own company domain address, for example John.Smith@HisOwnCompany.com. This goes a long way in establishing credibility with everyone you deal with.

3. Host your website based on your needs

You will need to host your website somewhere. There are many hosting companies but there are tradeoffs just like anything else in business. The primary factors include storage capacity (how much data you will have on the host), access speed (how fast the pages will load in the browsers), reliability (what percentage of the time the host is operational), flexibility, and cost. If your website is frequently unavailable and it could hurt your business, especially if your revenue is generated directly from your website (for example if you have an online store.) On the other hand if the site is merely a displays your name and email address, it might not worth an extra $1000 a month to go from 99.9 uptime to 99.999 uptime.

4. Update your contents

People are more inclined to visit your website often if the content is always fresh and changing. The more they come to your website the higher the likelihood of transacting with. Also search engines (like Google) seem to rank you higher if you have a lot of visitors. This means when someone is "searching" for you (using Google, for example) your website will be among the first few that is shown and therefore more likely for new prospective customers to visit your website.

5. Consider advertising

It is not inexpensive but it could pay off if you have the right products or services. There are some simple solutions to get started. pay-You can check out Google's Adwords as a starting point. It would charge you by clicks and/or impressions so in way a the cost is tied to benefits. You can also try direct email marketing.

6. Connect with your customers

Many small business use blogs and podcasts to broadcast contents to their existing and potential customers. You can influence them and they can see you as a subject matter expert when you share your knowledge, vision, and plans. You might feel you are giving away free information but the results often outweigh any perceived loss.

7. Collect data, analyze, and adjust

You can track how many visits your website gets, who they are, which page the start with, how much time they spend, which page they departure from, and many other useful facts. Such information can help you decide why people are attracted to your page and what they want, which in turn can help your maximize what you want to get out of your website.

8. Give customers what they want

It is important that the information on your website is easily discoverable, accessible, and usable. Pictures and diagrams are often better received than words. Also the main points should near the top or even in the title instead of several pages into the article.

9. Have a clear idea and implement it

There should be an objective behind everything on your website. If you are selling products online, help them easily find the product they are looking for, describe clearly (including pictures if applicable), and make the navigation to the order processing page seamless. On the other hand if the website is simply a "business card" put all the contact information on the first page in a clearly visible area such as the top 25% of the page.

Reblog this post [with Zemanta]

Saturday, August 22, 2009

New Business Cards

A few years ago while attending a convention I noticed this self claimed consultant who was handing out his business card. I am using the word card rather generously here as they were clearly cut out of regular paper. Seemingly he had printed them on paper using a laser printer. It looked unprofessional, yet it was thought provoking. Why one needs a business card anyway and what do we do with them after getting them? In the old days when I used to have a note book for taking notes I taped them onto the page where I jotted down any notes from our meeting. In most cases I never went back to the cards. The phone numbers of the few whom I called back were transferred to my phone book (and later my electronic PDA or computer.) The rest were trashed or forgotten for all practical purposes.

Things have changed but not necessarily the principles. I only use paper and pencil to take notes when I don't happen to have my laptop with me. When I get a business card, I usually use it to read the name to make sure I got it correctly, and may be as a reference for the rest of the meeting. As for the contact information, I try to do an email exchange so we both can transfer the info (and in a way to verify the email address). From there we can move the relevant pieces easily to our electronic phone books. This works for me, although it is not as efficient as it can be. In addition, I wish there was a picture to go with the contact info. It is embarrassing when I meet with someone who I had met a couple of months earlier but cannot remember which one of the eight people whom I met with on that day he/she is.

Here is what I think would help. The new business cards should be similar the traditional ones on the front (kind for backward compatibility)! On the back, there should be a photo, plus a barcode with everything on the front side encoded in it. The recipient would use their smart phone to take a picture of the back of the card which then would be decoded and stored in the phone. From there the information can be transferred to (or sync'ed with) the contacts database on one's computer. Wouldn't that be neat? I would think so, but is it practical?

Well, there are some solutions out there already. For example Microsoft has Microsoft Tag, an online application where it can create a barcode image from the provided information. Microsoft also has free applications for smart phones, including iPhone, BlackBerry, Nokia, and RIMM that can take a picture from the barcode and decode it. It worked very nicely when I tried it. By the way, the barcode generator is not limited to contact info. It can be used to encode free text, web pages (URL), and phone numbers. There is even a "custom" barcode option where one can convert the barcode into something that is more eyes pleasing. Microsoft Tag is still in beta, although it appeared to be useful even now.



I am not sure how long it will take for this idea to become common place, but sooner will be better than later. I for one will have my next business card with a photo and a barcode. Would you please do the same too? Oh, as for that consultant, I don't think this will help him, but I still think he should have spent a few bucks on normal business cards.

And that is how I see it …

Reblog this post [with Zemanta]

Monday, August 17, 2009

I am a netbook - I am a notebook

Have you seen those Mac commercials? The question that I am asked these days more than Mac versus PC is about netbooks and notebooks.

With the integration advancements the notebooks became more desirable than the desktops since they were portable and could do everything that the desktops did a year or two earlier. The desktops became mostly specialty computers where typically higher end processing, and graphics, large amount of memory and disk space, larger (or multiple) screens, or some combination of these were required. And then came along the netbooks to further enhance 3 aspects of the notebooks: the size and portability, the battery life, and the price.

A typical netbook is measured less than 11 inches diagonally compared to more than 17 for some notebooks and weighs about 2 pounds compared to about 7 pounds for an average notebook. This is a huge improvement for those who lug around their laptops all day. And to save space, most (or maybe even all) of these netbooks use solid state drives instead of mechanical drives which also weight less, are less noisy, and don't break (as easily). But the smaller size is one of the most significant drawbacks too. The keyboard is too small, for writing large documents and the screen size might be too small to be comfortable as the primary display. and the flash drives means less disk drive space which could become an issue for those who need to have a lot of files on their systems.

The battery charge might last up to 8 hours for some of the netbooks versus 2-3 hours for a typical notebook. For someone who is out of the office for most of the day this might a deal maker as a single charge could last the entire day. The longer lasting battery is made possible by using lower power processors (and other low power consumption components) which achieve the lower power by reduce processing power. For some applications this might not be a big issue while for others it will be.

However, the last enhancement, the price, is the number one reason that attracts people to the netbooks in my experience. They can be had for anywhere between (supposedly) free to about $300 in most cases, compared to $600-$3000 for notebooks. The most intriguing the least understood part of the very low end of the price range is that part (or even all) of the actual cost is subsidized by the cell phone providers almost exactly the same way they subsidize the cost of the cell phones. That means a two year (in almost all cases) contract to a wireless data plan for around $60 a month, which if you do math will cost $1,440. If one needs the data plan (and some, but not all, do) then it is OK. Otherwise this has be calculated into the total cost of ownership. Additionally, the lower cost is achieved by eliminating or reducing some features (graphics acceleration, processor, memory size, disk size, IO connectors, etc.)

So there is no clear "winner" here, at this time. Many use their computers primarily to access internet and a netbook might be a perfect solution for them (that is why it is called NETbook.) On the other hand some do a lot of intensive work on their laptops (spreadsheets, heavy document editing with graphics, charts etc, and other similar applications), needing large keyboard and screen size and lots of memory and disk. For them a notebook is the right answers. Which one is right for you? Only you know the answer.

Down the road, the netbooks will displace the notebooks in much the same way that the notebooks have been replacing the desktops, although they might not be call netbooks. We will be able to pack more functionality at a lower price into lighter laptops with larger screen sizes and keyboards, similar to the Apple MacBook Air, but at a much lower cost (closer to $300 than $3,000.) They In the meanwhile, the competition between the netbooks and notebooks is bring down the price of all laptops, which is good news for the rest of us.

And that is how I see it ...


Reblog this post [with Zemanta]

Monday, August 10, 2009

BodyBugg: SaaS to Lose Weight

There is a product called BodyBugg which is featured on the TV show “The Biggest Loser”. For those who are not familiar, it is a little gadget that one wears on the back of their upper arm, throughout the day. It has some sensors and a very accurate pedometer. The sensors measure things like body temperature, the amount of sweat, and “heat flux” (whatever that means) and the pedometer counts the number of steps one walks. Using all that information, the company behind this technology, BodyMedia, can calculate the number of calories burned, or that is what they claim.

Anyone who has ever tried to control their weight can attest that such a tool could be invaluable. BodyBugg does not directly show how much is burned. It only collects and stores up to several days worth of data internally. Using a personal computer the data will need to be downloaded to the BodyMedia servers via a Java application that runs in a browser. By the way their website can track what one eats, which is then converted to calories too. In addition it can track body parameters (weight and various body measurements). And if that is not enough, depending on the goals, the websites puts together a suggested daily meal plans for breakfast, lunch, dinner, and snacks, as well as daily exercise routines. Overall the combination of BodyBugg and this website is a very powerful fitness tool.

When I started using BodyBugg which I had received as a gift last Christmas, I noticed my calories burned at night when I was sleep was very close what one would expect by just calculating the resting metabolism rate (RMR) for an average person in my situation. I always had experienced a higher calorie burned rate compared to everyone else around me and couldn’t agree with seeing an “average” metabolism. As I read more about the device I started picking up clues as how their website was calculating the results. Based on that I modified some of my inputs (age, weight, etc) to achieve the expected RMR (more on this in a later blog, maybe). After the adjustments I thought it did a reasonable job of helping me track my calories. After several months of usage, my weight was relatively close to what the tool was predicting so I was fairly happy.

There were a few things that I don't like about it though. First, it does not show all the statistical data that I am interested in and there is no (known) way of downloading the data into a spreadsheet that I can use to do my own analysis. Also, I had to fake some data to make it work better. I would have preferred if it could allow me to enter the RMR directly (one can measure their RMR in a clinic) instead of them using an aver RMR based on age, height, and weight. Lastly, if their website is down (which happens a couple of times a month) one cannot use it at all. In other words there is offline solution. This can be problematic. Lastly, there is no way to enhance what they offer any further as there is no way to interoperate with other online or installed fitness tools. Whatever they offer is all get and there isn't much else you can do.

So, while I like what they have done, they need to do more. There needs to be a solution for data portability. They need an offline application that at a very least can function for a few days before needing to interact with their servers. I wish they open up their APIs. That could allow other vendors (or even the end users) to use the collected data to provide enhancements and functionality beyond the current scope. And lastly, interoperability with other tools like heart rate monitors and GPS can take really BodyBugg to a different level.
Reblog this post [with Zemanta]