If you have been following the news these past few weeks, there have been four major delays announced. The Sony PS3, Microsoft Windows Vista, Microsoft Office 2007, Toshiba HD-DVD. Is this all a coincidence or over promising complex technology?
Well based on what I have read, Sony said the PS3 delay is due to final Blu-Ray copy protection technology specifications not being ready for a Spring 2006 delivery. Ok, I'll buy this since this can be confirmed throughout the industry. However, the product delays here are due to a specifications ratification which in essence is considered part of the software engineering project. This is a dependency risk avoidance strategy and looks to be a sound management decision. "Playstation 3 delay - a good thing?".
Then you have Toshiba delaying their HD-DVD launch to wait for the release dates of HD-DVD content in April 2006. This makes sense also. Who wants to buy a device that has no content? I would not. From a marketing perspective, this would put a dent on initial sales. From a software engineering perspective, timing the release of content to coincide with release of a product is a market risk avoidance strategy and looks to be a sound management decision.
Then you have the new bombshell announcement that the already late Windows Vista will be delayed nine more months due to quality and security programming issues. Recall that Windows Vista is already two years late. I read an article stating that a possible 60% re-programming of Windows Vista is required in order to deliver it by January 2007. "60% Of Windows Vista Code To Be Rewritten". This sounds like a rumor or FUD. If there is at least an ounce of truth to this and if you are in the software engineering field, this sounds pretty significant to me. If you have to re-write 60% of the code within the last 15% of effort, then I think there are some major design quality problems within the project. Has the complexity of the product and age of the Windows code base finally caught up to itself? Or is there some requirements creep happening internally to the project? Well according to some blogs I've read, Microsoft is even moving programming resources from XBox into the Vista project. This should be a challenge for both teams.
Adding more developers to an already late project just makes the project, well, even later. If you are familiar with Fred Brook's essays on project management, then it looks to me like Microsoft is having some serious internal project management problems. For a consumer level product which Windows Vista has become, all these delays and excuses really do not go well with the average non-technical buyer. It primarily affects the public's perception of your brand or company.
So are all these product delays now part of the norm in today's ultra fast paced high technology world? If you look at proprietary software technologies the answer is dependent on the size of the project. The larger the project, the more likely there will be delays due to the thousands of dependencies within the project. If you look at the open source universe, it seems that the open source model appears to attack size the complexity in a different manner. Since there really aren't any open source projects as large as Sony PS3, Microsoft Windows Vista or HD-DVD this is yet to be determined.
Being the optimist that I am, the jury is out on whether project delays is the norm today. I think project size, over promising, marketing hype, increasing complexity, and requirements creep are probably the real causes of delays. Each of these aspects are manageable components of the software engineering and product development life-cycle process. Of course this is much easier said than done.
Sunday, March 26, 2006
Saturday, March 25, 2006
Next Generation AJAX Applications
I started this blog last year inspired by my first impression of an AJAX application, Google Maps, and how I thought it would disruptively change the future of software. Well, I got my first glimpse of that promising Web 2.0 future application in the form of a beta AJAX applications called AJAXwrite 0.9 which is an MS-Word clone developed by Linspire. You have got to see this thing.
I was totally surprised at how fast AJAXwrite loads, in 6 seconds. This is due to its relatively small size of 400 kb as stated on the web site. It looks nothing like first generation AJAX applications such as Google Maps and Yahoo! Mail. These apps tend look and feel like rich browser based applications. The first generation AJAX apps have a few issues with load times in my experience. AJAXwrite looks and feels more like a traditional GUI native application that utilizes the browser engine as its platform. In my case I am using Mozilla Firefox. AJAXwrite still has some maturing to do but for a proof of concept or beta as everyone calls these today, it really shows what is possible.
Just as I predicted, AJAX is the disruptive technology that changes everything on the web. If AJAXwrite 0.9 is the trend of things to come this year, we will see some exciting new web based software that will redefine what a web application is.
I wonder what frameworks are being used to make AJAXwrite work? I need to do some research into this because for such a rich AJAX application, it sure does provide a lot of the rich client widgets and capabilities that I am accustomed to building in non-browser GUI application development.
I was totally surprised at how fast AJAXwrite loads, in 6 seconds. This is due to its relatively small size of 400 kb as stated on the web site. It looks nothing like first generation AJAX applications such as Google Maps and Yahoo! Mail. These apps tend look and feel like rich browser based applications. The first generation AJAX apps have a few issues with load times in my experience. AJAXwrite looks and feels more like a traditional GUI native application that utilizes the browser engine as its platform. In my case I am using Mozilla Firefox. AJAXwrite still has some maturing to do but for a proof of concept or beta as everyone calls these today, it really shows what is possible.
Just as I predicted, AJAX is the disruptive technology that changes everything on the web. If AJAXwrite 0.9 is the trend of things to come this year, we will see some exciting new web based software that will redefine what a web application is.
I wonder what frameworks are being used to make AJAXwrite work? I need to do some research into this because for such a rich AJAX application, it sure does provide a lot of the rich client widgets and capabilities that I am accustomed to building in non-browser GUI application development.
Wednesday, March 15, 2006
Web Services and Small Companies
Innovation continues to take place in small companies. In today's over exposed world of instant information, web services and IT in general, it becomes difficult to sort through the background noise of what is published today to find new products and services. Especially from new or small startup companies. They just do not get the press.
I just read a really interesting article about a small company that has been able to monetize web services and do quite well at it. Google, Yahoo!, EBay, Amazon all have web services in place however are still trying to make it a profitable business. The article was published in the March 13, 2006 issue of Information Week entitled "Web Services By the Dozen".
The small company is called StrikeIron and it provides an online marketplace for web services for general use. StrikeIron acts as sort of a broker for organizations to search for web services and establish deals to consume them using StrikeIron in similar business model that makes Amazon and EBay successful. They are like this "80-pound gorilla" that dominates a field with few competitors.
In my local area, I had the opportunity recently to talk with a few small startup companies (micro companies) that are just as innovative. They are touting that web services and their agility (as in nimble) are the technologies that will allow them to distinguish their product and services to be ultra competetive and grow in the future. After reading about StrikeIron I think the creativity and innovation that will be driving the future of IT are the small companies. Google recently purchased Writely which was yet another small company providing their AJAX based web services that allows you use your browser for editing word processing documents in MS-Word, RTF, OpenOffice formats.
The only threat these micro companies have is their managing their own growth and being acquired too soon (i.e. by Google or others) while they are still half baked. We live in yet another exciting period where innovation is driving new ideas and most of these innovations are coming from the small companies.
I just read a really interesting article about a small company that has been able to monetize web services and do quite well at it. Google, Yahoo!, EBay, Amazon all have web services in place however are still trying to make it a profitable business. The article was published in the March 13, 2006 issue of Information Week entitled "Web Services By the Dozen".
The small company is called StrikeIron and it provides an online marketplace for web services for general use. StrikeIron acts as sort of a broker for organizations to search for web services and establish deals to consume them using StrikeIron in similar business model that makes Amazon and EBay successful. They are like this "80-pound gorilla" that dominates a field with few competitors.
In my local area, I had the opportunity recently to talk with a few small startup companies (micro companies) that are just as innovative. They are touting that web services and their agility (as in nimble) are the technologies that will allow them to distinguish their product and services to be ultra competetive and grow in the future. After reading about StrikeIron I think the creativity and innovation that will be driving the future of IT are the small companies. Google recently purchased Writely which was yet another small company providing their AJAX based web services that allows you use your browser for editing word processing documents in MS-Word, RTF, OpenOffice formats.
The only threat these micro companies have is their managing their own growth and being acquired too soon (i.e. by Google or others) while they are still half baked. We live in yet another exciting period where innovation is driving new ideas and most of these innovations are coming from the small companies.
Tuesday, March 14, 2006
SOA, Data Warehouses and Modeling
For the past few years Service Oriented Architecture (SOA) and Data Warehouses have not been discussed within the same context. It is as if the two are mutually exclusive or at least the analysts and vendors want you to perceive they are. This never really did make sense to me. I am not an analyst so I really did not pay too much attention to this until recently when I had to address questions in a discussion about SOA and data warehousing.
I have been involved in the data warehousing, data mining, business intelligence domain for the past ten years so the problems and technologies in that domain are familiar to me. I have not been involved in a SOA solution yet. As I did my research into SOA I discovered that very similar problems exist. The deeper my research went into SOA the more commonalities I saw between these two domains.
I just finished reading "Whipping Data Into Shape" published in the 02/06/2006 issue of InfoWorld. The premise of this article is that solving a SOA problem is very similar to solving a data warehouse problem. IBM, Informatica and Oracle have even gravitated towards what they call an operational data warehouse type of concept and architecture for their SOA solutions. This surprised me but makes sense. What is old is new again just reborn with a new label and a few twists.
Defining a data architecture is coming to the forefront again in the SOA problem space due to the issues encountered over the past few years. The InfoWorld article talks about the concept of a Master Data Management Architecture and provides a nice graphic 50,000 foot view of it. It makes sense on paper.
The primary problem is the metadata which leads me into my other topic which appears to be the forgotten art of modeling. More specifically information and data modeling. In my experience, the past six years has been a form of backlash against modeling applications, systems and data. I think this coincided with the dot com boom and the need to get web sites, services and applications up and running as quickly as possible. The agile technique movement appear to address this need.
Well in a SOA or data warehouse agile and fast architecture and design decisions will get you into 'deep kimchi' really fast. I guess that is what has been happening in the SOA industry lately and a regrouping or rethinking of these rapid approaches is coming to the fore front again. This is a good thing. In the past few months I now hear many technical and management discussions centering on the requirements, modeling, use-cases, business-rules, 'by design', doing it right, etc. with respect to SOA today. Well, if you were involved with the last push for data warehousing and business intelligence in the late 1990s then this is just a repeat of what was done in the past and appeared to work. It did for me since the organization I work for has been managing a data warehouse for the past nine years quite successfully.
The twist on the massive amounts of data and services that has to be managed today is getting a handle of the context and semantics of it all. From an enterprise perspective, you are dealing with applications, files of all types, databases, web sites, and information all over the organization. What does it all mean from a SOA perspective? This is where a metadata repository and good models (data, information, semantics) are critical for success. The solutions and tools for this requirement are not yet built. At least from what I have seen they are not. What is required is some type of enterprise level content management, metadata, repository, and modeling facility. I am not sure who is going to solve this problem but I think platforms like EMC Documentum are headed in the right direction. Whether or not they solve the problem is to be seen in the coming years.
To keep it all simple, from the modeling perspective, I think getting back to fundamentals and building the models is key to success. If you can't define the problem you are trying to solve then how can you measure the level of success you have? I am glad that the old problem solving 101 still applies to today's much more complex world. I have yet to see my problem solving skills fail me yet. I follow the KIS (keep it simple) or KISS (I won't spell out the acronym here) principle. If you can make the complex appear simple, you have a better chance of getting more people involving in understanding the problem and thus ultimately help in being part of the solution.
I have been involved in the data warehousing, data mining, business intelligence domain for the past ten years so the problems and technologies in that domain are familiar to me. I have not been involved in a SOA solution yet. As I did my research into SOA I discovered that very similar problems exist. The deeper my research went into SOA the more commonalities I saw between these two domains.
I just finished reading "Whipping Data Into Shape" published in the 02/06/2006 issue of InfoWorld. The premise of this article is that solving a SOA problem is very similar to solving a data warehouse problem. IBM, Informatica and Oracle have even gravitated towards what they call an operational data warehouse type of concept and architecture for their SOA solutions. This surprised me but makes sense. What is old is new again just reborn with a new label and a few twists.
Defining a data architecture is coming to the forefront again in the SOA problem space due to the issues encountered over the past few years. The InfoWorld article talks about the concept of a Master Data Management Architecture and provides a nice graphic 50,000 foot view of it. It makes sense on paper.
The primary problem is the metadata which leads me into my other topic which appears to be the forgotten art of modeling. More specifically information and data modeling. In my experience, the past six years has been a form of backlash against modeling applications, systems and data. I think this coincided with the dot com boom and the need to get web sites, services and applications up and running as quickly as possible. The agile technique movement appear to address this need.
Well in a SOA or data warehouse agile and fast architecture and design decisions will get you into 'deep kimchi' really fast. I guess that is what has been happening in the SOA industry lately and a regrouping or rethinking of these rapid approaches is coming to the fore front again. This is a good thing. In the past few months I now hear many technical and management discussions centering on the requirements, modeling, use-cases, business-rules, 'by design', doing it right, etc. with respect to SOA today. Well, if you were involved with the last push for data warehousing and business intelligence in the late 1990s then this is just a repeat of what was done in the past and appeared to work. It did for me since the organization I work for has been managing a data warehouse for the past nine years quite successfully.
The twist on the massive amounts of data and services that has to be managed today is getting a handle of the context and semantics of it all. From an enterprise perspective, you are dealing with applications, files of all types, databases, web sites, and information all over the organization. What does it all mean from a SOA perspective? This is where a metadata repository and good models (data, information, semantics) are critical for success. The solutions and tools for this requirement are not yet built. At least from what I have seen they are not. What is required is some type of enterprise level content management, metadata, repository, and modeling facility. I am not sure who is going to solve this problem but I think platforms like EMC Documentum are headed in the right direction. Whether or not they solve the problem is to be seen in the coming years.
To keep it all simple, from the modeling perspective, I think getting back to fundamentals and building the models is key to success. If you can't define the problem you are trying to solve then how can you measure the level of success you have? I am glad that the old problem solving 101 still applies to today's much more complex world. I have yet to see my problem solving skills fail me yet. I follow the KIS (keep it simple) or KISS (I won't spell out the acronym here) principle. If you can make the complex appear simple, you have a better chance of getting more people involving in understanding the problem and thus ultimately help in being part of the solution.
Sunday, March 05, 2006
IBM's Quasar Impact
Earlier this year I discussed the impact I foresaw that the Cell chip was going to have on the industry. Well over at IBM the impact is huge. Just a few days ago, Fortune published an article describing the Quasar project at IBM. "IBM's Quasar: Is it the future of computing? The computer giant is betting a new chip and a reorganized corporate structure will make IBM exciting again."
The impact of what IBM learned on the Cell chip project in the last six years will be sweeping. The Quasar project is fueling an internal reorganization of IBM based on lessons learned from the Cell project and new sources of innovation within the corporate giant. The Cell concept came from Sony. It is a collaborative effort between Sony, IBM and Toshiba. It will power the soon to be released Sony Playstation 3, HDTV related consumer electroncs devices and IBM Blade Servers.
Quasar can be considered the next release of the Cell and will be a center piece of all of IBM's future hardware designs. The future information technology will be image and speech centric instead of primarily text-based as it is today. In the heavily image-speech centric processing world, the power of Cell and Quasar will be what drives the innovation from IBM's viewpoint.
Once Sony releases the Playstation 3 this year, we will have more tangible evidence of what the excitement is all about from an IBM perspective. IBM is trying to introduce the 'wow' and 'excitement' factor back into its products and with the Cell and Quasar chips it looks like the future again looks to be quite a fun place if all goes as planned.
The impact of what IBM learned on the Cell chip project in the last six years will be sweeping. The Quasar project is fueling an internal reorganization of IBM based on lessons learned from the Cell project and new sources of innovation within the corporate giant. The Cell concept came from Sony. It is a collaborative effort between Sony, IBM and Toshiba. It will power the soon to be released Sony Playstation 3, HDTV related consumer electroncs devices and IBM Blade Servers.
Quasar can be considered the next release of the Cell and will be a center piece of all of IBM's future hardware designs. The future information technology will be image and speech centric instead of primarily text-based as it is today. In the heavily image-speech centric processing world, the power of Cell and Quasar will be what drives the innovation from IBM's viewpoint.
Once Sony releases the Playstation 3 this year, we will have more tangible evidence of what the excitement is all about from an IBM perspective. IBM is trying to introduce the 'wow' and 'excitement' factor back into its products and with the Cell and Quasar chips it looks like the future again looks to be quite a fun place if all goes as planned.
Subscribe to:
Posts (Atom)