Today was a sad day for me. I just read in the latest Software Development Times (March 1, 2006) issue on page 5 that the creator of Symantec's Visual Cafe and founder of Java software company M7, Mansour Safai, died on 2/9/2006. He was 43. The cause of death was brain cancer.
Safai was a brilliant software engineer/architect. While working for Logitech he created the Multiscope debugger. At Symantec he was vice president and general manager of their Internet tools division. In 1997, he developed Visual Cafe which was a multiplatform RAD tool for writing, debugging and deploying Java applets and applications. I did get to research and evaluate Visual Cafe in the late 1990s. I thought that it was quite impressive. Java at the time was just not mature enough for a adoption at my organization.
If you are not familiar with these tools that Safai created, they are all highly capable and innovative. His later company, M7 developed NitroX which was a very nice tool for permitting real-time WYSIWYG development of Struts and JSF applications within the Eclipse environment. I had the opportunity of evaluating and trying out NitroX last year (2005). What NitroX does behind the scenes is quite impressive. M7 has a patent on the algrothims implemented for coordinating and managing all the XML files in Struts, JSF, Eclipse and Java development.
Mansour Safai played, guitar, and was an athlete. He played competetive tennis in high school and enjoyed skiiing according to the SDT article.
The passing of a great software engineer, architect, and human being at such an early age is a great loss to humanity. His soul has moved on to the next level of existence and I am sure that he will continue his work in another realm.
This morning I offered a moment of silence and prayer for the passing of this great software engineer and architect. This news will probably go unnoticed with all the events happening in our modern world. If you happen to read this blog, out of respect for Mansour Safai please reserve a brief moment today in your life to remember him.
Tuesday, February 28, 2006
Wednesday, February 22, 2006
The Octopiler
How do you get the current and future generation of computer scientists, software engineers and programmers productive on parallel programming techniques? The octopiler. The "Octopiler seeks to arm Cell programmers" was the first announcement that I have seen for simplifying the development of complex parallel programs for the new Cell chip. (See my Jan 2006 blog entry about the Cell chip, "The Cell Processor and the Future").
The complexity of the new Cell chip and the need to be able to write programs that take advantage of it was always known to be programming challenge. In most computer science curriculums, you just don't teach the current generation of students how to write 8-way parallel algorithms. IBM has come up with a solution to create a compiler called the Octopiler that uses some artificial intelligence to breakdown your program algorithms and optimize them for the Cell chip's eight special-purpose engines (SPE).
What is most interesting about this new technology is that in theory it will make programming for a parallel processing chip as simple as writing single threaded programs. At least that is the goal. Whether this is realized or not in research and practice is to be seen.
I predicted that the Cell chip would change many things in computer science in the near future. This new compiler technology is the first tangible proof that I have seen to date. IBM says that the octopiler technology is available today in limited form on a 64-bit machine running Fedora Linux and a specialized version of the GCC compiler. The migration towards 64-bit computing just got another reason if you are interested in developing future applications for the Cell processor.
If you are in the game programming industry and want to develop games for the Sony Playstation 3, then you really do not have a choice but to start learning about the octopiler and other emerging technologies surrounding the Cell processor. I have never worked in the game programming domain so my viewpoint may be entirely wrong since there are already games under development for the Playstation 3 for the past year or so.
Anyway, the Cell chip will keep our industry dynamic and interesting for years to come.
The complexity of the new Cell chip and the need to be able to write programs that take advantage of it was always known to be programming challenge. In most computer science curriculums, you just don't teach the current generation of students how to write 8-way parallel algorithms. IBM has come up with a solution to create a compiler called the Octopiler that uses some artificial intelligence to breakdown your program algorithms and optimize them for the Cell chip's eight special-purpose engines (SPE).
What is most interesting about this new technology is that in theory it will make programming for a parallel processing chip as simple as writing single threaded programs. At least that is the goal. Whether this is realized or not in research and practice is to be seen.
I predicted that the Cell chip would change many things in computer science in the near future. This new compiler technology is the first tangible proof that I have seen to date. IBM says that the octopiler technology is available today in limited form on a 64-bit machine running Fedora Linux and a specialized version of the GCC compiler. The migration towards 64-bit computing just got another reason if you are interested in developing future applications for the Cell processor.
If you are in the game programming industry and want to develop games for the Sony Playstation 3, then you really do not have a choice but to start learning about the octopiler and other emerging technologies surrounding the Cell processor. I have never worked in the game programming domain so my viewpoint may be entirely wrong since there are already games under development for the Playstation 3 for the past year or so.
Anyway, the Cell chip will keep our industry dynamic and interesting for years to come.
Saturday, February 18, 2006
The Softer Side of Product Design
There are many myths and misconceptions within (and especially outside of) the information technology (IT) /software engineering (SE) industry that this is a primarily a technical business. The tenents of software engineering is at most approximately one third technical. The remaining two thirds is the other 'grey' stuff or softer side.
What I mean by the softer side is the non-technical aspects of the business of SE. In my usual early Saturday morning ritual, drinking coffee, perusing my favorites sites and reading various articles, I stumbled upon "Software Development's Evolution towards Product Design". After reading this entertaining yet insightful article it made me think about the 2/3 non-technical side of SE. I really like all the "poo" depictions in the article's graphics.
The Lost Garden article describes the human and social side of software development that is coming into focus today. This is where technology and techniques have less impact on the success of a product. What comes to mind are modern concepts such as social engineering, social portals, human centered design and other terms used today to describe the new products and interactions that occur with software and technology in the modern world of 2006.
Given that we know where SE has been, the future is the great unexplored territory. Evolving towards 'product design' as the above article states is just one direction that we are heading. The continuous evolution towards product design is especially relevant in the consumer product development industries (gaming, multimedia, entertainment). Other problem spaces in IT are not quite yet as mature and have different SE dynamics.
One area where evolution towards 'product design' is not quite so significant is in 'grey business' of business intelligence, data mining, data warehousing, modeling/simulation and decision support. This is an SE problem space where designing products for the 'emotional needs' is not necessarily quite as beneficial yet. In this particular domain, 'practical needs' still far outweigh the 'emotional needs'. In this area the concentration is still primarily on information and knowledge creation vice a specific product that has a look, feel, and social behavior.
For the long-run I think the emotional needs will eventually catch up and surpass the practical needs in the 'grey business'. The continual evolution of information technology will permit this to occur. The evolutionary software development model toward product design described in the Lost Garden article will eventually be realized over time. However, I am sure there will be other evolutionary SE theories, techniques and models that we will observe, discover, analyze, and force a revisitation of 'softer side'. Maybe this will be dubbed 'knowledge design' since some of the 'grey business' focuses more on knowledge and information vice products.
What I mean by the softer side is the non-technical aspects of the business of SE. In my usual early Saturday morning ritual, drinking coffee, perusing my favorites sites and reading various articles, I stumbled upon "Software Development's Evolution towards Product Design". After reading this entertaining yet insightful article it made me think about the 2/3 non-technical side of SE. I really like all the "poo" depictions in the article's graphics.
The Lost Garden article describes the human and social side of software development that is coming into focus today. This is where technology and techniques have less impact on the success of a product. What comes to mind are modern concepts such as social engineering, social portals, human centered design and other terms used today to describe the new products and interactions that occur with software and technology in the modern world of 2006.
Given that we know where SE has been, the future is the great unexplored territory. Evolving towards 'product design' as the above article states is just one direction that we are heading. The continuous evolution towards product design is especially relevant in the consumer product development industries (gaming, multimedia, entertainment). Other problem spaces in IT are not quite yet as mature and have different SE dynamics.
One area where evolution towards 'product design' is not quite so significant is in 'grey business' of business intelligence, data mining, data warehousing, modeling/simulation and decision support. This is an SE problem space where designing products for the 'emotional needs' is not necessarily quite as beneficial yet. In this particular domain, 'practical needs' still far outweigh the 'emotional needs'. In this area the concentration is still primarily on information and knowledge creation vice a specific product that has a look, feel, and social behavior.
For the long-run I think the emotional needs will eventually catch up and surpass the practical needs in the 'grey business'. The continual evolution of information technology will permit this to occur. The evolutionary software development model toward product design described in the Lost Garden article will eventually be realized over time. However, I am sure there will be other evolutionary SE theories, techniques and models that we will observe, discover, analyze, and force a revisitation of 'softer side'. Maybe this will be dubbed 'knowledge design' since some of the 'grey business' focuses more on knowledge and information vice products.
Saturday, February 11, 2006
The Eclipse of Modern IDEs
Five years after open source Eclipse was unveiled to the world as the future of IDEs by IBM, it appears that the future is imminent. Just this past week Borland has announced that it is getting out of the IDE business to focus on its ALM business. "Borland To Dump JBuilder". According to the InfoWorld article (02/08/2006), "Borland to exit IDE business, focus on ALM", this is due to declining sales, income and profitability in the IDE business.
Eclipse appears to be on track to become the future of IDEs. Recent surveys now put it at the top of the heap of Java IDEs. Things have changed drastically in the Software Engineering business as anticipated especially in the business of IDEs. I recall reading articles a few years ago about how tool vendors in the future will either become an Eclipse plugin or fade out of existence. Well with each passing year and corporate announcement like the one Borland just made, we see this happening right now.
This trend will continue as far as I can tell. At one point a few years ago, Borland JBuilder, IntelliJ, and a long since gone product by Symantec (I think it was called Visual Cafe) were the leaders. Today, each of these competing proprietary IDEs are either competing successfully for survival (IntelliJ), have since been 'sun setted' (Visual Cafe) or are being sold off to better opportunitues elsewhere (JBuilder). The Sun NetBeans products continues its evolution being heavily driven by the Eclipse phenomena.
Many of the reasons for this are identical to my personal scenario for choosing Eclipse as my preferred Java IDE a few years ago while evaluating JAVA IDEs. Eclipse was young and free, JBuilder and its peers were expensive and hard to get. The expensive tools were hard to get due to declining budgets and not due to accessibility on the web. I downloaded Eclipse, experimented and learned how to use it and become proficient with it. Prior to getting into Eclipse, I had used/evaluated VisualAge (yeah, remember that IBM tool which was the basis for Eclipse), Sun Forte, NetBeans, JBuilder and even Visual Cafe. I never did get to try IntelliJ although I have heard many good things about it. Using the free community edition or evaluation versions of the proprietary tools put quite a bias on my perception.
As Eclipse was evolving and improving (shining brighter) I was trying to justify the funds for the proprietary tools. Then Eclipse 2.1 rewrote the playing field in Java IDE performance and made the other tools appear to be very painful. I was able to get Eclipse 2.1 running acceptably on an archaic Pentium II 366mhz 384MB RAM notebook. Performance improved dramatically with Eclipse 3.x which is dominant today. As far as the IDE pain is concerned, Sun Forte and early versions of NetBeans were the worst. The sole reason why I never used the early versions of those tools was due to the painfully slow user interface performance.
At any rate, I could go on about my IDE experiences over the past few years but I this posting is about continuos change in modern IDEs. Eclipse has fully embraced the open source model and after 5-years has risen to the top by being agile, open, adaptable, reliable, and focusing on performance. I remember all the early debates about Swing vs. SWT and it appears that performance does matter. At least for the voting public (developers) it does. When it comes to IDEs regardless of the language/environment, I have always chosen performance over features. Well after being in this business since the late 1980s, looks like going with my instinct about IDEs has paid off again.
For the future, I would postulate that Eclipse will continue its ascent and start spreading its wings. This will provide more lift allowing it to fly higher and farther. IBM is already migrating much of what it does into becoming some type of Eclipse plug-in (Rational, Domino/Notes, etc..). Use of Eclipse in other languages like C/C++, Ruby, PHP, Python, Perl will mature. Microsoft, Sun and Oracle and whoever is left in the IDE business will all have to keep their eyes on Eclipse so they don't fall into the Borland scenario. This will keep the IDE business very interesting for years to come.
As for the modern IDE landscape, I think Eclipse has a bright future. That is where I put my vote for now.
Eclipse appears to be on track to become the future of IDEs. Recent surveys now put it at the top of the heap of Java IDEs. Things have changed drastically in the Software Engineering business as anticipated especially in the business of IDEs. I recall reading articles a few years ago about how tool vendors in the future will either become an Eclipse plugin or fade out of existence. Well with each passing year and corporate announcement like the one Borland just made, we see this happening right now.
This trend will continue as far as I can tell. At one point a few years ago, Borland JBuilder, IntelliJ, and a long since gone product by Symantec (I think it was called Visual Cafe) were the leaders. Today, each of these competing proprietary IDEs are either competing successfully for survival (IntelliJ), have since been 'sun setted' (Visual Cafe) or are being sold off to better opportunitues elsewhere (JBuilder). The Sun NetBeans products continues its evolution being heavily driven by the Eclipse phenomena.
Many of the reasons for this are identical to my personal scenario for choosing Eclipse as my preferred Java IDE a few years ago while evaluating JAVA IDEs. Eclipse was young and free, JBuilder and its peers were expensive and hard to get. The expensive tools were hard to get due to declining budgets and not due to accessibility on the web. I downloaded Eclipse, experimented and learned how to use it and become proficient with it. Prior to getting into Eclipse, I had used/evaluated VisualAge (yeah, remember that IBM tool which was the basis for Eclipse), Sun Forte, NetBeans, JBuilder and even Visual Cafe. I never did get to try IntelliJ although I have heard many good things about it. Using the free community edition or evaluation versions of the proprietary tools put quite a bias on my perception.
As Eclipse was evolving and improving (shining brighter) I was trying to justify the funds for the proprietary tools. Then Eclipse 2.1 rewrote the playing field in Java IDE performance and made the other tools appear to be very painful. I was able to get Eclipse 2.1 running acceptably on an archaic Pentium II 366mhz 384MB RAM notebook. Performance improved dramatically with Eclipse 3.x which is dominant today. As far as the IDE pain is concerned, Sun Forte and early versions of NetBeans were the worst. The sole reason why I never used the early versions of those tools was due to the painfully slow user interface performance.
At any rate, I could go on about my IDE experiences over the past few years but I this posting is about continuos change in modern IDEs. Eclipse has fully embraced the open source model and after 5-years has risen to the top by being agile, open, adaptable, reliable, and focusing on performance. I remember all the early debates about Swing vs. SWT and it appears that performance does matter. At least for the voting public (developers) it does. When it comes to IDEs regardless of the language/environment, I have always chosen performance over features. Well after being in this business since the late 1980s, looks like going with my instinct about IDEs has paid off again.
For the future, I would postulate that Eclipse will continue its ascent and start spreading its wings. This will provide more lift allowing it to fly higher and farther. IBM is already migrating much of what it does into becoming some type of Eclipse plug-in (Rational, Domino/Notes, etc..). Use of Eclipse in other languages like C/C++, Ruby, PHP, Python, Perl will mature. Microsoft, Sun and Oracle and whoever is left in the IDE business will all have to keep their eyes on Eclipse so they don't fall into the Borland scenario. This will keep the IDE business very interesting for years to come.
As for the modern IDE landscape, I think Eclipse has a bright future. That is where I put my vote for now.
Sunday, February 05, 2006
Frameworks
This is one of my favorite topics of late, especially in the web development and Java universe. If you are working with Java or any of the object-oriented dynamic scripting languages (Python, PHP, Ruby), then you are most likely working with a set of frameworks. If you are not using frameworks then I am not sure if you or I am in a better situation today.
I just read a blog posting "Why I Hate Frameworks" which pretty much sums of the state of frameworks today in the Java universe. What is ironic about this posting is that most of my experience with frameworks fits the hammer metaphor described in the article. I dub this 'framework hell' which is analogous to the the 'DLL hell' that exists in the Windows universe today.
If you are not currently working with a language/framework set or are just getting involved, then you will encounter the framework complexity and integration fiasco sooner or later. When you do, then all this will make sense to you. The general purpose framework days are over in modern software development. Frameworks are the trend for solving just about everything today and you can find at least a dozen framework to solve every single little part of your application in a highly granular fashion. Note, not all frameworks work or play well together. Most of them are designed independent of one another by teams of programmers that may or may not even be aware of each others' framework efforts or existence. This is the root of the problem.
The complexity and state of frameworks today is mind boggling. At least to me it is. Every few months, there is signficant progress made in the frameworks that I use or the frameworks I am researching to justify a possible change in technical direction. I don't mean just to following the technical winds of the day. Many of the these advances are significant enough to contemplate a disruption in forward technical progress.
Obsolescence of a framework is a real risk. In the last four years in the Java community I have seen some frameworks and APIs become obsolete as a newer better, faster, lighter technique picks up momentum and takes over. (i.e. Spring, AJAX). This continues today in 2006 as convergence of AJAX techniques takes shape. Recall that AJAX was a disruptive technique in 2005.
Maybe things will get better later this year or next, however for the foreseeable future, I think the state and complexity of frameworks will continue on course. What course that is up to the forward progress of the researchers, programmers, practitioners and companies that are driving this forward progress. Don't get me wrong, frameworks do work well when appropriately understood, applied, tested and proven. It is just the fact that your framework usage may become obsolete sooner than you think and you are left using deprecated technology.
I guess this is the cost of rapid forward progress. I am not complaining about it, just stating that rapid innovation in and of itself can be like one huge experiment that can shift and change directions without any warning. One month, you are using the best of breed and the following month you are re-engineering, refactoring, or simply researching better techniques which may cause you to consider making changes soon. If you follow framework intergration best practices (I am not sure which ones there are so many and everyone has their own viewpoints), loosely couple, design for change, use agile techniques, and keep an open mind, you will probably be successful in your framework endeavors. Good luck.
I just read a blog posting "Why I Hate Frameworks" which pretty much sums of the state of frameworks today in the Java universe. What is ironic about this posting is that most of my experience with frameworks fits the hammer metaphor described in the article. I dub this 'framework hell' which is analogous to the the 'DLL hell' that exists in the Windows universe today.
If you are not currently working with a language/framework set or are just getting involved, then you will encounter the framework complexity and integration fiasco sooner or later. When you do, then all this will make sense to you. The general purpose framework days are over in modern software development. Frameworks are the trend for solving just about everything today and you can find at least a dozen framework to solve every single little part of your application in a highly granular fashion. Note, not all frameworks work or play well together. Most of them are designed independent of one another by teams of programmers that may or may not even be aware of each others' framework efforts or existence. This is the root of the problem.
The complexity and state of frameworks today is mind boggling. At least to me it is. Every few months, there is signficant progress made in the frameworks that I use or the frameworks I am researching to justify a possible change in technical direction. I don't mean just to following the technical winds of the day. Many of the these advances are significant enough to contemplate a disruption in forward technical progress.
Obsolescence of a framework is a real risk. In the last four years in the Java community I have seen some frameworks and APIs become obsolete as a newer better, faster, lighter technique picks up momentum and takes over. (i.e. Spring, AJAX). This continues today in 2006 as convergence of AJAX techniques takes shape. Recall that AJAX was a disruptive technique in 2005.
Maybe things will get better later this year or next, however for the foreseeable future, I think the state and complexity of frameworks will continue on course. What course that is up to the forward progress of the researchers, programmers, practitioners and companies that are driving this forward progress. Don't get me wrong, frameworks do work well when appropriately understood, applied, tested and proven. It is just the fact that your framework usage may become obsolete sooner than you think and you are left using deprecated technology.
I guess this is the cost of rapid forward progress. I am not complaining about it, just stating that rapid innovation in and of itself can be like one huge experiment that can shift and change directions without any warning. One month, you are using the best of breed and the following month you are re-engineering, refactoring, or simply researching better techniques which may cause you to consider making changes soon. If you follow framework intergration best practices (I am not sure which ones there are so many and everyone has their own viewpoints), loosely couple, design for change, use agile techniques, and keep an open mind, you will probably be successful in your framework endeavors. Good luck.
Subscribe to:
Posts (Atom)