Saturday, December 30, 2006

Project Looking Glass 1.0 3-D UI

The future of user interfaces will be 3-D and it looks like the Sun has been busy with its Project Looking Glass 1.0 3-D user interface. "Sun Releases 1.0 Looking Glass Interface". As for the future, the 3-D user-interfaces are qutie impressive from the screenshots I have seen. You have got to check them out for yourself.

Just noting it here since Sun recently put a 1.0 out there for the open source community to experiment with and evolve.

Friday, December 29, 2006

OpenOffice.org 2.1 on the Mac OS X

I have been using OpenOffice.org on Windows and Linux for many years now. I just recently got myself a MacBook and have spent the last three weeks using it and learning how Mac OS X 10.4 (Tiger) works. I have to say that I am getting more comfortable using the Mac OS X. I am using Firefox as my browser, have gotten used to the various Mac OS X applications and am now getting serious about using my new Mac as a software development machine. I primarily develop in Java so getting all my development tools setup on the Mac is a must. This includes Eclipse, NetBeans, Tomcat, JBoss, Nvu and OpenOffice at a minimum.

Well I must say that OpenOffice runs fine on the Mac OS X after finally getting it installed last week. Currently you must have X11 running in order to use OpenOffice on the Mac. A pure Aqua version of OpenOffice is in beta and should be ready for primetime in 2007. Look here for details. I downloaded NeoOffice which runs fine however, for me I noticed a few performance issues with NeoOffice. So I reverted back to OpenOffice as my word processing, presentation, spreadsheet, and desktop database suite of productivity applications.

What really amazes me is how brainwashed so many users are with the perception that they 'must have' Microsoft office in order to function on a computer. I met several Mac users that said they purchased Office for the Mac in order to do word processing. Wow, the marketing machine at Microsoft sure does have everyone fooled.

This is the reason why I am blogging about OpenOffice on the Mac. It is available today and works as advertised. If only more Mac users or recent converts were aware of OpenOffice and its capabilities they could save themselves a few dollars and rid themselves of Microsoft. Personally, I enjoy have a Microsoft free laptop using my new Macbook. I have had the same on my Linux notebook for years and it sure does feel great not having to be dependent on Microsoft for my personal computing requirements.

Too bad that I work in a predominately Windows environment at work. It would be nice to see Macs brought into the workplace. Based on my experience, it just works like my refrigerator. No strange behavior, bugs, or just plain oddness to worry about anymore.

Sunday, December 10, 2006

My Initial Mac

After years of working with Windows, Linux, OS/2, I have always wanted to get into the Mac but never had the motivation or reason to spend the money to get a Mac. Well with the release of the Intel based Macs earlier this year and the ability to run Linux and Windows in virtualization via Parallels and soon VMware I took the plunge and bought me a MacBook on 11/29/2006. After seeing the features of Windows Vista and waiting for its release, I don't see any compelling reason to upgrade to Vista within the year or so unless it occurs at work. For personal home, entertainment, and personal multiplatform research use, I am going with Mac OS X.

After using it for a few days I am happy to say that I really like the Mac OS X 10.4 Tiger and the Mac in general. There are some extremely intuitive and unique capabilities on the Mac OS X that I have not seen on Linux, OS/2, or Windows. The Mac OS X 10.4 Tiger Dashboard, Expose, HotCorners UI features name just a few. These are the features that I have been using extensively. There is this UI automation feature which I haven't even gotten into yet. I forgot the name of it but it looks exactly the type of thing I was looking for in a 21st century operating system.

As far as multimedia is concerned, the Mac just blows away Windows in my opinion and experience. I have used Windows Media Center Edition and compared to the Mac, it looks like a half-baked experiment. The Mac multimedia device experience is like an extremely polished and intuitive appliance. It is just something you have to experience personally is all that I can say.

I still have a lot to learn about the Mac architecture, Darwin, Cocoa, Aqua, X11 on the Mac and more. After just running on it for a few days, I am excited to say that I like it! I will be blogging a lot more about the Mac in this blog and more in the next few years. Given that I do almost everything on the web or online now, switching to a Mac was a piece of cake for me. Mozilla Firefox and Safari work nicely and I had almost nothing to learn from a user perspective.

From a software engineering perspective, Mac OS X totally embraces Java which is good for me since I have been working with Java technology for a long time. I have been finding quite a bit of open source stuff for the Mac which is nice to know. I really like all the Dashboard Widgets. For the future, I have been looking at the previews of Mac OS X 10.5 Leopard that Apple has been putting on their website and it has some innovations that again will make my next Mac OS X upgrade in 2007 a more capable Mac. The new Spaces and TimeMachine capabilities are a step into the future.

I will continue working with Linux and Windows since that is how I make a living but it would be nice to be able to make a living working on Mac OS X for web development. At any rate, I hope your future computing is as enjoyable as I foresee mine will be. After just a few days of using my MacBook, I feel better about my computing future already. In addition to all the above, I forgot to mention that my MacBook just aesthetically looks and physically feels cool under my fingers.

Friday, November 17, 2006

Playstation 3

Today, 11/17/2006, Sony unleashed it's Playstation 3 (PS3) to the US market. Ok, so what does the Sony Playstation 3 have to do with Software Engineering? Quite a bit if you look out into the digital future, at least in my opionon. The PS3 is much more than a game machine. It is a supercomputer based on the IBM/Sony/Toshiba designed Cell processor (I blogged about the Cell earlier this year, The Octopiler) that runs on Linux, has a 60GB hard drive, 802.11g WiFi, Gigabit Ethernet, Blu Ray disc, and is designed to be the center of a digital entertainment hub for all your multimedia desires. Oh, yeah it is also an incredible Hi-Definition (up to 1080p), networked gaming machine.

I will not bore you with the technical details because that can be found all over the internet, printed media, and whenever you need it. One very interesting aspect of the Playstaion 3 that is downplayed, at least initially, is its inter-connectivity with the PSP. I really like the portability aspect of the PSP and use it quite frequently as a portable web browsing device and gaming machine. I'd like to see the PS3 and PSP collaborate with one another in games and other forms of digital entertainment that have not been explored yet.

I don't expect to be able to get a Playstation 3 anytime soon due to sell outs and limited supplies initally, but when I do eventually get one in the near future I intend to set it up in my family room and give it a spin as a center piece for the digital living room of the 21st century. I have intentionally held back from getting a large HDTV until after the PS3 was released. Now that the PS3 is out, I have begun researching and shopping for an LCD HDTV that will provide me with 1080p capabilities.

I think the PS3 will make the digital life interesting and fun again in 2007 and beyond. Something I would like to see is a Cell based notebook computer running Linux! Now that would be a really cool evolution say in 2008 for this family of technology. For now in 2006 and in 2007, we have the PS3. I can't wait to get mine.

The Open Source Battle Front

November 2006 will be remembered as the month when proprietary software companies opened a battle front in the inevitable war between open source and proprietary technologies. Here are some major signifcant events that have occurred:

  • Oracle announced that it will compete with Red Hat Linux directly in the Linux IT services sector.

  • Microsoft announced and signed a deal with Novell for licensing and working with the Linux community.

  • Microsoft also has signed a deal with Zend the open source PHP vendor.

  • Sun made Java open source under the GPL on 11/14/2006.


These are just the major announcements. There are many more than this however, the big proprietary companies are significant since they command so much market share and industry influence.

So what is the agenda under the agenda? It is an acknowledgement and action by the big proprietary vendors that they have decided to compete directly with open source companies that threaten their proprietary software business model. I thought this batte front was going to happen in 2007/2008 and it caught me off guard as to how soon the chosen battle fronts were made by the proprietary software companies (i.e. Oracle, Microsoft).

At any rate, the next few months and 2007 will definitely be a year of adventure, changes, and interesting developments in the open source world. What is interesting is how Google, Yahoo, Amazon, IBM, and other pro-open source companies have stayed out of the fireworks so far this month. It looks like they are all managing to proceed with their open source strategies, whatever they are, and moving forward changing the world.

As a supporter of open source I think big changes are in store for all us in 2007 and beyond.

Saturday, October 21, 2006

Unstructured Information Management

Unstructured information makes up most of the information content on the internet today. Estimates are as high as 90% of available information on the internet is unstructured. So with all the databases, portals, websites, repositories, hard drives and trillions of files that exist today how do you harness this information? This is where the field of Information Management has the technical challenge to turn all this information into useful information and knowledge. This is entirely conceiveable given sufficient time, computing power and storage. The challenge is making this happen in near realtime.

To meet this challenge, DARPA has funded IBM Research in 2005 to create UIMA which stands for the Unstructured Information Management Architecture. It is an open, industrial-strength, scaleable and extensible platform for creating, integrating and deploying unstructured information management solutions from combinations of semantic analysis and search components. IBM makes UIMA available as a free SDK (alpha), and makes the core Java framework available as open source software (UIMA at SourceForge) to provide a common foundation for industry and academia to collaborate and accelerate the world-wide development of technologies critical for discovering the vital knowledge present in the fastest growing sources of information today. IBM developerWorks has a tutorial for using the UIMA SDK with Eclipse.

Since IBM released UIMA as open source in early 2006, it has been widely adopted. Open source projects such as GATE, OntoText, and many other have been utilizing UIMA as the framework for unstructured information management research. As research into managing and harnessing unstructured information grows, there will be more available solutions to solve these problems.

Sunday, October 01, 2006

The Expanding Google Earth

If you have not used Google Earth lately then you are missing out on one of the killer applications that merges the web with rich native applications. The 3-D Warehouse of Google Earth plugins has been growing steadily after Google released Sketchup, the 3-D modeling aplication. Sketchup comes in a free version and a professional version. The nice thing about this approach is that you get to use a fully functioning program that scales to the professional level if you need that capability.

"The growing world of Google Earth" provides some insight into just how Google Earth has been evolving. I use Google Earth quite frequently as the means to find places and to explore places that I have never been. You can spend hours doing this and is quite entertaining.

The real power of Google Earth is the API provided by Google that allows you to build upon Google Earth's capabilities by integrating it into your custom applications. Additionally an SDK is provided to allow even more customization. Google Earth is becoming a platform for creating new applications or mash ups.

The imagery database is constantly being updated and I can see the day when the 3-D models for just about everything is integrated and available. I can think of several applications of integrating Google Earth technology and designing 3-D programs for everyday use. For now, we all get to watch this killer application evolve slowly which is ok with me.

Tuesday, September 05, 2006

Open Source To Rescue OCR

This blog is not entirely about open source intentionally. It just happens that open source technology is where all the action and innovation appears to be happening in software today. I just bumped upon an article about OCR that interested me. "Google releases open source OCR tool". If you thought at OCR was a stagnant technology then you just joined the crowd and Google's viewpoint.

What is interesting is the fact that it took an open source mentality, a stagnant technology (OCR), and a few bright Google engineers to reinvigorate the OCR technology space. The new open source product called, Tesseract, is released under an Apache Software License. Google hopes that the open source community will create solutions and new products for the OCR technology.

Prior to reading this article, I was not aware that OCR technology has literally stagnated for the last 10-years. HP and UNLV released their OCR technology into open source to re-generate interest. Well, it appears that Google does have the interest. Remember the digital library initiative that caused an uproar. Well, open source OCR looks like it is par for the course of working towards that goal.

Saturday, August 26, 2006

The Next Set of Open Source Waves

I have written about my viewpoint of open source technology and how it is transforming the IT industry. Well, I am glad that this mentality or at least thought process is becoming mainstream. I really like the angle that InfoWorld has put on this in the article "Who are the Losers, Now That Open Source is Winning?".

What is even more ironic is how Microsoft has approached the Mozilla foundation to collaborate on future projects. This is just more evidence that open source is a strategic aspect of any company working in the IT industry. You can no longer ignore open source if you want to remain viable in the future.

The success of Google, Yahoo, Amazon, EBay, SalesForce.com, RedHat, JBoss... are case studies for businesses built on open source technology. Getting back to who is winning and the next set of waves, the InfoWorld article above makes a pretty good assessment of the current state of open source vs. proprietary technology in the IT field. The bottom line is for the next few years there will be peaceful coexistence. After that the two will be on a head on collision course and if you are not a player with or in open source technology, you will probably not be doing business successfully in the future.

As always in this business, time will tell and we are always just another major breakthrough away from a complete disruptive technology shift. So far this year, I think the cards are playing in favor of open source and its increasing critical importance to the IT industry.

Sunday, July 30, 2006

LAMP Stack Development

Well, I have recently been pretty heavy into PHP development and must say that is quite an change compared to Java/Struts and J2EE type development. One nice thing I have found is taht Eclipse and Zend are working together to make this a smooth process with the PHPIDE. If you are a Java developer and already use Eclipse, you will feel quite at home using the Zend PHPIDE which is an Eclipse 3.2 plugin that works quite nicely even in beta form.

There are pros and cons to using the LAMP (Linux, Apache MySQL, Perl/PHP) or WAMP (Windows, Apache, MySQL, Perl/PHP) stack. I will just refer to this as the LAMP stack from now on in this posting. One benefit is that all the tools are open source. There are very few financial barriers to using these technologies. The primary con I see if that you have to ensure that you are using compatible versions of each of the components in your technology architecture. One solution is just try a VMware appliance, WAMP, or XAMMP packages pre-built for Linux or Windows. The downside to the pre-packages approach is that if you have to deploy and support a LAMP/WAMP stack, you will not have the necessary configuration skills for troubleshooting issues that may occur on your servers. I recommend that if you need to support this stuff, then learn to do it the hard way at least once so you will know the configuration and mechanics of the LAMP stack. The "Install & Configure Apache, PHP, JSP, Ruby on Rails, MySQL, PHPMyAdmin & WordPress on Windows XP/2000" and "Building a LAMP Server" HOWTOs really helped me out with this stuff among other HOWTOs I found. As usual, the HOWTO article was not bullet-proof or 100% accurate. I think I found about a 75% accuracy rate for configuring LAMP. Most issues I have found concern the PHP5 and MySQL5 issues (see this Zend post).

In other words, I discovered that MySQL 5 and PHP 5 have introduced some connectivity issues and incompatibilities with latest Apache HTTP Server. PHP5 doesn't even work with Apache 2.2 in my experience and I confirmed after doing quite a bit of research on the web. From what I have seen, the PHP community is working on supporting Apache 2.2. For MySQL5 and PHP5 you need to be using Apache HTTP Server 2.0. I learned this the hard way. Additionally, I was not aware that MySQL5 breaks quite a few PHP applications due to changes in password encoding and introduction of object-orientation in PHP5. Then there is the PHP mysqli extension that supercededs the mysql extension that is built into PHP. What a frustrating experience that is.

Anyway, what I have learned with all this experimentation with LAMP stack is that it is like all the other open source technology solutions I have been working with in the past few years. It does work with a few caveats. LAMP works as good as your understanding, skills and ability to use the web for research all the tools and components. Paraphrasing this, LAMP works if you are willing to put in the time and have the right mentality and necessary open source skills to solve problems.

Friday, July 21, 2006

Open Source Technology and National Security

I have read several articles that support the idea that the US DoD should adopt open source and open technologies more readily to match the productivity gains that many companies in the private sector have attained. The recent article, "Open Source in the national interest", provides a strong case for adopting open source in the DoD more readily. The 79 page April 2006 report mentions specifically how IBM has transformed itself by adopting open source readily.

There was a 2002 Report by Mitre that advocated the same. The 2002 article is not in as much depth as the more recent article noted above.

If Google, Yahoo, EBay, Amazon, IBM and many others are doing so well with open source technology on the large scale, then I think our government can do the same. It would just require a transformation of the government IT organizations. This is definitely more difficult than it sounds but when I see entire countries adopting open source technology (i.e. EU, Japan, Korea, China, India) I get that feeling that we might be falling behind or losing our edge by not following suit. I think it is in our national interest and so do many others in this ever changing computer science field.

Agile Web Development and the Lighter Languages

I have spent the past few months working with developing and experimenting with Java (JSP, Struts, Spring, Hibernate), Perl, PHP, ASP, Visual Basic and Ruby. Well, I will have to say there is something to be said about agile processes and open source dynamic languages. Recently I have had the opportunity to work with Java/JSP, Perl, and PHP. I am porting a PHP web application to JSP and intentionally sticking with a JSP Model 1 implementation. This is a bit of a digression in development techniques or a few steps backwards but I am testing out a theory. I can always refactor to an MVC design or other if necessary.

The theory that I am testing is: "It is possible to ignore elegance temporarily for attaining speed and practicality of getting a solution working using a 'quick and clean' process. (agility)" The old cliche in programming of doing something 'quick and dirty' to just get the job done is a concept I never really liked but have had to do occassionally in my career. The 'quick and dirty' approach almost always got me in trouble. I try to avoid doing this at all costs if possible. Based on my experiences with agile 'quick and clean' development, I feel that it is possible to attain software development speed practically in a sustained manner without sacrificing too much elegance. In other words, getting the cleanest job done in the best possible manner in the shortest period of time.

What I have found interesting in the days I've spent porting PHP to JSP is that it is relativley straight forward if you ignore your inner programming voice that says you should refactor the PHP to a more elegant Java/JSP design. In just a few days work, I have functioning webapps in JSP that rival the development speed of the looser PHP. I have to note that the PHP webapp is not using any PHP framework, design patterns (Five Common PHP Design Patterns), or other MVC pattern frameworks. This would of course complicate the design and make porting PHP to Java/JSP slower in my opinion but would require an experiment to prove it.

One other thing I must note is that having a really good IDE is critical to making this work. I am currently using Eclipse 3.1, Netbeans 5, and Zend Studio 5 in these 'quick and clean' experimental processes and this is definitely a must. I recently added Netbeans 5 to my pallete of java programming tools and it has yielded quite a productivity improvement for JSP and servlet development for me.

My next set of experiments is to get into Ruby on Rails (RoR) development which is a 'quick and clean' architecture. I have been talking about this all year however, have not made the time to start an experiment with RoR.

Saturday, July 01, 2006

Archaeology and FOSS Trends

Archaeology has been a passion of mine since I was a kid. Prior to getting into computer science and music, archaeology was one of the professions I wanted to pursue. To this day, using the internet and reading as much as I can I try to stay involved in what is current in the archaeological field.

Much of the software technology used in the archeological field has been proprietary until very recently. I just bumped into an open source product called ArcheOS that is a GNU Linux distributionhas specifically configured for archaology! The 1.0 release was presented late in 2005 and has since been evolving following the open source model.

In the article, "Uncovering progress in FOSS-based archeology", it discusses the progress made to date with FOSS for the archaology. I will research more into what ArcheOS does and should have a cursory working knowledge of this stuff sometime this year if I can devote a few days to it.

I am surprised at how FOSS has been finding its way into the archeological field. I guess with continuing technology trends, open source technology appears to be invading just about every aspect of life where software is required and used. It makes logical sense that open source technology is finding its way into other fields other than business, science and technology.

Saturday, June 10, 2006

Windows Genuine Advantage Spyware

I need to stand on top of my Microsoft soap box and say a few things about Windows security. I use both Linux, Windows and whatever devices make my digital life more productive and entertaining. Well, if you have not heard, just this past week Microsoft was caught and tagged as the largest spyware source to date. "Microsoft Big Brother". For roughly the past year, all modern Windows PCs have been phoning home every day. This was put into service in July 2005 as the Windows Genuine Advantage (WGA) service which was required to allow Windows patches to continue working. The true intent of WGA was never revealed by our trusted Big Brother, Microsoft.

The thing that bothers me the most about this is that Microsoft did this for security reasons and used it for some yet unknown reason (marketing) to make using Windows better for everyone. This sounds like a smokescreen to me and makes me wonder about the new integrate security features like anti-spyware Microsoft has embedded into the next version of Windows Vista.

Boy am I glad that I have options and know how to use Linux and other personal computing platforms effectively. As I look to the future, I see a less Windows dependent future for me. At least as far as home computing and entertainment computing and security are concerned. With Linux, Mac and Sony PS3 coming online soon, I think in the near future I will further diversify my personal computing technology architecture and use Windows less. Choices exist, you just have to exercise your right to choose based on good solid information.

Wednesday, June 07, 2006

Online Everything Is Getting Closer

I moved to using e-mail on the web in the mid 1990s with Yahoo! Mail. It was the best thing since slice bread for me. Prior to using Yahoo! Mail exclusively for my personal email, I was always on the wrong computer or at the wrong place when I wanted to access my e-mail folders containing the e-mail with information I needed where I was. This scenario could be while at home needing to access a work e-mail or while at work and having to access a home e-mail. Keeping track of the e-mail accounts and which computers could access it was just a pain.

To make a long productivity story, once I moved 100% to web e-mail, the problem just disappeared and is now a no-brainer. I do not understand the number of people I know who are still tied to thick client e-mail (MS-Outlook) today. Why? Who knows.

Well if you are still using your thick client calendar or e-mail, then you are stuck in the 1990s. With Yahoo! Calendar and most recently Google Calendar and the ease of collaboration over the web only requiring that you have access to the internet using any browser, I just don't understand the folks I hear talk about Outlook and wishing they were at home to access their personal email while away from home. Huh!?! Hello?! It's the 21st century and that is an old 1990s problem. Why are you still stuck in the 1990s?

Well, with the recent release of Google Spreadsheets and Google's acquisition of Writely, I think the days of being tied to a thick client for spreadsheets and word-processing are numbered. The only pieces missing is a nice web-based presentation application and a quick webtop database for business productivity. I suspect this will happen within the next year.

I can think of a few other web apps that would be nice replacement for their thick client equivalents, however for now I'd like to reserve some room for innovation in 2007. By the way, I just got my Google Spreadsheet account today and have already been inspired with a few ideas for its use within the first few minutes of using it. As for the "Online Everything" concept, I think this was dubbed Web 2.0 last year.

Wednesday, May 17, 2006

Google Web Toolkit makes AJAX Easier In Java

Today Google released the Google Web Toolkit (GWT) which is a Java framework that makes AJAX web applications easier to develop. I got my first taste of GWT today and have to say that Google has an AJAX product that makes building Java AJAX applications much easier to do. At least compared to my experiences in the past year.

A few of the features are that you don't have to write any Javascript. In comparison to other Java-based AJAX framework like DWR, you had to write some Javascript code. GWT generates all the necessary Javascript code for you! Google provides good examples for you to learn GWT.

Based on my first impression of GWT, they also provide decent documentation of GWT online. Anyway, this was a surprise that I happened to see in "Google Releases AJAX Framework".

Tuesday, May 09, 2006

The Modern Configuration Management System Solution

The de facto standard for source code version control is Concurrent Versions System or CVS as it is better known. Well, CVS has been around for a while and has a few flaws that most organizations just overlook. I used to work for one of those organizations that applied the "if it ain't broke, don't fix it" mentality toward source code version control. I have used PVCS, CVS, Teamsource, a few others and most recently Subversion.

CVS was good enough and it did what we required, most of the time. There were a issues with how CVS handles file renames, directory name changes, etc. which we just lived with like everyone else using CVS.

Well if you happen to use CVS today and just deal with all its binary file handling, file name change, directory handling flaws, I have some news for you. You should really consider migrating to the modern successor to CVS, Subversion. I had the opportunity to research, evaluate, test Subversion 1.3.1. I really should have done this a few years ago but now is as good a time any.

Subversion was designed to be a better CVS than CVS and I based on my experience they attained this goal. The primary reason for this is that I installed, tested and evaluated Subversion on Windows servers and it works as advertised. CVS is best installed and run on a Linux server. Subversion runs well in Linux and Windows.

Now I see why many of the Apache Software Foundation projects have switched to Subversion from CVS. Subversion handles binary files with ease and has all the nice GUI tools and Eclipse plugins that CVS has. I installed tested and really like RapidSVN, SmartSVN and TortoiseSVN. The Subclipse plugin for Eclipse 3.1 works very nicely. I did not find a good Netbeans 5.0 plugin which was my only disapppointment. Other than that, Subversion 1.3.1 is definitely ready for prime time. The Subversion website is an excellent source of information and tools available for Subversion. There are CVS to Subversion conversion tools which should make your transition smooth. I have not tried any of these repository conversion tools myself so I caveat my previous statement by saying your mileage may vary.

I found that the information in "Subversion UI Shootout" article is still relevant today. In the article "Setting up a Secure Subversion Server" which is geared towards Linux, the concepts can be applied to a Windows server deployment. For J2EE/Java EE environments, I found JavaSVN described in "Configuration Management in Java EE Applications Using Subversion" to be an excellent framework for Java server side development if you happen to need a servlet interface with your subversion repository.

For configuration management solutions today, I will be migrating everything I do related to configuration management to Subversion. It is definitely a better CVS than CVS and in my opinion, a good candidate for a modern open source configuration management system for more than just a source code version control system. I recommend that you use it for all your configuration management requirements if your organization is not adverse to open source technology. Good luck.

Monday, May 01, 2006

Google's Latest Products In April 2006

Google has been busy in their software engineering division. They recently released SketchUp which is a 3D Modeling tool. "Google Launches Free 3D Modeling Software". SketchUp allows you to create 3D models and store in Google Earth.

Google released Google Calendar in April 2006 and in the usual Google style it is dubbed a 'Beta'. Well, if you have not had a chance to try it you should. I have used many web-based calendar products like Yahoo Calendar, Lotus Domino, Mozilla Calendar, Sunbird, and even Microsoft Outlook Web Access. Well, I will have to say that Google has done a good job of entering an already crowded market and managed to make a dent and establish some traction.

Like their other recent products, Google has integrated many AJAX features which makes this product really useful. I am most impressed with the ease of use and responsiveness of Google Calendar compared to the competition. For my first set of impressions, it is as responsive as Google Maps. Which I suspect is not an easy feat. The software engineers at Google have been quite busy.

The ease of use is the most important aspect of the new Google product. Scheduling events and appointpoints is the easiest I've seen in any web calendar or desktop calendar. You just click on the date and an AJAX style window pops up allowing you enter your event. This surprised me. The ease of sharing your calendar and keeping a private calendar all in one interface is innovative. This feature is much easier to use compared to others I have seen.

The intuitiveness of Google Calendar is the most impressive. To reschedule an evevent or appointment you just drag and drop. Wow! All this from within my web browser.

For now, Google Calendar has become my favorite web-based calendar software. The integrate AJAX features and the responsiveness of the product are what impress me the most. There is still some refinements to be made like integration with hand-held devices and mobile phones. I am sure these features will eventually find their way into Google Calendar. This will be an interesting product to watch evolve.

Friday, April 14, 2006

Enterprise Portals

Portals, SOA and web services are the new buzz this spring. Well if you are involved in software engineering you are probably involved in projects that deal with content management, search, portals, SOA and some form of web services to expose all this content. So what is a portal? For the context of this posting I am using this definition of a portal. For open source java technology I try to stay focusing on JSR-168 compliant technology.

I have had the opportunity to spend some time researching portals lately and have found quite a few decent open source tools that are quite mature. eXo is a enterprise level product used by many organizations. The eXo platform is a powerful JSR 168 compliant enterprise portal built from several modules. It's based on Java Server Faces, Pico Container, JbossMX and AspectJ.

One product I like at least based on initial research is Liferay. It leverages all the best Java frameworks (Struts, Spring, Hibernate, Velocity, WSRP, MyFaces, etc..) and provides a really mature open source portal product. It has been in development for 6-years and is currently in version 4.0 release.

Over at the Apache Portals site, they are brewing Pluto which is the JSR-168 reference implementation, Jetspeed-1, Jetspeed-2, WSRP-4j and Graffito. All the Apache products are actively managed and are evolving fast.

For additional java open source portal technology, the following index contains a listing of quite a few more portal products. On the non-java front, Plone looks promising. It is built with Python. There are quite a few more open source portal products out there which are all evolving.

On the commerical side, there is Documentum which is a suite of enterprise level web applications. Documentum is large and deep. It is another JSR-168 compliant product. I mentioned it in a previous blog posting, "SOA, Data Warehouses and Modeling". Vignette is another product which I am still researching. Lotus Domino is even converging with WebSphere and becoming a player in the portal market.

So which enterprise portal products are the best? With the complexity of portal technologies that is a very difficult question to answer. I been involved with the debate that Microsoft Sharepoint is the solution. Based on my experience as a user of Sharepoint, I think it is somewhat limiting and rudimentary as a portal compared to the technologies I have seen. Everyone seems to be evolving in the direction of some type of enterprise portal. I like to stay focused on JSR-168 since it seems to have the momentum in enterprise portal technology today.

Saturday, April 01, 2006

A Universal OS and new RAD Web Tool

Well, it looks like the operating system wars are coming to an end. Metaphorically speaking, this is the equivalent of the destruction of the Berlin Wall. It looks like due to the overwhelming economics of fighting open source Linux, Microsoft has decided to embrace the open source model, restructure, and support Linux with all its energy. You could see this coming with the recent reorganization announcements within Microsoft's management ranks. Recall that this type of a adaptation worked for Apple when it made a similar decision in the late 1990s for its Mac OS X project.

Economically, sustaining the battle against the open source movement was a draining the corporate giant. With the EU on the verge of fining Microsoft $2.4 million a day for violation of EU antitrust rulings, this has proven to be too much business risk to Microsoft's bottom line. Microsoft did not have any intention of complying with any of the EU's rulings which is exactly what they have done in the US in the past 10-years. Lobbying the EU just does not appear to be working as it has in the US.

In addition to the Singularity project that Microsoft R&D is working on, they have decided to join the Eclipse foundation, support the Novell Mono project, and have announced that internally they have been working on a Mac OS X like project where the Windows kernel will now be based on the Linux 2.6 kernel. The operating systems engineering group has been working on this for the past 3 years and they are nearing a point where they can unveil their progress to date to the world in true Microsoft marketing style, PowerPoint presentations. We will start getting the marketing briefings shortly as Microsoft prepares to adapt. There is a rumor that this project was initiated to counter the growing competition between Microsoft and Sony.

Sony has been collaborating with Apple, IBM, and Toshiba engineering a Linux based operating system that makes heavy use of virtualization and can seamlessly run Linux, Mac, Windows applications with ease. It is designed for the Cell chip and we should be seeing alpha versions of this new yet to be named OS in early 2007. I even saw that it will run on the Playstation 3 which makes a lot of sense.

Now you can see why a 'Universal OS' is on the horizon or at least there is OS convergence happening on all fronts. All the major players are working on some form of Linux kernel based operating system for the future.

On the software development front, the Eclipse foundation and Mono have announced a secret project that will permit developers to build Java, .NET, php, Ruby web applications using a universal web GUI/AJAX RAD plugin. Based on rumored information about early alpha versions, it's capabilities can be summed up to be equivalent of what the initial Borland Delphi 1.0 did for GUI client/server development in 1995. It changed everything as far as RAD client/server development is concerned. As a matter of fact, I saw some early hints that the IBM was purchasing Borland's IDE tools (Delphi, JBuilder) which are now on the market primarily to keep the talented Borland software engineers employed within the Eclipse Foundation. Things that make you go hmmm.

I have been waiting for a 'RAD for the web' type of tool that will give developers back the productivity levels that were attained with Delphi/Visual Basic in the 1990s. In the past few years, open source tools have come to dominate the web development space outside of .NET. With the infancy of Web 2.0, it looks like tools consolidation and a new dynamics have the prevailing winds blowing in the direction of the open source model. I personally have seen many decisions made for this movement allowing organizations to adopt open source technology where Microsoft once ruled. Look at what open source technology adoption has done for the emerging commercial space industry, "The Software of Space Exploration".

2006 is shaping itself to be a year of innovation, excitment, and surprises just as I anticipated. It is only the first of April and yet so much change is happening! I anticipate the next 9-months of this year being just as exciting.

Sunday, March 26, 2006

Product Delays Becoming the Norm?

If you have been following the news these past few weeks, there have been four major delays announced. The Sony PS3, Microsoft Windows Vista, Microsoft Office 2007, Toshiba HD-DVD. Is this all a coincidence or over promising complex technology?

Well based on what I have read, Sony said the PS3 delay is due to final Blu-Ray copy protection technology specifications not being ready for a Spring 2006 delivery. Ok, I'll buy this since this can be confirmed throughout the industry. However, the product delays here are due to a specifications ratification which in essence is considered part of the software engineering project. This is a dependency risk avoidance strategy and looks to be a sound management decision. "Playstation 3 delay - a good thing?".

Then you have Toshiba delaying their HD-DVD launch to wait for the release dates of HD-DVD content in April 2006. This makes sense also. Who wants to buy a device that has no content? I would not. From a marketing perspective, this would put a dent on initial sales. From a software engineering perspective, timing the release of content to coincide with release of a product is a market risk avoidance strategy and looks to be a sound management decision.

Then you have the new bombshell announcement that the already late Windows Vista will be delayed nine more months due to quality and security programming issues. Recall that Windows Vista is already two years late. I read an article stating that a possible 60% re-programming of Windows Vista is required in order to deliver it by January 2007. "60% Of Windows Vista Code To Be Rewritten". This sounds like a rumor or FUD. If there is at least an ounce of truth to this and if you are in the software engineering field, this sounds pretty significant to me. If you have to re-write 60% of the code within the last 15% of effort, then I think there are some major design quality problems within the project. Has the complexity of the product and age of the Windows code base finally caught up to itself? Or is there some requirements creep happening internally to the project? Well according to some blogs I've read, Microsoft is even moving programming resources from XBox into the Vista project. This should be a challenge for both teams.

Adding more developers to an already late project just makes the project, well, even later. If you are familiar with Fred Brook's essays on project management, then it looks to me like Microsoft is having some serious internal project management problems. For a consumer level product which Windows Vista has become, all these delays and excuses really do not go well with the average non-technical buyer. It primarily affects the public's perception of your brand or company.

So are all these product delays now part of the norm in today's ultra fast paced high technology world? If you look at proprietary software technologies the answer is dependent on the size of the project. The larger the project, the more likely there will be delays due to the thousands of dependencies within the project. If you look at the open source universe, it seems that the open source model appears to attack size the complexity in a different manner. Since there really aren't any open source projects as large as Sony PS3, Microsoft Windows Vista or HD-DVD this is yet to be determined.

Being the optimist that I am, the jury is out on whether project delays is the norm today. I think project size, over promising, marketing hype, increasing complexity, and requirements creep are probably the real causes of delays. Each of these aspects are manageable components of the software engineering and product development life-cycle process. Of course this is much easier said than done.

Saturday, March 25, 2006

Next Generation AJAX Applications

I started this blog last year inspired by my first impression of an AJAX application, Google Maps, and how I thought it would disruptively change the future of software. Well, I got my first glimpse of that promising Web 2.0 future application in the form of a beta AJAX applications called AJAXwrite 0.9 which is an MS-Word clone developed by Linspire. You have got to see this thing.

I was totally surprised at how fast AJAXwrite loads, in 6 seconds. This is due to its relatively small size of 400 kb as stated on the web site. It looks nothing like first generation AJAX applications such as Google Maps and Yahoo! Mail. These apps tend look and feel like rich browser based applications. The first generation AJAX apps have a few issues with load times in my experience. AJAXwrite looks and feels more like a traditional GUI native application that utilizes the browser engine as its platform. In my case I am using Mozilla Firefox. AJAXwrite still has some maturing to do but for a proof of concept or beta as everyone calls these today, it really shows what is possible.

Just as I predicted, AJAX is the disruptive technology that changes everything on the web. If AJAXwrite 0.9 is the trend of things to come this year, we will see some exciting new web based software that will redefine what a web application is.

I wonder what frameworks are being used to make AJAXwrite work? I need to do some research into this because for such a rich AJAX application, it sure does provide a lot of the rich client widgets and capabilities that I am accustomed to building in non-browser GUI application development.

Wednesday, March 15, 2006

Web Services and Small Companies

Innovation continues to take place in small companies. In today's over exposed world of instant information, web services and IT in general, it becomes difficult to sort through the background noise of what is published today to find new products and services. Especially from new or small startup companies. They just do not get the press.

I just read a really interesting article about a small company that has been able to monetize web services and do quite well at it. Google, Yahoo!, EBay, Amazon all have web services in place however are still trying to make it a profitable business. The article was published in the March 13, 2006 issue of Information Week entitled "Web Services By the Dozen".

The small company is called StrikeIron and it provides an online marketplace for web services for general use. StrikeIron acts as sort of a broker for organizations to search for web services and establish deals to consume them using StrikeIron in similar business model that makes Amazon and EBay successful. They are like this "80-pound gorilla" that dominates a field with few competitors.

In my local area, I had the opportunity recently to talk with a few small startup companies (micro companies) that are just as innovative. They are touting that web services and their agility (as in nimble) are the technologies that will allow them to distinguish their product and services to be ultra competetive and grow in the future. After reading about StrikeIron I think the creativity and innovation that will be driving the future of IT are the small companies. Google recently purchased Writely which was yet another small company providing their AJAX based web services that allows you use your browser for editing word processing documents in MS-Word, RTF, OpenOffice formats.

The only threat these micro companies have is their managing their own growth and being acquired too soon (i.e. by Google or others) while they are still half baked. We live in yet another exciting period where innovation is driving new ideas and most of these innovations are coming from the small companies.

Tuesday, March 14, 2006

SOA, Data Warehouses and Modeling

For the past few years Service Oriented Architecture (SOA) and Data Warehouses have not been discussed within the same context. It is as if the two are mutually exclusive or at least the analysts and vendors want you to perceive they are. This never really did make sense to me. I am not an analyst so I really did not pay too much attention to this until recently when I had to address questions in a discussion about SOA and data warehousing.

I have been involved in the data warehousing, data mining, business intelligence domain for the past ten years so the problems and technologies in that domain are familiar to me. I have not been involved in a SOA solution yet. As I did my research into SOA I discovered that very similar problems exist. The deeper my research went into SOA the more commonalities I saw between these two domains.

I just finished reading "Whipping Data Into Shape" published in the 02/06/2006 issue of InfoWorld. The premise of this article is that solving a SOA problem is very similar to solving a data warehouse problem. IBM, Informatica and Oracle have even gravitated towards what they call an operational data warehouse type of concept and architecture for their SOA solutions. This surprised me but makes sense. What is old is new again just reborn with a new label and a few twists.

Defining a data architecture is coming to the forefront again in the SOA problem space due to the issues encountered over the past few years. The InfoWorld article talks about the concept of a Master Data Management Architecture and provides a nice graphic 50,000 foot view of it. It makes sense on paper.

The primary problem is the metadata which leads me into my other topic which appears to be the forgotten art of modeling. More specifically information and data modeling. In my experience, the past six years has been a form of backlash against modeling applications, systems and data. I think this coincided with the dot com boom and the need to get web sites, services and applications up and running as quickly as possible. The agile technique movement appear to address this need.

Well in a SOA or data warehouse agile and fast architecture and design decisions will get you into 'deep kimchi' really fast. I guess that is what has been happening in the SOA industry lately and a regrouping or rethinking of these rapid approaches is coming to the fore front again. This is a good thing. In the past few months I now hear many technical and management discussions centering on the requirements, modeling, use-cases, business-rules, 'by design', doing it right, etc. with respect to SOA today. Well, if you were involved with the last push for data warehousing and business intelligence in the late 1990s then this is just a repeat of what was done in the past and appeared to work. It did for me since the organization I work for has been managing a data warehouse for the past nine years quite successfully.

The twist on the massive amounts of data and services that has to be managed today is getting a handle of the context and semantics of it all. From an enterprise perspective, you are dealing with applications, files of all types, databases, web sites, and information all over the organization. What does it all mean from a SOA perspective? This is where a metadata repository and good models (data, information, semantics) are critical for success. The solutions and tools for this requirement are not yet built. At least from what I have seen they are not. What is required is some type of enterprise level content management, metadata, repository, and modeling facility. I am not sure who is going to solve this problem but I think platforms like EMC Documentum are headed in the right direction. Whether or not they solve the problem is to be seen in the coming years.

To keep it all simple, from the modeling perspective, I think getting back to fundamentals and building the models is key to success. If you can't define the problem you are trying to solve then how can you measure the level of success you have? I am glad that the old problem solving 101 still applies to today's much more complex world. I have yet to see my problem solving skills fail me yet. I follow the KIS (keep it simple) or KISS (I won't spell out the acronym here) principle. If you can make the complex appear simple, you have a better chance of getting more people involving in understanding the problem and thus ultimately help in being part of the solution.

Sunday, March 05, 2006

IBM's Quasar Impact

Earlier this year I discussed the impact I foresaw that the Cell chip was going to have on the industry. Well over at IBM the impact is huge. Just a few days ago, Fortune published an article describing the Quasar project at IBM. "IBM's Quasar: Is it the future of computing? The computer giant is betting a new chip and a reorganized corporate structure will make IBM exciting again."

The impact of what IBM learned on the Cell chip project in the last six years will be sweeping. The Quasar project is fueling an internal reorganization of IBM based on lessons learned from the Cell project and new sources of innovation within the corporate giant. The Cell concept came from Sony. It is a collaborative effort between Sony, IBM and Toshiba. It will power the soon to be released Sony Playstation 3, HDTV related consumer electroncs devices and IBM Blade Servers.

Quasar can be considered the next release of the Cell and will be a center piece of all of IBM's future hardware designs. The future information technology will be image and speech centric instead of primarily text-based as it is today. In the heavily image-speech centric processing world, the power of Cell and Quasar will be what drives the innovation from IBM's viewpoint.

Once Sony releases the Playstation 3 this year, we will have more tangible evidence of what the excitement is all about from an IBM perspective. IBM is trying to introduce the 'wow' and 'excitement' factor back into its products and with the Cell and Quasar chips it looks like the future again looks to be quite a fun place if all goes as planned.

Tuesday, February 28, 2006

Passing of a Great Software Engineer and Architect

Today was a sad day for me. I just read in the latest Software Development Times (March 1, 2006) issue on page 5 that the creator of Symantec's Visual Cafe and founder of Java software company M7, Mansour Safai, died on 2/9/2006. He was 43. The cause of death was brain cancer.

Safai was a brilliant software engineer/architect. While working for Logitech he created the Multiscope debugger. At Symantec he was vice president and general manager of their Internet tools division. In 1997, he developed Visual Cafe which was a multiplatform RAD tool for writing, debugging and deploying Java applets and applications. I did get to research and evaluate Visual Cafe in the late 1990s. I thought that it was quite impressive. Java at the time was just not mature enough for a adoption at my organization.

If you are not familiar with these tools that Safai created, they are all highly capable and innovative. His later company, M7 developed NitroX which was a very nice tool for permitting real-time WYSIWYG development of Struts and JSF applications within the Eclipse environment. I had the opportunity of evaluating and trying out NitroX last year (2005). What NitroX does behind the scenes is quite impressive. M7 has a patent on the algrothims implemented for coordinating and managing all the XML files in Struts, JSF, Eclipse and Java development.

Mansour Safai played, guitar, and was an athlete. He played competetive tennis in high school and enjoyed skiiing according to the SDT article.

The passing of a great software engineer, architect, and human being at such an early age is a great loss to humanity. His soul has moved on to the next level of existence and I am sure that he will continue his work in another realm.

This morning I offered a moment of silence and prayer for the passing of this great software engineer and architect. This news will probably go unnoticed with all the events happening in our modern world. If you happen to read this blog, out of respect for Mansour Safai please reserve a brief moment today in your life to remember him.

Wednesday, February 22, 2006

The Octopiler

How do you get the current and future generation of computer scientists, software engineers and programmers productive on parallel programming techniques? The octopiler. The "Octopiler seeks to arm Cell programmers" was the first announcement that I have seen for simplifying the development of complex parallel programs for the new Cell chip. (See my Jan 2006 blog entry about the Cell chip, "The Cell Processor and the Future").

The complexity of the new Cell chip and the need to be able to write programs that take advantage of it was always known to be programming challenge. In most computer science curriculums, you just don't teach the current generation of students how to write 8-way parallel algorithms. IBM has come up with a solution to create a compiler called the Octopiler that uses some artificial intelligence to breakdown your program algorithms and optimize them for the Cell chip's eight special-purpose engines (SPE).

What is most interesting about this new technology is that in theory it will make programming for a parallel processing chip as simple as writing single threaded programs. At least that is the goal. Whether this is realized or not in research and practice is to be seen.

I predicted that the Cell chip would change many things in computer science in the near future. This new compiler technology is the first tangible proof that I have seen to date. IBM says that the octopiler technology is available today in limited form on a 64-bit machine running Fedora Linux and a specialized version of the GCC compiler. The migration towards 64-bit computing just got another reason if you are interested in developing future applications for the Cell processor.

If you are in the game programming industry and want to develop games for the Sony Playstation 3, then you really do not have a choice but to start learning about the octopiler and other emerging technologies surrounding the Cell processor. I have never worked in the game programming domain so my viewpoint may be entirely wrong since there are already games under development for the Playstation 3 for the past year or so.

Anyway, the Cell chip will keep our industry dynamic and interesting for years to come.

Saturday, February 18, 2006

The Softer Side of Product Design

There are many myths and misconceptions within (and especially outside of) the information technology (IT) /software engineering (SE) industry that this is a primarily a technical business. The tenents of software engineering is at most approximately one third technical. The remaining two thirds is the other 'grey' stuff or softer side.

What I mean by the softer side is the non-technical aspects of the business of SE. In my usual early Saturday morning ritual, drinking coffee, perusing my favorites sites and reading various articles, I stumbled upon "Software Development's Evolution towards Product Design". After reading this entertaining yet insightful article it made me think about the 2/3 non-technical side of SE. I really like all the "poo" depictions in the article's graphics.

The Lost Garden article describes the human and social side of software development that is coming into focus today. This is where technology and techniques have less impact on the success of a product. What comes to mind are modern concepts such as social engineering, social portals, human centered design and other terms used today to describe the new products and interactions that occur with software and technology in the modern world of 2006.

Given that we know where SE has been, the future is the great unexplored territory. Evolving towards 'product design' as the above article states is just one direction that we are heading. The continuous evolution towards product design is especially relevant in the consumer product development industries (gaming, multimedia, entertainment). Other problem spaces in IT are not quite yet as mature and have different SE dynamics.

One area where evolution towards 'product design' is not quite so significant is in 'grey business' of business intelligence, data mining, data warehousing, modeling/simulation and decision support. This is an SE problem space where designing products for the 'emotional needs' is not necessarily quite as beneficial yet. In this particular domain, 'practical needs' still far outweigh the 'emotional needs'. In this area the concentration is still primarily on information and knowledge creation vice a specific product that has a look, feel, and social behavior.

For the long-run I think the emotional needs will eventually catch up and surpass the practical needs in the 'grey business'. The continual evolution of information technology will permit this to occur. The evolutionary software development model toward product design described in the Lost Garden article will eventually be realized over time. However, I am sure there will be other evolutionary SE theories, techniques and models that we will observe, discover, analyze, and force a revisitation of 'softer side'. Maybe this will be dubbed 'knowledge design' since some of the 'grey business' focuses more on knowledge and information vice products.

Saturday, February 11, 2006

The Eclipse of Modern IDEs

Five years after open source Eclipse was unveiled to the world as the future of IDEs by IBM, it appears that the future is imminent. Just this past week Borland has announced that it is getting out of the IDE business to focus on its ALM business. "Borland To Dump JBuilder". According to the InfoWorld article (02/08/2006), "Borland to exit IDE business, focus on ALM", this is due to declining sales, income and profitability in the IDE business.

Eclipse appears to be on track to become the future of IDEs. Recent surveys now put it at the top of the heap of Java IDEs. Things have changed drastically in the Software Engineering business as anticipated especially in the business of IDEs. I recall reading articles a few years ago about how tool vendors in the future will either become an Eclipse plugin or fade out of existence. Well with each passing year and corporate announcement like the one Borland just made, we see this happening right now.

This trend will continue as far as I can tell. At one point a few years ago, Borland JBuilder, IntelliJ, and a long since gone product by Symantec (I think it was called Visual Cafe) were the leaders. Today, each of these competing proprietary IDEs are either competing successfully for survival (IntelliJ), have since been 'sun setted' (Visual Cafe) or are being sold off to better opportunitues elsewhere (JBuilder). The Sun NetBeans products continues its evolution being heavily driven by the Eclipse phenomena.

Many of the reasons for this are identical to my personal scenario for choosing Eclipse as my preferred Java IDE a few years ago while evaluating JAVA IDEs. Eclipse was young and free, JBuilder and its peers were expensive and hard to get. The expensive tools were hard to get due to declining budgets and not due to accessibility on the web. I downloaded Eclipse, experimented and learned how to use it and become proficient with it. Prior to getting into Eclipse, I had used/evaluated VisualAge (yeah, remember that IBM tool which was the basis for Eclipse), Sun Forte, NetBeans, JBuilder and even Visual Cafe. I never did get to try IntelliJ although I have heard many good things about it. Using the free community edition or evaluation versions of the proprietary tools put quite a bias on my perception.

As Eclipse was evolving and improving (shining brighter) I was trying to justify the funds for the proprietary tools. Then Eclipse 2.1 rewrote the playing field in Java IDE performance and made the other tools appear to be very painful. I was able to get Eclipse 2.1 running acceptably on an archaic Pentium II 366mhz 384MB RAM notebook. Performance improved dramatically with Eclipse 3.x which is dominant today. As far as the IDE pain is concerned, Sun Forte and early versions of NetBeans were the worst. The sole reason why I never used the early versions of those tools was due to the painfully slow user interface performance.

At any rate, I could go on about my IDE experiences over the past few years but I this posting is about continuos change in modern IDEs. Eclipse has fully embraced the open source model and after 5-years has risen to the top by being agile, open, adaptable, reliable, and focusing on performance. I remember all the early debates about Swing vs. SWT and it appears that performance does matter. At least for the voting public (developers) it does. When it comes to IDEs regardless of the language/environment, I have always chosen performance over features. Well after being in this business since the late 1980s, looks like going with my instinct about IDEs has paid off again.

For the future, I would postulate that Eclipse will continue its ascent and start spreading its wings. This will provide more lift allowing it to fly higher and farther. IBM is already migrating much of what it does into becoming some type of Eclipse plug-in (Rational, Domino/Notes, etc..). Use of Eclipse in other languages like C/C++, Ruby, PHP, Python, Perl will mature. Microsoft, Sun and Oracle and whoever is left in the IDE business will all have to keep their eyes on Eclipse so they don't fall into the Borland scenario. This will keep the IDE business very interesting for years to come.

As for the modern IDE landscape, I think Eclipse has a bright future. That is where I put my vote for now.

Sunday, February 05, 2006

Frameworks

This is one of my favorite topics of late, especially in the web development and Java universe. If you are working with Java or any of the object-oriented dynamic scripting languages (Python, PHP, Ruby), then you are most likely working with a set of frameworks. If you are not using frameworks then I am not sure if you or I am in a better situation today.

I just read a blog posting "Why I Hate Frameworks" which pretty much sums of the state of frameworks today in the Java universe. What is ironic about this posting is that most of my experience with frameworks fits the hammer metaphor described in the article. I dub this 'framework hell' which is analogous to the the 'DLL hell' that exists in the Windows universe today.

If you are not currently working with a language/framework set or are just getting involved, then you will encounter the framework complexity and integration fiasco sooner or later. When you do, then all this will make sense to you. The general purpose framework days are over in modern software development. Frameworks are the trend for solving just about everything today and you can find at least a dozen framework to solve every single little part of your application in a highly granular fashion. Note, not all frameworks work or play well together. Most of them are designed independent of one another by teams of programmers that may or may not even be aware of each others' framework efforts or existence. This is the root of the problem.

The complexity and state of frameworks today is mind boggling. At least to me it is. Every few months, there is signficant progress made in the frameworks that I use or the frameworks I am researching to justify a possible change in technical direction. I don't mean just to following the technical winds of the day. Many of the these advances are significant enough to contemplate a disruption in forward technical progress.

Obsolescence of a framework is a real risk. In the last four years in the Java community I have seen some frameworks and APIs become obsolete as a newer better, faster, lighter technique picks up momentum and takes over. (i.e. Spring, AJAX). This continues today in 2006 as convergence of AJAX techniques takes shape. Recall that AJAX was a disruptive technique in 2005.

Maybe things will get better later this year or next, however for the foreseeable future, I think the state and complexity of frameworks will continue on course. What course that is up to the forward progress of the researchers, programmers, practitioners and companies that are driving this forward progress. Don't get me wrong, frameworks do work well when appropriately understood, applied, tested and proven. It is just the fact that your framework usage may become obsolete sooner than you think and you are left using deprecated technology.

I guess this is the cost of rapid forward progress. I am not complaining about it, just stating that rapid innovation in and of itself can be like one huge experiment that can shift and change directions without any warning. One month, you are using the best of breed and the following month you are re-engineering, refactoring, or simply researching better techniques which may cause you to consider making changes soon. If you follow framework intergration best practices (I am not sure which ones there are so many and everyone has their own viewpoints), loosely couple, design for change, use agile techniques, and keep an open mind, you will probably be successful in your framework endeavors. Good luck.

Sunday, January 22, 2006

The Cell Processor and the Future

On the cover of the 30JAN2006 Forbes magazine is the new Cell processor jointly developed between Sony, IBM, and Toshiba. The new Cell chip or '64-bit 8-way supercomputer system on a chip' is what Sony is using for the Playstation 3 (PS3) scheduled to debut this year (2006). The Forbes article, "Holy Chip!" is a great case study into some history of the Cell processor development and reveals some of its possibilities for the future beyond the PS3.

This chip has been in development for the past 5-years and started as an engineering challenge from Sony to IBM back in 2000 just as the Playstation 2 (PS2) was being launched. Sony executives wanted a 1000x increase over what the PS2 could do. Apparently the IBM, Sony, Toshiba team managed to yield a 50x increase over PS2 which pushed the capabilities of all the computer engineers/scientists involved to the limit. I suspect that Cell chip version 2 or some future iteration is probably in development and will attain the original goal of 1000x improvement over PS2 chip, the Emotion Engine, within the next 2-3 years.

The design and architecture of the Cell processor required many years of design and has since pushed and shifted the envelope of chip technology. "IBM's CELL Processor: Preview to Greatness?" If all goes as planned, this chip will not only be the center piece of Sony and Toshiba's digital entertainment devices, it will drive future computers and sensors. In the Forbes article, Raytheon studied the chip for 15-months and made decisions to use it in future weapons systems. The primary reason for this decision is its magnitude of graphics performance over all existing technologies.

The Cell processor can render full 30 frames per second dynamic computer generated full motion images at high-definition resolution in real time! This is significant because photo realistic computer graphics imagery is possible in real-time using the Cell processor. This will open many new doors of innovation for previously unimaginable product possibilities. One of the Cell-based demos developed by Toshiba dubbed "Magic Mirrors" turns your LCD monitor into a virtual real-time mirror! The demo simulates a real-world mirror in a program. This is only possible due to the advances with the Cell processor. Gaming and entertainment will not be the same.

IBM has published reseach papers ("The Cell project at IBM Research") about the Cell Project which is normally not how IBM does business. I guess IBM is making changes for getting the word out on this one long before products exist. Given that Sony will soon have its PS3 on the market.

I'd like to see how the Cell processor will change the personal and supercomputing arenas. Given the fact that the Cell is designed for grids and massive parallel processing, the next few years will be quite interesting as we see new products.

At any rate, there has not been a huge architectural shift in chip design for the past 20-years. The new cell chip required a completely new architectural and design approach over past chip designs. It is definitely innovative and time will tell what type of impact it has on the computing industry.

Friday, January 20, 2006

Continuing Open Source Technology Inroads

Evidence is mounting that open source technology adoption is picking up steam. If you use Mozilla Firefox as your web browser or using OpenOffice.org for your word processing, spreadsheets, presentation and desktop database needs then you are using open source technology. As a matter of fact, open source is what is driving innovation in information technology today.

In Europe, a recent report stated that over 30% of users are now using Firefox and growing daily. The percentage is highest in the Scandinavian countries. That makes sense because that is the area where Linux originated. The percentages are much higher than I expected so early in 2006. Back in May 2005, 10% of business users were using Firefox. Eight months later in January 2006, I think this figure is now approaching the 20% mark and according to this site, that mark was attained in December 2005.

Not only is open source use expanding, realized cost savings for its use is now becoming more widely accepted. "Consultants report corporations embracing, saving with open source". This is not a panacea for reducing costs since there are inherent additional cost involved with open source technology adoption. If you have the skills with Linux and have a good intelligent team that can readily assimilate the technology and make good decisions then you will probably realize similar cost savings.

Another benefit to open source technology is innovation. With open source technology, you are only limited by your motivation, knowledge and ability to apply your skills to solve business problems or create new products. This cycle inspires the creativity required to innovate with information technology. With many proprietary technologies you are limited by concerns like

  • Do you have legal licenses?

  • Are the licenses expired?

  • Do you have enough licenses?

  • Do you have the funding to get licenses?

  • Do you have time to justify the cost for the licenses?

  • When will you acquire the technology so you can put it to use?



All the above just creates roadblocks to innovation in my opinion. Instead of focusing on innovative solutions you tend to focus on the licensing costs. This is discussed with respect to the weapons and defense industry in "Why open source works for weapons and defense".

I made the complete jump to using, integrating and researching open source technology a few years ago and can gladly report that all these facts, figures, percentages and postings are all real based on my experience. I can confidently state that running a business heavily utilizing open source technology is possible today in 2006. If this was not the case, Google, IBM, Yahoo!, and Amazon would not be succesful today. It is also well known that these companies spend millions on research and development of open source technology. As a matter of fact these are the types of companies that are primarily funding and driving the open source revolution.

There are scenarios where propietary technologies are much more mature and solve the business problem better. This is particularly true in the multimedia and groupware arena. However, in these areas of open source technology, I am seeing evolutionary improvements that I am sure in the near future rival the proprietary products. This viewpoint is also discussed in Open Source's Commercial Future.

Open source technology does have its own types of associated hidden costs and requires much more skill than proprietary solutions. However, mitigating these costs requires doing your homework, assembling a skilled team and making sure you know your business requirements. Having the skilled open source personnel on staff is a must. Acquiring these skills takes a different breed of knowledge worker. Open source technology is definitely not Windows and it requires good multi-dimensional people to make it work.

Friday, January 13, 2006

Security in 2005 and Linux

Information security in 2005 was really bad. Actually, it was the worst year on record as anticipated. Leading the insecure perception pack is Microsoft Windows. The monthly 'patch Tuesday' has become a beacon and target for hackers. We saw the realization of zero-day exploits and Microsoft's slow response to well known vulnerabilities.

There were a few bright spots, "Linux Security: A Good Thing Keeps Getting Better". If you happen to be using Linux then 2005 was not a bad year and actually was somewhat predictable. If you are using Windows, then 2005 was a really bad year.

If you can anticipate, predict and adapt to vulnerabilities then you have a better chance at defending your systems. Comparing Linux to Windows I would have to say that Linux is the environment where anticipating threats is easier.

I am involved in securing both Windows and Linux systems at work and at home. From my viewpoint over the past year, I would have to say I was more concerned about the Windows machines than I am about my Linux machines. One of the nice things with Linux is that I don't have to continually worry about mail bomb viruses, macros and various scripts invading my network. The recent WMF flaw is a recent example of what is lurking inside of Windows.

The permissions model in Linux/Unix and especially in SELinux is much more robust than what Windows provides. So for the next year, I still feel that running Linux systems is a much safer way to compute than Windows. We are in January 2006 now so all that can change with another exploit to Linux or Windows. If I was a betting person, I would place my bets on Linux being more secure. I am not a gambler so I prefer to remain adaptable and flexible to address any potential scenario.

Maybe it's time to get a Mac in addition to my Linux and Windows machines! Now that would round out the platform scenario to include another architecture and make my computing networks highly diversified.

I guess that's the type of computing world we live in today. Change is rapid and constant. At any time or even overnight a security issue can manifest itself thus requiring some type of remediation. At least this stuff is not going to bore anyone anytime soon.

Data Warehouses (2006) Growth Keeps Going and Going

Databases and more specifically data warehouses are growing at an ever faster rate. "Data, Data, Everywhere" provides some metrics on the largest known data warehouses. The size of these complexes are roughly doubling every 12-18 months. We have seen this and even greater where I work. Wal-mart is approaching 600 terabytes today (Jan2006) and they project to be above the petabyte mark later this year.

I remember in the late 1990s when a terabyte was milestone. Well, today most of the larger data warehouse complexes are approaching the petabyte mark. EBay and Yahoo have over 100 terabyte today. Google is not mentioned and as usual they are relatively quiet about their metrics. I would expect that Google is in the Yahoo range or larger. The same goes for Amazon. Yahoo is mentioned as the largest commerical data warehouse based on the Winter Corp. survey conducted in mid-2005.

One common architectural characteristic with all these large data warehouses is the use of massive parallel clustering. The Wal-Mart complex has a massively parallel 1000-processor system. There are no details about how many server machines are used. Rumors about Google are that they have a 100k server machine massively parallel system based on open source technology. The power consumption of a system like this is mind boggling alone.

The growth rates mentioned are staggering. Wal-Mart adds over a billion rows of new data a day. EBay adds approximately 750k rows a day. "Database Lessons, Petabyte Style" mentions a Stanford University research database (Stanford Linear Accelerator Center) that was adding 500GB, yes that is gigabyte, of data a day in 2004. That meant every 29-days they were accumulating data tha is equal to all the books in the Library Of Congress! Whoa!

What do all these data warehouse complexes have in common? They all require massive clusters of servers and all have issues with managing their storage capacity. As Inmon put it, 'volume, volume, volume'. That is and always will be the #1 problem with data warehousing.

Monday, January 09, 2006

Organizational Culture and Environment

I was just reading Fortune 100 Best Places To Work 2006 this morning read about the best company, Genentech. They are one of the low-key Bay Area (as in San Fransicso) companies that has risen to the top through it's unique culture and environment. Genentech is a company for people who are smart, work for the greater cause, and are not necessarily pro-business. It's secret to success is its culture.

"Genentech: The best place to work now" This company has a very flat hierarchy, no titles, no special parking spaces, no dress code, has many onsite services (day care, Friday parties, etc.), spends a large percentage of its profits on research (the article states approx 40-50%), and has a culture and philosophy that prospective employees are carefully screened before hired. What is interesting is that Genentech's culture is compared to Apple and Google who are also low-key and high visibility companies. As a matter of fact, Apple does not even participate in the Fortune annual survey and Google is too young as a public to participate.

The Genentech CEO has an office that is 9x12 and has low-end steel furniture. For a multi billion dollar major biotech company, this reminds me of a Wal-Mart like mentality. If get a chance, read the book about Sam Walton "Made in America" and you will see the similarities in corporate mentality. Which has a very similar unique corporate culture.

Genentech stays laser focused on its roots which is research, innovation and the drive to a greater cause than the individual. Everything environment related appears to be centered around collaboration, progress, innovation, and value-added products.

So what does this have to do with software engineering? Everything. The culture and environment of an organization is critical to its long term success. This is something that organizations like Genentech, Apple, Google understand. If you are in a highly competetive field like IT services, internet, or biotech, you have to continue being innovative in order to survive. You must be able to atract the talent that shares your values, vision, remain focused, and passionate about that which you are doing. In Genentech's case it is research looking for cures to tough problems like cancer.

There are a several software engineering related books published that spend a great deal of time talking about the how the environment affects producitivity of the workers. Specifically in the Software Engineering field, "Peopleware" (Demarco, Lister) comes to mind. Others spend time discussing the impact of software engineering cultures and environments such as the quintessential "The Mythical Man-Month" (Brooks). The timeless classic by Brooks talks a great deal about mentalities, culture and environment back in the 70s.

It seems as though those companies who are extremely successful (i.e. Apple, Genentech, Google) have applied all the lessons learned from the past and make it happen today in 2006. As the saying goes, those that forget history are doomed to repeat it. Conversely, those that study history will learn from its lessons and will have the insight for steering the course to the future.

Tuesday, January 03, 2006

Forecasts for 2006

Good you made it through to the new year like I did. Like everyone else, this is the time to look out to the horizon and predict what may happen in the next 12-months. Notice I did not say crystal ball because that creates an imagery of mysticism for which the technology field is not. At least not in my viewpoint.

In 2005, AJAX was not even on the radar and it became the hottest topic of the year for web development. So what is on the radar this year? I'd put my bets on dynamic web development (i.e. Ruby on Rails, PHP Framework, and probably something like Trails for Java). Web platforms like Amazon.com, Saleforce.com and others will emerge and become dominant. Google and Yahoo are not sitting still opening their services and APIs for building applications on their platforms. Microsoft with it's aging .NET and emerging Windows Live initiatives will follow this wind.

The dynamic OO languages will continue to make strides and gain mind share. Especially Ruby on Rails and frameworks that are like this.

Java will continue to get more complex with its frameworks and technologies. Hopefully, Ruby has an influence on the course Java is charting and steer it towards simplicity and ease of use. This is definitely where Java is not today. The maturing of an open source Java VM and component based framework like Apache MyFaces could make this happen. For enterprise development, Java will continue to remain significant.

The web platforms (Amazon, EBay, Google, SalesForce, Yahoo) will make the operating system irrelevant. This may be the year that Windows looses its monopolistic grip. The desktop is becoming less relevant. At least I like to dream well. The Mac using Intel processors will have a huge impact if Apple does it right.

The web platforms use web services, APIs and widgets to provide you with all the tools you need to build applications. What about all the hype surrounding web services? Web services are the interim building parts for the web platforms. They will continue to exist and will evolve into building blocks for the web platforms.

In the browser space, Firefox will continue to inch away market share from IE. It is already used in high as 25% of computers in Europe and approximately 10% in USA. This is unconfirmed since the metrics on this is not easily gathered. I suspect that it will do the same in the rest of the world. In Europe, the penetration may even go as high as 35% by the end of 2006. Overall I wil predict a 20% market share for Firefox by December 2006.

Virtualization will continue its ascent. VMWare and Xen will be emerge as leaders in this field. This is where 64-bit and 128-bit computing make sense. It's time to get 64-bit processors. AMD is releasing this stuff and is relatively inexpensive. The next machine I purchase will be 64-bit.

Security issues will continue to plague the IT industry. 2005 was the worst on record. 2006 is going to be a lot worse. Zero day exploits and the continuing evolution of organized cyber crime makes this a profitable business for the 'dark side'.

Google will do some amazing things in the next year. Most of it will be unexpected yet sweeping as they have done in 2005. It is only 1/3/2006 and there are already rumors about a low-end non-Windows Google PC to be sold at Wal-Mart. What else is next? Maybe an AOL/Google web platform. Google remains secretive and only it knows where it is going.

Blu-Ray and the Sony PS3 will land this year. This is going to be the start of the mass migration to high-definition everything not just HDTV.

That's my first shot at forecasting the next year. This is a dynamic industry and I am sure there will a lot of unexpected events and technologies that are waiting to be unleashed on the world. Happy New Year!