Thursday, November 10, 2011

Three common mistakes that flood IT Projects

Three common mistakes that flood IT Projects

I have been involved with IT for several years now. Since my high school years I've seen many IT initiatives fail miserably (i.e. disastrous implementation, horrible IT solutions, incomplete initiatives, etc.) and I've seen others be tremendous successes.

I recently came up with a blog posting by Ty Kiisel about common mistakes that plague IT projects. Seeing in retrospective, in one way or another, I can identify with those mistakes he mentions in the blog.

For example, he talks about the project manager setting up unrealistic deadlines for the team. The author suggests that while some projects require a hard deadline, most of them don’t. In my experience that holds true, or at least not you shouldn’t set up unrealistic deadlines especially when you are not expected to set them. I remember a time when one of our clients wanted to implement a new 3D scanning system in his manufacturing plant as part of a new quality management control system. The system consisted in a combination of cameras, sensors and software that took several pictures of a certain object and compared them against a previously defined “quality” product. The problem with this system is that it was the first time anybody in the team tried to “cluster” the cameras to shoot at the same time and tried to automate the camera’s functions via programming language. The client was very interested in the project because it would increase the speed of the quality control system without hiring more people. During the negotiations with the client, my team leader offered the finalized product into what we thought would be a very tight schedule if we knew the technology. When we found out what we were faced against, we discovered the deadline was totally unrealistic. Fortunately, the client was nice enough to accept a delay of over 4 times the expected delivery time. Lesson: never commit to a deadline (specially to an unrealistic one) just to impress people at the beginning, at the end you will impress them in the totally opposite way.

Kiisel also talks about risk not being managed and how ignoring it does not make it go away. I worked with a partner for a significant academic project as part of our thesis. We were building a vehicle traffic simulator in various computers at university. We had heard how a recent electric failure in one of the adjacent computer labs of the building had fried two of the research servers used in a thesis project for other group and how they lost nearly two years of research due to data damage. We were aware of the risk and we thought we should take precautionary measures in order to avoid data loss. We made backups of our data the next week and we forgot about the data loss case as the time went by. Nearly six months later, the same problem happened again but this time the fried computer was our server. If it hadn’t been for a backup made a week earlier by the recent automated backup system installed by the Research Department, we would have lost a tremendous amount of our time and data critical to our final project.

Finally, Kiisel also talks about the mistake of stakeholders not involved in the project. I found this several times in IT projects of third parties. For example, I remember how a supposedly high-tech emergency communication system installed in the school where I worked failed miserably. The US Department of State, through the US Embassy in my country, gave a substantial grant for the implementation of an emergency communication system installed in all school classrooms. The sponsor unilaterally decided to outsource the project to an external company that did the installation over a period of two months working during the weekends. The company never interviewed any of the stakeholders (faculty members, support staff, students, etc.) to gain insights about our needs, about stakeholders’ tech proficiency, among other factors. The result was an installed high-tech system that was so complicated to use that nobody could actually operate. The old system (walk and notify to the nearest secretary –even if she was in a different building-) was brought back and the new system was abandoned and remained installed as a symbol of failure. Personally, in this case I think there was more than just bad project management as I suspect the sponsor had some dubious interests in the company and technology used. This feeling became stronger when he was fired soon after the failed implementation of the project.