Every year, tens of thousands of senior year Computer Science undergraduates are working on a whole year-long projects. Most of  them go to waste after the course ends. Surely, people learn a lot out of these projects but at the end of the day several hundreds of man-hours are wasted besides the code.

Code is trivial but “the change” isn’t. These senior-year projects can be sustainable in a way that authors, academia or people can get benefit out of them. Here are some stats about projects done at my school:

• Research projects: 8  These are mentored by professors and at the end students leave something behind them – even if the source code is crap and no engineering practices applied.

• R&D: 2 These are really interesting proof of concept stuff. They prove that there is a room for progress in that area.

• Productized projects: 2 These are completed and ready to be used by people, well-thought and designed ones, have a chance to be sold to a company.

• Total waste: 15 These don’t solve any problem in the world and nobody will use them ever including the authors.

At the end, from a pragmatic point of view, it was

Total Waste > Something Useful. I am pretty sure that these fellows learned a lot while doing such projects. Although at least half of the research projects were not conclusive it gives an idea of the academic life to the students. This is really cool.

The problem is, why don’t we build stuff which can be useful and we learn a lot out of it? What prevents us doing such projects? Here are a few points.

  • Academicians don’t know the real world. A project should not be managed by professors except it is entirely an academic research project. %90 of academicians do suck at developing products. They don’t see anything besides crappy user interfaces, a considerable number of them don’t use smartphones. They are conservative and usually no idea how the real-world works. So doing real stuff is usually not their thing. No offense, that’s truth.

  • Students don’t find problems. Okay you come up with an idea, that’s great. If you are solving a “real problem” or making something simpler and useful, that’s it. At least you made an improvement. You don’t have to earn money or something, you solved a problem and that is what engineers are supposed to do. If you are solving an artificial problem, well I might say you are just masturbating.

  • Students don’t learn to contribute. If you are wasting your time on a project that has no future, you should consider doing contribution. They are lots of open source projects out there waiting for to be improved. I believe developing a big feature is much more valuable than sparing months of work into a waste. Reading someone else’s code is difficult but is the real challenge.

  • **Academic projects should be open-sourced. **I didn’t hear anybody coding missile launcher projects or a significant progress on any cancer detection projects. Also none of the research projects are funded or can bring money someday. Why don’t we contribute to public domain as open source science? This is not just about my school but the most of the Computer Science schools. There is always a room for improvement in science, so why don’t we set our aims to make world a better place? At least can’t we simply disallow waste projects?