Saturday, March 14, 2015

Tale of Two Speaking Engagements a.k.a. What is Wrong with Linux

We are very grateful for two events we participated in on the first quarter of the year 2015.

First is the IT Mania 2015, held last 31 January in AMA University, Quezon City, organized by the Junior Philippine Computer Society (JPCS) led by Marigold Cardeño in coordination with Freelancer's community manager, Nikko Magalona. Being a former JPCS Research and Development Committee Director during my college days, it was like meeting old friends.

And the Web and Database Development seminar held last 27 February in Panpacific University Northern Philippines (PUNP), Urdaneta, Pangasinan, organized for and by PUNP Computer Engineering students, namely Joemar F. Arreola, Butz M. Bautista, Marlon E. Daileg, Aljon S. Equila, Khert Patrick V. Gumangan, Julius P. Manangbao, Junipher Manangbao, Greggy T. Mendoza, Andrew Martin M. Millan, Don Lorence C. Panayon, Mark Oliver G. Rabena, Ralph Christopher S. Sadio, Darien Vaughn E. Soria, Christian Justine E. Velasco, and Mark Jayson E. Zambale.

Moral lesson: if you want to be seen in the groupie, don't stand behind the brightly turned on monitor or be X-Men's Nightcrawler
Hopefully what we have imparted would be found relevant and valuable for their career moving forward into the IT industry.

Attending these events, we have come to realize and enumerate what's wrong with Linux or GNU/Linux to be politically correct. They are:
  • Inconsistent software development process and practices across various distributions
  • Unreliable hardware support even on distributions using higher kernel version
  • Varying versions of support library versions across various distributions
  • Difficult skillset transfer between various distributions
  • Distribution standard (Linux Base) can be sacrificed for market share
  • Performance can be good, security is not top priority
  • Various distributions target same problem domain segment, with varying degree of effectiveness
  • Inconsistent and non-comprehensive documentation

Inconsistent software development process and practices across various distributions
Each GNU/Linux distribution became silos themselves, disregarding that each distribution have different and not cross-distribution compatible ways of solving the same problem. With each distribution having their own patches for bugs and security updates, one problem will be solved in various and not necessarily effective ways that will consequently not be beneficial to the GNU/Linux community and their ecosystem overall.

Unreliable hardware support even on distributions using higher kernel version
Back in early 2001, Red Hat Linux 6 was new, with the hot and fresh 2.4.x kernel, and Slackware Linux 3.5 is considerably older using 2.0.x kernel version, but guess which distribution recognized most if not all the hardware of the IBM hardware my previous employer bought at that time? If you guess Slackware 3.5, the mind-boggling questions is how come an older distribution with an older kernel able to detect the hardware? Red Hat Linux 6 was unable to recognize that hardware until we were able to use incremental 6.x versions, but either way, how come Slackware did it. More than the kernel version, the overall structure of the system is more important than the greatest and the latest Linux kernel version.

Varying versions of support library across various distributions
There are free and open source distributions and then, there are free and open source distributions, and there lies the rub. Different GNU/Linux distributions have different release schedule between them and along with it, they have varying degree of support for certain libraries. Take PHP5 as a case in point: in Ubuntu it could be the bleeding edge version of 5.4.x as of this writing, but in CentOS, it's still 5.2.x for the sake of stability. Two or more GNU/Linux distributions may be contemporaries, but having to deploy a common software across them would be support library nightmare when the needed library version is different across different distributions, consequently, other free and open source software will be written for specific library versions, depriving other distributions just because they have older or incompatible library versions. Linux From Scratch project further confirmed this problem.

Difficult skillset transfer between various distributions
As if having to determine variation in support libraries across various GNU/Linux distributions wasn't bad enough, the variation in skillset among distribution poses another concern. Currently, Linux distributions are basically classified into two (2) camps: Red Hat-based and Debian based. Red Hat-based distributions, like Fedora, CentOS and OpenSuse, use Red Hat Package Manager (RPM) to manage software within the system while Debian-based distributions, like Ubuntu, Linux Mint and well... Debian, use apt-get and related tools for package management. While distributions using the same package management system may use the same set of commands, the variance in package repository can and will cause trouble, not to mention the variation in support libraries and package versions for each distribution. Oh, and we haven't yet spoken about Debian-based PPA package distribution which adds another dimension to this concern. You have to start a particular server daemon? Red Hat-based distributions have certain steps to take which is different from other non-Red Hat based systems.

Distribution standard (Linux Base) can be sacrificed for market share
If breaking the Linux Standard Base is a severe crime, Red Hat drew the first blood and is guilty of high treason. Other distributions followed suit, because Red Hat broke it and became commercial success, so why can't they? This will be made apparent when you have to start a daemon, on certain systems it's in /etc/rc.d/ while in others it's in /etc/rc.d/init.d/, while others still, it's in /etc/<daemon name>/.

Performance can be good, security is not top priority
When Microsoft Windows was the king of the computing landscape during the first decade of 2000, with standard features like virus vulnerability, high reliability to crash anytime soon, and supposedly easter egg Blue Screen of Death being common, having performance better than status quo and having a server do it's job for more than a few days in a week was a much sought platform capability. Sadly, Microsoft wised up and slowly caught on with performance, Windows Server is now also able to do it's job as a server for more than a few days a week, but GNU/Linux distributions were slow to move on to the next level and became complacent with the "been there, done that" response to proponents of Windows Server. Much as we hate to admit it,Microsoft upped the ante with Azure and their Hyper-V able to run CentOS for Red Hat-based and Ubuntu for Debian-based Linux to satisfy wishlists of many. In terms of security, SELinux and similar add-ons were still hard to use, configure and manage, while at least in cloud computing, Azure and similar platforms have built-in systems to address if not mitigate this concern, in short term that is since we have observed that cloud platforms are potentially the new monopoly platforms where compatibility between cloud platforms is taboo. Security is still a far third on the priority list of many platforms and the impact and repercussion is hard felt when Heartbleed and similar security issues come to light.

Various distributions target same problem domain segment, with varying degree of effectiveness
It is said that you either have to stand for something or fall for everything. Beyond marketing campaigns and mass perception management, what does each GNU/Linux distribution stand for? It may not be as common to be heard now but most if not all Linux distributions strive to be desktop alternatives to Windows, and they each did bad jobs for the role. Many distributions want to be an answer-all cure-all system: CentOS, OpenSuse, Gentoo, Debian and similar distributions have desktops as well as server functions. While all these distributions can be desktop as well as server platforms, they have inconsistent degree of effectiveness for the role they take on. For desktops, they don't have as much polish as the Mac and they also don't have as much hardware support as Windows. For servers, having crosshaired dependencies among support libraries, their servers may not easily squeeze through the storage capacity of the server hardware while leaving enough room for server generated files, and don't get us started on the embedded market where the mere size of the kernel is enough to choke it off.

Inconsistent and non-comprehensive documentation
If there's one major feature of successful free and open source projects to be beyond a mere itch to be scratched, it has got to be a consistent and comprehensive documentation. While you can google the answer for some pressing questions, if the main project site has very sparse or vague notion of how to do particular steps that will help implementers of the project, it will be easily eclipsed by a well documented system. While the source code of a free and open source project is available for close scrutiny, nothing beats a comprehensive documentation to easily relay the rationale of that project.

Such are the concerns of GNU/Linux that while it's a free and open source platform, it lives on the principle that "if it smells like chicken, looks like chicken, sounds like chicken, and tastes like chicken, it must be chicken," in the long run, it won't thrive long on uncertainty and hit-or-miss attitude towards security, consistency, standard compliance and comprehensive documentation.

2 comments:

  1. I am glad that both the events worked out really well, and I completely agree with the pointers pointed out. I believe these will help in improvement.

    ReplyDelete