Linux 2017 – The Road to Hell

Also known as a really good song by Chris Rea. Also known as, ladies and gentlemen, I am terrified. What will become of Linux in 2017? Will it even boot? Now, now, please, relax. I am not trying to be an attention person of fiscally questionably nature. I am just trying to share my fears with you.

And I’m also a man of science. Of numbers. The way I move through space with minimum waste and maximum joy is all about mathematical probabilities. I look at things happening around me and try to extrapolate what they will be like in the future. I seek patterns in the numbers, and what I see ain’t pretty. Linux is slowly killing itself.

Joker

Why so negative?

I am not being negative. Not at all. I am merely being realistic. Looking at the state of the Linux desktop in the past 6-7 years, I can see a definite negative trend in terms of quality. Overall, it has held steady for a good solid four years, around 2010-2014, and then things have sort of started to slip. Let me demonstrate. To wit, my Lenovo G50 test box, a very Linux unfriendly system.

In 2015

When I purchased this system, it gave me all sorts of grief. The chief amongst them was the wireless network card connectivity. But then, I found a workaround, and I was able to start using my laptop without having to worry about frequent losses of network. There were several other issues, like UEFI support, which was also gradually resolved, as more distributions started implementing mechanism to boot this state-of-the-art technology first launched in 2001.

I was hopeful that the glitches and problems would be gradually fixed upstream, and that newer editions of various distributions would emerge with a more streamlined hardware support, allowing me to use the laptop without worrying about seemingly trivial, given factors that any normal user would expect from their system. Alas, come the new year, things started turning for the worse.

Now, in 2016 …

What did I have in 2015? On this particular system, I was able to use Samba shares with anonymous or guest access. I was able to use the Realtek card without network drops after adding the modprobe.conf change. I was able to play multimedia files, battery life was decent, smartphone support was okay, and prospects were optimistic.

In 2016, with all recent distros using a more recent kernel (post CentOS 7 and Trusty release), the network connectivity drops even with the modprobe tweak in place. Heavy network usage leads to an outage, which somewhat luckily can be resolved without reboots. Previously, my testing would usually have a single incident per live testing session and none in the installed systems. Now, a typical live session has 3-4 drops, whereas in the installed system, it happens whenever I’m downloading multiple files from the Web.

On the multimedia front, things are not looking bright. Linux Mint recently decided to stop providing codecs by default, adopting the Ubuntu method, which eliminates one of its chief advantages. Samba sharing is no longer possible anonymously. You need to provide credentials for your Windows machines to be able to connect to shares.

Battery life, on the SAME hardware, has gone down from roughly 3.5 hours to about 2.5 hours with pretty much every single distribution, and the results are consistent across different systems and desktop environments.

Ubuntu, bad battery life

Kubuntu, bad battery life

When it comes to smartphones, Plasma — a brand new and shiny desktop environment — fails to mount iPhone or Windows Phone in 2016. Nothing highlights the problem better than a comparison of the CentOS 7.2 KDE and Gnome editions. A seemingly outdated non-home-friendly system has such a drastic difference in quality and results, just because of the desktop choice. Then, Plasma came around, replaced and superseded KDE, and still, the same issues with smartphone (MTP) support are present and carried over into future releases.

Linux Mint 18 Sarah delivers an abysmal performance with smartphones — ALL of them. Four or five previous releases had excellent support in this space, with the very same Cinnamon release. On top of this, we have the remainder of regressions introduced in the Ubuntu 16.04, an LTS edition no less!

iPhone regression

Lucid was the breakthrough Ubuntu release, followed by a rock-solid Pangolin and a supreme Trusty, which still remains the most balanced distro on the market out there. Xerus, on the other hand, struggles with lots of bugs and issues, both on the software and hardware front. For instance, the fact the new Software Manager was unable to locate some of the applications that can be searched and installed using the command line interface. That’s not the kind of bug that should be allowed into production.

The regression plague pretty much every distro. OpenSUSE 42.1 was another highly sought, highly anticipated release that ended up drowning in bugs and issues, none of which have been resolved half a year down the road. Usually, when I’d retest a particular distro, roughly two months after the initial launch, as I’ve done with Kubuntu and openSUSE on many occasions, the issues would be ironed out. Not this time.

I’m also having more and more distributions fail the basic boot test, either from USB or DVD, and I have launched a series of so-called rejection reports highlighting my frustration with the ordeal. Again, it is very important to be consistent, which is why the use of the same hardware and same testing methods helps highlight the differences and the drift between operating system versions over time.

Do the math, and project the results for 2017.

Why so buggy?

This is a really good question. And I think the answer has two parts. One, old problems are rarely if ever fixed, especially hardware-related problems. My woes with the N-band Intel card on the T400 laptop remained to the last day I still had this machine, and they were never resolved in any one distribution. The same applies to my Broadcom adapter on the HP Pavilion system, which would behave pretty badly across the wider distro estate for good many years, especially with systems installed to an external disk.

The other part is, the distroscape is accelerating. There are more and more changes being added to the desktop, and as a whole, the desktop is becoming more complex. The typical cycle of about six months might have been sufficient to review and QA the distributions, but it no longer applies for the modern crop of systems. I believe that it has become too difficult to keep pace with the rate of change and provide high-quality results.

Plasma crash

Limited testing definitely entails the risk of bias, but I believe my statistical pool is fairly large and consistent. With hundreds of distros tested over many years across a range of devices, including no less than seven notebooks, it is fair to assume that my experience is representative of what Linux has become, and where it is headed.

Yes, it is friendlier and more convenient than before. There is no reason to pine after the good ole days. That is the illusion of time and warped memories. Linux has become more accessible, but at the same time, it is now being used and tested on a much bigger scale than ever before, and more problems are being found. It’s a good sign, but not so when people expect the corrective part of the feedback loop.

The distro devs have always waited for the golden moment when their systems would finally become a mainstream commodity. Android and Steam have definitely changed the perception out there, and now, the demand outruns the supply. Worse, the dwindling resources and outdated methods are causing havoc and irreparable damage to the Linux reputation. The pace is too fast, too wild, and our favorite operating system is unravelling.

Stop. Collaborate and listen.

If things continue the way there are, Linux will just become irrelevant. No one needs a Microsoft-alternative operating system if it can’t do half the things that a typical Windows system can. We’re not talking about miracles. Basic expectations. Connectivity. Consistency.

Consistency is just as important, if not more. Imagine if you have a production setup with one version of Linux. And now you upgrade to a new one. Can you afford to have things break? Think about your network setup. Or your smartphones. What if you can no longer browse the Web or download stuff without drops, and you cannot connect your gadgets to your laptop? That is devastating. Do you want to gamble your important stuff, your work, on whimsical, erratic changes in software?

Yes, things break everywhere! But not so rapidly and so randomly. Again, if we look at my G50 system, as good or as bad as it may be, my ability to perform the exact SAME operations I did a year ago has diminished significantly. I no longer can enjoy seamless Samba sharing, I cannot use my phones the way I used to download photos and sync music, I can’t use my laptop without frequent battery recharging, installing software is buggier and more difficult, and my network drops every 15 minutes.

That is all you need to know when you consider your odds for 2017.

The bigger question is, can Linux be saved?

The answer is, of course, it’s just software.

The answer is simple. Slow down. That’s all it takes. Having more and more git commits in the kernel is all nice and fun, but ultimately, at the end of the day, people just want to be entertained. They want things to be as they were yesterday. Progress has to be seamless. For most users, the huge advancements in kernel security and architecture are completely irrelevant, and go completely unnoticed. But userland bugs are terrible. And noticeable.

I think the distro world needs to gear down a notch or two. Bi-annual releases contribute nothing to the quality of the end product and detract people from focusing on delivering high-quality, robust products. It’s just noise for the sake of noise — generating activity the likes of the Civil Service in Yes, Prime Minister. No one will get a medal for releasing their distro twice a year. But people may actually appreciate solid products, as infrequently as they come, because at the end of the day, it makes no difference. Most people are happy to replace their software come the end of life of their hardware. And that means once every six years.

We don’t need to be so conservative. But let’s trying slowing down to one release a year. That gives everyone twice as much time to focus on fixing problems and creating beautiful, elegant distributions with the passion and love they have, and the passion and love and loyalty that their users deserve. Free does not mean you can toss the emotions down the bin.

Conclusion

The Year of Linux is the year that you look at your distribution, compare to the year before, and you have that sense of stability, the knowledge that no matter what you do, you can rely on your operating system. Which is definitely not the case today. If anything, the issues are worsening and multiplying. You don’t need a degree in math to see the problem.

I find the lack of consistency to be the public enemy no. 1 in the open-source world. In the long run, it will be the one deciding factor that will determine the success of Linux. Sure, applications, but if the operating system is not transparent, people will not choose it. They will seek simpler, possibly less glamorous, but ultimately more stable solutions, because no one wants to install a patch and dread what will happen after a reboot. It’s very PTSD. And we know Linux can do better than that. We’ve seen it. Not that long ago. That’s all.


Cover image by Brigitte Werner for Pixabay.com.