This forum has been archived. All content is frozen. Please use KDE Discuss instead.

50 reasons why Linux has no future

Tags: None
(comma "," separated)
awaydesu
Registered Member
Posts
1
Karma
0

50 reasons why Linux has no future

Wed Oct 20, 2010 1:56 am
StopLinux project has completed translating its famous “50 facts and myths about Linux or why Linux has no future” document into English. Since its initial release, the document has been heavily updated and abridged. Thus, all particular examples have been removed from it, while leaving general concepts only.

Despite some controversial points, author of this document believes to have shattered the myths about open source software, particularly Linux.

http://linux-faq.org/eng/index.html
User avatar
TheBlackCat
Registered Member
Posts
2945
Karma
8
OS
How long ago was the written? It seems like it is talking about how linux was maybe 5 years ago. Having to manually change the internals? Most individual pieces of hardware take you several nights to set up? No support from device manufacturers? That was true years ago, but far from true today. It doesn't look worth reading since it doesn't appear to be relevant to the situation today.


Man is the lowest-cost, 150-pound, nonlinear, all-purpose computer system which can be mass-produced by unskilled labor.
-NASA in 1965
User avatar
Primoz
Moderator
Posts
859
Karma
1
OS
I've seen this before both the actual Linux-faq site and on the similar site called Linuxsux (or something like that).
And I for one don't think that anyone of this sites is bad for Linux. Maybe outdated, as TheBlackCat said, but not necessary bad.
The problem I see, and the reason behind sites like this is, that there are some Linux users who wish to convert everyone they know to Linux. And try to convince people that Ubuntu (or any other popular distro, but I've sen this behaviour mostly on Ubuntu Forums) is same or better than Windows.
The problem is that it Ubuntu or any other linux distro for that matter isn't like Windows. Yeah it can be like i, if you know how to configure it, but if someone just tells you to install Linux and you'll have everything your previous Windows had you'll be greatly disappointed.

So all in all I personally believe that sites like this aren't bad because even if their facts about Linux are outdated it's still better to inform the user of everything that could go wrong than just saying "Linux is so much better than M$ Windoze!!11" :D


P.S.: I am a member of a community called Linsux, but that doesn't mean that I or many pople on that community think that Linux in general sucks. A lot of them are actual users of Linux or used Linux for extended period of time. It's more that we're aware of problems that Linux has, but mostly (at least for me) can't stand users who preach the way of GNU/Linux and try to convert masses to it.


Primoz, proud to be a member of KDE forums since 2008-Nov.
ngativ
Registered Member
Posts
66
Karma
0
OS
When my wife bought a laptop , it shipped with windows vista. So i tested that OS and , oh well you know ... vista sucks!.

So installed kubuntu in that laptop and everything went just fine. She hates the CLI and she don't likes geek stuff. So my first concern was about if Linux works for her,so i asked her if everything was fine for her using Linux. She said that everything is fine because she can do all that she needs to do with Linux, so everything works just fine for her.

But! i have to fix those damn bugs shipped with every Kubuntu upgrade. With the 10.10 upgrade i had to fix 3 annoying bugs.

What i am trying to say is that linux is not for everyone, as windows or mac os i not for everyone neither (me). Id never care about others using Linux, i just like Linux, and every person who likes using Linux its because there's is good reason, even those who likes linux just because some sort of ideology.

What people thinks about Linux? Well,most people thinks that Linux is one of the most advanced , secure and complicated operative systems out there. Cheers!
cando
Registered Member
Posts
9
Karma
0
OS
ngativ wrote:What people thinks about Linux? Well,most people thinks that Linux is one of the most advanced , secure and complicated operative systems out there. Cheers!


Yeah, right. It is one of the most complicated OS out there, but most advanced and most secure?

To be honest - Linux is a old-fashioned monolytic kernel OS with all advantages and disadvantages of such a design.

Pro:

in a dedicated appliance this is a great advantage. One can compile the kernel exactly to the underlying hardware - many routers, firewalls, NAS storages end even dedicated servers to single tasks (LAMP, File Servers, DNS Servers, DHCP Servers...) can benefit from this design.

Single tasking desktops and thin clients can also get a lot out of it - if it is just to browse the web, chat, read emails or write some small documents - it is perfect for PDA, Thin Clients, Nettops and Home Computers.

Security by obscurity. No serious virus developer would invest time to white malicious code for Linux - it does not pay off - however the "well documented" and after 20 Years stil buggy kernel offers a lot of opportunities to do so, but you have to rewrite your code for every distro. In Windows it "just works" and you can rely on well documented and stable interfaces.

Cons:

Well the design is grown like cancer in the last 20 years - standardisation
of interfaces / layers etc. are almost non-existent. There is no consistent modular model / building blocks that can ensure backward compatibility and interoperability of software. You have always to rebuild your work for the next kernel or next compiler version etc.

Device driver model / plugins / security abstraction layer between privileged OS kernel functions and usermode / hardware are not really existent what means a driver failure can compromise the whole OS.

Not real multitasking model with process isolation. If a task eats up sesources - the whole beast hangs. For average Joe it is OK, he is custom to have to reboot often on BSoD when using windows and downloading tons of maliciuos freeware from the internet... For a serious - and well defined - working environment both systems have a chance.

...
The bottom line:

Admins can lockdown Linux and Windows to the user and force the Usage of approved Software only - either Open- or Closed Source with success. There is no need to upgrade every half year to the next version. A desktop can run for years in a closed client-server environment almost unchanged.

Thin clients and virtualization can cut some more costs and ease upgrades of the underlying hardware. The lack of a clean well defined abstraction layer to device drivers in linux can be compensated that way easyly.

One disadwantage is the lack of good software for commercial use - there are no significant and reliable open-source ERP systems out there, no financial and tax software, only rudimentary CRM systems and all this stuff.

When it comes to software purchases - OS costs are not relevant. If I invest in a ERP system for millions of dollars, it is ridiculous to save 1000 bucks for the server OS. So Windows and Unix are the OS of choice. I would not bet on Linux for such mission critical and expensive stuff.

I would place Linux in the segment of non critical low-end desktops / servers and any kind of appliances. It is OK for consumer electronics, and home & student OS.


When it comes to business it could be a OS for locked-down daily office workplaces handling Spreadsheets or writing letters etc. and thin clients. I would not place it in development areas e.g. for CAD or things like that, in the server segment best for dedicated firewalls, dns dervers, dhcp servers, small business file servers or as OS for large network storage appliances, or as non-critical web servers. Have you seen how many linux based forums servers and web serves get compromised by root kits in the past? A secure system is something else.

However - it is an outdated 30 year old design (even it exists only 20 years) - and not a "state fof the art" operating system.
User avatar
TheBlackCat
Registered Member
Posts
2945
Karma
8
OS
cando wrote:To be honest - Linux is a old-fashioned monolytic kernel OS with all advantages and disadvantages of such a design.

So are windows and Unix, I am not exactly sure why this is a particular problem with Linux.

cando wrote:Security by obscurity. No serious virus developer would invest time to white malicious code for Linux - it does not pay off

Except, of course, for major online systems carrying massive amounts of sensitive financial information that run on Linux, such as google and amazon.com. Considering most hackers and virus-makers today are affiliated with organized crime and are after financial information like credit cards, Linux would be an ideal choice as a target, especially since a lot of the critical systems run on just two distributions: red hat linux and suse linux enterprise.

cando wrote:Device driver model / plugins / security abstraction layer between privileged OS kernel functions and usermode / hardware are not really existent what means a driver failure can compromise the whole OS.

Same with windows and Unix.

cando wrote:Not real multitasking model with process isolation. If a task eats up sesources - the whole beast hangs. For average Joe it is OK, he is custom to have to reboot often on BSoD when using windows and downloading tons of maliciuos freeware from the internet... For a serious - and well defined - working environment both systems have a chance.

This is absurd, I've run software on a dual-core system with dozens of simultaneous tasks eating up almost all my resources and the system does not "hang". My understanding is that the multi-tasking model in Linux is far superior to the one one windows.

I find it hard to believe that over 90% of top 500 fastest supercomputers run on an operating system (Linux) that has does not have a "real multitasking model". That is the whole point of using a supercomputer in the first place. In fact Linux is the standard for high-performance computing, almost none of the faster supercomputers run windows (about 1%, less then 4% use Unix).

cando wrote:When it comes to business it could be a OS for locked-down daily office workplaces handling Spreadsheets or writing letters etc. and thin clients. I would not place it in development areas e.g. for CAD or things like that, in the server segment best for dedicated firewalls, dns dervers, dhcp servers, small business file servers or as OS for large network storage appliances, or as non-critical web servers. Have you seen how many linux based forums servers and web serves get compromised by root kits in the past? A secure system is something else.

Is that why every major stock exchange in the world is Linux-based? The London stock exchange was the last hold-out, but when a crash in windows brought their system down for over a day, possibly costing billions of dollars, they ditched windows and went to straight Linux (as all the others had a long time ago).

cando wrote:However - it is an outdated 30 year old design (even it exists only 20 years) - and not a "state fof the art" operating system.

So are both windows and Unix. How is this a problem with Linux in particular?

I find it interesting that you make a lot of criticisms that apply equally to Linux, windows, and Unix, then somehow use this as justification to use windows or unix. You didn't provide any real reason, besides changing interfaces (which, as you said, is not a problem if you only do security upgrades), why windows or unix is a better choice, and the choices of major businesses for mission-critical systems and others for high-performance computing contradict your position.


Man is the lowest-cost, 150-pound, nonlinear, all-purpose computer system which can be mass-produced by unskilled labor.
-NASA in 1965
cando
Registered Member
Posts
9
Karma
0
OS
Maybe it is the X-Windows subsystem, maybe it's KDE or metacity. I don't know who exactly to blame - but if one of my kubuntu desktop applications crashes (and it does often) - there is no way to switch to another working application, even ctrl-alt f1 to f6 does not work sometimes - the keyboard and mouse do not respond (however sometimes it is possible to connect remotely via ssh). So the whole GUI subsystem is not really multitasking in the user experiance.

As I said in a server environment where server are dedicated to a specific task it is great - in a multipurpose environment it is far behind windows or unix or BSD.

Regarding the architecture - windows is far ahead with its driver isolation model and with stable interface definitions.

There are a lot linux boxes out there compromized by root kits and misused as spam-server out there. Maybe there are companies who successfully use large linux installations, but they do not rely onli on the OS security but on dedicated IDS systems, firewalls and a lot more stuff to keep hackers away.

BTW hacking and stealing of data is a different story in contrast to viruses. It is a different target.

Viruses are made to increase sales on AV protection software and target the mass market, root kits and hacker attacks are focussed on servers - and linus systems are potential targets for that. There is no AV Software market for Linux - it does not pay off to put ressources there.

Regarding the architecture. It is time to go towards a microkernel architecture and put all device drivers in usermode to increase security and make the system kernel leaner and safer.

This would be much better to put brain and ressources there to gain advantages over the competition instead of polishing the UI with things like placing the windows minimize and close buttons on the left side in the title bar etc...
User avatar
TheBlackCat
Registered Member
Posts
2945
Karma
8
OS
cando wrote:Maybe it is the X-Windows subsystem, maybe it's KDE or metacity. I don't know who exactly to blame - but if one of my kubuntu desktop applications crashes (and it does often) - there is no way to switch to another working application, even ctrl-alt f1 to f6 does not work sometimes - the keyboard and mouse do not respond (however sometimes it is possible to connect remotely via ssh). So the whole GUI subsystem is not really multitasking in the user experiance.

This has nothing to do with how Linux is designed as an operating system, nor does it have anything to do with the multitasking model, it is a problem with your particular window manager. It sounds like your window manager is somehow able to crash X11 when an application crashes, which is quite a feat and something I have never seen myself.

Kwin does not have this problem, I have no problem switching applications when one freezes or crashes, which is pretty rare though. This may be an issue with Kubuntu or metacity, it is certainly not the fault of KDE nor an inherent problem with Linux.

For windows, on the other hand, you cannot switch applications if windows explorer crashes or freezes, which it does constantly on windows 7 for me (on at least 4 different computers).

That is actually one of the big advantages of how window managers like kwin operate. If a program freezes on windows, it is in control of the window painting so the title bar freezes as well. You also can't move it, push it to the background, or really do anything with it. You need to open the task manager to kill it. In kwin, if the program freezes, the titlebar is still operational, so you can close it, move it, minimize it, push it to the background, etc.

cando wrote:As I said in a server environment where server are dedicated to a specific task it is great - in a multipurpose environment it is far behind windows or unix or BSD.

You have provided no reasons to back this up, practically all of your arguments apply equally well to windows and unix (BSD is a type of Unix), and the one that does apply to Linux (changing architectures) does not apply to the enterprise distributions that would actually be used on such systems. And for critical multipurpose environments, like supercomputers, Linux is still the operating system of choice.

cando wrote:Regarding the architecture - windows is far ahead with its driver isolation model and with stable interface definitions.

It is better with stable interface definitions, I'll grant you that. In terms of driver isolation, I don't know. Unfortunately, and I don't mean to be rude by this, but you knowledge of other matters (like mixing up window manager issues with multitasking, or not knowing BSD is a type of Unix) does not inspire confidence in your understanding of the systems in question.

cando wrote:There are a lot linux boxes out there compromized by root kits and misused as spam-server out there. Maybe there are companies who successfully use large linux installations, but they do not rely onli on the OS security but on dedicated IDS systems, firewalls and a lot more stuff to keep hackers away.

Same with windows. User error can effect any operating system. But all the security reviews I have read say Linux's fundamental design is far more secure than windows, and the history with Linux severs indicates that they have far fewer security and stability problems than windows ones.

cando wrote:BTW hacking and stealing of data is a different story in contrast to viruses. It is a different target.

Years ago you would be right. But nowadays viruses are often used to steal sensitive information and then send to the ones who made the virus.

cando wrote:Viruses are made to increase sales on AV protection software and target the mass market,

Wait, are you seriously saying viruses are made by antivirus companies?

cando wrote:root kits and hacker attacks are focussed on servers - and linus systems are potential targets for that.

Windows servers get viruses all the time.

cando wrote:There is no AV Software market for Linux - it does not pay off to put ressources there.

Except, of course, that most of the major antivirus companies do release Linux versions. Those are really only for preventing Linux systems from inadvertently passing viruses on to windows systems, but they certainly think there is a market for antivirus systems on Linux.

cando wrote:Regarding the architecture. It is time to go towards a microkernel architecture and put all device drivers in usermode to increase security and make the system kernel leaner and safer.

Ironically, Linux is already in the process of setting up user-mode driver frameworks. Are windows and Unix?

cando wrote:This would be much better to put brain and ressources there to gain advantages over the competition instead of polishing the UI with things like placing the windows minimize and close buttons on the left side in the title bar etc...

That is what Tanenbaum said when Linux first came out in 1992. He was pushing for Hurd as a microkernel-based system. Look where that got him. Microkernels have proven to be extremely difficult to implement in practice.


Man is the lowest-cost, 150-pound, nonlinear, all-purpose computer system which can be mass-produced by unskilled labor.
-NASA in 1965
cando
Registered Member
Posts
9
Karma
0
OS
I know that BSD is a free unix derivate - and in oposite to Linux it has a more modular kernel.

Anyway. Regarding the user experiance I cannot confirm, that Windows since XP is unstable ore crashes everything if one App is hanging. It is a real multitasking system with a scheduler and process isolation, priorization etc. Don't mix it up with experiances from Windows for Workgroups / 95.

Also the kernel is modular - there is the HAL and everything else are dynamic libraries and loadeable drivers - there is no monolytic kernel. I agree, that most hardware drivers in windows stil run in kernel mode context - but this is changing. The OS design is client-server based using ipc communication and multithreading for performance improvement whithin the general multitasking design to speed up applications. It is in the meantime far away from the first approaches in windows 3.x with cooperative multitasking based on the message queue.

I have used windows for almost 2 decades on my workplace as both server - and desktop OS in large scale implementations with almost no issues - also for business critical applications in cluster configurations etc.

It is able to compete with Unix systems on the market with comparable hardware config - (I am not talking about beasts like large sun server or mainframe installations etc.).

aniway - I use for several years Linux on my private computers and I am happy with this. Gnome is great and stable, my linux server runs non-stop (as my windows server does before). But with kde I have very often issues, so I try every upcomming distro to give it a chance and after 2 or 3 weeks I dump it because of instability, crashes etc and switch back to gnome.

I like the idea of open source SW and even it is always unfinished, with no documentation and sometimes buggy - I mean one bug fixed and after 2 versions the same bug shows up again - I want to give it a chance.

The problem is - there is a lot of software out there for linux, but you cannot bet that in a half year it will exists or will be supported any longer.

A lot of features one get to love and use every day just disappear whithout notice. This is something I would not accept in a business environment.

I know - it is for free - and therefore no liabilities. This is not the way I would do my business - so I completely understand, that CIO's prefer to spend money for software and have support and a software assurance and backward compatibility.

Your point, that it is used on critical servers e.g. for the stock exchange makes my feelings about the financial markets even worse. Imagine what could happen if some smart guys in the kernel team have some great ideas in the next version and those stuff crashes existing functionality or corruppts data - as my kde desktop crashes last week.

I love Linux - but in my opinion it is not yet ready to compete with commercial software. But I hope one day it will - so I support it.
airdrik
Registered Member
Posts
1854
Karma
5
OS
cando wrote:I like the idea of open source SW and even it is always unfinished, with no documentation and sometimes buggy - I mean one bug fixed and after 2 versions the same bug shows up again - I want to give it a chance.

The problem is - there is a lot of software out there for linux, but you cannot bet that in a half year it will exists or will be supported any longer.

A lot of features one get to love and use every day just disappear whithout notice. This is something I would not accept in a business environment.

I suppose one difference between little one-off projects where one is produced by a commercial entity, and one is open source: the commercial entity is more likely to be around to collect money for supporting the project in 6 months to a year, whereas the open source project may well be abandoned if there is little interest in it. On the other hand, the more popular a project, the more likely that people will still be supporting and developing it in 6 months to a year or longer. The Kernel isn't going away any time soon. KDE and Gnome aren't going away any time soon. Generally if you are on a popular distro and stick to the stuff that's in the repos you aren't going to run into packages that don't have a future very much because there are people making sure that the continuity is there.
I know - it is for free - and therefore no liabilities. This is not the way I would do my business - so I completely understand, that CIO's prefer to spend money for software and have support and a software assurance and backward compatibility.

Your point, that it is used on critical servers e.g. for the stock exchange makes my feelings about the financial markets even worse. Imagine what could happen if some smart guys in the kernel team have some great ideas in the next version and those stuff crashes existing functionality or corruppts data - as my kde desktop crashes last week.

I love Linux - but in my opinion it is not yet ready to compete with commercial software. But I hope one day it will - so I support it.


The people using Linux in the financial markets (and other mission-critical deployments) aren't going to be using bleeding edge kernels in bleeding edge linux distributions. They're going to be using such distros as Red Hat and SUSE enterprise which take well-tested kernel versions (which have already gone through the ringers of kernel audits and testing as well as testing done in more cutting-edge distros like Fedora and OpenSUSE), do more testing and bug-fixing, and then as they build the software collection around the central kernel and library suite do more testing and bug-fixing to make sure that everything in the distribution works well together, works as expected and is stable and reliable. Finally, after all the tests have passed they declare the whole suite of Kernel + compilers + libraries + software collection stable and begin releasing betas and release candidates for clients to test. After they've released the final stable version, they sell it together with 3-8 year support contracts (depending on the company and the license), giving you the guarantee that everything will continue to work as-is (plus bug fixes and security updates) for the duration of the contract. I would expect there to be very little different between the kinds of licences and contracts that Red Hat sells and any other major Unix vendor, and I would expect similar levels of stability and reliability between the two as well.

On the other hand, more popular desktop distributions tend to lean more towards providing up-to-date software and don't have as rigorous of testing procedures. Of course, there are also a number of desktop distributions (like PCLinuxOS) which do focus more on providing that stability and reliability, while also providing an up-to-date and well-designed desktop experience.


airdrik, proud to be a member of KDE forums since 2008-Dec.
Kryten2X4B
Registered Member
Posts
911
Karma
4
OS
cando wrote:Regarding the architecture - windows is far ahead with its driver isolation model and with stable interface definitions.


I agree on the stable interface definitions, but I certainly do not agree with the rest of the quoted part. Driver isolation sounds great in theory but in practice a sub-optimal driver can still bring down the entire system. It happened to me just yesterday on a Windows 7 machine (the culprit was the graphics driver in this case, reverting to an older version worked). It wouldn't respond to anything but the main switch. If it had been Linux, I could at least have ssh:ed my way in and see what the problem may be.

I'm not saying Linux is perfect, since that would be a lie, but in my ten years of Linux experience (4 or so of those being me just experimenting with it on a spare machine on occasion though) I've never experienced a driver being able to halt the entire system. Faulty hardware has made that happen though. Sure, a bad graphics driver may kill X, or a bad network driver may cause the network to go down in what seems like unpredictable ways but the rest of the system is generally working.

Not much of a comfort if you rely on either X or the network and don't know how to fix it though.[/quote]

cando wrote:There are a lot linux boxes out there compromized by root kits and misused as spam-server out there. Maybe there are companies who successfully use large linux installations, but they do not rely onli on the OS security but on dedicated IDS systems, firewalls and a lot more stuff to keep hackers away.


Of course they do. But they would (or should at least) do that no matter what OS(es) they use. And maybe it's just me but I've seen a lot more compromised Windows-boxes than Linux-dittos. And the worst: the Windows-boxes owners, company-owned or otherwise, are more likely to be unaware of it.

cando wrote:The problem is - there is a lot of software out there for linux, but you cannot bet that in a half year it will exists or will be supported any longer.


And the difference to any other OS would be?

cando wrote:Imagine what could happen if some smart guys in the kernel team have some great ideas in the next version and those stuff crashes existing functionality or corruppts data


Sure, that could happen. With any OS. If the system-administrator(s) are lazy. I maintain servers for a living (not of that magnitude though), both Windows and Linux ones, and one thing we always do: test it with real data but on backup-systems before you let any critical components be updated on the important machines. It's fine living on the edge for your home-system but it's a big no-no if you want to keep your job and reputation.


OpenSUSE 11.4, 64-bit with KDE 4.6.4
Proud to be a member of KDE forums since 2008-Oct.
cando
Registered Member
Posts
9
Karma
0
OS
Yes, both worlds have their fans.

I was very happy several years ago as I started with linux, that there was the built-in ability to do remote X-Sessions via xdmcp or even windowed X-terminal sessions with xnest, encrypt the traffic using ssh or openVPN etc.

Who needs to pay for terminal server if yone can have thin clients with an X-Server installed and can work on a central Instance. No sw-distrubution etc.

One year ago xnest / xdmcp disappers from ubuntu log-on screen and with the current version it is not even working any longer.

This is what I mean - not small on/off projects, but basic OS functionality. What we've got instead was a "cloud" service auto-integrated into the distro and things like hard-coded nepomuk stuff one cannot get rid off. So this distros really slide away from rock solid components OS to "cheap" consumer bloatware stuff. This is the price for trying to beat MS - with the same UI and Product strategy (explorer) - glue things together so the user has no choice.
User avatar
TheBlackCat
Registered Member
Posts
2945
Karma
8
OS
That is more an assessment of Ubuntu's particular strategy rather than Linux as a whole. Ubuntu has gotten a reputation for trying to emulate Microsoft and Apple's strategy rather then behaving like other Linux distributions. You cannot blame Linux for Ubuntu's particular decisions. The sorts of changes you are complaining about are common in Ubuntu, but less so in many other distributions.

As I keep saying, if you don't want your system to change, you should stick with a distribution that focuses on maintaining long-term consistency, like Red Hat Enterprise Linux and SUSE Linux Enterprise (or free derivatives like CentOS if you prefer). The whole point of these distributions is that they don't change core components of the OS.

There are different types of markets for Linux. For businesses and other projects that need long-term consistency, there are distributions that cater to that market. For users who are more interested in cool new features but don't care about the underlying software, there are distributions like Ubuntu, openSUSE, Fedora, and many others. Who users who want bleeding-edge systems which they tweak to their hearts' content, there are distributions like Arch and Gentoo.

Even within those, different distributions have different philosophies. Ubuntu has the reputation for being aimed solely at mass-market users, and cuts out anything that is not relevant to them. A distribution like openSUSE is targeted at more advanced users, so tends to come with the sorts of tools you are discussing but isn't always as easy to use or configure. Fedora has a reputation for absolute bleeding-edge software. Rather than blaming Linux for one distribution not matching up with your particular approach to computing, you should find out which distribution best matches your philosophy and use that.

People who want long-term stability shouldn't use a version of Linux aimed at everyday consumers. Everyday consumers should not use a version of Linux aimed at hard-core programmers.

Mission-critical systems using Linux are never going to see the sorts of problems you saw because no one setting up such a system would use Ubuntu. They would use Red Hat, Suse, or some other similar Enterpise-level distribution that changes very, very rarely (except for security and sometimes stability updates).

Your problem is that you want long-term consistency, but rather than choosing a distribution that promises this you chose a distribution that is known for major changes on a routine basis. This is like trying to use windows home basic as a server when windows server edition is available.


Man is the lowest-cost, 150-pound, nonlinear, all-purpose computer system which can be mass-produced by unskilled labor.
-NASA in 1965
User avatar
CraigPaleo
Registered Member
Posts
73
Karma
0
OS
Cando,
That missing xnest / xdmcp was a bug with GDM and the fix was to use KDM instead. https://bugs.launchpad.net/ubuntu/+sour ... bug/689988

Edit: that bug fix is only a release not update. Yeah, probably better to go with one of the server oriented distros.


Image
User avatar
CraigPaleo
Registered Member
Posts
73
Karma
0
OS
mikewilliams wrote:Hello,

Thanks for posting this list. This is an interesting topic. I am fairly new to Linux. Would most of you agree that it has no future?

Thanks again!
Mike Williams


I can only speak for myself. I do believe Linux has a bright future. There have been naysayers for almost 20 years but Linux is still here and just keeps getting better. :)


Image


Bookmarks



Who is online

Registered users: bartoloni, Bing [Bot], Google [Bot], Yahoo [Bot]