I found that both Windows and Linux are equally suck. Neither one of them is better or superior to the other. Because of that then my choice would be based on the users, where the one with nicer users win. In this case I found Windows users are nicer than most Linux users I’ve met offline/online because they have no emotional attachment towards Windows, neither they are loyal. They use it not as a fan but as a user, they don’t think they’re special for using it. Name-calling, flaming, trolling is pretty much non-existent among them when it comes to Windows-related discussion.
Among Windows users they rarely compare Windows with Linux for the sake of saying Linux is bad. They have no reason to do that and they don’t have to. Linux fans usually say “Windows breaks on a frequent basis and it’s Windows’ fault. In Linux when things break it’s your fault”. This is bullshit. Whether the system breaks by itself or we broke it, the point is we’d have to take care of it whenever something broken, regardless it’s Windows or Linux. Linux is no special in that sense. Unless it can fix itself, I don’t think there’s anything in Linux that I can brag about and feel proud of.
Windows sucks is not enough of a reason for me to give it up completely, let alone to switch to Linux. I see Linux is equally sucks so there’s no point in switching to it. Linux may not break by itself (assuming that Windows does break itself that often) but even if it’s our own fault that it breaks it would still as frustrating as in Windows when that happens. My experience in Linux is there is no such thing is everything runs perfectly the first time upon installation. Whether it’s Linux or Windows both would require a certain level of manual configuration at some point. Taylor-made, cookie-cutter system like Mac OS X is an exception but it’s not accessible by the mass.
I think Mac users have very little contribution to the geek world. I’m sorry if you Mac users out there are offended by this but I have to say that it’s because your favorite system is static. You can’t experience the same enjoyment of building a system up from scratch.
Sorry Mac users but your system now is nothing more than a “Windows PC’ but it comes with its own OS that can’t be legally installed in different-branded system. Sorry Apple but it’s not our fault if we never use your OS because the only way to use OS X is to get a Mac. However your systems are premium. I know it may be cliché to say ‘premium’ considering that even in Windows world, there are premium brands too like SONY, Lenovo and Toshiba, but that is not a problem in Windows world. If a user just want a Windows system s/he can always get a cheaper hardware. Mac users can never enjoy that.
As far as I know Mac OS X is cheaper than Windows when sold separately. Not only that Mac users don’t face the same confusion as Windows users since there are only 2 versions available, the desktop and server. Not to mention the server and desktop version will be merged in the upcoming ‘Lion’ release. In Windows there are probably at least 7 (pun intended) different versions to choose from, and the Ultimate version is really pricey. That’s one good thing I know about Mac OS X.
However one good thing does not mean it’s already OK. Not allowing OS X to be installed in non-Mac system is like not allowing cars other than Ferrari to use Shell gas. For me vendor-locking is the stupidest business model. It’s a relic from the past. I’m not saying that Apple should give up on vendor-locking anytime soon or even go completely open source but isn’t it about time to reconsider for a fresh business model in order not to look evil? So my final words for now, allow Mac clones or at least allow Mac OS to be installed on non-Macs and we’ll talk.
It’s because the Windows users are able to build their system from scratch that makes them contributed more to the geek world. They share with everybody how they do stuff and they have access to resources (IT stores) and get to experiment with different hardware setup to achieve the best with their own hands. In Mac things can only be done in a pre-setup environment by Apple. Of course there are some upgradeability in Macs but most of them are only recently (since the transition to Intel CPUs) whereas in Windows world this has been around at least a decade before Mac users could enjoy the almost similar (but inferior) experience.
The most that could be done with a Mac is to upgrade the hard drive and RAM but CPU is usually a no-no, unless you’re adventurous enough to try it, provided it’s not hardwired to your system, or your system does not using the custom-designed CPU like most Macs do. In Mac world the users would have to wait for Apple to use newer CPU, but that is by getting an entirely new hardware instead of upgrading the old one. The system is already on the premium side so getting a new one means another money to spent. Sorry but that’s not something I’d want to do. Rather than paying money for a complete system that’s deemed “the best your money can buy” I’d use the same amount of money to get awesome hardware and setup a superior system instead.
That said I have to say this again, sorry dear Mac users. Your system is cool but the moment I learned of how restrictive it is I stopped loving it. Don’t blame me if I say you have very little contribution to the geek ecosystem, and I don’t buy it if you’re going to say cliché rationalizing argument like “Macs don’t need to be meddled with that often”. Sorry but that is also a relic of the past. People should enjoy building their system because their can, thus it’s not enjoyable to be not allowed to do something that is actually possible.
All Linux distros should follow the way of Windows and Mac OS X by default, where applications are installed from a single downloaded file where when double-clicked, bring up an installation wizard program with graphical prompts, rather than the traditional package management systems that many Unix-like systems use.
Oh my goth I really don’t get it. There are people who think it’s bad and ‘monopolistic’ for Microsoft to be an all-rounder tech company, say, if they’re making their own hardware (PC) and make Windows exclusively for it. However they would think it’s great or perfectly fine for Apple to be an all-rounder tech company who made Mac OS X vendor-locked with their Macintosh machines only all this time. Isn’t that a bit biased to say Apple can do that while Microsoft can’t? Isn’t it a bit unfair too for Microsoft here? As far as I can remember, the same hate sentiment towards Microsoft also once happened to IBM as well, when they still have the PC division with them. They made OS/2 for their PC platform (which they already lost control on their own platform by that time) and gained the “Big Blue” moniker for trying to be an all-rounder (they’re a long-running company after all). Well, it wasn’t hate until some people came up with the 3B (Big ‘Bad’ Blue), implying that IBM is an evil company.
Image by scottpowerz via Flickr
My first experience with Linux was 10 years ago. Although I started my computing experience earlier than that (I attended computer classes since 1995) but I remain a n00b because the beige boxes scares me. I only started falling in love with computers and IT stuff when my father purchased our first family PC (powered by Pentium III) in year 2000. By having our own PC I can freely tinker around it without having to worry too much. Although the PC was pre-installed with Windows 98 SE, an article in local PC magazine drove my curiosity to try Linux and ended up installing RedHat in a dual-boot environment. I admit that I fell in love in Linux but I love Windows more because of it’s ease-of-use thus I set Windows as the default OS. Actually nobody in my family knows there’s Linux in the PC because it is only bootable via a boot diskette. Despite using Windows most of the time, I keep using Linux occasionally out of curiosity and started mastering it unknowingly in the process.
A couple of years since my introduction to Linux, I learned about the existence of special kind of software that would allow me to use Linux without having to set my beige box to dual-boot system. The software is known as virtual machine. I installed Connectix Virtual PC and began experimenting with various Linux flavors, often more than 2 at one time. Sure, having such load in a Pentium III box with maximum RAM of only 512MB is a pain but for a geek it was a pain worth bearing. However it still does not enough to make me a Linux convert because I still think that Linux was still immature for a beginner’s use. It was during the same time I introduced Linux to my family and nobody accepted it. Yes, the heavy reliance on CLI freaked my family members and unlike Windows which they can fix themselves, they’d left dumbfounded should they face problems in Linux. And I was more convinced that Linux is still not good as a beginner’s OS. Well, perhaps I’ve used the wrong distro but how should I know if the one I’m comfortable with may be too scary for others?
Fast forward a few years and I’ve almost gave up being a Linux evangelist to my family. Yeah, I know it wouldn’t succeed because I still not using Linux as my main OS up to that time. It’s not that I don’t want to but virtualization software consumed too much of my limited system resources, although my PC was among the most powerful of that time. Even if I set my PC into a dual-boot machine and dedicate all system resources to whatever OS I booted, it wouldn’t help either because I’m not happy with the hassle of having to reboot the machine just to switch the other OS. Then I think why not the computer developers simplify it? My computing knowledge were pretty much limited on that time. All that I could think for improvement is either to make the virtualization less resource hungry or something that I described as “hardware-level virtualization”. The former might be impossible because no matter how small footprint the virtualization software has, the overall system resources is still shared among the host SO and the guest OS. For the latter, I thought it was ridiculous until I read an article in another local PC magazine about the so-called “hardware-assisted virtualization” in 2005, around the same time of the emergence of multi-core consumer CPU.
From what I understood about hardware-assisted virtualization, it’s similar to my vision of hardware-level virtualization, where system resources are partitioned at hardware level instead of in software level as in the traditional software virtualization. In the article both AMD‘s “Pacifica” and Intel‘s “Vanderpool” were mentioned well. I thought the technology I’ve been waiting for has arrived but I was wrong. It’s all liars. The article mentioned about having a machine where we can boot into both system at one time without the need to install the virtualization software and we can switch between the OSes in real-time without having to reboot the system (let’s call it “double-boot” instead of dual-boot) or reloading the same OS without restarting it. Sounds nice because should the current working environment crashed, the loaded copy of the OS would take over and this could be done without the user noticing it. However I still haven’t seen my dream of “double-boot” system become true despite the technology is already available. The technology becomes useful only if the virtualization software is installed, which means it still need the host-guest relationship between the OSes, of which I think kills the purpose of having the hardware-level virtualization. I am highly disappointed. However there was one time in local PC expo where I saw an Apple representative demonstrated switching between Mac OS X and Windows XP in real-time using certain key combination. I asked him whether there are any virtualization software installed or not and he answered me the Mac only use Bootcamp. I’m not sure though whether it’s true or there were just some tricks because I never really have a chance to use Windows on Macs but whatever system it is, I only want to see the “double-boot” system become true.
Will Apple be run out of business if their so-called ‘platform’ is open and they allow people to install Mac OS X on non-Macs? Well, contrary to popular belief that Apple is a strong IT company (based on their current performance), coupled with the fact that Apple ‘platform’ is not really a platform anymore since their transition to x86 CPU (intel), things may not favoring Apple much in those situations. In fact they are not too confident that they may prevail as strong as they are right now should the above situations ever happened, which explains why it is not legally possible to clone a Mac today. Heck if they are confident enough they should have removed all kind of vendor-locking on their products and prepare themselves to compete in a completely open a and fair market as a standalone brand.
Linux should learn more from Windows too; ie. to use proper, corresponding icon for its executable files. Executable files are programs that comes with applications, together with their own set of icons. The problem is the icons are only available for applications menu and desktops but why not the executable files as well? I remember having a hard time browsing the usr/bin folder searching for the executable file of Transmission (the bittorent client) by looking at file names instead of trying to notice the familiar Transmission icon. Yeah I found it but I can do it much faster in Windows thanks to the usage of corresponding icon for Windows applications. Besides in Windows applications are usually sorted out properly in their own program directories with proper application naming (Program Files/App name) instead of dumping all cryptically named executable files in only one place. Besides why the heck the system still use almost-cryptic acronymic naming of directories like ‘usr’ instead of ‘user’ or ‘bin’ instead of ‘binary’ or ‘lib’ instead of ‘library’? (And strangely enough they can use four-letter words for some other directories like ‘home’ and ‘root’?). And what the heck the ‘sda’ is actually? (yeah, I know what ‘sda’ is but that one is just me exaggerating to show you how cryptic directory naming in Linux can be). Linux is already decades in development yet the developers still refuse to abandon UNIX system convention? C’mon, it’s just a matter of a few additional letters so it shouldn’t hurt for that much of extra work. And isn’t modern Linux is supposed to be a hybrid and pragmatic standalone system instead of being heavily and strictly modeled after UNIX? It’s been decades since the first version of Linux being coded so today it should be made less like UNIX now while still remaining POSIX-compliant. Heck, even Mac OS X and Windows NT* are completely POSIX-compliant but managed to exist as a completely different system. In other words, modern Linux should be partially modeled after other systems as well if it wants to ‘seize’ the OS market domination from Windows.
*WinNT may or may not fully POSIX-compliant but I don’t fuss over the details. Nobody really care whether I’m correct or not because there’ll always be people disagreeing with me whether I’m right or wrong.
Come to think of it, Apple does not contribute to the emergence of computer geeks. Computer geeks exist mainly because people are able to build their own system from scratch and freely customize it, which is a ‘pleasure’ that Mac users would never enjoy (unless Apple acknowledge/approve the ‘Hackintosh’ or legalize Mac-cloning or even allowing Mac OS X on non-Macs, which is very unlikely to happen anytime soon – the possibility is almost zero).
I remember when I was working as a temporary teacher before, the Education Dept. provided teachers with Macs as teaching tools. However many teachers complained, especially veteran teachers who can’t get used to it thus the Education Dept. stopped supplying Macs and provided other PC then. Besides later on the Education Dept. also decided that Macs are expensive and requires special maintenance and consequently dropped it from the list.