OpinionComputers nowadays are designed with layers upon layers upon layers of abstractions. They will be drastically simplified before 50 years.
      – BorgClown, 2009-09-21 at 20:59:13   (31 comments)

On 2009-09-21 at 21:04:48, BorgClown wrote...
I certainly don't know, but this industry looks like it's heading for a fresh start. It's hitting or foreseeing limits in several key areas.
On 2009-09-21 at 21:58:08, Lee J Haywood wrote...
For example?
On 2009-09-22 at 01:27:21, BorgClown wrote...
An example of what? Multi-fricking-layered design, simplification or limits?
On 2009-09-22 at 10:29:54, Lee J Haywood wrote...
Of the specific limits which are being reached.
On 2009-09-22 at 19:09:43, Thelevellers wrote...
Well people are looking into light gates and quantum computing, so at some point all this tech will be redundant, but I don;t think that's quite your point?
On 2009-09-22 at 20:21:32, BorgClown wrote...
In the side of hardware, clock speeds and miniaturization can't go much beyond without transforming chips into radios. If new technologies change the hardware radically, it would be a chance for a fresh start without the weight of backwards compatibility. In the software side, computers have become unreliable because so much depends on deeper layers on whom the developer has little or no control. We've learned to expect occasional and unexplained software crashes and bugs. There are very stable operating systems (no, not talking about *nix), but the stability comes with a noticeable performance penalization. As personal computers get more powerful, I think they're getting ready for using stricter stability controls and simpler development tools. In the user side, I don't know, but the keyboard-mouse-monitor interaction can't go on forever, although I don't see what could replace it. What I expect is something to replace them, and if needed, emulate them.
On 2009-09-23 at 15:21:18, Lee J Haywood wrote...
When computers are unreliable it's because almost no-one uses formal verification in software. http://en.wikipedia.org/wiki/Formal_verification Software performance issues are largely offset by increases in underlying hardware speeds. The problem is that efficiency improvements come in 2 forms. First are the minor improvements that come from optimisation, which are rarely worthwhile because they'll be negated by hardware improvements. The second type are drastic improvements that can be made by not using brain-dead methods of processing which fundamentally ignore the bottlenecks of hard drives and physical memory limits. Unfortunately, most software (particularly when bloated) suffers from the second form of inefficiency whilst the developers mistakenly think the problems will be solved by hardware improvements that only address the first category.
On 2009-09-23 at 19:21:49, DigitalBoss wrote...
Where I work, we use VMWare 4. We just bought a new IBM 3950 that has 48 cores with 64 GB of RAM, and 12 Ethernet cards. We installed the ESX host operating system, which is really RedHat, on it and have about 20 virtual machines running on one piece of hardware. 20 virtual RedHat EL5.3 servers running on one computer. It is pretty cool. It is starting to get hard to remember if a server is a real server or a virtual server.
On 2009-09-24 at 01:59:46, BorgClown wrote...
@Lee J Haywood: I've worked most with minis and mainframes, where it's almost impossible to crash the OS or corrupt its memory, but even in those very stable environments I've seen many problems of the second type you describe. So yes, no simplification will solve that problem. I've see also that people who make that mistakes do it out of ignorance or laziness, so a simpler and more stringent development environment may help to keep them at check.
On 2009-09-24 at 02:03:35, BorgClown wrote...
@DigitalBoss: That's what this topic is about, now it's become common to have a virtualization layer, we keep adding more stories to the building. I'd love an OS that's prepared from the ground up to partition its resources, so you could have dynamic reallocations between partitions without rebooting emulated virtual machines. I guess that's my line of work influencing me again.
On 2009-09-24 at 03:17:29, DigitalBoss wrote...
The virtualization is really cool, and it is real complicated, but if setup right, it can give you some really neat options. We have used the VMMotion feature a good bit, it is cool. We also use Network Attached Storage (iSCSI volumes).
On 2009-09-24 at 03:19:27, DigitalBoss wrote...
@LeeJ: Have you ever used Eclipse? I have been using it lately on a J2EE SOAP web service project.
On 2009-09-24 at 08:51:47, Lee J Haywood wrote...
Personally I'd like to go back to a computer that is ready to use when you switch it on, rather than having to wait for it to seek all over the place trying to find fragmented files. (If you put all the files needed for booting sequentially on the disk, there's no seeking and it's about 100 times faster). The irony is that as things get more modularised they become more stable but make the software much bigger. @DigitalBoss: I don't use IDEs, I use vim for everything.
On 2009-09-24 at 14:18:50, DigitalBoss wrote...
Cool, I use vim, but not for everything.
On 2009-09-24 at 14:23:08, Melchior wrote...
@BorgClown: How much programming have you done? Correct me if I'm wrong, but the abstractions make it easier to program. Sure, you could remove the abstractions and directly interact with the hardware, but that's a horrible way to produce a program.
On 2009-09-25 at 01:13:14, BorgClown wrote...
Well, at least you folks are not on plain vi.
On 2009-09-25 at 01:34:00, BorgClown wrote...
@Melchior: I program for a living since about 13 years. Mostly mainframe development. About abstractions easing the development, they really do. My gripe is the seemingly unneeded abstractions over abstractions over.. well, more abstractions. For example, a few months ago there was a interesting article about how the Linux sound subsystem works. The ideal layers would be something like this: Hardware <-> Sound Driver <-> Application. The funny thing is, they were like this before, when the OSS drivers were used. Now your Linux distro can have OSS drivers, ALSA drivers (who can emulate OSS and do mixing), a mixing layer like ARTS or ESD or PulseAudio (who can emulate ARTS or ESD or ALSA) or any other niche flavor. Say you use PulseAudio, who is the choice Ubuntu made. You can theoretically use your preferred emulation mode and it should automagically work, but it doesn't, and even using it explicitly doesn't give you the best results.
On 2009-09-25 at 01:55:19, BorgClown wrote...
Another pet peeve of mine: Legal-based abstraction layers. Closed source products who use GPL libraries. They need to wrap around the libraries in order to disclose only the wrapper source code and not the full application's, which introduces a unneeded abstraction layer because of copyright law. What about if the product includes an API? Another kind of legal abstraction is encryption and downgrading of analog signals on DVD players. And don't get me started on development practices. If you want to make two servers share an invoiving transaction, you might be in the way for a client-server SOA/XML/SQL cocktail just to get the transaction from one disk to another. Sometimes it's too much looping the loop. I fear it's wishful thinking to hope computers (and IT in general) to be simplified, as it tends to become more and more complex each year. But who knows? It might become too kludgy and crash on itself.
On 2009-09-25 at 03:11:41, DigitalBoss wrote...
I use vi when on some of our older Solaris machines; I like vim much more. I think PulseAudio will eventually get squared away, and will work well. I know, it can be a pain now.
On 2009-09-25 at 11:35:51, Lee J Haywood wrote...
XML might be good if it were easy to use, and at least it's text-based. In reality, you need the DTD to even begin to make sense of an interface and it's a nightmare when your programming language doesn't support XML parsing out of the box. It is an improvement over delimited text some of the time, but is often overused and over-complicates things.
On 2009-09-25 at 20:28:14, BorgClown wrote...
Oh god, now we also need pesticides to develop? =)
On 2009-09-25 at 20:30:40, Thelevellers wrote...
Heh, all of the last few comments is largely meaningless to me :) I feel foolish...
On 2009-09-25 at 20:34:47, BorgClown wrote...
Just remember to keep a can of DDT nearby. Don't let the bugs bite you.
On 2009-09-25 at 20:54:49, Lee J Haywood wrote...
DTD, not DDT. On a related note, am I the only one who can immediately say out loud dichlorodiphenyltrichloroethane every time someone mentions DDT?
On 2009-09-25 at 21:05:33, BorgClown wrote...
Is DTD even more poisonous? Does it give cancer too? Does it make you utter superlong words?
On 2009-09-25 at 21:08:38, Lee J Haywood wrote...
It just makes writing an interface to someone else's XML slightly less painful, since you can generate an XML parser directly from the DTD (in Java, at least) if you know what you're doing. http://en.wikipedia.org/wiki/Document_Type_Definition
On 2009-09-25 at 21:13:18, BorgClown wrote...
Damn your British phlegm, I could argue that your hair is purple and you would calmly explain me why it isn't.
On 2009-09-25 at 21:16:52, Lee J Haywood wrote...
Oh, hey, didn't you know that DDT is still useful stuff? Some reckon that it'd be better if it were still used to fight malaria, rather than just having a blanket worldwide ban. http://en.wikipedia.org/wiki/DDT#Criticism_of_restrictions_on_DDT_use
On 2009-09-25 at 22:09:18, Thelevellers wrote...
I am practicing say dichlorodiphenyltrichloroethane... :)
On 2009-09-26 at 01:09:16, BorgClown wrote...
Try saying dichlorotrichloroethanediphenyl, that's DTD =P
On 2009-09-26 at 01:10:10, BorgClown wrote...
It just doesn't just roll out of the tongue like dichlorodiphenyltrichloroethane does.