www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Re: DVCS (was Re: Moving to D)

reply Jean Crystof <news news.com> writes:
Walter Bright Wrote:

 My mobo is an ASUS M2A-VM. No graphics cards, or any other cards plugged into 
 it. It's hardly weird or wacky or old (it was new at the time I bought it to 
 install Ubuntu).

ASUS M2A-VM has 690G chipset. Wikipedia says: http://en.wikipedia.org/wiki/AMD_690_chipset_series#690G "AMD recently dropped support for Windows and Linux drivers made for Radeon X1250 graphics integrated in the 690G chipset, stating that users should use the open-source graphics drivers instead. The latest available AMD Linux driver for the 690G chipset is fglrx version 9.3, so all newer Linux distributions using this chipset are unsupported." Fast forward to this day: http://www.phoronix.com/scan.php?page=article&item=amd_driver_q111&num=2 Benchmark page says: the only available driver for your graphics gives only about 10-20% of the real performance. Why? ATI sucks on Linux. Don't buy ATI. Buy Nvidia instead: http://geizhals.at/a466974.html This is 3rd latest Nvidia GPU generation. How long support lasts? Ubuntu 10.10 still supports all Geforce 2+ which is 10 years old. I foretell Ubuntu 19.04 is last one supporting this. Use Nvidia and your problems are gone.
Jan 11 2011
next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
Did you hear that, Walter? Just buy a 500$ video card so you can watch
youtube videos on Linux. Easy. :D
Jan 11 2011
parent reply Jean Crystof <news news.com> writes:
Andrej Mitrovic Wrote:

 Did you hear that, Walter? Just buy a 500$ video card so you can watch
 youtube videos on Linux. Easy. :D

Dear Sir, did you even open the link? It's the cheapest Nvidia card I could find by googling for 30 seconds. 28,58 euros translates to $37. I can't promise that very old Geforce chips support 1920x1200 but at least the ones compatible with his PCI-express bus work perfectly. Maybe You were trying to be funny?
Jan 11 2011
next sibling parent reply Jean Crystof <news news.com> writes:
Andrej Mitrovic Wrote:

 Notice the smiley face -> :D
 
 Yeah I didn't check the price, it's only 30$. But there's no telling
 if that would work either. 

I can tell from our hobbyist group's experience with Compiz, native Linux games, Wine, multiplatform OpenGL game development on Linux, and hardware accelerated video that all of these tasks had problems on our ATI hardware and no problems with Nvidia.
 Also, dirt cheap video cards are almost
 certainly going to cause problems. Even if the drivers worked
 perfectly, a year down the road things will start breaking down. Cheap
 hardware is cheap for a reason.

That's not true. I suggested a low end card because if he's using integrated graphics now, there's no need for high end hardware. The reason why the price is lower is cheaper cards have smaller heatsinks, less fans or none at all, no advanced features (SLI), low frequency cores with most shaders disabled (They've sidestepped manufacturing defects by disabling broken cores), smaller memory bandwidth, less & cheaper memory modules without heatsinks. Just look at the circuit board. A high end graphics card is physically at least twice as large or even more. No wonder it costs more. The price goes up $100 just by buying the bigger heatsinks are fans. Claiming that low end components have shorter lifespan is ridiculous. Why does Ubuntu 10.10 still support cheap Geforce 2 MX then?
Jan 11 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Andrej Mitrovic wrote:
 On 1/12/11, Jean Crystof <news news.com> wrote:
 Claiming that low end components have shorter lifespan is ridiculous.

You've never had computer equipment fail on you?

I've had a lot of computer equipment. Failures I've had, ranked in order of most failures to least: keyboards power supplies hard drives fans monitors I've never had a CPU, memory, or mobo failure. Which is really kind of amazing. I did have a 3DFX board once, which failed after a couple years. Never bought another graphics card. The keyboards fail so often I keep a couple spares around. I buy cheap, bottom of the line equipment. I don't overclock them and I make sure there's plenty of airflow around the boxes.
Jan 12 2011
next sibling parent Jesse Phillips <jessekphillips+D gmail.com> writes:
Walter Bright Wrote:

 The keyboards fail so often I keep a couple spares around.
 
 I buy cheap, bottom of the line equipment. I don't overclock them and I make 
 sure there's plenty of airflow around the boxes.

Wow, I have never had a keyboard fail. I'm stilling using my first keyboard from 1998. Hell, I haven't even rubbed off any of the letters. I guess the only components I've had fail on me has been hard drive and CD/DVD drive. Monitor was about to go.
Jan 12 2011
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Vladimir Panteleev wrote:
 On Thu, 13 Jan 2011 05:43:27 +0200, Walter Bright 
 <newshound2 digitalmars.com> wrote:
 
 The keyboards fail so often I keep a couple spares around.

Let me guess, all cheap rubber-domes? Maybe you should have a look at some professional keyboards. Mechanical keyboards are quite durable, and feel much nicer to type on.

Yup, the $9.99 ones. They also get things spilled on them, why ruin an expensive one? <g>
Jan 12 2011
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound2 digitalmars.com> wrote in message 
news:igm2um$2omg$1 digitalmars.com...
 Vladimir Panteleev wrote:
 On Thu, 13 Jan 2011 05:43:27 +0200, Walter Bright 
 <newshound2 digitalmars.com> wrote:

 The keyboards fail so often I keep a couple spares around.

Let me guess, all cheap rubber-domes? Maybe you should have a look at some professional keyboards. Mechanical keyboards are quite durable, and feel much nicer to type on.

Yup, the $9.99 ones. They also get things spilled on them, why ruin an expensive one? <g>

I've got a $6 one I've been using for years, and I frequently beat the shit out of it. And I mean literally just pounding on it, not to type, but just to beat :) With all the physical abuse I give this ultra-cheapie thing, I honestly can't believe it still works fine after all these years. "AOpen" gets my approval for keyboards :) (Heh, I actually had to turn it over to check the brand. I had no idea what it was.) I never spill anything on it, though.
Jan 13 2011
prev sibling next sibling parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 13.01.2011 06:33, schrieb Walter Bright:
 Vladimir Panteleev wrote:
 On Thu, 13 Jan 2011 05:43:27 +0200, Walter Bright <newshound2 digitalmars.com>
 wrote:

 The keyboards fail so often I keep a couple spares around.

Let me guess, all cheap rubber-domes? Maybe you should have a look at some professional keyboards. Mechanical keyboards are quite durable, and feel much nicer to type on.

Yup, the $9.99 ones. They also get things spilled on them, why ruin an expensive one? <g>

There are washable keyboards, e.g. http://h30094.www3.hp.com/product/sku/5110581/mfg_partno/VF097AA Cheers, - Daniel
Jan 13 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Daniel Gibson wrote:
 There are washable keyboards, e.g. 
 http://h30094.www3.hp.com/product/sku/5110581/mfg_partno/VF097AA

I know. But what I do works for me. I happen to like the action on the cheapo keyboards, and the key layout. I'll also throw one in my suitcase for a trip, 'cuz I hate my laptop keyboard. And I don't care if they get lost/destroyed on the trip.
Jan 13 2011
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Andrej Mitrovic wrote:
 Lol Walter you're like me. I keep buying cheap keyboards all the time.
 I'm almost becoming one of those people that collect things all the
 time (well.. the difference being I throw the old ones in the trash).
 Right now I'm sporting this dirt-cheap Genius keyboard, I've just
 looked up the price and it's 5$. My neighbor gave it to me for free
 because he got two for some reason. You would think a 5$ keyboard
 sucks, but it's pretty sweet actually. The keys have a nice depth, and
 they're real easy to hit. The downside? They've put the freakin' sleep
 button right above the right cursor key. Now *that's* genius, Genius..
 So I had to disable sleep mode. LOL!

My preferred keyboard layout has the \ key right above the Enter key. The problem is those ^%%^&*^*&^&*^ keyboards that have the \ key somewhere else, and the Enter key is extra large and in that spot. So guess what happens? If I want to delete foo\bar.c, I type in: del foo Enter Yikes! There goes my directory contents! I've done this too many times. I freakin hate those keyboards. I always check to make sure I'm not buying one, though they seem to be most of 'em.
Jan 14 2011
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Andrej Mitrovic wrote:
 I've found all the pieces but putting them back together
 was a nightmare. Which piece goes where with which other piece and in
 what order..

No prob. I've got some tools in the basement that will take care of that.
Jan 14 2011
prev sibling parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 14.01.2011 04:46, schrieb Andrej Mitrovic:
 Lol Walter you're like me. I keep buying cheap keyboards all the time.
 I'm almost becoming one of those people that collect things all the
 time (well.. the difference being I throw the old ones in the trash).
 Right now I'm sporting this dirt-cheap Genius keyboard, I've just
 looked up the price and it's 5$. My neighbor gave it to me for free
 because he got two for some reason. You would think a 5$ keyboard
 sucks, but it's pretty sweet actually. The keys have a nice depth, and
 they're real easy to hit. The downside? They've put the freakin' sleep
 button right above the right cursor key. Now *that's* genius, Genius..
 So I had to disable sleep mode. LOL!

Had something like that once, too. I just removed the key from the keyboard ;)
Jan 14 2011
prev sibling parent Stanislav Blinov <blinov loniir.ru> writes:
14.01.2011 3:12, Nick Sabalausky пишет:
 "Walter Bright"<newshound2 digitalmars.com>  wrote in message
 news:igm2um$2omg$1 digitalmars.com...
 Vladimir Panteleev wrote:
 On Thu, 13 Jan 2011 05:43:27 +0200, Walter Bright
 <newshound2 digitalmars.com>  wrote:

 The keyboards fail so often I keep a couple spares around.

some professional keyboards. Mechanical keyboards are quite durable, and feel much nicer to type on.

expensive one?<g>

out of it. And I mean literally just pounding on it, not to type, but just to beat :) With all the physical abuse I give this ultra-cheapie thing, I honestly can't believe it still works fine after all these years. "AOpen" gets my approval for keyboards :) (Heh, I actually had to turn it over to check the brand. I had no idea what it was.) I never spill anything on it, though.

got tired and started to tear. It served me for more than 10 years in everything from gaming to writing university reports to programming (pounding, dropping and spilling/sugaring included). And it was an old one - without all those annoying win-keys and stuff. Never got another one that would last at least a year. One of the recent ones died taking with it a USB port on the mobo (or maybe it was vice-versa, I don't know).
Jan 14 2011
prev sibling next sibling parent Sean Kelly <sean invisibleduck.org> writes:
Walter Bright Wrote:
 
 I buy cheap, bottom of the line equipment. I don't overclock them and I make 
 sure there's plenty of airflow around the boxes.

I don't overclock any more after a weird experience I had overclocking an Athlon ages ago. It ran fine except that unzipping something always failed with a CRC error. Before that I expected that an overclocked CPU would either work or fail spectacularly. I'm not willing to risk data silently being corrupted in the background, particularly when even mid-range CPUs these days are more than enough for nearly everything.
Jan 13 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound2 digitalmars.com> wrote in message 
news:iglsge$2evs$1 digitalmars.com...
 Andrej Mitrovic wrote:
 On 1/12/11, Jean Crystof <news news.com> wrote:
 Claiming that low end components have shorter lifespan is ridiculous.

You've never had computer equipment fail on you?

I've had a lot of computer equipment. Failures I've had, ranked in order of most failures to least: keyboards power supplies hard drives fans monitors I've never had a CPU, memory, or mobo failure. Which is really kind of amazing. I did have a 3DFX board once, which failed after a couple years. Never bought another graphics card. The keyboards fail so often I keep a couple spares around. I buy cheap, bottom of the line equipment. I don't overclock them and I make sure there's plenty of airflow around the boxes.

My failure list from most to least would be this: 1. power supply / printer 2. optical drive / floppies (the disks, not the drives) 3. hard drive 4. monitor / mouse / fan Never really had probems with anything else as far as I can remember. I had a few 3dfx cards back in the day and never had the slightest bit of trouble with any of them. I used to go through a ton of power supplies until I finally stopped buying the cheap ones. Printers kept giving me constant trouble, but the fairly modern HP I have now seems to work ok (although the OEM software/driver is complete and utter shit, but then OEM software usually is.)
Jan 13 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Nick Sabalausky wrote:
 My failure list from most to least would be this:
 
 1. power supply / printer
 2. optical drive / floppies (the disks, not the drives)
 3. hard drive
 4. monitor / mouse / fan
 
 Never really had probems with anything else as far as I can remember. I had 
 a few 3dfx cards back in the day and never had the slightest bit of trouble 
 with any of them.
 
 I used to go through a ton of power supplies until I finally stopped buying 
 the cheap ones. Printers kept giving me constant trouble, but the fairly 
 modern HP I have now seems to work ok (although the OEM software/driver is 
 complete and utter shit, but then OEM software usually is.)

My printer problems ended (mostly) when I finally spent the bux and got a laser printer. The (mostly) bit is because neither Windows nor Ubuntu support an HP 2300 printer. Sigh.
Jan 13 2011
parent reply Robert Clipsham <robert octarineparrot.com> writes:
On 14/01/11 03:53, Walter Bright wrote:
 My printer problems ended (mostly) when I finally spent the bux and got
 a laser printer. The (mostly) bit is because neither Windows nor Ubuntu
 support an HP 2300 printer. Sigh.

Now this surprises me, printing has been the least painless thing I've ever encountered - it's the one area I'd say Linux excels. In OS X or Windows if I want to access my networked printer there's at least 5 clicks involved - on Linux there was a grand total of 0 - it detected my printer and installed it with no intervention from me, I just clicked print and it worked. Guess that's the problem with hardware though, it could have a few thousand good reviews and you could still manage to get something you run into endless issues with! -- Robert http://octarineparrot.com/
Jan 14 2011
parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 14.01.2011 16:48, schrieb Robert Clipsham:
 On 14/01/11 03:53, Walter Bright wrote:
 My printer problems ended (mostly) when I finally spent the bux and got
 a laser printer. The (mostly) bit is because neither Windows nor Ubuntu
 support an HP 2300 printer. Sigh.

Now this surprises me, printing has been the least painless thing I've ever encountered - it's the one area I'd say Linux excels. In OS X or Windows if I want to access my networked printer there's at least 5 clicks involved - on Linux there was a grand total of 0 - it detected my printer and installed it with no intervention from me, I just clicked print and it worked. Guess that's the problem with hardware though, it could have a few thousand good reviews and you could still manage to get something you run into endless issues with!

This really depends on your printer, some have good Linux support and some don't. Postscript-support (mostly seen in better Laser printers) is probably most painless (just supply a PPD - if CUPS doesn't have one for your printer anyway - and you're done). But also many newer inkjet printers have Linux support, but many need a proprietary library from the vendor to work. But a few years ago it was a lot worse, especially with cheap inkjets. Many supported only GDI printing which naturally is best supported on Windows (GDI is a windows interface).
Jan 14 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Daniel Gibson wrote:
 But a few years ago it was a lot worse, especially with cheap inkjets. 
 Many supported only GDI printing which naturally is best supported on 
 Windows (GDI is a windows interface).

Yeah, but I bought an *HP* laserjet, because I thought everyone supported them well. Turns out I probably have the only orphaned HP LJ model.
Jan 14 2011
next sibling parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 14.01.2011 20:50, schrieb Walter Bright:
 Daniel Gibson wrote:
 But a few years ago it was a lot worse, especially with cheap inkjets.
 Many supported only GDI printing which naturally is best supported on
 Windows (GDI is a windows interface).

Yeah, but I bought an *HP* laserjet, because I thought everyone supported them well. Turns out I probably have the only orphaned HP LJ model.

Yes, the HP Laserjets usually have really good support with PCL and sometimes even Postscript. You said you've got a HP (Laserjet?) 2300? On http://www.openprinting.org/printer/HP/HP-LaserJet_2300 it says that printer "works perfectly" and supports PCL 5e, PCL6 and Postscript level 3. Generally http://www.openprinting.org/printers is a really good page to see if a printer has Linux-support and where to get drivers etc. Cheers, - Daniel
Jan 14 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Daniel Gibson wrote:
 Am 14.01.2011 20:50, schrieb Walter Bright:
 Daniel Gibson wrote:
 But a few years ago it was a lot worse, especially with cheap inkjets.
 Many supported only GDI printing which naturally is best supported on
 Windows (GDI is a windows interface).

Yeah, but I bought an *HP* laserjet, because I thought everyone supported them well. Turns out I probably have the only orphaned HP LJ model.

Yes, the HP Laserjets usually have really good support with PCL and sometimes even Postscript. You said you've got a HP (Laserjet?) 2300?

Yup. Do you want a picture? <g>
 On http://www.openprinting.org/printer/HP/HP-LaserJet_2300 it says that 
 printer "works perfectly" and supports PCL 5e, PCL6 and Postscript level 3.

Nyuk nyuk nyuk
 Generally http://www.openprinting.org/printers is a really good page to 
 see if a printer has Linux-support and where to get drivers etc.

Jan 14 2011
parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 14.01.2011 22:54, schrieb Walter Bright:
 Daniel Gibson wrote:
 Am 14.01.2011 20:50, schrieb Walter Bright:
 Daniel Gibson wrote:
 But a few years ago it was a lot worse, especially with cheap inkjets.
 Many supported only GDI printing which naturally is best supported on
 Windows (GDI is a windows interface).

Yeah, but I bought an *HP* laserjet, because I thought everyone supported them well. Turns out I probably have the only orphaned HP LJ model.

Yes, the HP Laserjets usually have really good support with PCL and sometimes even Postscript. You said you've got a HP (Laserjet?) 2300?

Yup. Do you want a picture? <g>

No, I believe you ;)
 On http://www.openprinting.org/printer/HP/HP-LaserJet_2300 it says that
 printer "works perfectly" and supports PCL 5e, PCL6 and Postscript level 3.

Nyuk nyuk nyuk

The hplip version in Ubuntu 8.04 and in Ubuntu 9.10 support your printer - I don't know about your version, because Ubuntu doesn't list it anymore, but I'd be surprised if it didn't support it as well ;) hplips docs say that the printer is supported when connected via USB or "Network or JetDirect" (but not Parallel port, but probably the printer doesn't have one). It may be that Ubuntu doesn't install hplip (HPs driver for all kinds of printers - including the LaserJet 2300 ;)) by default. That could be fixed by "sudo apt-get install hplip hpijs-ppds" and then trying to add the printer again (if there's no Voodoo to do that automatically). Cheers, - Daniel
Jan 14 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Daniel Gibson wrote:
 The hplip version in Ubuntu 8.04 and in Ubuntu 9.10 support your printer 
 - I don't know about your version,

8.10
 because Ubuntu doesn't list it 
 anymore, but I'd be surprised if it didn't support it as well ;)
 hplips docs say that the printer is supported when connected via USB or 
 "Network or JetDirect" (but not Parallel port, but probably the printer 
 doesn't have one).

The HP 2300D is parallel port. (The "D" stands for duplex, an extra cost option on the 2300.)
 It may be that Ubuntu doesn't install hplip (HPs driver for all kinds of 
 printers - including the LaserJet 2300 ;)) by default.
 That could be fixed by
 "sudo apt-get install hplip hpijs-ppds"
 and then trying to add the printer again (if there's no Voodoo to do 
 that automatically).

How I installed the printer is I just, more or less at random, said it was a different HP laserjet, and then it worked. The duplex doesn't work, though, nor any of the other variety of special features it has.
Jan 14 2011
next sibling parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 15.01.2011 01:23, schrieb Walter Bright:
 Daniel Gibson wrote:
 The hplip version in Ubuntu 8.04 and in Ubuntu 9.10 support your printer - I
 don't know about your version,

8.10
 because Ubuntu doesn't list it anymore, but I'd be surprised if it didn't
 support it as well ;)
 hplips docs say that the printer is supported when connected via USB or
 "Network or JetDirect" (but not Parallel port, but probably the printer
 doesn't have one).

The HP 2300D is parallel port. (The "D" stands for duplex, an extra cost option on the 2300.)

HP says[1] it has also got USB, if their docs are correct for your version (and the USB port is just somehow hidden) it may be worth a try :) Also, http://www.openprinting.org/printer/HP/HP-LaserJet_2300 links (under "Postscript") a PPD that supports duplex. CUPS supports adding a printer and providing a custom PPD. (In my experience Postscript printers do support the parallel port, you can even just cat a PS file to /dev/lp0 if it has the right format) However, *maybe* performance (especially for pictures) is not as great as with HPs own PCL. As a Bonus: There are generic Postscript driver for Windows as well, so with that PPD your Duplex may even work on Windows :)
 It may be that Ubuntu doesn't install hplip (HPs driver for all kinds of
 printers - including the LaserJet 2300 ;)) by default.
 That could be fixed by
 "sudo apt-get install hplip hpijs-ppds"
 and then trying to add the printer again (if there's no Voodoo to do that
 automatically).

How I installed the printer is I just, more or less at random, said it was a different HP laserjet, and then it worked. The duplex doesn't work, though, nor any of the other variety of special features it has.

Maybe CUPS didn't list the LJ2300 as supported because (according to that outdated list I found in the Ubuntu 8.04 driver) it isn't supported at the parport. [1] http://h10010.www1.hp.com/wwpc/us/en/sm/WF06b/18972-236251-236263-14638-f51-238800-238808-238809.html
Jan 14 2011
prev sibling parent reply Jean Crystof <news news.com> writes:
Walter Bright Wrote:

 Daniel Gibson wrote:
 The hplip version in Ubuntu 8.04 and in Ubuntu 9.10 support your printer 
 - I don't know about your version,

8.10

This thread sure was interesting. Now what I'd like is if Walter could please try a Nvidia Geforce on Linux if the problems won't go away by upgrading his Ubuntu. Unfortunately that particular Ati graphics driver is constantly changing and it might take 1-2 years to make it work in Ubuntu: http://www.phoronix.com/vr.php?view=15614 The second thing is upgrading the Ubuntu. Telling how Linux sucks by using Ubuntu 8.10 is like telling how Windows 7 sucks when you're actually using Windows ME or 98. These have totally different software stacks, just to name a few: openoffice 2 vs 3 ext3 vs ext4 filesystem usb2 vs usb3 nowadays kde3 vs kde4 (kde4 in 8.10 was badly broken) gcc 4.3 vs 4.5 old style graphics drivers vs kernel mode switch faster bootup thousands of new features and drivers tens of thousands of bugfixes and so on. It makes no sense to discuss "Linux". It's constantly changing.
Jan 14 2011
next sibling parent Jean Crystof <news news.com> writes:
Jean Crystof Wrote:

 Walter Bright Wrote:
 
 Daniel Gibson wrote:
 The hplip version in Ubuntu 8.04 and in Ubuntu 9.10 support your printer 
 - I don't know about your version,

8.10

This thread sure was interesting. Now what I'd like is if Walter could please try a Nvidia Geforce on Linux if the problems won't go away by upgrading his Ubuntu. Unfortunately that particular Ati graphics driver is constantly changing and it might take 1-2 years to make it work in Ubuntu: http://www.phoronix.com/vr.php?view=15614 The second thing is upgrading the Ubuntu. Telling how Linux sucks by using Ubuntu 8.10 is like telling how Windows 7 sucks when you're actually using Windows ME or 98. These have totally different software stacks, just to name a few: openoffice 2 vs 3 ext3 vs ext4 filesystem usb2 vs usb3 nowadays kde3 vs kde4 (kde4 in 8.10 was badly broken) gcc 4.3 vs 4.5 old style graphics drivers vs kernel mode switch faster bootup thousands of new features and drivers tens of thousands of bugfixes and so on. It makes no sense to discuss "Linux". It's constantly changing.

I tried to find the package lists for Ubuntu 8.10 (intrepid), but they're not online anymore. Using it is *totally* crazy. Do apt-get update and apt-get upgrade even work anymore? The Ubuntu idea was to provide a simple graphical tool for dist-upgrades. If I had designed it, I wouldn't even let you log in before upgrading. No wonder DMD binaries depended on legacy libraries some time ago. The compiler author should be using VAX or something similar like all dinosaurs do.
Jan 14 2011
prev sibling next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/14/11 6:48 PM, Jean Crystof wrote:
 Walter Bright Wrote:

 Daniel Gibson wrote:
 The hplip version in Ubuntu 8.04 and in Ubuntu 9.10 support your printer
 - I don't know about your version,

8.10

This thread sure was interesting. Now what I'd like is if Walter could please try a Nvidia Geforce on Linux if the problems won't go away by upgrading his Ubuntu. Unfortunately that particular Ati graphics driver is constantly changing and it might take 1-2 years to make it work in Ubuntu: http://www.phoronix.com/vr.php?view=15614 The second thing is upgrading the Ubuntu. Telling how Linux sucks by using Ubuntu 8.10 is like telling how Windows 7 sucks when you're actually using Windows ME or 98. These have totally different software stacks, just to name a few: openoffice 2 vs 3 ext3 vs ext4 filesystem usb2 vs usb3 nowadays kde3 vs kde4 (kde4 in 8.10 was badly broken) gcc 4.3 vs 4.5 old style graphics drivers vs kernel mode switch faster bootup thousands of new features and drivers tens of thousands of bugfixes and so on. It makes no sense to discuss "Linux". It's constantly changing.

The darndest thing is I have Ubuntu 8.10 on my laptop with KDE 3.5 on top... and love it. But this all is exciting - I think I'll make the switch, particularly now that I have a working backup solution. Andrei
Jan 14 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Jean Crystof wrote:
 The second thing is upgrading the Ubuntu. Telling how Linux sucks by using
Ubuntu 8.10

To be fair, it was about the process of upgrading in place to Ubuntu 8.10 that sucked. It broke everything, and made me leery of upgrading again.
Jan 14 2011
prev sibling parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 14.01.2011 21:16, schrieb Russel Winder:
 On Fri, 2011-01-14 at 11:50 -0800, Walter Bright wrote:
 Daniel Gibson wrote:
 But a few years ago it was a lot worse, especially with cheap inkjets.
 Many supported only GDI printing which naturally is best supported on
 Windows (GDI is a windows interface).

Yeah, but I bought an *HP* laserjet, because I thought everyone supported them well. Turns out I probably have the only orphaned HP LJ model.

I have an HP LJ 4000N and whilst it is perfectly functional, printing systems have decided it is too old to work with properly -- this is a Windows, Linux and Mac OS X problem. Backward compatibility is a three-edged sword.

hplip on Linux should support it when connected via Parallel Port (but, according to a maybe outdated list, not USB or Network/Jetdirect). See also http://www.openprinting.org/printer/HP/HP-LaserJet_4000 :-)
Jan 14 2011
prev sibling next sibling parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 14.01.2011 15:21, schrieb retard:

 PSUs: Never ever buy the cheap models.

Yup, one should never cheap out on PSUs. Also cheap PSUs usually are less efficient.
 Optical drives: Number 1 reason for breakage, I forget to close the tray
 and kick it off! Currently I don't use internal optical drives anymore.
 There's one external dvd burner. I rarely use it. And it's safe from my
 feet on the table :D

If you don't trash them yourself (:P) optical drives sometimes fail because a rubber band in it that rotates the disk (or something) becomes brittle or worn after some years. These can usually be replaced.
 Hard drives: these always fail, sooner or later. There's nothing you can
 do except RAID and backups (labs.google.com/papers/disk_failures.pdf).
 I've successfully terminated all (except those in use) hard drives so far
 by using them normally.

Not kicking/hitting your PC and cooling them appropriately helps, but in the end modern HDDs die anyway. I've had older (4GB) HDDs run for for over 10 years, much of the time even 24/7, without failing.
 Mice: I've always bought Logitech mice. NEVER had any failures. The
 current one is MX 510 (USB). Previous ones used the COM port. The bottom
 of the MX510 shows signs of hardcore use, but the internal parts haven't
 fallen off yet and the LED "eye" works :-D

I often had mouse buttons failing in logitech mice. Sometimes I removed the corresponding switches in the mouse and soldered one from another old cheap mouse into it, which fixed it until it broke again.. Now I'm using microsoft mice and they seem more reliable so far.
 Fans: If you want reliability, buy fans with ball bearings. They make
 more noise than sleeve bearings. I don't believe in expensive high
 quality fans. Sure, there are differences in the airflow and noise
 levels, but the max reliability won't be any better. The normal PC stores
 don't sell any fans with industrial quality bearings. Like I said before,
 remember to replace the oil http://www.dansdata.com/fanmaint.htm -- I
 still have high quality fans from the 1980s in 24/7 use. The only problem
 is, I couldn't anticipate how much the power consumption grows. The old
 ones are 40-80 mm fans. Now (at least gaming) computers have 120mm or
 140mm or even bigger fans.

Thanks for the tip :-)
Jan 14 2011
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Thanks for the fan info. I'm going to go oil my fans!
Jan 14 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"retard" <re tard.com.invalid> wrote in message 
news:igpm5t$26so$1 digitalmars.com...
 Now, I've also bought Canon, HP, and Epson inkjets. What can I say.. The
 printers are cheap. The ink is expensive. They're slow, and result looks
 like shit (not very photo-realistic) compared to the online printing
 services. AND I've "broken" about 8 of them in 15 years. It's way too
 expensive to start buying spare parts (e.g. when the dry ink gets stuck
 in the ink "tray" in Canon printers). Nowadays I print photos using some
 online service. The inkjet printer quality still sucks IMO. Don't buy
 them.

A long time ago we got, for free, an old Okidata printer that some school or company or something was getting rid of. It needed a new, umm, something really really expensive (I forget offhand), so there was a big black streak across each page. And it didn't do color. But I absolutely loved that printer. Aside from the black streak, everything about it worked flawlessly every time. *Never* jammed once, blazing fast, good quality. Used that thing for years. Eventually we did need something that could print without that streak and we went through a ton of inkjets. Every one of them was total shit until about 2 or 3 years ago we got an HP Photosmart C4200 printer/scanner combo which isn't as good as the old Okidata, but it's the only inkjet I've ever used that I'd consider "not shit". The software/drivers for it, though, still fall squarely into the "pure shit" category, though. Oh well...Maybe there's Linux drivers for it that are better...
 PSUs: Never ever buy the cheap models. There's a list of bad
 manufacturers in the net. They make awful shit.

Another problem is that, as places like Sharky Extreme and Tom's Hardware found out while testing, it seems to be common practice for PSU manufacturers to outright lie about the wattage.
 Optical drives: Number 1 reason for breakage, I forget to close the tray
 and kick it off!

Very much related to that: I truly, truly *hate* all software that decides it makes sense to eject the tray directly. And even worse: OSes not having a universal setting for "Disable *all* software-triggered ejects". That option should be standard and default. I've seriously tried to learn how to make Windows rootkits *just* so I could hook into the right dll/function and disable it system-wide once-and-for-all. (Never actually got anywhere with it though, and eventually just gave up.)
 Hard drives: these always fail, sooner or later. There's nothing you can
 do except RAID and backups

And SMART monitors: I've had a total of two HDD's fail, and in both cases I really lucked out. The first one was in my Mac, but it was after I was already getting completely fed up with OSX and Apple, so I didn't really care much - I was mostly back on Windows again by that point. The second failure just happened to be the least important of the three HDDs in my system. I was still pretty upset about it though, so it was a big wakeup call: I *will not* have a primary system anymore that doesn't have a SMART monitoring program, with temperature readouts, always running. And yes, it can't always predict a failure, but sometimes it can so IMO there's no good reason not to have it. That's actually one of the things I don't like about Linux, nothing like that seems to exist for Linux. Sure, there's a cmd line program you can poll, but that doesn't remotely cut it.
 Monitors: The CRTs used to break every 3-5 years. Even the high quality
 Sony monitors :-| I've used TFT panels since 2003. The inverter of the
 first 14" TFT broke after 5 years of use. Three others are still working,
 after 1-6 years of use.

I still use CRTs (one big reason being that I hate the idea of only being able to use one resolution), and for a long time I've always had either a dual-monitor setup or dual systems with one monitor on each, so I've had a lot of monitors. But I've only ever had *one* CRT go bad, and I definitely use them for more than 5 years. Also, FWIW, I'm convinced that Sony is *not* as good as people generally think. Maybe they were in the 70's or 80's, I don't know, but they're frequently no better than average. It's common for their high end DVD players to have problems or limitations that the cheap bargain-brands (like CyberHome) don't have. I had an expensive portable Sony CD player and the buttons quickly crapped out rendering it unusable (not that I care anymore since I have a Toshiba Gigabeat F with the Rockbox firmware - iPod be damned). The PS2 was reining champ for "most unreliable video game hardware in history" until the 360 stole the title by a landslide. And I've *never* found a pair of Sony headphones that sounded even *remotely* as good as a pair from Koss of comparable price and model. Sony is the Buick/Catallac/Oldsmobile of consumer electronics, *not* the Lexus/Benz as most people seem to think.
 Mice: I've always bought Logitech mice. NEVER had any failures. The
 current one is MX 510 (USB). Previous ones used the COM port. The bottom
 of the MX510 shows signs of hardcore use, but the internal parts haven't
 fallen off yet and the LED "eye" works :-D

MS and Logitech mice are always the best. I've never come across any other brand that put out anything but garbage (that does include Apple, except that in Apple's case it's because of piss-poor design rather than the piss-poor engineering of all the other non-MS/Logitech brands). I've been using this Logitech Trackball for probably over five years and I absolutely love it: http://www.amazon.com/Logitech-Trackman-Wheel-Optical-Silver/dp/B00005NIMJ/ In fact, I have two of them. The older one has been starting to get a bad connection between the cord and the trackball, but that's probably my fault. And heck, the MS mouse my mom uses has left-button that's been acting up, so nothing's perfect no matter what brand. But MS/Logitech are definitely still worlds ahead of anyone else. (Which is kind of weird because, along with keyboards, mice are the *only* hardware I trust MS with. Every other piece of MS hardware either has reliability problems or, in the case of all their game controllers going all they way back to the Sidewinders in the pre-XBox days, a completely worthless D-Pad.)
Jan 15 2011
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only being
 able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs. Andrei
Jan 15 2011
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:igt2pl$2u6e$1 digitalmars.com...
 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only being
 able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

Heh :) Well, I can spend no money and stick with my current 21" CRT that already suits my needs (that I only paid $25 for in the first place), or I can spend a hundred or so dollars to lose the ability to have a decent looking picture at more than one resolution and then say "Gee golly whiz! That sure is a really flat panel!!". Whoop-dee-doo. And popularity and trendyness are just non-issues.
Jan 15 2011
next sibling parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 16.01.2011 04:33, schrieb Jonathan M Davis:
 On Saturday 15 January 2011 19:11:26 Nick Sabalausky wrote:
 "Andrei Alexandrescu"<SeeWebsiteForEmail erdani.org>  wrote in message
 news:igt2pl$2u6e$1 digitalmars.com...

 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only
 being able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

Heh :) Well, I can spend no money and stick with my current 21" CRT that already suits my needs (that I only paid $25 for in the first place), or I can spend a hundred or so dollars to lose the ability to have a decent looking picture at more than one resolution and then say "Gee golly whiz! That sure is a really flat panel!!". Whoop-dee-doo. And popularity and trendyness are just non-issues.

Why would you _want_ more than one resolution? What's the use case? I'd expect that you'd want the highest resolution that you could get and be done with it. - Jonathan M Davis

Maybe for games (if your PC isn't fast enough for full resolution or the game doesn't support it).. but that is no problem at all: flatscreens can interpolate other resolutions and while the picture may not be good enough for text (like when programming) and stuff it *is* good enough for games on decent flatscreens. For non-games-usage I never had the urge to change the resolution of my flatscreens. And I really prefer them to any CRT I've ever used. OTOH when he has a good CRT (high resolution, good refresh rate) there may be little reason to replace it, as long as it's working.. apart from the high power consumption and the size maybe. Cheers, - Daniel
Jan 15 2011
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Daniel Gibson wrote:
 OTOH when he has a good CRT (high resolution, good refresh rate) there 
 may be little reason to replace it, as long as it's working.. apart from 
 the high power consumption and the size maybe.

The latter two issues loomed large for me. I was very glad to upgrade to an LCD.
Jan 15 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Daniel Gibson" <metalcaedes gmail.com> wrote in message 
news:igtq08$2m1c$1 digitalmars.com...
 Am 16.01.2011 04:33, schrieb Jonathan M Davis:
 On Saturday 15 January 2011 19:11:26 Nick Sabalausky wrote:
 "Andrei Alexandrescu"<SeeWebsiteForEmail erdani.org>  wrote in message
 news:igt2pl$2u6e$1 digitalmars.com...

 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only
 being able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

Heh :) Well, I can spend no money and stick with my current 21" CRT that already suits my needs (that I only paid $25 for in the first place), or I can spend a hundred or so dollars to lose the ability to have a decent looking picture at more than one resolution and then say "Gee golly whiz! That sure is a really flat panel!!". Whoop-dee-doo. And popularity and trendyness are just non-issues.

Why would you _want_ more than one resolution? What's the use case? I'd expect that you'd want the highest resolution that you could get and be done with it. - Jonathan M Davis

Maybe for games (if your PC isn't fast enough for full resolution or the game doesn't support it).. but that is no problem at all: flatscreens can interpolate other resolutions and while the picture may not be good enough for text (like when programming) and stuff it *is* good enough for games on decent flatscreens.

There's two reasons it's good for games: 1. Like you indicated, to get a better framerate. Framerate is more important in most games than resolution. 2. For games that aren't really designed for multiple resolutions, particularly many 2D ones, and especially older games (which are often some of the best, but they look like shit on an LCD).
 For non-games-usage I never had the urge to change the resolution of my 
 flatscreens. And I really prefer them to any CRT I've ever used.

For non-games, just off-the-top-of-my-head: Bumping up to a higher resolution can be good when dealing with images, or whenever you're doing anything that could use more screen real-estate at the cost of smaller UI elements. And CRTs are more likely to go up to really high resolutions than non-CRTs. For instance, 1600x1200 is common on even the low-end CRT monitors (and that was true even *before* televisions started going HD - which is *still* lower-rez than 1600x1200). Yea, you can get super high resolution non-CRTs, but they're much more expensive. And even then, you lose the ability to do any real desktop work at a more typical resolution. Which is bad because for many things I do want to limit my resolution so the UI isn't overly-small. And yea, there are certian things you can do to scale up the UI, but I've never seen an OS, Win/Lin/Mac, that actually handled that sort of thing reasonably well. So CRTs give you all that flexibility at a sensible price. And if I'm doing some work on the computer, and it *is* set at a sensible resolution that works for both the given monitor and the task at hand, I've never noticed a real impromevent with LCD versus CRT. Yea, it is a *little* bit better, but I've never noticed any difference while actually *doing* anything on a computer: only when I stop and actually look for differences. Also, it can be good when mirroring the display to TV-out or, better yet, using the "cinema mode" where any video-playback is sent fullscreen to the TV (which I'll often do), because those things tend to not work very well when the monitor isn't reduced to the same resolution as the TV.
 OTOH when he has a good CRT (high resolution, good refresh rate) there may 
 be little reason to replace it, as long as it's working.. apart from the 
 high power consumption and the size maybe.

I've actually compared the rated power consumpsion between CRTs and LCDs of similar size and was actually surprised to find that there was little, if any, real difference at all on the sets I compared.
Jan 15 2011
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:igttbt$16hu$1 digitalmars.com...
 OTOH when he has a good CRT (high resolution, good refresh rate) there 
 may be little reason to replace it, as long as it's working.. apart from 
 the high power consumption and the size maybe.

I've actually compared the rated power consumpsion between CRTs and LCDs of similar size and was actually surprised to find that there was little, if any, real difference at all on the sets I compared.

As for size, well, I have enough space, so at least for me that's a non-issue.
Jan 15 2011
parent Adam Ruppe <destructionator gmail.com> writes:
I stuck with my CRT for a long time. What I really liked about it
was the bright colors. I've never seen an LCD match that.

But, my CRT started to give out. It'd go to a bright line in the
middle and darkness everywhere else at random. It started doing
it just every few hours, then it got to the point where it'd do
it every 20 minutes or so.

I found if I give it a nice pound on the side, it'd go back to
normal for a while. I was content for that for months.

... but the others living with me weren't. *WHACK* OH MY GOD
JUST BUY A NEW ONE ALREADY!


So I gave in and looked for a replacement CRT with the same specs.
But couldn't find one. I gave in and bought an LCD in the same
price range (~$150) with the same resolution (I liked what I had!)

Weighed less, left room on the desk for my keyboard, and best of all,
I haven't had to hit it yet. But colors haven't looked quite the same
since and VGA text mode just looks weird. Alas.
Jan 16 2011
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Nick Sabalausky wrote:
 And CRTs are more likely to go up to really
 high resolutions than non-CRTs. For instance, 1600x1200 is common on even 
 the low-end CRT monitors (and that was true even *before* televisions 
 started going HD - which is *still* lower-rez than 1600x1200).
 
 Yea, you can get super high resolution non-CRTs, but they're much more 
 expensive.

I have 1900x1200 on LCD, and I think it was around $325. It's a Hanns-G thing, from Amazon. Of course, I don't use it for games. I got thoroughly burned out on that when I had a job in college developing/testing them.
Jan 15 2011
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/15/11 10:47 PM, Nick Sabalausky wrote:
 "Daniel Gibson"<metalcaedes gmail.com>  wrote in message
 news:igtq08$2m1c$1 digitalmars.com...
 There's two reasons it's good for games:

 1. Like you indicated, to get a better framerate. Framerate is more
 important in most games than resolution.

 2. For games that aren't really designed for multiple resolutions,
 particularly many 2D ones, and especially older games (which are often some
 of the best, but they look like shit on an LCD).

It's a legacy issue. Clearly everybody except you is using CRTs for gaming and whatnot. Therefore graphics hardware producers and game vendors are doing what it takes to adapt to a fixed resolution.
 For non-games-usage I never had the urge to change the resolution of my
 flatscreens. And I really prefer them to any CRT I've ever used.

For non-games, just off-the-top-of-my-head: Bumping up to a higher resolution can be good when dealing with images, or whenever you're doing anything that could use more screen real-estate at the cost of smaller UI elements. And CRTs are more likely to go up to really high resolutions than non-CRTs. For instance, 1600x1200 is common on even the low-end CRT monitors (and that was true even *before* televisions started going HD - which is *still* lower-rez than 1600x1200). Yea, you can get super high resolution non-CRTs, but they're much more expensive. And even then, you lose the ability to do any real desktop work at a more typical resolution. Which is bad because for many things I do want to limit my resolution so the UI isn't overly-small. And yea, there are certian things you can do to scale up the UI, but I've never seen an OS, Win/Lin/Mac, that actually handled that sort of thing reasonably well. So CRTs give you all that flexibility at a sensible price.

It's odd how everybody else can put up with LCDs for all kinds of work.
 And if I'm doing some work on the computer, and it *is* set at a sensible
 resolution that works for both the given monitor and the task at hand, I've
 never noticed a real impromevent with LCD versus CRT. Yea, it is a *little*
 bit better, but I've never noticed any difference while actually *doing*
 anything on a computer: only when I stop and actually look for differences.

Meanwhile, you are looking at a gamma gun shooting atcha.
 Also, it can be good when mirroring the display to TV-out or, better yet,
 using the "cinema mode" where any video-playback is sent fullscreen to the
 TV (which I'll often do), because those things tend to not work very well
 when the monitor isn't reduced to the same resolution as the TV.


 OTOH when he has a good CRT (high resolution, good refresh rate) there may
 be little reason to replace it, as long as it's working.. apart from the
 high power consumption and the size maybe.

I've actually compared the rated power consumpsion between CRTs and LCDs of similar size and was actually surprised to find that there was little, if any, real difference at all on the sets I compared.

Absolutely. There's a CRT brand that consumes surprisingly close to an LCD. It's called "Confirmation Bias". Andrei
Jan 16 2011
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:igvhj9$mri$1 digitalmars.com...
 On 1/15/11 10:47 PM, Nick Sabalausky wrote:
 There's two reasons it's good for games:

 1. Like you indicated, to get a better framerate. Framerate is more
 important in most games than resolution.

 2. For games that aren't really designed for multiple resolutions,
 particularly many 2D ones, and especially older games (which are often 
 some
 of the best, but they look like shit on an LCD).

It's a legacy issue. Clearly everybody except you is using CRTs for gaming and whatnot. Therefore graphics hardware producers and game vendors are doing what it takes to adapt to a fixed resolution.

Wow, you really seem to be taking a lot of this personally. First, I asume you meant "...everybody except you is using non-CRTs..." Second, how exacty is the modern-day work of graphics hardware producers and game vendors that you speak of going to affect games from more than a few years ago? What?!? You're still watching movies that were filmed in the 80's?!? Dude, you need to upgrade!!!
 It's odd how everybody else can put up with LCDs for all kinds of work.

Strawman. I never said anything remotely resembling "LCDs are unusable." What I've said is that 1. They have certain benefits that get overlooked, and 2. Why should *I* spend the money to replace something that already works fine for me?
 And if I'm doing some work on the computer, and it *is* set at a sensible
 resolution that works for both the given monitor and the task at hand, 
 I've
 never noticed a real impromevent with LCD versus CRT. Yea, it is a 
 *little*
 bit better, but I've never noticed any difference while actually *doing*
 anything on a computer: only when I stop and actually look for 
 differences.

Meanwhile, you are looking at a gamma gun shooting atcha.

You can't see anything at all without electromagnetic radiation shooting into your eyeballs.
 I've actually compared the rated power consumpsion between CRTs and LCDs 
 of
 similar size and was actually surprised to find that there was little, if
 any, real difference at all on the sets I compared.

Absolutely. There's a CRT brand that consumes surprisingly close to an LCD. It's called "Confirmation Bias".

I'm pretty sure I did point out the limitations of my observation: "...on all the sets I compared". And it's pretty obvious I wasn't undertaking a proper extensive study. There's no need for sarcasm.
Jan 16 2011
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/11 2:07 PM, Nick Sabalausky wrote:
 "Andrei Alexandrescu"<SeeWebsiteForEmail erdani.org>  wrote in message
 news:igvhj9$mri$1 digitalmars.com...
 On 1/15/11 10:47 PM, Nick Sabalausky wrote:
 There's two reasons it's good for games:

 1. Like you indicated, to get a better framerate. Framerate is more
 important in most games than resolution.

 2. For games that aren't really designed for multiple resolutions,
 particularly many 2D ones, and especially older games (which are often
 some
 of the best, but they look like shit on an LCD).

It's a legacy issue. Clearly everybody except you is using CRTs for gaming and whatnot. Therefore graphics hardware producers and game vendors are doing what it takes to adapt to a fixed resolution.

Wow, you really seem to be taking a lot of this personally.

Not at all!
 First, I asume you meant "...everybody except you is using non-CRTs..."

 Second, how exacty is the modern-day work of graphics hardware producers and
 game vendors that you speak of going to affect games from more than a few
 years ago? What?!? You're still watching movies that were filmed in the
 80's?!? Dude, you need to upgrade!!!

You have a good point if playing vintage games is important to you.
 It's odd how everybody else can put up with LCDs for all kinds of work.

Strawman. I never said anything remotely resembling "LCDs are unusable." What I've said is that 1. They have certain benefits that get overlooked,

The benefits of CRTs are not being overlooked. They are insignificant or illusory. If they were significant, CRTs would still be in significant use. Donning a flat panel is not a display of social status. Most people need computers to get work done, and they'd use CRTs if CRTs would have them do better work. A 30" 2560x1280 monitor is sitting on my desk. (My employer bought it for me without asking; I "only" had a 26". They thought making me more productive at the cost of a monitor is simple business sense.) My productivity would be seriously impaired if I replaced either monitor with even the best CRT out there.
 and 2. Why should *I* spend the money to replace something that
 already works fine for me?

If it works for you, fine. I doubt you wouldn't be more productive with a larger monitor. But at any rate entering money as an essential part of the equation is (within reason) misguided. This is your livelihood, your core work. Save on groceries, utilities, cars, luxury... but don't "save" on what impacts your real work.
 And if I'm doing some work on the computer, and it *is* set at a sensible
 resolution that works for both the given monitor and the task at hand,
 I've
 never noticed a real impromevent with LCD versus CRT. Yea, it is a
 *little*
 bit better, but I've never noticed any difference while actually *doing*
 anything on a computer: only when I stop and actually look for
 differences.

Meanwhile, you are looking at a gamma gun shooting atcha.

You can't see anything at all without electromagnetic radiation shooting into your eyeballs.

Nonono. Gamma = electrons. CRT monitors have what's literally called a gamma gun. It's aimed straight at your eyes.
 Absolutely. There's a CRT brand that consumes surprisingly close to an
 LCD. It's called "Confirmation Bias".

I'm pretty sure I did point out the limitations of my observation: "...on all the sets I compared". And it's pretty obvious I wasn't undertaking a proper extensive study. There's no need for sarcasm.

There is. It would take anyone two minutes of online research to figure that your comparison is wrong. Andrei
Jan 16 2011
prev sibling next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/11 1:38 PM, Andrei Alexandrescu wrote:
 On 1/15/11 10:47 PM, Nick Sabalausky wrote:
 "Daniel Gibson"<metalcaedes gmail.com> wrote in message
 news:igtq08$2m1c$1 digitalmars.com...
 There's two reasons it's good for games:

 1. Like you indicated, to get a better framerate. Framerate is more
 important in most games than resolution.

 2. For games that aren't really designed for multiple resolutions,
 particularly many 2D ones, and especially older games (which are often
 some
 of the best, but they look like shit on an LCD).

It's a legacy issue. Clearly everybody except you is using CRTs for gaming and whatnot.

s/is using/is not using/ Andrie
Jan 16 2011
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Andrei Alexandrescu wrote:
 Meanwhile, you are looking at a gamma gun shooting atcha.

I always worried about that. Nobody actually found anything wrong, but still.
Jan 16 2011
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Andrej Mitrovic" <andrej.mitrovich gmail.com> wrote in message 
news:mailman.652.1295210795.4748.digitalmars-d puremagic.com...
 With CRTs I could spend a few hours in front of the PC, but after that
 my eyes would get really tired and I'd have to take a break. Since I
 switched to LCDs I've never had this problem anymore, I could spend a
 day staring at screen if I wanted to. Of course, it's still best to
 take some time off regardless of the screen type.

I use a light-on-dark color scheme. Partly because I like the way it looks, but also partly because it's easier on my eyes. If I were using a scheme with blazing-white everywhere, I can imagine a CRT might be a bit harsh.
 Anyway.. how about that Git thing, then? :D

I'd been holding on to SVN for a while, but that discussion did convince me to give DVCSes an honest try (haven't gotten around to it yet though, but plan to).
Jan 16 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Andrej Mitrovic wrote:
 With CRTs I could spend a few hours in front of the PC, but after that
 my eyes would get really tired and I'd have to take a break. Since I
 switched to LCDs I've never had this problem anymore, I could spend a
 day staring at screen if I wanted to. Of course, it's still best to
 take some time off regardless of the screen type.

I need reading glasses badly, but fortunately not for reading a screen. I never had eye fatigue problems with it. I did buy a 28" LCD for my desktop, which is so nice that I can no longer use my laptop screen for dev. :-(
 Anyway.. how about that Git thing, then? :D

We'll be moving dmd, phobos, druntime, and the docs to Github shortly. The accounts are set up, it's just a matter of getting the svn repositories moved and figuring out how it all works. I know very little about git and github, but the discussions about it here and elsewhere online have thoroughly convinced me (and the other devs) that this is the right move for D.
Jan 16 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Jonathan M Davis wrote:
 That will make it _much_ easier to make check-ins while working on other 
 stuff in parallel.

Yes. And there's the large issue that being on github simply makes contributing to the D project more appealing to a wide group of excellent developers.
Jan 16 2011
parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 17.01.2011 06:12, schrieb Walter Bright:
 Jonathan M Davis wrote:
 That will make it _much_ easier to make check-ins while working on
 other stuff in parallel.

Yes. And there's the large issue that being on github simply makes contributing to the D project more appealing to a wide group of excellent developers.

How will the licensing issue (forks of the dmd backend are only allowed with your permission) be solved?
Jan 16 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Daniel Gibson wrote:
 How will the licensing issue (forks of the dmd backend are only allowed 
 with your permission) be solved?

It shouldn't be a problem as long as those forks are for the purpose of developing patches to the main branch, as is done now in svn. I view it like people downloading the source from digitalmars.com. Using the back end to develop a separate compiler, or set oneself up as a distributor of dmd, incorporate it into some other product, etc., please ask for permission. Basically, anyone using it has to agree not to sue Symantec or Digital Mars, and conform to: http://www.digitalmars.com/download/dmcpp.html
Jan 16 2011
parent reply Robert Clipsham <robert octarineparrot.com> writes:
On 17/01/11 06:25, Walter Bright wrote:
 Daniel Gibson wrote:
 How will the licensing issue (forks of the dmd backend are only
 allowed with your permission) be solved?

It shouldn't be a problem as long as those forks are for the purpose of developing patches to the main branch, as is done now in svn. I view it like people downloading the source from digitalmars.com. Using the back end to develop a separate compiler, or set oneself up as a distributor of dmd, incorporate it into some other product, etc., please ask for permission. Basically, anyone using it has to agree not to sue Symantec or Digital Mars, and conform to: http://www.digitalmars.com/download/dmcpp.html

Speaking of which, are you able to remove the "The Software was not designed to operate after December 31, 1999" sentence at all, or does that require you to mess around contacting symantec? Not that anyone reads it, it is kind of off putting to see that over a decade later though for anyone who bothers reading it :P -- Robert http://octarineparrot.com/
Jan 17 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Robert Clipsham wrote:
 Speaking of which, are you able to remove the "The Software was not 
 designed to operate after December 31, 1999" sentence at all, or does 
 that require you to mess around contacting symantec? Not that anyone 
 reads it, it is kind of off putting to see that over a decade later 
 though for anyone who bothers reading it :P

Consider it like the DNA we all still carry around for fish gills!
Jan 17 2011
next sibling parent Robert Clipsham <robert octarineparrot.com> writes:
On 17/01/11 20:29, Walter Bright wrote:
 Robert Clipsham wrote:
 Speaking of which, are you able to remove the "The Software was not
 designed to operate after December 31, 1999" sentence at all, or does
 that require you to mess around contacting symantec? Not that anyone
 reads it, it is kind of off putting to see that over a decade later
 though for anyone who bothers reading it :P

Consider it like the DNA we all still carry around for fish gills!

I don't know about you, but I take full advantage of my gills! -- Robert http://octarineparrot.com/
Jan 17 2011
prev sibling parent Robert Clipsham <robert octarineparrot.com> writes:
On 18/01/11 01:09, Brad Roberts wrote:
 On Mon, 17 Jan 2011, Walter Bright wrote:

 Robert Clipsham wrote:
 Speaking of which, are you able to remove the "The Software was not designed
 to operate after December 31, 1999" sentence at all, or does that require
 you to mess around contacting symantec? Not that anyone reads it, it is kind
 of off putting to see that over a decade later though for anyone who bothers
 reading it :P

Consider it like the DNA we all still carry around for fish gills!

In all seriousness, the backend license makes dmd look very strange. It threw the lawyers I consulted for a serious loop. At a casual glance it gives the impression of software that's massively out of date and out of touch with the real world. I know that updating it would likely be very painful, but is it just painful or impossible? Is it something that money could solve? I'd chip in to a fund to replace the license with something less... odd. Later, Brad

Make that a nice open source license and I'm happy to throw some money at it too :> -- Robert http://octarineparrot.com/
Jan 18 2011
prev sibling parent Brad Roberts <braddr slice-2.puremagic.com> writes:
On Mon, 17 Jan 2011, Walter Bright wrote:

 Robert Clipsham wrote:
 Speaking of which, are you able to remove the "The Software was not designed
 to operate after December 31, 1999" sentence at all, or does that require
 you to mess around contacting symantec? Not that anyone reads it, it is kind
 of off putting to see that over a decade later though for anyone who bothers
 reading it :P

Consider it like the DNA we all still carry around for fish gills!

In all seriousness, the backend license makes dmd look very strange. It threw the lawyers I consulted for a serious loop. At a casual glance it gives the impression of software that's massively out of date and out of touch with the real world. I know that updating it would likely be very painful, but is it just painful or impossible? Is it something that money could solve? I'd chip in to a fund to replace the license with something less... odd. Later, Brad
Jan 17 2011
prev sibling parent Johann MacDonagh <johann.macdonagh..no spam..gmail.com> writes:
On 1/16/2011 5:07 PM, Walter Bright wrote:
 We'll be moving dmd, phobos, druntime, and the docs to Github shortly.
 The accounts are set up, it's just a matter of getting the svn
 repositories moved and figuring out how it all works.

 I know very little about git and github, but the discussions about it
 here and elsewhere online have thoroughly convinced me (and the other
 devs) that this is the right move for D.

I'm sure you've already seen this, but Pro Git is probably the best guide for git. http://progit.org/book/ Once you understand what a commit is, what a tree is, what a merge is, what a branch is, etc... its actually really simple (Chapter 9 in Pro Git). Definitely a radical departure from svn, and a good one for D.
Jan 18 2011
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
On 16/01/2011 19:38, Andrei Alexandrescu wrote:
 On 1/15/11 10:47 PM, Nick Sabalausky wrote:
 "Daniel Gibson"<metalcaedes gmail.com>  wrote in message
 news:igtq08$2m1c$1 digitalmars.com...
 There's two reasons it's good for games:

 1. Like you indicated, to get a better framerate. Framerate is more
 important in most games than resolution.

 2. For games that aren't really designed for multiple resolutions,
 particularly many 2D ones, and especially older games (which are often
 some
 of the best, but they look like shit on an LCD).

It's a legacy issue. Clearly everybody except you is using CRTs for gaming and whatnot. Therefore graphics hardware producers and game vendors are doing what it takes to adapt to a fixed resolution.

Actually, not entirely true, although not for the reasons of old games. Some players of hardcore twitch FPS games (like Quake), especially professional players, still use CRTs, due to the near-zero input lag that LCDs, although having improved in that regard, are still not able to match exactly. But other than that, I really see no reason to stick with CRTs vs a good LCD, yeah. -- Bruno Medeiros - Software Engineer
Jan 28 2011
prev sibling next sibling parent "Nick Sabalausky" <a a.a> writes:
"retard" <re tard.com.invalid> wrote in message 
news:igv3p3$2n4k$2 digitalmars.com...
 Sat, 15 Jan 2011 23:47:09 -0500, Nick Sabalausky wrote:

 Yea, you can get super high resolution non-CRTs, but they're much more
 expensive. And even then, you lose the ability to do any real desktop
 work at a more typical resolution. Which is bad because for many things
 I do want to limit my resolution so the UI isn't overly-small. And yea,
 there are certian things you can do to scale up the UI, but I've never
 seen an OS, Win/Lin/Mac, that actually handled that sort of thing
 reasonably well. So CRTs give you all that flexibility at a sensible
 price.

You mean DPI settings?

I just mean uniformly scaled UI elements. For instance, you can usually adjust a UI's font size, but the results tend to be like selectively scaling up just the nose, mouth and hands on a picture of a human. And then parts of it still end up too small. And, especially on Linux, those sorts of settings doesn't always get obeyed by all software anyway.
 Also, it can be good when mirroring the display to TV-out or, better
 yet, using the "cinema mode" where any video-playback is sent fullscreen
 to the TV (which I'll often do), because those things tend to not work
 very well when the monitor isn't reduced to the same resolution as the
 TV.

But my TV happily accepts 1920x1080? Sending the same digital signal to both works fine here. YMMV

Mine's an SD...which I suppose I have to defend...Never felt a need to replace it. Never cared whether or not I could see athlete's drops of sweat or individual blades of grass. And I have a lot of SD content that's never going to magcally change to HD, and that stuff looks far better on an SD set anyway than on any HD set I've ever seen no matter what fancy delay-introducing filter it had (except maybe the CRT HDTVs that don't exist anymore). Racing games, FPSes and Pikmin are the *only* things for which I have any interest in HD (since, for those, it actually matters if you're able to see small things in the distance). But then I'd be spending money (which I'm very short on), and making all my other SD content look worse, *AND* since I'm into games, it would be absolutely essential to get one without any input->display lag, which is very difficult since 1. The manufacturers only seem to care about movies and 2. From what I've seen, they never seem to actually tell you how much lag there is. So it's a big bother, costs money, and has drawbacks. Maybe someday (like when I get rich and the downsides improve) but not right now.
Jan 16 2011
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
On 16/01/2011 04:47, Nick Sabalausky wrote:
 There's two reasons it's good for games:

 1. Like you indicated, to get a better framerate. Framerate is more
 important in most games than resolution.

This reason was valid at least at some point in time, for me it actually hold me back from transitioning from CRTs to LCDs for some time. But nowadays the screen resolutions have stabilized (stopped increasing, in terms of DPI), and graphics cards have improved in power enough that you can play nearly any game with the LCDs native resolution with max framerate, so no worries with this anymore (you may have to tone down the graphics settings a bit in some cases, but that is fine with me)
 2. For games that aren't really designed for multiple resolutions,
 particularly many 2D ones, and especially older games (which are often some
 of the best, but they look like shit on an LCD).

Well, if your LCD supports it, you have the option of not expanding the screen if output resolution is not the native one. How good or bad that would be, depends on the game I guess. I actually did this some years ago on certain (recent) games for a some time, use only 1024Ч768 of the 1280x1024 native, to have better framerate. It's not a problem for me for old games, since most of them that I occasionally play are played in console emulator. DOS games unfortunately were very hard to play correctly in XP in the first place (especially with soundblaster), so it's not a concern for me. PS: here's a nice thread for anyone looking to purchase a new LCD: http://forums.anandtech.com/showthread.php?t=39226 It explains a lot of things about LCD technology, and ranks several LCDs according to intended usage (office work, hardcore gaming, etc.). -- Bruno Medeiros - Software Engineer
Jan 28 2011
prev sibling next sibling parent Lutger Blijdestijn <lutger.blijdestijn gmail.com> writes:
Nick Sabalausky wrote:

 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message
 news:igt2pl$2u6e$1 digitalmars.com...
 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only
 being able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

Heh :) Well, I can spend no money and stick with my current 21" CRT that already suits my needs (that I only paid $25 for in the first place), or I can spend a hundred or so dollars to lose the ability to have a decent looking picture at more than one resolution and then say "Gee golly whiz! That sure is a really flat panel!!". Whoop-dee-doo. And popularity and trendyness are just non-issues.

Actually nearly all lcds below 600$-800$ price point (tn-panels) have quite inferior display of colors compared to el cheapo crt's, at any resolution.
Jan 16 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/15/11 9:11 PM, Nick Sabalausky wrote:
 "Andrei Alexandrescu"<SeeWebsiteForEmail erdani.org>  wrote in message
 news:igt2pl$2u6e$1 digitalmars.com...
 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only being
 able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

Heh :) Well, I can spend no money and stick with my current 21" CRT that already suits my needs (that I only paid $25 for in the first place),

My last CRT was a 19" from Nokia, 1600x1200, top of the line. Got it for free under the condition that I pick it up myself from a porch, which is as far as its previous owner could move it. I was seriously warned to come with a friend to take it. It weighed 86 lbs. That all worked for me: I was a poor student and happened to have a huge desk at home. I didn't think twice about buying a different monitor when I moved across the country... I wonder how much your 21" CRT weighs.
 or I
 can spend a hundred or so dollars to lose the ability to have a decent
 looking picture at more than one resolution and then say "Gee golly whiz!
 That sure is a really flat panel!!". Whoop-dee-doo. And popularity and
 trendyness are just non-issues.

I think your eyes are more important than your ability to fiddle with resolution. Besides, this whole changing the resolution thing is a consequence of using crappy software. What you want is set the resolution to the maximum and do the rest in software. And guess what - at their maximum, CRT monitors suck compared to flat panels. Heck this is unbelievable... I spend time on the relative merits of flat panels vs. CRTs. I'm outta here. Andrei
Jan 16 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:igvc0k$c3o$1 digitalmars.com...
 On 1/15/11 9:11 PM, Nick Sabalausky wrote:
 Heh :)  Well, I can spend no money and stick with my current 21" CRT that
 already suits my needs (that I only paid $25 for in the first place),

My last CRT was a 19" from Nokia, 1600x1200, top of the line. Got it for free under the condition that I pick it up myself from a porch, which is as far as its previous owner could move it. I was seriously warned to come with a friend to take it. It weighed 86 lbs. That all worked for me: I was a poor student and happened to have a huge desk at home. I didn't think twice about buying a different monitor when I moved across the country... I wonder how much your 21" CRT weighs.

No clue. It's my desktop system, so I haven't had a reason to pick up the monitor in years. And the desk seems to handle it just fine.
 or I
 can spend a hundred or so dollars to lose the ability to have a decent
 looking picture at more than one resolution and then say "Gee golly whiz!
 That sure is a really flat panel!!". Whoop-dee-doo. And popularity and
 trendyness are just non-issues.

I think your eyes are more important than your ability to fiddle with resolution.

Everyone always seems to be very vague on that issue. Given real, reliable, non-speculative evidence that CRTs are significantly (and not just negligibly) worse on the eyes, I could certainly be persuaded to replace my CRT when I can actually afford to. Now I'm certainly not saying that such evidence isn't out there, but FWIW, I have yet to come across it.
 Besides, this whole changing the resolution thing is a consequence of 
 using crappy software. What you want is set the resolution to the maximum 
 and do the rest in software. And guess what - at their maximum, CRT 
 monitors suck compared to flat panels.

Agreed, but show me an OS that actually *does* handle that reasonably well. XP doesn't. Win7 doesn't. Ubuntu 9.04 and Kubuntu 10.10 don't. (And I'm definitely not going back to OSX, I've had my fill of that.)
 Heck this is unbelievable... I spend time on the relative merits of flat 
 panels vs. CRTs. I'm outta here.

You're really taking this hard, aren't you?
Jan 16 2011
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/11 2:22 PM, Nick Sabalausky wrote:
 "Andrei Alexandrescu"<SeeWebsiteForEmail erdani.org>  wrote in message
 news:igvc0k$c3o$1 digitalmars.com...
 I think your eyes are more important than your ability to fiddle with
 resolution.

Everyone always seems to be very vague on that issue. Given real, reliable, non-speculative evidence that CRTs are significantly (and not just negligibly) worse on the eyes, I could certainly be persuaded to replace my CRT when I can actually afford to. Now I'm certainly not saying that such evidence isn't out there, but FWIW, I have yet to come across it.

Finding recent research on dangers of CRTs on eyes is difficult to find for the same reason finding recent research on the dangers of steam locomotives. Still, look at what Google thinks when you type "CRT monitor e".
 Besides, this whole changing the resolution thing is a consequence of
 using crappy software. What you want is set the resolution to the maximum
 and do the rest in software. And guess what - at their maximum, CRT
 monitors suck compared to flat panels.

Agreed, but show me an OS that actually *does* handle that reasonably well. XP doesn't. Win7 doesn't. Ubuntu 9.04 and Kubuntu 10.10 don't. (And I'm definitely not going back to OSX, I've had my fill of that.)

I'm happy with the way Ubuntu and OSX handle it.
 Heck this is unbelievable... I spend time on the relative merits of flat
 panels vs. CRTs. I'm outta here.

You're really taking this hard, aren't you?

Apparently I got drawn back into the discussion :o). I'm not as intense about this as one might think, but I do find it surprising that this discussion could possibly occur ever since about 2005. Andrei
Jan 16 2011
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:igvlf8$v20$1 digitalmars.com...
 Apparently I got drawn back into the discussion :o). I'm not as intense 
 about this as one might think, but I do find it surprising that this 
 discussion could possibly occur ever since about 2005.

FWIW, when computer monitors regularly use the pixel density that the newer iPhones currently have, then I'd imagine that would easily compensate for scaling artifacts on non-native resultions enough to get me to find and get one with a small enough delay (assuming I had the $ and needed a new monitor).
Jan 16 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
Nick Sabalausky wrote:
 FWIW, when computer monitors regularly use the pixel density that the newer 
 iPhones currently have, then I'd imagine that would easily compensate for 
 scaling artifacts on non-native resultions enough to get me to find and get 
 one with a small enough delay (assuming I had the $ and needed a new 
 monitor).

I bought the iPod with the retina display. That gizmo has done the impossible - converted me into an Apple fanboi. I absolutely love that display. The weird thing is set it next to an older iPod with the lower res display. They look the same. But I find I can read the retina display without reading glasses, and it's much more fatiguing to do that with the older one. Even though they look the same! You can really see the difference if you look at both using a magnifying glass. I can clearly see the screen door even on my super-dee-duper 1900x1200 monitor, but not at all on the iPod. I've held off on buying an iPad because I want one with a retina display, too (and the camera for video calls).
Jan 16 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:igvlf8$v20$1 digitalmars.com...
 On 1/16/11 2:22 PM, Nick Sabalausky wrote:
 "Andrei Alexandrescu"<SeeWebsiteForEmail erdani.org>  wrote in message
 news:igvc0k$c3o$1 digitalmars.com...
 I think your eyes are more important than your ability to fiddle with
 resolution.

Everyone always seems to be very vague on that issue. Given real, reliable, non-speculative evidence that CRTs are significantly (and not just negligibly) worse on the eyes, I could certainly be persuaded to replace my CRT when I can actually afford to. Now I'm certainly not saying that such evidence isn't out there, but FWIW, I have yet to come across it.

Finding recent research on dangers of CRTs on eyes is difficult to find for the same reason finding recent research on the dangers of steam locomotives. Still, look at what Google thinks when you type "CRT monitor e".

It's not as clearcut as you may think. One of the first results for "CRT monitor eye": http://www.tomshardware.com/forum/52709-3-best-eyes Keep in mind too, that the vast majority of the reports of CRTs being significantly worse are either have no backing references or are so anecdotal and vague that it's impossible to distinguish from the placebo effect. And there's other variables that rarely get mentioned, like whether they happen to be looking at a CRT with a bad refresh rate or brightness/contrast set too high. I'm not saying that CRTs are definitely as good as or better than LCDs on the eyes, I'm just saying it doesn't seem quite as clear as so many people assume it to be.
Jan 16 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
Nick Sabalausky wrote:
 Keep in mind too, that the vast majority of the reports of CRTs being 
 significantly worse are either have no backing references or are so 
 anecdotal and vague that it's impossible to distinguish from the placebo 
 effect. And there's other variables that rarely get mentioned, like whether 
 they happen to be looking at a CRT with a bad refresh rate or 
 brightness/contrast set too high.

My CRTs would gradually get fuzzier over time. It was so slow you didn't notice until you set them next to a new one.
Jan 16 2011
prev sibling parent "Nick Sabalausky" <a a.a> writes:
"retard" <re tard.com.invalid> wrote in message 
news:ih0b1t$g2g$3 digitalmars.com...
 For example used 17" TFTs cost less than $40.

Continuing to use my 21" CRT costs me nothing.
 Even the prices aren't very competitive. I only remember that all refresh
 rates below 85 Hz caused me headache and eye fatigue. You can't use the
 max resolution   60 Hz for very long.

I run mine no lower than 85 Hz. It's about 100Hz at the moment. And I never need to run it at the max rez for long. It's just nice to be able to bump it up now and then when I want to. Then it goes back down. And yet people feel the need to bitch about me liking that ability.
 Why should *I* spend the money to replace something that already

You might get more things done by using a bigger screen. Maybe get some money to buy better equipment and stop complaining.

You've got to be kidding me...*other* people start giving *me* crap about what *I* choose to use, and you try to tell me *I'm* the one that needs to stop complaining? I normally try very much to avoid direct personal comments and only attack the arguments not the arguer, but seriously, what the hell is wrong with your head that you could even think of such an enormously idiotic thing to say? Meh, I'm not going to bother with the rest...
Jan 16 2011
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday 15 January 2011 19:11:26 Nick Sabalausky wrote:
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message
 news:igt2pl$2u6e$1 digitalmars.com...
 
 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only
 being able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

Heh :) Well, I can spend no money and stick with my current 21" CRT that already suits my needs (that I only paid $25 for in the first place), or I can spend a hundred or so dollars to lose the ability to have a decent looking picture at more than one resolution and then say "Gee golly whiz! That sure is a really flat panel!!". Whoop-dee-doo. And popularity and trendyness are just non-issues.

Why would you _want_ more than one resolution? What's the use case? I'd expect that you'd want the highest resolution that you could get and be done with it. - Jonathan M Davis
Jan 15 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
For games? I just switch to software rendering. I get almost the same
quality as a CRT on low resolutions. It's still not perfect, but it's
close.

Soo.. what are you playing that needs low resolutions and high
framerates, Nick? Quake? :D
Jan 15 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Sat, 15 Jan 2011 23:47:09 -0500, Nick Sabalausky wrote:

 Bumping up to a higher resolution can be good when dealing with images,
 or whenever you're doing anything that could use more screen real-estate
 at the cost of smaller UI elements. And CRTs are more likely to go up to
 really high resolutions than non-CRTs. For instance, 1600x1200 is common
 on even the low-end CRT monitors (and that was true even *before*
 televisions started going HD - which is *still* lower-rez than
 1600x1200).

The standard resolution for new flat panels has been 1920x1080 or 1920x1200 for a long time now. The panel size has slowly improved from 12-14" to 21.5" and 24", the price has gone down to about $110-120. Many of the applications have been tuned for 1080p. When I abandoned CRTs, the most common size was 17" or 19". Those monitors indeed supported resolutions up to 1600x1200 or more. However the best resolution was about 1024x768 or 1280x1024 for 17" monitors and 1280x1024 or a step up for 19" monitors. I also had one 22" or 23" Sony monitor which had the optimal resolution of 1600x1200 or at most one step bigger. It's much less than what the low-end models offer now. It's hard to believe you're using anything larger than 1920x1200 because the legacy graphics cards don't support very high resolutions, especially via DVI. For example I recently noticed a top of the line Geforce 6 card only supports resolutions up to 2048x1536 85В Hz. Guess how it works with a 30" Cinema display HD В 2560x1600. Another thing is subpixel antialiasing. You can't really do it without a TFT panel and digital video output.
 Yea, you can get super high resolution non-CRTs, but they're much more
 expensive. And even then, you lose the ability to do any real desktop
 work at a more typical resolution. Which is bad because for many things
 I do want to limit my resolution so the UI isn't overly-small. And yea,
 there are certian things you can do to scale up the UI, but I've never
 seen an OS, Win/Lin/Mac, that actually handled that sort of thing
 reasonably well. So CRTs give you all that flexibility at a sensible
 price.

You mean DPI settings?
 Also, it can be good when mirroring the display to TV-out or, better
 yet, using the "cinema mode" where any video-playback is sent fullscreen
 to the TV (which I'll often do), because those things tend to not work
 very well when the monitor isn't reduced to the same resolution as the
 TV.

But my TV happily accepts 1920x1080? Sending the same digital signal to both works fine here. YMMV
 OTOH when he has a good CRT (high resolution, good refresh rate) there
 may be little reason to replace it, as long as it's working.. apart
 from the high power consumption and the size maybe.

of similar size and was actually surprised to find that there was little, if any, real difference at all on the sets I compared.

How much do the CRTs consume power? The max power consumption for LED powered panels has gone down considerably and you never use their max brightness. Typical power consumption of a modern 21.5" panel might stay between 20 and 30 Watts when you're just typing text.
Jan 16 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
I need to get a better LCD/LED display one of these days. Right now
I'm sporting a Samsung 2232BW, it's a 22" screen with a native
1680x1050 resolution (16:10). But it has horrible text rendering when
antialiasing is enabled. I've tried a bunch of screen calibration
software, changing DPI settings, but nothing worked. I know it's not
my eyes to blame since antialised fonts look perfectly fine for me on
a few laptops that I've seen.
Jan 16 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable

On Sun, 2011-01-16 at 16:55 +0100, Andrej Mitrovic wrote:
 I need to get a better LCD/LED display one of these days. Right now
 I'm sporting a Samsung 2232BW, it's a 22" screen with a native
 1680x1050 resolution (16:10). But it has horrible text rendering when
 antialiasing is enabled. I've tried a bunch of screen calibration
 software, changing DPI settings, but nothing worked. I know it's not
 my eyes to blame since antialised fonts look perfectly fine for me on
 a few laptops that I've seen.

It may not be the monitor, it may be the operating system setting. In particular what level of smoothing and hinting do you have set for the fonts on LCD screen? Somewhat counter-intuitively, font rendering gets worse if you have no hinting or you have full hinting. It is much better to set "slight hinting". Assuming you have sub-pixel smoothing set of course. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jan 16 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/16/11, Russel Winder <russel russel.org.uk> wrote:
 It may not be the monitor, it may be the operating system setting.  In
 particular what level of smoothing and hinting do you have set for the
 fonts on LCD screen?  Somewhat counter-intuitively, font rendering gets
 worse if you have no hinting or you have full hinting.  It is much
 better to set "slight hinting".   Assuming you have sub-pixel smoothing
 set of course.

Yes, I know about those. Linux has arguably more settings to choose from, but it didn't help out. There's also RGB>BGR>GBR switches and contrast settings, and the ones you've mentioned like font hinting. It just doesn't seem to work on this screen no matter what I choose. Also, this screen has very poor yellows. When you have a solid yellow picture displayed you can actually see the color having a gradient from a darkish yellow to very bright yellow (almost white) from the top to the bottom of the screen, without even moving your head. But I bought this screen because it was rather cheap at the time and it's pretty good for games, which is what I cared for a few years ago. (low input lag + no tearing, no blurry screen when moving rapidly). I've read a few forum posts around the web and it seems other people have problems with this model and antialising as well. I'll definitely look into buying a quality screen next time though.
Jan 16 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
With CRTs I could spend a few hours in front of the PC, but after that
my eyes would get really tired and I'd have to take a break. Since I
switched to LCDs I've never had this problem anymore, I could spend a
day staring at screen if I wanted to. Of course, it's still best to
take some time off regardless of the screen type.

Anyway.. how about that Git thing, then? :D
Jan 16 2011
prev sibling next sibling parent so <so so.do> writes:
 You have a good point if playing vintage games is important to you.

He was quite clear on that i think, this is not like natural selection. I don't know Nick, but like the new generation movies, new generation games mostly suck. If i had to, i would definitely pick the old ones for both of them.
 The benefits of CRTs are not being overlooked. They are insignificant or  
 illusory. If they were significant, CRTs would still be in significant  
 use. Donning a flat panel is not a display of social status. Most people  
 need computers to get work done, and they'd use CRTs if CRTs would have  
 them do better work.

Well you can't value things like that, you know better than that. It is not just about how significant or insignificant? How is it watching things in only one angle? How is it reading a text, or i should say trying to read? How about colors or refresh rate? Yes, LCD has its own benefits too, and quite a bit of them. You forget the biggest factor, cost, for both user and the mainly producer.
Jan 16 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Sun, 16 Jan 2011 12:34:36 -0800, Walter Bright wrote:

 Andrei Alexandrescu wrote:
 Meanwhile, you are looking at a gamma gun shooting atcha.

I always worried about that. Nobody actually found anything wrong, but still.

It's like the cell phone studies. Whether they're causing brain tumors or not.
Jan 16 2011
prev sibling parent retard <re tard.com.invalid> writes:
Sun, 16 Jan 2011 21:46:25 +0100, Andrej Mitrovic wrote:

 With CRTs I could spend a few hours in front of the PC, but after that
 my eyes would get really tired and I'd have to take a break. Since I
 switched to LCDs I've never had this problem anymore, I could spend a
 day staring at screen if I wanted to. Of course, it's still best to take
 some time off regardless of the screen type.

That's a good point. I've already forgotten how much eye strain the old monitors used to cause.
 
 Anyway.. how about that Git thing, then? :D

:)
Jan 16 2011
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday 15 January 2011 13:13:41 Andrei Alexandrescu wrote:
 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only being
 able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

But don't you just _hate_ the fact that lightbulbs don't smell? How can you stand that? ;) Yes. That does take the cake. And I want it back, since cake sounds good right now. LOL. This thread has serious been derailed. I wonder if I should start a new one one the source control issue. I'd _love_ to be able to use git with Phobos and druntime rather than svn, and much as I've never used mercurial and have no clue how it compares to git, it would have to be an improvement over svn. Unfortunately, that topic seems to have not really ultimately gone anywhere in this thread. - Jonathan M Davis
Jan 15 2011
prev sibling parent reply =?UTF-8?B?IkrDqXLDtG1lIE0uIEJlcmdlciI=?= <jeberger free.fr> writes:
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: quoted-printable

Nick Sabalausky wrote:
 "retard" <re tard.com.invalid> wrote in message=20
 Hard drives: these always fail, sooner or later. There's nothing you c=


 do except RAID and backups

And SMART monitors: =20 I've had a total of two HDD's fail, and in both cases I really lucked o=

 The first one was in my Mac, but it was after I was already getting=20
 completely fed up with OSX and Apple, so I didn't really care much - I =

 mostly back on Windows again by that point. The second failure just hap=

 to be the least important of the three HDDs in my system. I was still p=

 upset about it though, so it was a big wakeup call: I *will not* have a=

 primary system anymore that doesn't have a SMART monitoring program, wi=

 temperature readouts, always running. And yes, it can't always predict =

 failure, but sometimes it can so IMO there's no good reason not to have=

 That's actually one of the things I don't like about Linux, nothing lik=

 that seems to exist for Linux. Sure, there's a cmd line program you can=

 poll, but that doesn't remotely cut it.
=20

I use smard (same as Linux) but where I am reasonably confident that on Linux it will email me if it detects an error condition, I am not as sure of being notified on Windows (where email is not an option because it is at work and Lotus will not accept email from sources other than those explicitly allowed by the IT admins). Jerome --=20 mailto:jeberger free.fr http://jeberger.free.fr Jabber: jeberger jabber.fr
Jan 16 2011
parent reply "Nick Sabalausky" <a a.a> writes:
""Jйrфme M. Berger"" <jeberger free.fr> wrote in message 
news:iguask$1dur$1 digitalmars.com...
Simple curiosity: what do you use for SMART monitoring on Windows?
I use smard (same as Linux) but where I am reasonably confident that
on Linux it will email me if it detects an error condition, I am not
as sure of being notified on Windows (where email is not an option
because it is at work and Lotus will not accept email from sources
other than those explicitly allowed by the IT admins).

Hard Disk Sentinel. I'm not married to it or anything, but it seems to be pretty good.
Jan 16 2011
parent =?UTF-8?B?IkrDqXLDtG1lIE0uIEJlcmdlciI=?= <jeberger free.fr> writes:
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: quoted-printable

Nick Sabalausky wrote:
 ""J=EF=BF=BDr=EF=BF=BDme M. Berger"" <jeberger free.fr> wrote in messag=

 news:iguask$1dur$1 digitalmars.com...
 Simple curiosity: what do you use for SMART monitoring on Windows?
 I use smard (same as Linux) but where I am reasonably confident that
 on Linux it will email me if it detects an error condition, I am not
 as sure of being notified on Windows (where email is not an option
 because it is at work and Lotus will not accept email from sources
 other than those explicitly allowed by the IT admins).

Hard Disk Sentinel. I'm not married to it or anything, but it seems to =

 pretty good.
=20
=20

Jerome --=20 mailto:jeberger free.fr http://jeberger.free.fr Jabber: jeberger jabber.fr
Jan 17 2011
prev sibling parent spir <denis.spir gmail.com> writes:
On 01/13/2011 04:43 AM, Walter Bright wrote:
 Andrej Mitrovic wrote:
 On 1/12/11, Jean Crystof <news news.com> wrote:
 Claiming that low end components have shorter lifespan is ridiculous.

You've never had computer equipment fail on you?

I've had a lot of computer equipment. Failures I've had, ranked in order of most failures to least: keyboards power supplies hard drives fans monitors I've never had a CPU, memory, or mobo failure. Which is really kind of amazing. I did have a 3DFX board once, which failed after a couple years. Never bought another graphics card. The keyboards fail so often I keep a couple spares around. I buy cheap, bottom of the line equipment. I don't overclock them and I make sure there's plenty of airflow around the boxes.

Same for me. Cheap hardware as well; and as standard as possible. I've never had any pure electronic failure (graphic card including)! I would just put fan & power supply before keyboard, and add mouse to the list just below keyboard. My keyboards do not break as often as yours: you must be a brutal guy ;-) An exception is for wireless keyboards and mice, which I quickly abandoned. Denis _________________ vita es estrany spir.wikidot.com
Jan 13 2011
prev sibling next sibling parent Jeff Nowakowski <jeff dilacero.org> writes:
On 01/12/2011 04:11 PM, retard wrote:
 Same thing, can't imagine how a video card could break.

I recently had a cheap video card break. It at least had the decency to break within the warranty period, but I was too lazy to return it :P I decided that the integrated graphics, while slow, were "good enough" for what I was using the machine for.
Jan 12 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
retard wrote:
 There's no reason why they would break. Few months ago I was 
 reconfiguring an old server at work which still used two 16-bit 10 
 megabit ISA network cards. I fetched a kernel upgrade (2.6.27.something). 
 It's a modern kernel which is still maintained and had up-to-date drivers 
 for the 20 year old device! Those devices have no moving parts and are 
 stored inside EMP & UPS protected strong server cases. How the heck could 
 they break?
 
 Same thing, can't imagine how a video card could break. The old ones 
 didn't even have massive cooling solutions, the chips didn't even need a 
 heatsink. The only problem is driver support, but on Linux it mainly gets 
 better over the years.

I paid my way through college hand-making electronics boards for professors and engineers. All semiconductors have a lifetime that is measured by the area under the curve of their temperature over time. The doping in the semiconductor gradually diffuses through the semiconductor, the rate of diffusion increases as the temperature rises. Once the differently doped parts "collide" the semiconductor fails.
Jan 12 2011
parent Eric Poggel <dnewsgroup2 yage3d.net> writes:
On 1/12/2011 6:41 PM, Walter Bright wrote:
 All semiconductors have a lifetime that is measured by the area under
 the curve of their temperature over time.

Oddly enough, milk has the same behavior.
Jan 28 2011
prev sibling next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
Notice the smiley face -> :D

Yeah I didn't check the price, it's only 30$. But there's no telling
if that would work either. Also, dirt cheap video cards are almost
certainly going to cause problems. Even if the drivers worked
perfectly, a year down the road things will start breaking down. Cheap
hardware is cheap for a reason.
Jan 11 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Andrej Mitrovic" <andrej.mitrovich gmail.com> wrote in message 
news:mailman.571.1294806486.4748.digitalmars-d puremagic.com...
 Notice the smiley face -> :D

 Yeah I didn't check the price, it's only 30$. But there's no telling
 if that would work either. Also, dirt cheap video cards are almost
 certainly going to cause problems. Even if the drivers worked
 perfectly, a year down the road things will start breaking down. Cheap
 hardware is cheap for a reason.

Rediculous. All of the video cards I'm using are ultra-cheap ones that are about 10 years old and they all work fine.
Jan 12 2011
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:igkv8v$2gq$1 digitalmars.com...
 "Andrej Mitrovic" <andrej.mitrovich gmail.com> wrote in message 
 news:mailman.571.1294806486.4748.digitalmars-d puremagic.com...
 Notice the smiley face -> :D

 Yeah I didn't check the price, it's only 30$. But there's no telling
 if that would work either. Also, dirt cheap video cards are almost
 certainly going to cause problems. Even if the drivers worked
 perfectly, a year down the road things will start breaking down. Cheap
 hardware is cheap for a reason.

Rediculous. All of the video cards I'm using are ultra-cheap ones that are about 10 years old and they all work fine.

They're cheap because they have lower clock speeds, fewer features, and less memory.
Jan 12 2011
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Wednesday 12 January 2011 13:11:13 retard wrote:
 Wed, 12 Jan 2011 14:22:59 -0500, Nick Sabalausky wrote:
 "Andrej Mitrovic" <andrej.mitrovich gmail.com> wrote in message
 news:mailman.571.1294806486.4748.digitalmars-d puremagic.com...
 
 Notice the smiley face -> :D
 
 Yeah I didn't check the price, it's only 30$. But there's no telling if
 that would work either. Also, dirt cheap video cards are almost
 certainly going to cause problems. Even if the drivers worked
 perfectly, a year down the road things will start breaking down. Cheap
 hardware is cheap for a reason.

Rediculous. All of the video cards I'm using are ultra-cheap ones that are about 10 years old and they all work fine.

There's no reason why they would break. Few months ago I was reconfiguring an old server at work which still used two 16-bit 10 megabit ISA network cards. I fetched a kernel upgrade (2.6.27.something). It's a modern kernel which is still maintained and had up-to-date drivers for the 20 year old device! Those devices have no moving parts and are stored inside EMP & UPS protected strong server cases. How the heck could they break? Same thing, can't imagine how a video card could break. The old ones didn't even have massive cooling solutions, the chips didn't even need a heatsink. The only problem is driver support, but on Linux it mainly gets better over the years.

It depends on a number of factors, including the quality of the card and the conditions that it's being used in. I've had video cards die before. I _think_ that it was due to overheating, but I really don't know. It doesn't really matter. The older the part, the more likely it is to break. The cheaper the part, the more likely it is to break. Sure, the lack of moving parts makes it less likely for a video card to die, but it definitely happens. Computer parts don't last forever, and the lower their quality, the less likely it is that they'll last. By no means does that mean that a cheap video card isn't necessarily going to last for years and function just fine, but it is a risk that a cheap card will be too cheap to last. - Jonathan M Davis
Jan 12 2011
prev sibling next sibling parent reply Ulrik Mikaelsson <ulrik.mikaelsson gmail.com> writes:
Wow. The thread that went "Moving to D"->"Problems with
DMD"->"DVCS"->"WHICH DVCS"->"Linux Problems"->"Driver
Problems/Manufacturer preferences"->"Cheap VS. Expensive". It's a
personally observed record of OT threads, I think.

Anyways, I've refrained from throwing fuel on the thread as long as I
can, I'll bite:

 It depends on a number of factors, including the quality of the card and the
 conditions that it's being used in. I've had video cards die before. I _think_
 that it was due to overheating, but I really don't know. It doesn't really
 matter. The older the part, the more likely it is to break. The cheaper the
 part, the more likely it is to break. Sure, the lack of moving parts makes it
 less likely for a video card to die, but it definitely happens. Computer parts
 don't last forever, and the lower their quality, the less likely it is that
 they'll last. By no means does that mean that a cheap video card isn't
 necessarily going to last for years and function just fine, but it is a risk
that
 a cheap card will be too cheap to last.

that cost more is often high-end HW which creates more heat, which _might_ actually shorten the lifetime. On the other hand, low-end HW is often less heat-producing, which _might_ make it last longer. The real difference lies in what level of HW are sold at which clock-levels, I.E. manufacturing control procedures. So an expensive low-end for a hundred bucks might easily outlast a cheap high-end alternative for 4 times the money. Buy quality, not expensive. There is a difference.
Jan 12 2011
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/12/11 2:30 PM, retard wrote:
 Wed, 12 Jan 2011 22:46:46 +0100, Ulrik Mikaelsson wrote:

 Wow. The thread that went "Moving to D"->"Problems with
 DMD"->"DVCS"->"WHICH DVCS"->"Linux Problems"->"Driver
 Problems/Manufacturer preferences"->"Cheap VS. Expensive". It's a
 personally observed record of OT threads, I think.

 Anyways, I've refrained from throwing fuel on the thread as long as I
 can, I'll bite:

 It depends on a number of factors, including the quality of the card
 and the conditions that it's being used in. I've had video cards die
 before. I _think_ that it was due to overheating, but I really don't
 know. It doesn't really matter. The older the part, the more likely it
 is to break. The cheaper the part, the more likely it is to break.
 Sure, the lack of moving parts makes it less likely for a video card to
 die, but it definitely happens. Computer parts don't last forever, and
 the lower their quality, the less likely it is that they'll last. By no
 means does that mean that a cheap video card isn't necessarily going to
 last for years and function just fine, but it is a risk that a cheap
 card will be too cheap to last.

that cost more is often high-end HW which creates more heat, which _might_ actually shorten the lifetime. On the other hand, low-end HW is often less heat-producing, which _might_ make it last longer. The real difference lies in what level of HW are sold at which clock-levels, I.E. manufacturing control procedures. So an expensive low-end for a hundred bucks might easily outlast a cheap high-end alternative for 4 times the money. Buy quality, not expensive. There is a difference.

Nicely written, I fully agree with you.

Same here. It's not well understood that heating/cooling cycles with the corresponding expansion and contraction cycles are the main reason for which electronics fail. At an extreme, the green-minded person who turns all CFLs and all computers off at all opportunities ends up producing more expense and more waste than the lazier person who leaves stuff on for longer periods of time. Andrei
Jan 12 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Wed, 12 Jan 2011 13:22:28 -0800, Jonathan M Davis wrote:

 On Wednesday 12 January 2011 13:11:13 retard wrote:
 Same thing, can't imagine how a video card could break. The old ones
 didn't even have massive cooling solutions, the chips didn't even need
 a heatsink. The only problem is driver support, but on Linux it mainly
 gets better over the years.

It depends on a number of factors, including the quality of the card and the conditions that it's being used in.

Of course.
 I've had video cards die before.
 I _think_ that it was due to overheating, but I really don't know. It
 doesn't really matter.

Modern GPU and CPU parts are of course getting hotter and hotter. They're getting so hot it's a miracle the components such as capacitors nearby the cores can handle it. You need better cooling which means even more breaking parts.
 The older the part, the more likely it is to break.

Not true. http://en.wikipedia.org/wiki/Bathtub_curve
 The cheaper the part, the more likely it is to break.

That might be true if the part is a power supply or a monitor. However, the latest and greatest video cards and CPUs are sold at an extremely high price mainly for hardcore gamers (and 3d modelers -- quadro & firegl). This is sometimes purely an intellectual property issue, nothing to do with physical parts. For example I've earned several hundred euros by installing soft-mods, that is upgraded firmware / drivers. Ever heard of Radeon 9500 -> 9700, 9800SE -> 9800, and lately 6950 -> 6970 mods? I've also modded one PC NVIDIA card to work on Macs (sold at a higher price) and done one Geforce -> Quadro mod. You don't touch the parts at all, just flash the ROM. It would be a miracle if that improved the physical quality of the parts. It does raise the price, though. Another observation: the target audience of the low end NVIDIA cards are usually HTPC and office users. These computers have small cases and require low profile cards. The cards have actually *better* multimedia features (purevideo) than the high end cards for gamers. These cards are built by the same companies as the larger versions (Asus, MSI, Gigabyte, and so on). Could it just be that by giving the buyer less physical parts and less intellectual property in the form of GPU firmware, they can sell at a lower price. There are also these cards with the letters "OC" in their name. The manufacturer has deliberately overclocked the cards beyond their specs. That's actually hurting the reliability but the price is even bigger.
Jan 12 2011
prev sibling parent retard <re tard.com.invalid> writes:
Wed, 12 Jan 2011 22:46:46 +0100, Ulrik Mikaelsson wrote:

 Wow. The thread that went "Moving to D"->"Problems with
 DMD"->"DVCS"->"WHICH DVCS"->"Linux Problems"->"Driver
 Problems/Manufacturer preferences"->"Cheap VS. Expensive". It's a
 personally observed record of OT threads, I think.
 
 Anyways, I've refrained from throwing fuel on the thread as long as I
 can, I'll bite:
 
 It depends on a number of factors, including the quality of the card
 and the conditions that it's being used in. I've had video cards die
 before. I _think_ that it was due to overheating, but I really don't
 know. It doesn't really matter. The older the part, the more likely it
 is to break. The cheaper the part, the more likely it is to break.
 Sure, the lack of moving parts makes it less likely for a video card to
 die, but it definitely happens. Computer parts don't last forever, and
 the lower their quality, the less likely it is that they'll last. By no
 means does that mean that a cheap video card isn't necessarily going to
 last for years and function just fine, but it is a risk that a cheap
 card will be too cheap to last.

that cost more is often high-end HW which creates more heat, which _might_ actually shorten the lifetime. On the other hand, low-end HW is often less heat-producing, which _might_ make it last longer. The real difference lies in what level of HW are sold at which clock-levels, I.E. manufacturing control procedures. So an expensive low-end for a hundred bucks might easily outlast a cheap high-end alternative for 4 times the money. Buy quality, not expensive. There is a difference.

Nicely written, I fully agree with you.
Jan 12 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/12/11, Jean Crystof <news news.com> wrote:
 Claiming that low end components have shorter lifespan is ridiculous.

You've never had computer equipment fail on you?
Jan 12 2011
prev sibling next sibling parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 12.01.2011 04:02, schrieb Jean Crystof:
 Walter Bright Wrote:

 My mobo is an ASUS M2A-VM. No graphics cards, or any other cards plugged into
 it. It's hardly weird or wacky or old (it was new at the time I bought it to
 install Ubuntu).

ASUS M2A-VM has 690G chipset. Wikipedia says: http://en.wikipedia.org/wiki/AMD_690_chipset_series#690G "AMD recently dropped support for Windows and Linux drivers made for Radeon X1250 graphics integrated in the 690G chipset, stating that users should use the open-source graphics drivers instead. The latest available AMD Linux driver for the 690G chipset is fglrx version 9.3, so all newer Linux distributions using this chipset are unsupported."

I guess a recent version of the free drivers (as delivered with recent Ubuntu releases) still is much better than the one in Walters >2 Years old Ubuntu. Sure, game performance may not be great, but I guess normal working (even in 1920x1200) and watching youtube videos works.
 Fast forward to this day:
 http://www.phoronix.com/scan.php?page=article&item=amd_driver_q111&num=2

 Benchmark page says: the only available driver for your graphics gives only
about 10-20% of the real performance. Why? ATI sucks on Linux. Don't buy ATI.
Buy Nvidia instead:

No it doesn't. The X1250 uses the same driver as the X1950 which is much more mature and also faster than the free driver for the Radeon HD *** cards (for which a proprietary Catalyst driver is still provided).
 http://geizhals.at/a466974.html

 This is 3rd latest Nvidia GPU generation. How long support lasts? Ubuntu 10.10
still supports all Geforce 2+ which is 10 years old. I foretell Ubuntu 19.04 is
last one supporting this. Use Nvidia and your problems are gone.

I agree that a recent nvidia card may improve things even further. Cheers, - Daniel
Jan 12 2011
parent retard <re tard.com.invalid> writes:
Wed, 12 Jan 2011 19:11:22 +0100, Daniel Gibson wrote:

 Am 12.01.2011 04:02, schrieb Jean Crystof:
 Walter Bright Wrote:

 My mobo is an ASUS M2A-VM. No graphics cards, or any other cards
 plugged into it. It's hardly weird or wacky or old (it was new at the
 time I bought it to install Ubuntu).

ASUS M2A-VM has 690G chipset. Wikipedia says: http://en.wikipedia.org/wiki/AMD_690_chipset_series#690G "AMD recently dropped support for Windows and Linux drivers made for Radeon X1250 graphics integrated in the 690G chipset, stating that users should use the open-source graphics drivers instead. The latest available AMD Linux driver for the 690G chipset is fglrx version 9.3, so all newer Linux distributions using this chipset are unsupported."

Ubuntu releases) still is much better than the one in Walters >2 Years old Ubuntu.

Most likely. After all they're fixing more bugs than creating new ones. :-) My other guess is, while the open source drivers are far from perfect for hardcore gaming, the basic functionality like setting up a video mode is getting better. Remember the days you needed to type in all internal and external clock frequencies and packed pixel bit counts in xorg.conf ?!
 Sure, game performance may not be great, but I guess normal working
 (even in 1920x1200) and watching youtube videos works.

Embedded videos on web pages used to require huge amounts of CPU power when you were upscaling them in the fullscreen mode. The reason is that Flash only recently starting supporting hardware accelerated videos, on ***32-bit*** systems equipped with a ***NVIDIA*** card. The same VDPAU libraries are used by the native video players. I tried to accelerate video playback with my Radeon HD 5770, but it failed badly. Believe it or not, my 3 Ghz 4-core Core i7 system with 24 GB of RAM and the fast Radeon HD 5770 was too slow to play 1080p videos 1920x1080 using the open source drivers. Without hardware acceleration you need a modern high-end dual-core system or faster to run the video assuming the drivers aren't broken. If you only want to watch the youtube videos in windowed mode, you still need a 2+ Ghz single-core. But.. Youtube has switched to HTML5 videos recently. This should take the requirements down a notch. Still I wouldn't trust integrated graphics that much. They've always been crap.
Jan 12 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/12/11, Nick Sabalausky <a a.a> wrote:
 Rediculous. All of the video cards I'm using are ultra-cheap ones that are
 about 10 years old and they all work fine.

I'm saying that if you buy a cheap video card *today* you might not get what you expect. And I'm not talking out of my ass, I've had plenty of experience with faulty hardware and device drivers. The 'quality' depends more on who makes the product than what price tag it has, but you have to look these things up and not buy things on first sight because they're cheap.
Jan 12 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Wed, 12 Jan 2011 14:22:59 -0500, Nick Sabalausky wrote:

 "Andrej Mitrovic" <andrej.mitrovich gmail.com> wrote in message
 news:mailman.571.1294806486.4748.digitalmars-d puremagic.com...
 Notice the smiley face -> :D

 Yeah I didn't check the price, it's only 30$. But there's no telling if
 that would work either. Also, dirt cheap video cards are almost
 certainly going to cause problems. Even if the drivers worked
 perfectly, a year down the road things will start breaking down. Cheap
 hardware is cheap for a reason.

Rediculous. All of the video cards I'm using are ultra-cheap ones that are about 10 years old and they all work fine.

There's no reason why they would break. Few months ago I was reconfiguring an old server at work which still used two 16-bit 10 megabit ISA network cards. I fetched a kernel upgrade (2.6.27.something). It's a modern kernel which is still maintained and had up-to-date drivers for the 20 year old device! Those devices have no moving parts and are stored inside EMP & UPS protected strong server cases. How the heck could they break? Same thing, can't imagine how a video card could break. The old ones didn't even have massive cooling solutions, the chips didn't even need a heatsink. The only problem is driver support, but on Linux it mainly gets better over the years.
Jan 12 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Thu, 13 Jan 2011 05:43:27 +0200, Walter Bright  
<newshound2 digitalmars.com> wrote:

 The keyboards fail so often I keep a couple spares around.

Let me guess, all cheap rubber-domes? Maybe you should have a look at some professional keyboards. Mechanical keyboards are quite durable, and feel much nicer to type on. -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 12 2011
prev sibling next sibling parent Caligo <iteronvexor gmail.com> writes:
--000e0cd1a164e34eb90499b7c920
Content-Type: text/plain; charset=ISO-8859-1

On Wed, Jan 12, 2011 at 11:33 PM, Walter Bright
<newshound2 digitalmars.com>wrote:

 Vladimir Panteleev wrote:

 On Thu, 13 Jan 2011 05:43:27 +0200, Walter Bright <
 newshound2 digitalmars.com> wrote:

  The keyboards fail so often I keep a couple spares around.

Let me guess, all cheap rubber-domes? Maybe you should have a look at some professional keyboards. Mechanical keyboards are quite durable, and feel much nicer to type on.

Yup, the $9.99 ones. They also get things spilled on them, why ruin an expensive one? <g>

http://www.daskeyboard.com/ or http://steelseries.com/us/products/keyboards/steelseries-7g expensive, I know, but who cares. You only live once! --000e0cd1a164e34eb90499b7c920 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable <br><br><div class=3D"gmail_quote">On Wed, Jan 12, 2011 at 11:33 PM, Walter= Bright <span dir=3D"ltr">&lt;<a href=3D"mailto:newshound2 digitalmars.com"=
newshound2 digitalmars.com</a>&gt;</span> wrote:<br><blockquote class=3D"g=

eft:1ex;"> <div class=3D"im">Vladimir Panteleev wrote:<br> <blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-left:1p= x #ccc solid;padding-left:1ex"> On Thu, 13 Jan 2011 05:43:27 +0200, Walter Bright &lt;<a href=3D"mailto:new= shound2 digitalmars.com" target=3D"_blank">newshound2 digitalmars.com</a>&g= t; wrote:<br> <br> <blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-left:1p= x #ccc solid;padding-left:1ex"> The keyboards fail so often I keep a couple spares around.<br> </blockquote> <br> Let me guess, all cheap rubber-domes? Maybe you should have a look at some = professional keyboards. Mechanical keyboards are quite durable, and feel mu= ch nicer to type on.<br> </blockquote> <br></div> Yup, the $9.99 ones. They also get things spilled on them, why ruin an expe= nsive one? &lt;g&gt;<br> </blockquote></div><br><div><br></div><div>=A0=A0<a href=3D"http://www.dask= eyboard.com/">http://www.daskeyboard.com/</a></div><div><br></div><div>or</= div><div><br></div><div><a href=3D"http://steelseries.com/us/products/keybo= ards/steelseries-7g">http://steelseries.com/us/products/keyboards/steelseri= es-7g</a></div> <div><br></div><div>expensive, I know, but who cares. =A0You only live once= !</div><div><br></div><div><br></div> --000e0cd1a164e34eb90499b7c920--
Jan 13 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
Lol Walter you're like me. I keep buying cheap keyboards all the time.
I'm almost becoming one of those people that collect things all the
time (well.. the difference being I throw the old ones in the trash).
Right now I'm sporting this dirt-cheap Genius keyboard, I've just
looked up the price and it's 5$. My neighbor gave it to me for free
because he got two for some reason. You would think a 5$ keyboard
sucks, but it's pretty sweet actually. The keys have a nice depth, and
they're real easy to hit. The downside? They've put the freakin' sleep
button right above the right cursor key. Now *that's* genius, Genius..
So I had to disable sleep mode. LOL!

*However*, my trusty Logitech MX518 is standing strong with over 5
years of use. Actually, I did cut the cable by accident once. But I
had a spare 10$ Logitech mouse which happened to have the same
connector that plugs in that little PCI board, so I just swapped the
cables. (yay for hardware design reuse!).
Jan 13 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
I forgot to mention though, do *not* open up a MX518 unless you want
to spend your day figuring out where all the tiny little pieces go.
When I opened it the first time, all the pieces went flying in all
directions. I've found all the pieces but putting them back together
was a nightmare. Which piece goes where with which other piece and in
what order.. Luckily I found a forum where someone else already took
apart and assembled the same mouse, and even took pictures of it.
There was really only this one final frustrating piece that I couldn't
figure out which held the scroll wheel together and made that
"clikclick" sound when you scroll.
Jan 13 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Thu, 13 Jan 2011 19:04:59 -0500, Nick Sabalausky wrote:

 My failure list from most to least would be this:
 
 1. power supply / printer
 2. optical drive / floppies (the disks, not the drives)
 3. hard drive
 4. monitor / mouse / fan

My list is pretty much the same. I bought a (Toshiba IIRC) dot matrix printer (the price was insane) in 1980s. It STILL works fine when printing ASCII text, but it's "a bit" noisy and slow. Another thing is, after upgrading from DOS, haven't found any drivers for printing graphics. On DOS, only some programs had specially crafted drivers for this printer and some had drivers for some other proprietary protocol the printer "emulates" :-) My second printer was some Canon LBP in the early 90s. STILL works without any problems (still connected to my Ubuntu CUPS server), but it's also relatively slow and physically huge. I used to replace the toner and drums, toner every ~2 years (prints 1500-3000 pages of 5% text) and drum every 5-6 years. We bought it as used from a company. It had been repaired once by the official Canon service. After that, almost 20 years without repair. I also bought a faster (USB!) laser printer from Brother couple of years ago. I've replaced the drum once and replaced the toner three times with some cheapo 3rd party stuff. It was a bit risky to buy a set of 10 toner kits along with the printers (even the laser printers are so cheap now), but it was an especially cheap offer and we thought the spare part prices go up anyway. The amortized printing costs are probably less than 3 cents per page. Now, I've also bought Canon, HP, and Epson inkjets. What can I say.. The printers are cheap. The ink is expensive. They're slow, and result looks like shit (not very photo-realistic) compared to the online printing services. AND I've "broken" about 8 of them in 15 years. It's way too expensive to start buying spare parts (e.g. when the dry ink gets stuck in the ink "tray" in Canon printers). Nowadays I print photos using some online service. The inkjet printer quality still sucks IMO. Don't buy them. PSUs: Never ever buy the cheap models. There's a list of bad manufacturers in the net. They make awful shit. The biggest problem is, if the PSU breaks, it might also break other parts which makes all PSU failures really expensive. I've bought <ad>Seasonic, Fortron, and Corsair</ad> PSUs since the late 1990s. They work perfectly. If some part fails, it's the PSU fan (or sometimes the fuse when switching the PSU on causes a surge). Fuses are cheap. Fans last much longer if you replace the engine oil every 2-4 years. Scrap off the sticker in the center of the fan and pour in appropriate oil. I'm not kidding! I've got one 300W PSU from 1998 and it still works and the fan is almost as quiet as if it was new. Optical drives: Number 1 reason for breakage, I forget to close the tray and kick it off! Currently I don't use internal optical drives anymore. There's one external dvd burner. I rarely use it. And it's safe from my feet on the table :D Hard drives: these always fail, sooner or later. There's nothing you can do except RAID and backups (labs.google.com/papers/disk_failures.pdf). I've successfully terminated all (except those in use) hard drives so far by using them normally. Monitors: The CRTs used to break every 3-5 years. Even the high quality Sony monitors :-| I've used TFT panels since 2003. The inverter of the first 14" TFT broke after 5 years of use. Three others are still working, after 1-6 years of use. Mice: I've always bought Logitech mice. NEVER had any failures. The current one is MX 510 (USB). Previous ones used the COM port. The bottom of the MX510 shows signs of hardcore use, but the internal parts haven't fallen off yet and the LED "eye" works :-D Fans: If you want reliability, buy fans with ball bearings. They make more noise than sleeve bearings. I don't believe in expensive high quality fans. Sure, there are differences in the airflow and noise levels, but the max reliability won't be any better. The normal PC stores don't sell any fans with industrial quality bearings. Like I said before, remember to replace the oil http://www.dansdata.com/fanmaint.htm -- I still have high quality fans from the 1980s in 24/7 use. The only problem is, I couldn't anticipate how much the power consumption grows. The old ones are 40-80 mm fans. Now (at least gaming) computers have 120mm or 140mm or even bigger fans.
Jan 14 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/14/11, retard <re tard.com.invalid> wrote:
 Like I said before,
 remember to replace the oil http://www.dansdata.com/fanmaint.htm

I've never thought of this. I did have a couple of failed fans over the years but I always had a bunch of spares from the older equipment which I've replaced. Still, that is a cool tip, thanks! And yes, avoid cheap PSU's or at least get one from a good manufacturer. It's also important to have a PSU that can actually power your PC.
Jan 14 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable

On Fri, 2011-01-14 at 11:50 -0800, Walter Bright wrote:
 Daniel Gibson wrote:
 But a few years ago it was a lot worse, especially with cheap inkjets.=


 Many supported only GDI printing which naturally is best supported on=


 Windows (GDI is a windows interface).

Yeah, but I bought an *HP* laserjet, because I thought everyone supported=

=20
 Turns out I probably have the only orphaned HP LJ model.

I have an HP LJ 4000N and whilst it is perfectly functional, printing systems have decided it is too old to work with properly -- this is a Windows, Linux and Mac OS X problem. Backward compatibility is a three-edged sword. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jan 14 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Fri, 14 Jan 2011 21:02:38 +0100, Daniel Gibson wrote:

 Am 14.01.2011 20:50, schrieb Walter Bright:
 Daniel Gibson wrote:
 But a few years ago it was a lot worse, especially with cheap inkjets.
 Many supported only GDI printing which naturally is best supported on
 Windows (GDI is a windows interface).

Yeah, but I bought an *HP* laserjet, because I thought everyone supported them well. Turns out I probably have the only orphaned HP LJ model.

Yes, the HP Laserjets usually have really good support with PCL and sometimes even Postscript. You said you've got a HP (Laserjet?) 2300? On http://www.openprinting.org/printer/HP/HP-LaserJet_2300 it says that printer "works perfectly" and supports PCL 5e, PCL6 and Postscript level 3. Generally http://www.openprinting.org/printers is a really good page to see if a printer has Linux-support and where to get drivers etc.

I'm not sure if Walter's Ubuntu version already has this, but the latest Ubuntus automatically install all CUPS supported (USB) printers. I haven't tried this autodetection with parallel or network printers. The "easiest" way to configure CUPS is via the CUPS network interface ( http://localhost:631 ). In some early Ubuntu versions the printer configuration was broken. You had to add yourself to the lpadmin group and whatnot. My experiences with printers are: Linux (Ubuntu) 1. Plug in the cable 2. Print Mac OS X 1. Plug in the cable 2. Print Windows 1. Plug in the cable. 2. Driver wizard appears, fails to install 3. Insert driver cd (preferably download the latest drivers from the internet) 4. Save your work 4. Reboot 5. Close the HP/Canon/whatever ad dialog 6. Restart the programs and load your work 7. Print
Jan 14 2011
prev sibling next sibling parent Gour <gour atmarama.net> writes:
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: quoted-printable

On Fri, 14 Jan 2011 22:40:11 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

 To be fair, it was about the process of upgrading in place to Ubuntu
 8.10 that sucked. It broke everything, and made me leery of upgrading
 again.

<shameful plugin> /me likes Archlinux - all the hardware work and there is no 'upgrade' process like in Ubuntu 'cause it is 'rolling release', iow. one can update whenever and as often one desire. Moreover, it's very simple to build from the source if one wants/needs. </shameful plugin> Sincerely, Gour --=20 Gour | Hlapicina, Croatia | GPG key: CDBF17CA ----------------------------------------------------------------
Jan 14 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable

On Sat, 2011-01-15 at 00:26 +0100, Daniel Gibson wrote:
[ . . . ]
 hplip on Linux should support it when connected via Parallel Port (but,=

 according to a maybe outdated list, not USB or Network/Jetdirect). See al=

 http://www.openprinting.org/printer/HP/HP-LaserJet_4000 :-)

The problem is not the spooling per se, Linux, Windows and Mac OS X are all happy to talk to JetDirect, the problem is that the printer only has 7MB of memory and no disc, and operating systems seem now to think that printers have gigbytes of memory and make no allowances. The worst of it is though that the LJ 4000 has quite an old version of PostScript compared to that in use today and all the application and/or drivers that render to PostScript are not willing (or able) to code generate for such an old PostScript interpreter. Together this leads to a huge number of stack fails on print jobs. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jan 15 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Sat, 15 Jan 2011 03:23:41 -0500, Nick Sabalausky wrote:

 "retard" <re tard.com.invalid> wrote in message
 PSUs: Never ever buy the cheap models. There's a list of bad
 manufacturers in the net. They make awful shit.

Another problem is that, as places like Sharky Extreme and Tom's Hardware found out while testing, it seems to be common practice for PSU manufacturers to outright lie about the wattage.

That's true. But it's also true that PSU efficiency and power have improved drastically. And their quality overall. In 1990s it was pretty common that computer stores mostly sold those shady brands with a more or less lethal design. There are lots of reliable brands now. If you're not into gaming, it hardly matters which (good) PSU you buy. They all provide 300+ Watts and your system might consume 70-200 Watts, even under full load.
 Monitors: The CRTs used to break every 3-5 years. Even the high quality
 Sony monitors :-| I've used TFT panels since 2003. The inverter of the
 first 14" TFT broke after 5 years of use. Three others are still
 working, after 1-6 years of use.

being able to use one resolution), and for a long time I've always had either a dual-monitor setup or dual systems with one monitor on each, so I've had a lot of monitors. But I've only ever had *one* CRT go bad, and I definitely use them for more than 5 years. Also, FWIW, I'm convinced that Sony is *not* as good as people generally think. Maybe they were in the 70's or 80's, I don't know, but they're frequently no better than average.

I've disassembled couple of CRT monitors. The Sony monitors have had aluminium cased "modules" inside them. So replacing these should be relatively easy. They also had detachtable wires between these units. Cheaper monitors have three circuit boards (one for the front panel, one in the back of the tube and one in the bottom). It's usually the board in the bottom of the monitor that breaks, which means that you need to cut all wires to remove it in cheaper monitors. It's just this high level design that I like in Sony's monitors. Probably other high quality brands like Eizo also do this. Sony may also use bad quality discrete components like capacitors and ICs. I can't say anything about that.
Jan 15 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Sun, 16 Jan 2011 11:56:34 +0100, Lutger Blijdestijn wrote:

 Nick Sabalausky wrote:
 
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message
 news:igt2pl$2u6e$1 digitalmars.com...
 On 1/15/11 2:23 AM, Nick Sabalausky wrote:
 I still use CRTs (one big reason being that I hate the idea of only
 being able to use one resolution)

I'd read some post of Nick and think "hmm, now that's a guy who follows only his own beat" but this has to take the cake. From here on, I wouldn't be surprised if you found good reasons to use whale fat powered candles instead of lightbulbs.

that already suits my needs (that I only paid $25 for in the first place), or I can spend a hundred or so dollars to lose the ability to have a decent looking picture at more than one resolution and then say "Gee golly whiz! That sure is a really flat panel!!". Whoop-dee-doo. And popularity and trendyness are just non-issues.

Actually nearly all lcds below 600$-800$ price point (tn-panels) have quite inferior display of colors compared to el cheapo crt's, at any resolution.

There are also occasional special offers on IPS flat panels. The TN panels have also improved. I bought a cheap 21.5" TN panel as my second monitor last year. The viewing angles are really wide, basically about 180 degrees horizontally, a tiny bit less vertically. I couldn't see any effects of dithering noise either. It has a DVI input and a power consumption of about 30 Watts max (I run it in eco mode). Now that both framerate and view angle problems have been more or less solved for TN panels (except in pivot mode), the only remaining problems is the color reproduction. But it only matters when working with photographs.
Jan 16 2011
prev sibling next sibling parent so <so so.do> writes:
 Besides, this whole changing the resolution thing is a consequence of  
 using crappy software. What you want is set the resolution to the  
 maximum and do the rest in software. And guess what - at their maximum,  
 CRT monitors suck compared to flat panels.

This is just... wrong.
Jan 16 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
I found this:
http://stackoverflow.com/questions/315911/git-for-beginners-the-definitive-practical-guide

A bunch of links to SO questions/answers.
Jan 16 2011
prev sibling next sibling parent reply retard <re tard.com.invalid> writes:
Sun, 16 Jan 2011 15:22:13 -0500, Nick Sabalausky wrote:

 Dude, you need to upgrade!!!

The CRTs have a limited lifetime. It's simply a fact that you need to switch to flat panels or something better. They won't probably even manufacture CRTs anymore. It becomes more and more impossible to purchase *unused* CRTs anywhere. At least at a reasonable price. For example used 17" TFTs cost less than $40. I found pages like this http://shopper.cnet.com/4566-3175_9-0.html Even the prices aren't very competitive. I only remember that all refresh rates below 85 Hz caused me headache and eye fatigue. You can't use the max resolution В 60 Hz for very long.
 Why should *I* spend the money to replace something that already

You might get more things done by using a bigger screen. Maybe get some money to buy better equipment and stop complaining.
 Besides, this whole changing the resolution thing is a consequence of
 using crappy software. What you want is set the resolution to the
 maximum and do the rest in software. And guess what - at their maximum,
 CRT monitors suck compared to flat panels.

well. XP doesn't. Win7 doesn't. Ubuntu 9.04 and Kubuntu 10.10 don't. (And I'm definitely not going back to OSX, I've had my fill of that.)

My monitors have had about the same pixel density over the years. EGA (640x400) or 720x348 (Hercules) / 12", 800x600 / 14", 1024x768 / 15-17", 1280x1024 / 19", 1280x1024 / 17" TFT, 1440x900 / 19", 1920x1080 / 21.5", 2560x1600 / 30" Thus, there's no need to enlarge all graphical widgets or text. My vision is still ok. What changes is the amount of simultaneously visible area for applications. You're just wasting the expensive screen estate by enlarging everything. You're supposed to run more simultaneous tasks on a larger screen.
 I've actually compared the rated power consumpsion between CRTs and
 LCDs of
 similar size and was actually surprised to find that there was little,
 if any, real difference at all on the sets I compared.


I'm pretty sure I did point out the limitations of my observation: "...on
all the sets I compared". And it's pretty obvious I wasn't undertaking a
proper extensive study. There's no need for sarcasm.

Your comparison was pointless. You can come up with all kinds of arbitrary comparisons. The TFT panel power consumption probably varies between 20 and 300 Watts. Do you even know how much your CRT uses power? CRTs used as computer monitors and those used as televisions have different characteristics. CRT TVs have better brightness and contrast, but lower resolution and sharpness than CRT computer monitors. Computer monitors tend to need more power, maybe even twice as much. Also larger monitors of the same brand tend to use more power. When a CRT monitor gets older, you need more power to illuminate the phosphor as the amount of phosphor in the small holes of the grille/mask decreases over time. This isn't the case with TFTs. The backlight brightness and panel's color handling dictates power consumption. A 15" TFT might need as much power as a 22" TFT using the same panel technology. TFT TVs use more power as they typically provide higher brightness. Same thing if you buy those high quality panels for professional graphics work. The TFT power consumption has also drastically dropped because of AMOLED panels, LED backlights and better dynamic contrast logic. The fluorescent backlights lose some of their brightness (maybe about 30%) before dying unlike a CRT which totally goes dark. The LED backlights wont suffer from this (at least observably). My obversation is that e.g. in computer classes (30+ computers per room) the air conditioning started to work much better after the upgrade to flat panels. Another upgrade turned the computers into micro-itx thin clients. Now the room doesn't need air conditioning at all.
Jan 16 2011
parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Sunday 16 January 2011 23:17:22 Nick Sabalausky wrote:
 "retard" <re tard.com.invalid> wrote in message
 news:ih0b1t$g2g$3 digitalmars.com...
 
 For example used 17" TFTs cost less than $40.

Continuing to use my 21" CRT costs me nothing.
 Even the prices aren't very competitive. I only remember that all refresh
 rates below 85 Hz caused me headache and eye fatigue. You can't use the
 max resolution   60 Hz for very long.

I run mine no lower than 85 Hz. It's about 100Hz at the moment.

I've heard that the eye fatigue at 60 Hz is because it matches electricity for the light bulbs in the room, so then the flickering of the light bulbs and the screen match. Keeping it above 60 Hz avoids the problem. 100Hz is obviously well above that.
 And I never need to run it at the max rez for long. It's just nice to be
 able to bump it up now and then when I want to. Then it goes back down. And
 yet people feel the need to bitch about me liking that ability.

You can use whatever you want for all I care. It's your computer, your money, and your time. I just don't understand what the point of messing with your resolution is. I've always just set it at the highest possible level that I can. I've currently got 1920 x 1200 on a 24" monitor, but it wouldn't hurt my feelings any to get a higher resolution. I probably won't, simply because I'm more interested in getting a second monitor than a higher resolution, and I don't want to fork out for two monitors to get a dual monitor setup (since I want both monitors to be the same size) when I already have a perfectly good monitor, but I'd still like a higher resolution. So, the fact that you have and want a CRT and actually want the ability to adjust the resolution baffles me, but I see no reason to try and correct you or complain about it. - Jonathan M Davis
Jan 16 2011
prev sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Sunday 16 January 2011 14:07:57 Walter Bright wrote:
 Andrej Mitrovic wrote:
 Anyway.. how about that Git thing, then? :D

We'll be moving dmd, phobos, druntime, and the docs to Github shortly. The accounts are set up, it's just a matter of getting the svn repositories moved and figuring out how it all works. I know very little about git and github, but the discussions about it here and elsewhere online have thoroughly convinced me (and the other devs) that this is the right move for D.

Great! That will make it _much_ easier to make check-ins while working on other stuff in parallel. That's a royal pain with svn, and while it's slightly better when using git-svn to talk to an svn repository, it isn't much better, because the git branching stuff doesn't understand that you can't reorder commits to svn, so you can't merge in branches after having committed to the svn repository. But having it be pure git fixes all of that. So, this is great news. And I don't think that there's anything wrong with being a bit slow about the transition if taking our time means that we get it right, though obviously, the sooner we transition over, the sooner we get the benefits. - Jonathan M Davis
Jan 16 2011