www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - totally satisfied :D

reply =?UTF-8?B?IkrDuG4i?= <caapora gmail.com> writes:
The best idea I had today: rename D into :D

* Easier to google
* the :D experience (Satisfied Customers smile)
* :D associates with programming, it is symbol in f.e. LISP and 
Ruby
* backwards compatible: no need to update sitename, etc.
* Its predecessor is :C, a very very unhappy language

I hope :D parses in D community.

Jøn


ps :C reminds me of Retort in the Muppet show: use C to have your 
programming experiments blow up on you.
Sep 16 2012
next sibling parent "Brian" <brisvegan1971 gmail.com> writes:
Nice :D  Love it
Sep 16 2012
prev sibling next sibling parent reply "David Nadlinger" <see klickverbot.at> writes:
On Sunday, 16 September 2012 at 21:59:30 UTC, Jøn wrote:
 The best idea I had today: rename D into :D

 * Easier to google

You might be surprised to see that D is the number 1 result for ":D" even today. David
Sep 17 2012
next sibling parent reply =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
On 09/17/2012 03:08 PM, Nick Sabalausky wrote:
 On Mon, 17 Sep 2012 13:18:51 -0700
 "H. S. Teoh"<hsteoh quickfur.ath.cx>  wrote:

 Any time you hear "smart" and "software" in the same sentence, be
 prepared for something dumb.

Heh, I actually say pretty much the same thing myself very often. Couldn't agree more. If you were around me in person, you'd frequently hear "I hate when (devices|programs) try to be smart." Smart(.*) is a red flag for "badly designed" or "unreliable". That's actually been an even bigger thing with me lately than ever before since, because of work, I have a call phone for the first time now - two actually, an iPhone and an Android - and I absolutely *HATE* both the damn things (with the iPhone being slightly worse). *Everything* about them is just wrong, backwards, idiotic. They even managed to take something as trivial to get right as volume controls and *completely* fuck it up in every imaginable way. And of course, Android aped Apple's idiotic lead on that, as usual.

I have to jump in on this discussion: Those have been exactly my feelings since I've gotten my "smart" phone about two years ago. I cannot believe the lack of usability! :) I have an Android but of course I have played with iPhones as well. Let me tell you: the emperor has no clothes! :) They have imagined a "phone", where being able to answer the call is completely by luck if the phone has been in your pocket when the call arrived! Chances are, you will touch something on the "smart" screen and reject the call by some random reason like "I am in class." (No, I am not a student or a teacher at this time; but that exact scenario happened to me multiple times.) Imagine a device where the *entire* screen is touchable with different areas meaning different things depending on context! The users can only cradle it gently but they can't hold it firmly! Wow! I can't believe how this whole idea took off. Later generations will have a good laugh at these devices. Thanks for letting me vent. :) Next time I will talk about CalTrain's immature attempts at adopting the Clipper card stupidity and their apparent and obviously obvious :p failure in doing so. Unbelievable amount of technology, expense, labor, customer inconvenience, citations, etc. just to obviate a system that has been working flawlessly for centuries: A paper ticket. Technology should solve a problem; it should not be forced on people. Ok, apparently that one is out too... :) Getting back on topic, yes, I like :D Ali
Sep 17 2012
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
I suppose I have a more pragmatic view, due to my background in non-computer 
engineering.

     It's all like that.

There are a couple of good reasons for that.

1. Not every engineer is a rock star. In fact, very few of them are. I tend to 
snicker at companies that insist they only hire the top 1%. It seems that about 
90% of the engineers out there must be in that top 1% <g>.

2. It costs too much money to do perfect engineering. You wouldn't be able to 
afford those products. Do you have $10,000 to spend on a tablet?


That said, our engineering tools and methodologies are improving all the time 
(what we're doing with D is improving programming methodology) which reduces
the 
defect rate. Industry responds to this with heaping on more capability, which 
adds more (subtler) bugs back in.

If you don't think things are getting better all the time, take apart a car 
built in the 1960's, and compare the fit, finish, and problems with that of a 
modern car. Now, I love those old cars, but let's get real. Modern cars are 
enormously better. They're still loaded with problems, but different ones.

(For an example, my truck is over 20 years old, and has never had a tuneup. No 
new spark plugs, no new distributor, etc. I just turn the key, and it goes.
It's 
quite amazing. My experience with cars from the 1960's is they require 
continuous work to keep running, even when they were new.)

Software is a lot better, too. It really is.
Sep 18 2012
parent Don Clugston <dac nospam.com> writes:
On 18/09/12 09:29, Walter Bright wrote:
 I suppose I have a more pragmatic view, due to my background in
 non-computer engineering.

      It's all like that.

 There are a couple of good reasons for that.

 1. Not every engineer is a rock star. In fact, very few of them are. I
 tend to snicker at companies that insist they only hire the top 1%. It
 seems that about 90% of the engineers out there must be in that top 1% <g>.

 2. It costs too much money to do perfect engineering. You wouldn't be
 able to afford those products. Do you have $10,000 to spend on a tablet?

 Software is a lot better, too. It really is.

I don't think that's true, except in terms of the range of functionality. The interesting thing is if you compare computer hardware from 1985 with hardware from today. It is dramatically better, in every respect, without exception. But that isn't true of software. Tex is ancient, and yet it's not easy to find *any* recent software of similar quality. When I was a kid, I muse to make games and save them on cassette tapes. The tapes failed pretty often, so that my work was lost. Last week, my kids made some games using online Flash/JS-based websites. Due to bugs in those websites, sometimes when they go to save, their work is lost. I find that appalling.
Sep 18 2012
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/17/2012 9:35 PM, Nick Sabalausky wrote:
 And there's even more. Honestly, if I were looking into getting a new car, I
 would consider that stereo *alone* to be a deal-breaker. It's that bad.

Install headers and a cherry-bomb exhaust, and you won't need no steekin' car stereo no more.
 I miss the 80's: Devices worked and idiots didn't use computers.

You've got a selective memory!! A car stereo in the 80's used cassettes. With a cassette, you've got flutter, rewinding, and a player that randomly ate your tapes. You also had tapes scattered about your car, usually encrusted with some substance that may or may not have come from McDonald's or the dog. I was happy a few years back to throw my cassette collection into the garbage. Oh, and TV sets and VCRs stunk compared to today. The TV shows stunk, too. With netflix, I rewatched some of those older shows, and was appalled at how bad they were. Try watching an 80's miniseries - gawd, what stinkers. But I did like 80's fashions much better than today's. The 70's were the worst, and the 80's the best.
Sep 18 2012
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2012 2:08 AM, Nick Sabalausky wrote:
 Install headers and a cherry-bomb exhaust,


Watch "Bullitt".
 Yea, so was I, but then I discovered that that we're basically trading
 one set of problems for another, especially with video. Casettes suck,
 and I'm glad to be done with them, but with discs:

My car stereo takes a USB stick. I specifically picked that model for that reason. CDs in the car suck.
 Some fantastic 80's shows off the top of my head:

 - Soap
 - Hunter
 - Magnum PI
 - Remington Steele
 - Miami Vice

I loved MV in the 80's. It was on netflix, so I started watching it. It was *horrible*! Awful. Cringeworthy.
 - MacGyver
 - Cheers
 - Golden Girls (ok, minus the occasional "After School Special" scenes)
 - Married With Children (the first two or three seasons were in the
    80's)

I do like MwC. Cheers - awful. The rest I never watched. There's nothing, nothing remotely as good as Breaking Bad.
 I once heard someone say the 70's were the hangover from the 60's.
 That's how I feel about the 80's and 90's:

 - Torn jeans? Awesome. Sagging? GTFO.
 - Spandex/leather? Sweet. Flanel? Blech.

I love my flannel. Nothing like it on a cool Seattle day!
 - Flock of Seagulls? Radical. Combover? What is this, "Leave it
    to Beaver"?

I meant the hairstyles for women! My "hairstyle" is a buzz cut.
Sep 18 2012
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2012 2:30 PM, Nick Sabalausky wrote:
 I loved MV in the 80's. It was on netflix, so I started watching it.
 It was *horrible*! Awful. Cringeworthy.

I've heard that the third and fourth seasons went downhill. I saw the whole first season just about a year ago and loved it.

Uh, I watched the pilot. The whole "I gotta beat up my partner to near death in order for us to become buddies" thing.
 There's nothing, nothing remotely as good as Breaking Bad.


You're lucky. The pleasure is all in front of you!
Sep 18 2012
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2012 3:40 PM, Andrej Mitrovic wrote:
 On 9/18/12, Walter Bright <newshound2 digitalmars.com> wrote:
 There's nothing, nothing remotely as good as Breaking Bad.

You're just saying that 'cos your name rhymes with the lead character's name. :p

Walter Bright, Walter White, middle aged, buzz cut, nerdly, ... hmmm.....
Sep 18 2012
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/17/2012 10:29 PM, H. S. Teoh wrote:
 LOL... I agree with the sentiment. My dad has a pair of Apple II's from
 the 80's, and they still work. He does his accounts on them sometimes.
 Compared to a 3-year-old PC of today, which is probably already dying a
 horrible death of HD failures, fan failures, CPU overheating, software
 breakages that's gotten it into a state that requires reformatting and
 reinstalling to fix. Apparently, this is the crowning achievement of 3
 decades of software development.

?? I don't have such problems with my computers, and I tend to run them for 5 years before upgrading. The HD failure rate is about the same as in the 80's. Of course, we no longer have to deal with floppies that get corrupted often. The most common failure I've had are the power supplies, they're still as bad today as in the 80's.
Sep 18 2012
next sibling parent reply Jan Knepper <jan smartsoft.us> writes:
On 09/18/2012 03:48 AM, Walter Bright wrote:
 ?? I don't have such problems with my computers, and I tend to run them
 for 5 years before upgrading. The HD failure rate is about the same as
 in the 80's. Of course, we no longer have to deal with floppies that get
 corrupted often.

 The most common failure I've had are the power supplies, they're still
 as bad today as in the 80's.

Never had a power supply failure... But all my power supplies can handle a lot more than they are used for. The #0 failure I see is HD... :-( I have had the necessary disks die on me in the last 20 years...
Sep 18 2012
parent Jacob Carlborg <doob me.com> writes:
On 2012-09-18 21:58, Jonathan M Davis wrote:

 I have an rsync cronjob back up my home partition nightly so that the chances
 of losing that data are slim (though I don't back up all the rest of my data
 from my many hard drives unfortunately - it would take up too much space).
 It's saved me on a number of occasions from corrupted or lost data even
 _without_ hard drive failures. Regular backups are a must IMHO, though I think
 that most people consider it too much of a hassle to bother with
 unfortunately.

It's dead easy on Mac OS X with the built in TimeMachine. Just select the backup disk and you're done. By default it backups all HFS+ disks, if you want you can choose to exclude some. -- /Jacob Carlborg
Sep 19 2012
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2012 8:36 AM, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright <newshound2 digitalmars.com>
 wrote:
 The most common failure I've had are the power supplies, they're still as
 bad today as in the 80's.

There are good power supplies, they just don't come in pre-built computers because they're expensive. I think the same could be said of products from any era.

Well, you guys have convinced me. Next time I buy a PS, I'm going to spend more money on it.
Sep 18 2012
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/19/2012 1:37 AM, Mehrdad wrote:
 What exactly do you guys _do_ with your computer that suddenly breaks the power
 supplies?! Maybe I'm just too young to know, but I've never seen a power supply
 break...

The symptoms I had are it just won't turn on. What I do? Take the old one out, got to the computer store or the computer recycler, and look for a matching one. The recycler is great for older ones not made anymore, I can get a replacement for $10 or so. Sometimes, I have to modify the case to get it to fit.
Sep 19 2012
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/19/2012 2:47 AM, monarch_dodra wrote:
 I have about 2 laptops at home, whose batteries are left with, literally, 0
 charge. Even a simple split second power cut, and they fill turn off :(

Yeah, my ancient laptop's batteries are good for maybe a second on full charge.
Sep 19 2012
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-09-18 17:12, H. S. Teoh wrote:

 Reformatting and reinstalling, though, is a matter of course on any
 Windows installation that I've ever seen. I've heard of such things as
 stable Windows installations, but as far as my experience goes those are
 mythical beasts. Things just fail the moment you start doing something
 non-trivial, like anything besides read email, watch youtube, and browse
 the 'Net. I've been spared this pain for the most part 'cos I swore off
 Windows and have been running Linux as my main OS for at least 10 years,
 but I do still get requests for help to fix broken Windows
 installations. Most of the time, the thing's either unfixable (hood is
 welded shut) or not worth the effort to fix 'cos reformat + reinstall is
 faster (shudder).

I had a Windows machine running as an HTPC that I had no problems with. Although the only thing I used it for was to watch movies.
 That's not to say that Linux doesn't have its own problems, of course.
 The libc5 -> libc6 transition is one of the memorable nightmares in its
 history. There have been others. X11 failures can get really ugly (back
 in the days before KVM, a crashed or wedged X server meant your graphics
 card is stuck in graphics mode and the console shows up as random dot
 patterns -- good luck trying to fix the system when you can't see what
 you type). Once I accidentally broke the dynamic linker, and EVERYTHING
 broke, because everything depended on it. The only thing left was a
 single bash shell over SSH (this was on a remote server with no easy
 physical access), and the only commands that didn't fail were built-in
 bash commands like echo. So I had to transfer busybox over by converting
 it into a series of echo commands that reconstituted the binary and
 copy-n-paste it. It's one of those moments where you get so much
 satisfaction from having rescued a dying system singlehandedly with echo
 commands, but it's also one of those things that puts Linux on some
 people's no-way, no-how list.

That's also the beauty of Linux, you could do it. Try doing that on a Windows machine. -- /Jacob Carlborg
Sep 19 2012
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-09-18 17:36, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright <newshound2 digitalmars.com> wrote:
 The most common failure I've had are the power supplies, they're still as bad
today as in the 80's.

There are good power supplies, they just don't come in pre-built computers because they're expensive. I think the same could be said of products from any era.

What kind of computers are you guys using. I have never owned a pre-built computer (except for laptops). I always buy my own components and assembles the computer. Then I know what I get. -- /Jacob Carlborg
Sep 19 2012
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 09/19/2012 11:54 AM, Mehrdad wrote:
 ...

 At least when Windows has the occasional boot problem which I stupidly
 caused, it's _fixable_ and doesn't lie to you about having fixed it!!

The issue is that in one case you know how to fix it and in the other one you do not (and you care less about it because you prefer to think Windows is superior as it is what you use '99% of the time'), not that the problems are inherently (un)fixable.
Sep 19 2012
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2012 12:37 AM, Nick Sabalausky wrote:
 Heh, actually, my 10-year-old 32-bit single-core XP desktop is still
 going strong,

I upgraded to a 6 core 64 bit machine. It really does improve the usability of my computer. My laptop is probably 8 years old, but I keep it for travel use. It does presentations just fine, and nobody would find it worth stealing. And if it breaks or I lose it, I won't miss it. All I've done to it is replace the drive with an SSD. Unfortunately, XP doesn't do well with SSDs, and it's as slow as the old HD. At least it's much quieter than the whining HD.
Sep 18 2012
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2012 2:16 AM, Nick Sabalausky wrote:
 On Tue, 18 Sep 2012 01:10:07 -0700
 Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/18/2012 12:37 AM, Nick Sabalausky wrote:
 Heh, actually, my 10-year-old 32-bit single-core XP desktop is still
 going strong,

I upgraded to a 6 core 64 bit machine. It really does improve the usability of my computer.

I avoid bloatware, so extra silicon doesn't do nearly as much for me.

Yeah, well, OCRing a book on my laptop takes an hour, but 5 minutes on my desktop. Also, my laptop takes several minutes to boot, my desktop is under 30 seconds. Running the D test suite on the laptop is no longer practical - takes way too long.
Sep 18 2012
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 9/18/12 8:53 AM, Steven Schveighoffer wrote:
 I never ever ever accidentally call someone when the phone is in my
 pocket, because it gets locked when I'm done with it. In fact, I never
 accidentally do *anything* on my iPhone. Never happened with my
 flip-phone either, but certainly the capacitive touch screen has not
 reintroduced that problem for those who are willing to learn how to use
 them.

Yes!
 These rants are absolutely hilarious. It's like saying you hate
 calculators because you can't slide the buttons like on your abacus.

I thought I'm alone in thinking so. To me these rants are eery - I can't recognize in them one single problem I've actually experienced. I do recognize the frustration though - some systems are indeed designed with next to no regard for usability design. Last time I've seen that was on a Delta flight. If the UI designer optimized for something, it must have been the maximization of the number of key presses for doing anything. Andrei
Sep 18 2012
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2012 6:21 AM, Andrei Alexandrescu wrote:
 Last time I've seen that was on a Delta flight. If the UI
 designer optimized for something, it must have been the maximization of the
 number of key presses for doing anything.

What's miserable is the guy behind you stabbing the touchscreen on the back of your seat with his finger, unable to find anything he wants to watch.
Sep 18 2012
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-09-18 09:37, Nick Sabalausky wrote:

 - Oracle (Even if it's not a terrible DBMS, it's certainly overpriced)

You need a SAP system to keep track of the cost of your Oracle system :) -- /Jacob Carlborg
Sep 19 2012
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 18, 2012 at 12:35:45AM -0400, Nick Sabalausky wrote:
[...]
 - Set-top firmware completely fubared, just like you described, and
   the company and tech people just shrugged it off and gave excuses
   that didn't make any sense at all.

They wrote it in ActionScript. So it's a feature, not a bug! :-P
 - Video feeds that I could almost swear must have been MPEG *1*.
   Constant compression artifacts.

Ugh. This reminds me of that nasty online plague known as WebEx. Inferior proprietary non-interoperable video encoding, in a day and age when superior open standards exist. Usable only with a badly designed proprietary player with egregious usability problems. People have complained loud and clear and the official response is, our engineering team designed this train-wreck so we're stuck with it, and we're looking to maybe perhaps someday move to a better format but that's not on our list of priorities right now. Yet for whatever reason corporate types just love WebEx. Every meeting and cow-orker's son's birthday party is on WebEx. Ugh. Nowadays I just resort to looking over the cow-orker's shoulders when reviewing WebEx videos instead of defiling my PC with that crap. [...]
 I miss the 80's: Devices worked and idiots didn't use computers.

LOL... I agree with the sentiment. My dad has a pair of Apple II's from the 80's, and they still work. He does his accounts on them sometimes. Compared to a 3-year-old PC of today, which is probably already dying a horrible death of HD failures, fan failures, CPU overheating, software breakages that's gotten it into a state that requires reformatting and reinstalling to fix. Apparently, this is the crowning achievement of 3 decades of software development. Sigh. T -- In theory, software is implemented according to the design that has been carefully worked out beforehand. In practice, design documents are written after the fact to describe the sorry mess that has gone on before.
Sep 17 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Sep 19, 2012 at 05:51:31PM -0400, Nick Sabalausky wrote:
 On Wed, 19 Sep 2012 13:38:32 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 On Wed, Sep 19, 2012 at 08:49:58PM +0200, Mehrdad wrote:
 It's pretty damn hard to convince Linux users that what you're
 trying to do is, in fact, not out of stupidity/ignorance.

It's pretty damn hard to convince Windows zealots that anything but the Windows way is not out of stupidity/ignorance.

Windows zealots are pretty rare though. Most Windows users accept that it's just an OS, and that it has its problems and downsides. (It'd be pretty hard to be a Windows user and *not* accept that Windows has it's problems.)

Haha, true! [...]
 What's "sloppy focus"?

The window focus automatically changes to whatever window the mouse is currently hovering over. Preferably WITHOUT automatically bringing said window to the top. (Good luck making this work on Windows. And once you actually manage to coax Windows to do it, have fun seeing the train wreck that is your applications once you start using them this way.)

Not exactly what you described, but similar: http://ehiti.de/katmouse/ When I point at something and scroll, I expect my *target* to scroll, not whatever the hell random thing I just happened to have clicked on last.

Yeah, another annoyance -- not with Windows specifically but with GUI apps in general -- is the search function more often than not has an invisible cursor from which the next search begins, which may be COMPLETELY unrelated to what you're currently looking at (e.g. if you scrolled the screen after the previous search). Or a new search always starts from the top of the document/page/whatever regardless of where you currently are. This is completely counterintuitive and stupid, and makes it a royal pain esp. when you want to search starting from a specific location.
 I would *HATE* using windows if I didn't have that. Unfortunately, it
 doesn't *always* work on Win7 (usually does, though). Works great on
 XP.
 
 But I agree, trying to do anything the non-Windows way on Windows
 involves stupid PITA hacking, that doesn't always work right, *if*
 it's even possible at all.

Yeah, after attempting to do sloppy focus on Windows, I crawled back into a dark corner and wept silently as I conceded to doing things the Windows way.
 And it's not *just* doing something the non-Windows way, it's even
 specific *versions* of windows: You can't even get things the WinXP
 way on Win7. Sure, *some* things you can, *sometimes*, with obscure
 hacks that don't even always work...

Yeah that's what I meant by "hood welded shut". Although it's probably more like "hood booby-trapped shut, open at your own risk". :-P
 Man, I'm really gonna have to get around to upgrading my laptop from
 Win7 back to XP sometime...Fuck this shit...

"Upgrading back to XP", lol! T -- Windows 95 was a joke, and Windows 98 was the punchline.
Sep 19 2012
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Sep 20, 2012 at 01:15:32AM -0400, Nick Sabalausky wrote:
 On Wed, 19 Sep 2012 13:38:32 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 I prefer to communicate in complete sentences rather than
 point-n-grunt.

That's great :) I like both CLI and GUI, depending on what I'm doing, but that's really good.

Well, I was being a bit over the top. :-P GUIs do have their place, such as when you're dealing with, say, 3D CADs or inherently graphical tasks. But IMNSHO, GUIs are overused for tasks that they aren't necessarily the best interface for. I cringe everytime I have to construct an expression (boolean or otherwise) using a hilariously convoluted GUI dialogue with all manner of drop boxes and other assorted widgets---or worse, a deeply-nested set of menus that you have to wade through for every single term in the expression---when I could've just typed out the expression on the keyboard in less than 1/10 of the time it takes to point-n-grunt it out. Makes me feel like trying to write D code in K&R C, or worse, PHP. :-P (There goes my feeble attempt at bringing this back on topic.) T -- The day Microsoft makes something that doesn't suck is probably the day they start making vacuum cleaners... -- Slashdotter
Sep 19 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 17 Sep 2012 22:29:10 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 Yet for whatever reason corporate types just love WebEx. Every meeting
 and cow-orker's son's birthday party is on WebEx. Ugh. Nowadays I just
 resort to looking over the cow-orker's shoulders when reviewing WebEx
 videos instead of defiling my PC with that crap.
 

If corporate types love it, you know it's bad: - Flash - Acrobat Reader - COBOL - PHP - VBScript / VisualBasic (ie, "Cobol of the 90's") - Visual SourceSafe - Lotus Notes - Blackboard (HUGE around colleges, or at least it was when I left) - Oracle (Even if it's not a terrible DBMS, it's certainly overpriced)
 
 [...]
 I miss the 80's: Devices worked and idiots didn't use computers.

LOL... I agree with the sentiment. My dad has a pair of Apple II's from the 80's, and they still work. He does his accounts on them sometimes.

I still have a IIc. The Apple II's, IMNSHO are the *one* worthwhile product line Apple's ever had. Probably b/c it's the only one (to my knowledge) that was more Woz than Jobs. Love those machines. I regret that I never have a chance to use it anymore.
 Compared to a 3-year-old PC of today, which is probably
 already dying a horrible death of HD failures, fan failures, CPU
 overheating, software breakages that's gotten it into a state that
 requires reformatting and reinstalling to fix.

Heh, actually, my 10-year-old 32-bit single-core XP desktop is still going strong, and is in active use (though it has had some upgrades: more RAM, tons of HDD (totalling ~2.5TB), a SATA add-on card, a USB 2.0 add-on, a DVD-burner). Although as of a few months ago it's no longer my primary system since I got a super-cheap 2-core 64-bit laptop - the speed is nice on the rare occasion I deal with video, but the real killer feature is simply that it's a laptop.
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 00:48:09 -0700
Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/17/2012 10:29 PM, H. S. Teoh wrote:
 LOL... I agree with the sentiment. My dad has a pair of Apple II's
 from the 80's, and they still work. He does his accounts on them
 sometimes. Compared to a 3-year-old PC of today, which is probably
 already dying a horrible death of HD failures, fan failures, CPU
 overheating, software breakages that's gotten it into a state that
 requires reformatting and reinstalling to fix. Apparently, this is
 the crowning achievement of 3 decades of software development.

?? I don't have such problems with my computers, and I tend to run them for 5 years before upgrading. The HD failure rate is about the same as in the 80's. Of course, we no longer have to deal with floppies that get corrupted often. The most common failure I've had are the power supplies, they're still as bad today as in the 80's.

I went through a few-years-long period where I was constantly replacing failed power supplies. Then I finally decided to splurge on a GOOD one, huge wattage, very reputable company, and at *least* twice the $$$ I'd ever spent on a power supply before. Never had another power supply problem since. (Knock on wood...) One important thing to keep on mind (that I've learned from Tom's Hardware and Sharky Extreme) is that power supply manufacturer apparently lie about their wattages as a regular matter of course. Ie, if it says "X Watts", then you're never going to get it to even about 0.9*X without the stupid thing blowing up. So keep that in mind when shopping. Regarding HDDs, I've sworn I will *never* run a main system again without a GOOD always-on SMART monitor like Hard Disk Sentinel <http://www.hdsentinel.com/>. In fact, that's one of the main reasons I haven't switched my primary OS from Win to Linux yet, because I can't find a good Linux SMART monitor. (Manually running a CLI program - or writing a script to do it - doesn't even remotely count.) Oooh! Actually, now that I've looked up that link, it looks like they do have an early Linux version now. Awesome, I'm gonna have to try that out.
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 01:10:07 -0700
Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/18/2012 12:37 AM, Nick Sabalausky wrote:
 Heh, actually, my 10-year-old 32-bit single-core XP desktop is still
 going strong,

I upgraded to a 6 core 64 bit machine. It really does improve the usability of my computer.

I avoid bloatware, so extra silicon doesn't do nearly as much for me.
Sep 18 2012
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Tuesday, September 18, 2012 05:16:53 Nick Sabalausky wrote:
 On Tue, 18 Sep 2012 01:10:07 -0700
 
 Walter Bright <newshound2 digitalmars.com> wrote:
 On 9/18/2012 12:37 AM, Nick Sabalausky wrote:
 Heh, actually, my 10-year-old 32-bit single-core XP desktop is still
 going strong,

I upgraded to a 6 core 64 bit machine. It really does improve the usability of my computer.

I avoid bloatware, so extra silicon doesn't do nearly as much for me.

It all depends on what you're doing. Today's machines are overpowered for many common tasks, but for some stuff you can _always_ use more CPU (e.g. transcoding video). - Jonathan M Davis
Sep 18 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 18, 2012 at 02:57:08AM -0700, Jonathan M Davis wrote:
 On Tuesday, September 18, 2012 05:16:53 Nick Sabalausky wrote:
 On Tue, 18 Sep 2012 01:10:07 -0700
 
 Walter Bright <newshound2 digitalmars.com> wrote:
 On 9/18/2012 12:37 AM, Nick Sabalausky wrote:
 Heh, actually, my 10-year-old 32-bit single-core XP desktop is
 still going strong,

I upgraded to a 6 core 64 bit machine. It really does improve the usability of my computer.

I avoid bloatware, so extra silicon doesn't do nearly as much for me.

It all depends on what you're doing. Today's machines are overpowered for many common tasks, but for some stuff you can _always_ use more CPU (e.g. transcoding video).

Yeah, I run povray on complicated auto-generated math models quite a lot, and my recent upgrade to an AMD hexacore made a huge difference. It's the primary reason I upgraded my 10-year-old AMD Barton system. This was a couple o' years ago, and I'm planning to run this system for at least another decade before I upgrade again (I hate the upgrade cycle). OTOH, my primary window manager is ratpoison, so it's not like the hexacore makes any noticeable difference at all on that front, though GUI-dependent people might find hexacore desirable 'cos then they can turn on all the eye-candy in Compiz and still have the system work without lagging. (I used to have a Compiz installation purely for showing off and for Linux propaganda purposes, but it fell into disuse and nowadays I just don't bother.) T -- Once bitten, twice cry...
Sep 18 2012
prev sibling next sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 18, 2012, at 12:48 AM, Walter Bright <newshound2 digitalmars.com> wro=
te:
=20
 The most common failure I've had are the power supplies, they're still as b=

There are good power supplies, they just don't come in pre-built computers b= ecause they're expensive. I think the same could be said of products from a= ny era.=20=
Sep 18 2012
prev sibling next sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 18, 2012, at 1:28 AM, Nick Sabalausky <SeeWebsiteToContactMe semitwis=
t.com> wrote:
=20
 One important thing to keep on mind (that I've learned from Tom's
 Hardware and Sharky Extreme) is that power supply manufacturer
 apparently lie about their wattages as a regular matter of course. Ie,
 if it says "X Watts", then you're never going to get it to even about
 0.9*X without the stupid thing blowing up. So keep that in mind when
 shopping.

They are probably advertising their peak voltage. One of the most valuable l= essons I've learned about electrical equipment is that everything is built w= ith an intended usage pattern and if you exceed that the part will fail. The= really frustrating thing is that it can be very hard to find a version of s= omething rated for continuous use, and when you do, they're incredibly more e= xpensive than the off the shelf version of that thing. I've gone through cou= ntless paper shredders because of this, and melted more than one popcorn pop= per (I roast my own coffee).
 Regarding HDDs, I've sworn I will *never* run a main system again
 without a GOOD always-on SMART monitor like Hard Disk Sentinel
 <http://www.hdsentinel.com/>. In fact, that's one of the main reasons I
 haven't switched my primary OS from Win to Linux yet, because I can't
 find a good Linux SMART monitor. (Manually running a CLI program
 - or writing a script to do it - doesn't even remotely count.) Oooh!
 Actually, now that I've looked up that link, it looks like they do
 have an early Linux version now. Awesome, I'm gonna have to try that
 out.

And I've learned that I will never again run a striped RAID off a mainboard c= ontroller, because when the mainboard dies you're SOL.=20=
Sep 18 2012
prev sibling next sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Tuesday, 18 September 2012 at 15:47:30 UTC, Sean Kelly wrote:
 And I've learned that I will never again run a striped RAID off 
 a mainboard controller, because when the mainboard dies you're 
 SOL.

Migrate to ZFS! :D
Sep 18 2012
prev sibling next sibling parent "Chris Nicholson-Sauls" <ibisbasenji gmail.com> writes:
On Tuesday, 18 September 2012 at 08:27:31 UTC, Nick Sabalausky 
wrote:
 Regarding HDDs, I've sworn I will *never* run a main system 
 again
 without a GOOD always-on SMART monitor like Hard Disk Sentinel
 <http://www.hdsentinel.com/>. In fact, that's one of the main 
 reasons I
 haven't switched my primary OS from Win to Linux yet, because I 
 can't
 find a good Linux SMART monitor. (Manually running a CLI program
 - or writing a script to do it - doesn't even remotely count.) 
 Oooh!
 Actually, now that I've looked up that link, it looks like they 
 do
 have an early Linux version now. Awesome, I'm gonna have to try 
 that
 out.

I do believe conky can provide SMART monitoring. http://conky.sourceforge.net/ Although periodically running GSmartCtl (GUI front-end to the command line tool) isn't a bad idea, either, to see the specific details (spin-ups, heat stress, etc) and/or execute the drive's self-test.
Sep 18 2012
prev sibling next sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On 18-Sep-12 02:39, Xinok wrote:
 On Monday, 17 September 2012 at 07:16:15 UTC, Jonathan M Davis wrote:
 On Monday, September 17, 2012 09:05:48 David Nadlinger wrote:
 On Sunday, 16 September 2012 at 21:59:30 UTC, Jøn wrote:
 The best idea I had today: rename D into :D
 * Easier to google


You might be surprised to see that D is the number 1 result for ":D" even today.

The search results seem to be identical whether you search for D or :D, so the colon seems to be ignored. Of course, the fact that dlang.org comes up first could just be because google taylors its results to you, and we're both people who deal with D (and presumably search for it from time to time) already. - Jonathan M Davis

It's the second result on DuckDuckGo, which *doesn't* tailor it's search results. https://duckduckgo.com/?q=d

Nowhere to be found for me. Obviously they also do tailor the results. (it's first time I see duckduckgo, and I followed your exact link) -- Dmitry Olshansky
Sep 18 2012
parent David Gileadi <gileadis NSPMgmail.com> writes:
On 9/18/12 11:10 AM, Dmitry Olshansky wrote:
 On 18-Sep-12 02:39, Xinok wrote:
 It's the second result on DuckDuckGo, which *doesn't* tailor it's search
 results.

 https://duckduckgo.com/?q=d

Nowhere to be found for me. Obviously they also do tailor the results. (it's first time I see duckduckgo, and I followed your exact link)

It must have changed overnight--yesterday I saw it as the second result; today I don't see it either (except when I expand the Computing section).
Sep 18 2012
prev sibling next sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Tuesday, 18 September 2012 at 19:03:40 UTC, Jan Knepper wrote:
 On 09/18/2012 03:48 AM, Walter Bright wrote:
 ?? I don't have such problems with my computers, and I tend to 
 run them
 for 5 years before upgrading. The HD failure rate is about the 
 same as
 in the 80's. Of course, we no longer have to deal with 
 floppies that get
 corrupted often.

 The most common failure I've had are the power supplies, 
 they're still
 as bad today as in the 80's.

Never had a power supply failure... But all my power supplies can handle a lot more than they are used for. The #0 failure I see is HD... :-( I have had the necessary disks die on me in the last 20 years...

Neither have I... in the past 10 years (young dev here). However, I've had 3 SSDs crap out on me in less than a month... out of 3... on 3 different computers. I'm on my fourth now. 4 months running. The worst part about an SSD failure is the utter and total lack of warning. One day, everything is green. The next day, the bios can't see it. Game over. I've had friends ask me to "investigate" blue screens and intermittent errors. The HDD was dye-ING, but the data/os still salvageable. Not so with an SSD.
Sep 18 2012
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Tuesday, September 18, 2012 21:37:06 monarch_dodra wrote:
 On Tuesday, 18 September 2012 at 19:03:40 UTC, Jan Knepper wrote:
 On 09/18/2012 03:48 AM, Walter Bright wrote:
 ?? I don't have such problems with my computers, and I tend to
 run them
 for 5 years before upgrading. The HD failure rate is about the
 same as
 in the 80's. Of course, we no longer have to deal with
 floppies that get
 corrupted often.
 
 The most common failure I've had are the power supplies,
 they're still
 as bad today as in the 80's.

Never had a power supply failure... But all my power supplies can handle a lot more than they are used for. The #0 failure I see is HD... :-( I have had the necessary disks die on me in the last 20 years...

Neither have I... in the past 10 years (young dev here). However, I've had 3 SSDs crap out on me in less than a month... out of 3... on 3 different computers. I'm on my fourth now. 4 months running. The worst part about an SSD failure is the utter and total lack of warning. One day, everything is green. The next day, the bios can't see it. Game over. I've had friends ask me to "investigate" blue screens and intermittent errors. The HDD was dye-ING, but the data/os still salvageable. Not so with an SSD.

I have an rsync cronjob back up my home partition nightly so that the chances of losing that data are slim (though I don't back up all the rest of my data from my many hard drives unfortunately - it would take up too much space). It's saved me on a number of occasions from corrupted or lost data even _without_ hard drive failures. Regular backups are a must IMHO, though I think that most people consider it too much of a hassle to bother with unfortunately. - Jonathan M Davis
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 19:22:50 +0200
"Chris Nicholson-Sauls" <ibisbasenji gmail.com> wrote:
 
 I do believe conky can provide SMART monitoring.
 http://conky.sourceforge.net/
 
 Although periodically running GSmartCtl (GUI front-end to the 
 command line tool) isn't a bad idea, either, to see the specific 
 details (spin-ups, heat stress, etc) and/or execute the drive's 
 self-test.

See, that's the problem though. A SMART tool that needs to be periodically run manually is next to pointless. I expect a SMART monitor to *always* be running and *actively* notify me when something starts going downhill. Naturally, that's not a substitute for actually looking at the stats now and then, but the "always-on, active notification" is a MUST.
Sep 18 2012
prev sibling next sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Wednesday, 19 September 2012 at 06:11:00 UTC, Walter Bright 
wrote:
 On 9/18/2012 8:36 AM, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright 
 <newshound2 digitalmars.com>
 wrote:
 The most common failure I've had are the power supplies, 
 they're still as
 bad today as in the 80's.

There are good power supplies, they just don't come in pre-built computers because they're expensive. I think the same could be said of products from any era.

Well, you guys have convinced me. Next time I buy a PS, I'm going to spend more money on it.

What exactly do you guys _do_ with your computer that suddenly breaks the power supplies?! Maybe I'm just too young to know, but I've never seen a power supply break...
Sep 19 2012
prev sibling next sibling parent "Simen Kjaeraas" <simen.kjaras gmail.com> writes:
On Wed, 19 Sep 2012 10:37:44 +0200, Mehrdad <wfunction hotmail.com> wrote:

 On Wednesday, 19 September 2012 at 06:11:00 UTC, Walter Bright wrote:
 On 9/18/2012 8:36 AM, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright  
 <newshound2 digitalmars.com>
 wrote:
 The most common failure I've had are the power supplies, they're  
 still as
 bad today as in the 80's.

There are good power supplies, they just don't come in pre-built computers because they're expensive. I think the same could be said of products from any era.

Well, you guys have convinced me. Next time I buy a PS, I'm going to spend more money on it.

What exactly do you guys _do_ with your computer that suddenly breaks the power supplies?! Maybe I'm just too young to know, but I've never seen a power supply break...

Use them every day for a regular computer? While I've had no spectacular failures (yet), this has been sufficient to break a PSU or two. Then I decided to spend money getting a quality PSU, and it hasn't had a single problem in 7 years. -- Simen
Sep 19 2012
prev sibling next sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Wednesday, 19 September 2012 at 08:36:46 UTC, Mehrdad wrote:
 What exactly do you guys _do_ with your computer that suddenly 
 breaks the power supplies?! Maybe I'm just too young to know, 
 but I've never seen a power supply break...

I once tried to do some GPU calculations. After several hours, the PSU failed, frying my components. The graphics card was literally ON FIRE (!). Nothing was salvageable. Anyways, that is what *THAT* is how to kill a PSU, and *THAT* is what happens when they fail...
Sep 19 2012
prev sibling next sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Wednesday, 19 September 2012 at 08:53:33 UTC, monarch_dodra 
wrote:
 On Wednesday, 19 September 2012 at 08:36:46 UTC, Mehrdad wrote:
 What exactly do you guys _do_ with your computer that suddenly 
 breaks the power supplies?! Maybe I'm just too young to know, 
 but I've never seen a power supply break...

I once tried to do some GPU calculations. After several hours, the PSU failed, frying my components. The graphics card was literally ON FIRE (!). Nothing was salvageable. Anyways, that is what *THAT* is how to kill a PSU, and *THAT* is what happens when they fail...

Dang that's... intense. o.o Are laptop power supplies more durable or something? None of my laptops (or anyone's laptop I know) have had problematic power supplies...
Sep 19 2012
prev sibling next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Tuesday, 18 September 2012 at 05:26:33 UTC, H. S. Teoh wrote:
 On Tue, Sep 18, 2012 at 12:35:45AM -0400, Nick Sabalausky wrote:
 [...]
 Yet for whatever reason corporate types just love WebEx. Every 
 meeting
 and cow-orker's son's birthday party is on WebEx. Ugh. Nowadays 
 I just
 resort to looking over the cow-orker's shoulders when reviewing 
 WebEx
 videos instead of defiling my PC with that crap.

I am a corporate guy that loves WebEx. If you ever went through the amount of failed attempts in the corporate world starting with NetMeeting, Sametime and a couple of others I already forgot, in the last decade, you can only love how easy and stable it is to use WebEx conferences. -- Paulo
Sep 19 2012
prev sibling next sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Wednesday, 19 September 2012 at 09:19:13 UTC, Mehrdad wrote:
 On Wednesday, 19 September 2012 at 08:53:33 UTC, monarch_dodra 
 wrote:
 On Wednesday, 19 September 2012 at 08:36:46 UTC, Mehrdad wrote:
 What exactly do you guys _do_ with your computer that 
 suddenly breaks the power supplies?! Maybe I'm just too young 
 to know, but I've never seen a power supply break...

I once tried to do some GPU calculations. After several hours, the PSU failed, frying my components. The graphics card was literally ON FIRE (!). Nothing was salvageable. Anyways, that is what *THAT* is how to kill a PSU, and *THAT* is what happens when they fail...

Dang that's... intense. o.o Are laptop power supplies more durable or something? None of my laptops (or anyone's laptop I know) have had problematic power supplies...

The difference is that a laptop's wattage is nowhere near the wattage of a desktop. This is even truer of modern computer, where desktops consume even more power, whereas laptops are consuming much less. I've never had a problem with a laptop "PSU block" itself... Not that I can say the same about the batteries. I have about 2 laptops at home, whose batteries are left with, literally, 0 charge. Even a simple split second power cut, and they fill turn off :(
Sep 19 2012
prev sibling next sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Wednesday, 19 September 2012 at 09:19:13 UTC, Mehrdad wrote:
 On Wednesday, 19 September 2012 at 08:53:33 UTC, monarch_dodra 
 wrote:
 On Wednesday, 19 September 2012 at 08:36:46 UTC, Mehrdad wrote:
 What exactly do you guys _do_ with your computer that 
 suddenly breaks the power supplies?! Maybe I'm just too young 
 to know, but I've never seen a power supply break...

I once tried to do some GPU calculations. After several hours, the PSU failed, frying my components. The graphics card was literally ON FIRE (!). Nothing was salvageable. Anyways, that is what *THAT* is how to kill a PSU, and *THAT* is what happens when they fail...

Dang that's... intense. o.o Are laptop power supplies more durable or something? None of my laptops (or anyone's laptop I know) have had problematic power supplies...

The difference is that a laptop's wattage is nowhere near the wattage of a desktop. This is even truer of modern computer, where desktops consume even more power, whereas laptops are consuming much less. I've never had a problem with a laptop "PSU block" itself... Not that I can say the same about the batteries. I have about 2 laptops at home, whose batteries are left with, literally, 0 charge. Even a simple split second power cut, and they fill turn off :(
Sep 19 2012
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Wednesday, September 19, 2012 11:45:42 Paulo Pinto wrote:
 On Tuesday, 18 September 2012 at 05:26:33 UTC, H. S. Teoh wrote:
 On Tue, Sep 18, 2012 at 12:35:45AM -0400, Nick Sabalausky wrote:
 [...]
 Yet for whatever reason corporate types just love WebEx. Every
 meeting
 and cow-orker's son's birthday party is on WebEx. Ugh. Nowadays
 I just
 resort to looking over the cow-orker's shoulders when reviewing
 WebEx
 videos instead of defiling my PC with that crap.

I am a corporate guy that loves WebEx. If you ever went through the amount of failed attempts in the corporate world starting with NetMeeting, Sametime and a couple of others I already forgot, in the last decade, you can only love how easy and stable it is to use WebEx conferences.

We've taken to using google hangout where I work. It's by no means perfect, but it's far easier to setup and deal with than WebEx. There may be worse things than WebEx, but I'd just as soon never have to deal with it again. To each their own though, I suppose. - Jonathan M Davis
Sep 19 2012
prev sibling next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 19 September 2012 at 10:53:40 UTC, Jonathan M Davis 
wrote:
 On Wednesday, September 19, 2012 11:45:42 Paulo Pinto wrote:
 On Tuesday, 18 September 2012 at 05:26:33 UTC, H. S. Teoh 
 wrote:
 On Tue, Sep 18, 2012 at 12:35:45AM -0400, Nick Sabalausky 
 wrote:
 [...]
 Yet for whatever reason corporate types just love WebEx. 
 Every
 meeting
 and cow-orker's son's birthday party is on WebEx. Ugh. 
 Nowadays
 I just
 resort to looking over the cow-orker's shoulders when 
 reviewing
 WebEx
 videos instead of defiling my PC with that crap.

I am a corporate guy that loves WebEx. If you ever went through the amount of failed attempts in the corporate world starting with NetMeeting, Sametime and a couple of others I already forgot, in the last decade, you can only love how easy and stable it is to use WebEx conferences.

We've taken to using google hangout where I work. It's by no means perfect, but it's far easier to setup and deal with than WebEx. There may be worse things than WebEx, but I'd just as soon never have to deal with it again. To each their own though, I suppose. - Jonathan M Davis

I have yet not tried. In most places where I worked, they only allow such type of tools when you can have some control over the servers where it is hosted. You know the typical bureaucratic from multi-national companies in the corporate world. :( -- Paulo
Sep 19 2012
prev sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 19, 2012, at 12:40 AM, Jacob Carlborg <doob me.com> wrote:

 On 2012-09-18 17:36, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright =


=20
 The most common failure I've had are the power supplies, they're =



=20
 There are good power supplies, they just don't come in pre-built =


products from any era.
=20
 What kind of computers are you guys using. I have never owned a =

and assembles the computer. Then I know what I get. Same here. If I were to buy a pre-assembled computer I'd probably go to = someplace like cyberpowerpc.com, but even then you have to explicitly = pick the good PSU because it isn't included by default.=
Sep 19 2012
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Monday, September 17, 2012 09:05:48 David Nadlinger wrote:
 On Sunday, 16 September 2012 at 21:59:30 UTC, J=C3=B8n wrote:
 The best idea I had today: rename D into :D
=20
 * Easier to google

You might be surprised to see that D is the number 1 result for ":D" even today.

The search results seem to be identical whether you search for D or :D,= so the=20 colon seems to be ignored. Of course, the fact that dlang.org comes up = first=20 could just be because google taylors its results to you, and we're both= people=20 who deal with D (and presumably search for it from time to time) alread= y. - Jonathan M Davis
Sep 17 2012
prev sibling next sibling parent Andrea Fontana <nospam example.com> writes:
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable

+1

Il giorno lun, 17/09/2012 alle 00.00 +0200, "J=C3=B8n" ha scritto:

 The best idea I had today: rename D into :D
=20
 * Easier to google
 * the :D experience (Satisfied Customers smile)
 * :D associates with programming, it is symbol in f.e. LISP and=20
 Ruby
 * backwards compatible: no need to update sitename, etc.
 * Its predecessor is :C, a very very unhappy language
=20
 I hope :D parses in D community.
=20
 J=C3=B8n
=20
=20
 ps :C reminds me of Retort in the Muppet show: use C to have your=20
 programming experiments blow up on you.
=20

Sep 17 2012
prev sibling next sibling parent "F i L" <witte2008 gmail.com> writes:
 * Its predecessor is :C, a very very unhappy language

lawl
Sep 17 2012
prev sibling next sibling parent "David Nadlinger" <see klickverbot.at> writes:
On Monday, 17 September 2012 at 07:16:15 UTC, Jonathan M Davis 
wrote:
 On Monday, September 17, 2012 09:05:48 David Nadlinger wrote:
 On Sunday, 16 September 2012 at 21:59:30 UTC, Jøn wrote:
 The best idea I had today: rename D into :D
 
 * Easier to google

You might be surprised to see that D is the number 1 result for ":D" even today.

The search results seem to be identical whether you search for D or :D, so the colon seems to be ignored.

This was exactly my point – prepending a colon doesn't magically make the name "easier to google" (if the proposal was even meant serious in the first place).
 Of course, the fact that dlang.org comes up first
 could just be because google taylors its results to you, and 
 we're both people
 who deal with D (and presumably search for it from time to 
 time) already.

I used a completely fresh browser session and was connected via my university network, so unless Google does a really worrying amount of tracking… David
Sep 17 2012
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Monday, September 17, 2012 14:21:45 David Nadlinger wrote:
 I used a completely fresh browser session and was connected via
 my university network, so unless Google does a really worrying
 amount of tracking=E2=80=A6

If you're logged in, it'll use that, and it'll probably use your IP, bu= t if=20 you weren't logged in, and your IP wasn't one that you'd had before, th= en=20 unless they're tracking via cookies, then I wouldn't expect them to be=20= tracking you. It's actually possible to do a fairly good job of trackin= g you=20 if you have javascript and/or flash enabled, because they can get at th= e list=20 of fonts and whatnot on your computer, making you rather unique very qu= ickly=20 unless you have a _very_ standard setup (there was a paper about it not= too=20 long ago that got referenced on slashdot), but I wouldn't expect them t= o be=20 going to all of that trouble. So, it sounds like it's fairly likely tha= t=20 dlange.org is starting to be the winning hit for searching the letter D= , which=20 is a good thing. - Jonathan M Davis
Sep 17 2012
prev sibling next sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Sunday, 16 September 2012 at 21:59:30 UTC, Jøn wrote:
 The best idea I had today: rename D into :D

 * Easier to google
 * the :D experience (Satisfied Customers smile)
 * :D associates with programming, it is symbol in f.e. LISP and 
 Ruby
 * backwards compatible: no need to update sitename, etc.
 * Its predecessor is :C, a very very unhappy language

 I hope :D parses in D community.

 Jøn


 ps :C reminds me of Retort in the Muppet show: use C to have 
 your programming experiments blow up on you.

I think that is a great idea. Or at least, make it the official logo/mascot, or the unofficial name. It is these kinds of details give programmers a more personal bond with the language they are coding in. ---- I am a :D coder
Sep 17 2012
prev sibling next sibling parent =?UTF-8?B?IkrDuG4i?= <caapora gmail.com> writes:
On Monday, 17 September 2012 at 07:16:15 UTC, Jonathan M Davis 
wrote:
 On Monday, September 17, 2012 09:05:48 David Nadlinger wrote:
 On Sunday, 16 September 2012 at 21:59:30 UTC, Jøn wrote:
 The best idea I had today: rename D into :D
 
 * Easier to google

You might be surprised to see that D is the number 1 result for ":D" even today.

The search results seem to be identical whether you search for D or :D, so the colon seems to be ignored. Of course, the fact that dlang.org comes up first could just be because google taylors its results to you, and we're both people who deal with D (and presumably search for it from time to time) already. - Jonathan M Davis

Yep, I regretted the Google suggestion soon afterwards, because :D is more common than D in other contexts. But I do think :D is a better brand, more recognizable, and D sure needs a little help in that area. So if the search results do not change... that at least leaves the option open. Anyway, the post was for your entertainment. But from what we entertain the future springs. So surely D future is :D ...
Sep 17 2012
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 17 Sep 2012 00:16:26 -0700
Jonathan M Davis <jmdavisProg gmx.com> wrote:

 On Monday, September 17, 2012 09:05:48 David Nadlinger wrote:
 On Sunday, 16 September 2012 at 21:59:30 UTC, J=F8n wrote:
 The best idea I had today: rename D into :D
=20
 * Easier to google

You might be surprised to see that D is the number 1 result for ":D" even today.

The search results seem to be identical whether you search for D or :D, so the colon seems to be ignored. =20

Yea, google pathologically ignores anything that isn't strictly alphanumeric, even when you enclose in quotes. Fucking annoying as hell. Especially when you're trying to find something about C++ and the damn thing comes back with a bunch of C# results. That's a real obnoxious trend in computing: Software doing whatever the hell it feels like (usually under the guise of "being helpful") instead of doing what it's fucking told.
Sep 17 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 18, 2012 at 12:48:09AM -0700, Walter Bright wrote:
 On 9/17/2012 10:29 PM, H. S. Teoh wrote:
LOL... I agree with the sentiment. My dad has a pair of Apple II's
from the 80's, and they still work. He does his accounts on them
sometimes.  Compared to a 3-year-old PC of today, which is probably
already dying a horrible death of HD failures, fan failures, CPU
overheating, software breakages that's gotten it into a state that
requires reformatting and reinstalling to fix. Apparently, this is
the crowning achievement of 3 decades of software development.

?? I don't have such problems with my computers, and I tend to run them for 5 years before upgrading. The HD failure rate is about the same as in the 80's. Of course, we no longer have to deal with floppies that get corrupted often. The most common failure I've had are the power supplies, they're still as bad today as in the 80's.

OK, I exaggerated a little. I'm just bitter because once I bought a HD (from a store of questionable repute, I'll confess) that started making clicking sounds 2 months later, and then proceeded to keel over and die in the most horrible way, taking all my data with it. (But I shouldn't be so bitter, though, 'cos RMA gave me a brand new HD.) Another time, my computer started randomly rebooting for no apparent reason -- then I discovered that the power supply was starting to fail. Which caused a series of other failures like fan failures and CPU overheating. But this is all just hardware, which is beside the point. Reformatting and reinstalling, though, is a matter of course on any Windows installation that I've ever seen. I've heard of such things as stable Windows installations, but as far as my experience goes those are mythical beasts. Things just fail the moment you start doing something non-trivial, like anything besides read email, watch youtube, and browse the 'Net. I've been spared this pain for the most part 'cos I swore off Windows and have been running Linux as my main OS for at least 10 years, but I do still get requests for help to fix broken Windows installations. Most of the time, the thing's either unfixable (hood is welded shut) or not worth the effort to fix 'cos reformat + reinstall is faster (shudder). That's not to say that Linux doesn't have its own problems, of course. The libc5 -> libc6 transition is one of the memorable nightmares in its history. There have been others. X11 failures can get really ugly (back in the days before KVM, a crashed or wedged X server meant your graphics card is stuck in graphics mode and the console shows up as random dot patterns -- good luck trying to fix the system when you can't see what you type). Once I accidentally broke the dynamic linker, and EVERYTHING broke, because everything depended on it. The only thing left was a single bash shell over SSH (this was on a remote server with no easy physical access), and the only commands that didn't fail were built-in bash commands like echo. So I had to transfer busybox over by converting it into a series of echo commands that reconstituted the binary and copy-n-paste it. It's one of those moments where you get so much satisfaction from having rescued a dying system singlehandedly with echo commands, but it's also one of those things that puts Linux on some people's no-way, no-how list. T -- The right half of the brain controls the left half of the body. This means that only left-handed people are in their right mind. -- Manoj Srivastava
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 08:12:50 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 Reformatting and reinstalling, though, is a matter of course on any
 Windows installation that I've ever seen. I've heard of such things as
 stable Windows installations, but as far as my experience goes those
 are mythical beasts. Things just fail the moment you start doing
 something non-trivial, like anything besides read email, watch
 youtube, and browse the 'Net. I've been spared this pain for the most
 part 'cos I swore off Windows and have been running Linux as my main
 OS for at least 10 years, but I do still get requests for help to fix
 broken Windows installations. Most of the time, the thing's either
 unfixable (hood is welded shut) or not worth the effort to fix 'cos
 reformat + reinstall is faster (shudder).
 

My desktop's XP installation (SP2 even) has been aces for years. And years ago, when I did have to reinstall, it was just because of something stupid I'd done. I've seen plenty of screwed up Win boxes (even Win7), but it's always owned by someone who doesn't even know what a "web browser" is, so I figure chances are it's due to one of two things: A. The user doing something stupid. B. The user not using the web the way I do: with Adblock Plus installed, and JS and Flash disabled by default.
 That's not to say that Linux doesn't have its own problems, of course.
 The libc5 -> libc6 transition is one of the memorable nightmares in
 its history. There have been others. X11 failures can get really ugly
 (back in the days before KVM, a crashed or wedged X server meant your
 graphics card is stuck in graphics mode and the console shows up as
 random dot patterns -- good luck trying to fix the system when you
 can't see what you type).

Oh man, I can't even tell you how many times I've had X suddenly fail to startup with some errors for *no* apparent reason (and once that happens, X *stays* dead, unless you happen to be a Linux guru). Luckily this isn't so common anymore though, it was mostly about ten years ago. That was one of the main reasons I swore off linux for years, until just a few years ago I got back into it.
 Once I accidentally broke the dynamic
 linker, and EVERYTHING broke, because everything depended on it. The
 only thing left was a single bash shell over SSH (this was on a
 remote server with no easy physical access), and the only commands
 that didn't fail were built-in bash commands like echo. So I had to
 transfer busybox over by converting it into a series of echo commands
 that reconstituted the binary and copy-n-paste it. It's one of those
 moments where you get so much satisfaction from having rescued a
 dying system singlehandedly with echo commands, but it's also one of
 those things that puts Linux on some people's no-way, no-how list.
 

Ouch.
Sep 18 2012
prev sibling next sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Tuesday, 18 September 2012 at 21:19:13 UTC, Nick Sabalausky 
wrote:
 On Tue, 18 Sep 2012 08:12:50 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 Reformatting and reinstalling, though, is a matter of course 
 on any Windows installation that I've ever seen. I've heard of 
 such things as stable Windows installations, but as far as my 
 experience goes those are mythical beasts.

My desktop's XP installation (SP2 even) has been aces for years. And years ago, when I did have to reinstall, it was just because of something stupid I'd done. I've seen plenty of screwed up Win boxes (even Win7), but it's always owned by someone who doesn't even know what a "web browser" is, so I figure chances are it's due to one of two things: A. The user doing something stupid. B. The user not using the web the way I do: with Adblock Plus installed, and JS and Flash disabled by default.

I vote +1 for (A). :) It's not a mythical beast, it's sitting right in front of me! My situation with Windows 7 has been quite stable too. FYI, my Windows is run: - Without any antimalware software of any kind (I hate them) - Always with admin privileges (UAC turned off) - In "Test Mode" (security risk in terms of digital signatures) - I currently boot 5 OSes: - Windows 7 x64, the original which the laptop came with, which I use 99% of the time - Windows 8 I installed a few weeks ago for trying it out - Windows XP 32-bit and 64-bit for testing stuff - Linux (Ubuntu) x64 for when I need it - I mess with partitions every few weeks - I hack around with Windows internals quite a bit ;) Guess which OS is the one that I've reinstalled a bazillion times? Ubuntu. And it _still_ doesn't boot automatically! I tell it to install Grub, and it says OK. It even _force_ it to reinstall Grub, and it says OK, I reinstalled myself. Then I reboot and it goes onto the screen and just... doesn't boot. I have to type in the boot sequence commands myself. Why? Because a random, unrelated partition on the disk changed and Ubuntu freaked out. At least when Windows has the occasional boot problem which I stupidly caused, it's _fixable_ and doesn't lie to you about having fixed it!!
Sep 19 2012
prev sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Tuesday, 18 September 2012 at 21:19:13 UTC, Nick Sabalausky
wrote:
 On Tue, 18 Sep 2012 08:12:50 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 Reformatting and reinstalling, though, is a matter of course 
 on any Windows installation that I've ever seen. I've heard of 
 such things as stable Windows installations, but as far as my 
 experience goes those are mythical beasts.

My desktop's XP installation (SP2 even) has been aces for years. And years ago, when I did have to reinstall, it was just because of something stupid I'd done. I've seen plenty of screwed up Win boxes (even Win7), but it's always owned by someone who doesn't even know what a "web browser" is, so I figure chances are it's due to one of two things: A. The user doing something stupid. B. The user not using the web the way I do: with Adblock Plus installed, and JS and Flash disabled by default.

I vote +1 for (A). :) It's not a mythical beast, it's sitting right in front of me! My situation with Windows 7 has been quite stable too. FYI, my Windows is run: - Without any antimalware software of any kind (I hate them) - Always with admin privileges (UAC turned off) - In "Test Mode" (security risk in terms of digital signatures) - I currently boot 5 OSes: - Windows 7 x64, the original which the laptop came with, which I use 99% of the time - Windows 8 I installed a few weeks ago for trying it out - Windows XP 32-bit and 64-bit for testing stuff - Linux (Ubuntu) x64 for when I need it - I mess with partitions every few weeks - I hack around with Windows internals quite a bit ;) Guess which OS is the one that I've reinstalled a bazillion times? Ubuntu. And it _still_ doesn't boot automatically! I tell it to install Grub, and it says OK. It even _force_ it to reinstall Grub, and it says OK, I reinstalled myself. Then I reboot and it goes onto the screen and just... doesn't boot. I have to type in the boot sequence commands myself. Why? Because a random, unrelated partition on the disk changed and Ubuntu freaked out. At least when Windows has the occasional boot problem which I stupidly caused, it's _fixable_ and doesn't lie to you about having fixed it!!
Sep 19 2012
prev sibling next sibling parent reply David Gileadi <gileadis NSPMgmail.com> writes:
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 8bit

On 9/16/12 3:00 PM, "Jn" wrote:
 The best idea I had today: rename D into :D

You asked for it: a mockup.
Sep 17 2012
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/17/2012 1:10 PM, David Gileadi wrote:
 On 9/16/12 3:00 PM, "Jøn" wrote:
 The best idea I had today: rename D into :D

You asked for it: a mockup.

<g>
Sep 17 2012
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/17/2012 3:09 PM, Nick Sabalausky wrote:
 You know, make one of them look like Phobos, and the other Deimos, and
 you may be onto something...

The trouble with cute logos is like hearing the same joke over and over. I'm happy with our current logo. It's simple and elegant.
Sep 17 2012
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 09/17/2012 10:10 PM, David Gileadi wrote:
 On 9/16/12 3:00 PM, "Jn" wrote:
 The best idea I had today: rename D into :D

You asked for it: a mockup.

Nicely done. I prefer that one to the one we have.
Sep 17 2012
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Sep 17, 2012 at 03:57:41PM -0400, Nick Sabalausky wrote:
 On Mon, 17 Sep 2012 00:16:26 -0700
 Jonathan M Davis <jmdavisProg gmx.com> wrote:

 The search results seem to be identical whether you search for D or
 :D, so the colon seems to be ignored.
 

Yea, google pathologically ignores anything that isn't strictly alphanumeric, even when you enclose in quotes. Fucking annoying as hell. Especially when you're trying to find something about C++ and the damn thing comes back with a bunch of C# results. That's a real obnoxious trend in computing: Software doing whatever the hell it feels like (usually under the guise of "being helpful") instead of doing what it's fucking told.

Any time you hear "smart" and "software" in the same sentence, be prepared for something dumb. T -- The best way to destroy a cause is to defend it poorly.
Sep 17 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 18, 2012 at 04:28:08AM -0400, Nick Sabalausky wrote:
[...]
 I went through a few-years-long period where I was constantly
 replacing failed power supplies. Then I finally decided to splurge on
 a GOOD one, huge wattage, very reputable company, and at *least* twice
 the $$$ I'd ever spent on a power supply before.
 
 Never had another power supply problem since. (Knock on wood...)

Yeah, all those cheap PSU's you get from door-crasher sales, all those are crap. They start behaving funny after 2 years (if even that) and randomly shutting off for no reason on the 3rd anniversary. The PSU is one of those things that you *want* to make a good investment in.
 One important thing to keep on mind (that I've learned from Tom's
 Hardware and Sharky Extreme) is that power supply manufacturer
 apparently lie about their wattages as a regular matter of course. Ie,
 if it says "X Watts", then you're never going to get it to even about
 0.9*X without the stupid thing blowing up. So keep that in mind when
 shopping.

Heh. Reminds me of my UPS... I bought it to protect my very expensive PSU from power surges/failures, but guess what? The PSU is still running and the UPS has been dead for 4 years. :-P
 Regarding HDDs, I've sworn I will *never* run a main system again
 without a GOOD always-on SMART monitor like Hard Disk Sentinel
 <http://www.hdsentinel.com/>. In fact, that's one of the main reasons
 I haven't switched my primary OS from Win to Linux yet, because I
 can't find a good Linux SMART monitor. (Manually running a CLI program
 - or writing a script to do it - doesn't even remotely count.) Oooh!
 Actually, now that I've looked up that link, it looks like they do
 have an early Linux version now. Awesome, I'm gonna have to try that
 out.

Sounds like something I *should* be running. I'll have to look into that. T -- Живёшь только однажды.
Sep 18 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 18, 2012 at 08:36:26AM -0700, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright <newshound2 digitalmars.com> wrote:
 The most common failure I've had are the power supplies, they're
 still as bad today as in the 80's.

There are good power supplies, they just don't come in pre-built computers because they're expensive. I think the same could be said of products from any era.

Yeah, I've learned the hard way not to trust pre-assembled PCs. They may have one or two good components listed in the ad just to hook you, but usually many other parts (that people don't usually pay attention to) are crap. PSUs are one of them. Nowadays I only ever buy parts, and assemble my own PCs. Things tend to last much longer this way. (Same thing goes for software... one thing I really like about Linux is that you can replace parts freely without voiding warranties or violating EULAs or wrestling with straitjacketed software licenses or fighting with gratuitous incompatibilities between software not written by the same people, that sorta thing. And usually OSS software comes with alternatives for everything, should the default one turn out to be crap. (Well OK, sometimes all the alternatives are crap too, but that's another story.)) T -- Doubtless it is a good thing to have an open mind, but a truly open mind should be open at both ends, like the food-pipe, with the capacity for excretion as well as absorption. -- Northrop Frye
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 10:03:13 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Tue, Sep 18, 2012 at 08:36:26AM -0700, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright
 <newshound2 digitalmars.com> wrote:
 The most common failure I've had are the power supplies, they're
 still as bad today as in the 80's.

There are good power supplies, they just don't come in pre-built computers because they're expensive. I think the same could be said of products from any era.

Yeah, I've learned the hard way not to trust pre-assembled PCs. They may have one or two good components listed in the ad just to hook you, but usually many other parts (that people don't usually pay attention to) are crap. PSUs are one of them. Nowadays I only ever buy parts, and assemble my own PCs. Things tend to last much longer this way.

I think the last time I bought a fully pre-assembled desktop, it was a a 486. I got into the habit of building from parts just because that was the easiest way to get *exactly* what I wanted (Yea, I'm a control freak). And it's not difficult to do either, it's not like building a car from parts (Although my large hands/fingers are admittedly a liability when digging around a PC's internals). I wish it was reasonable to do the same with laptops. Unfortunately the necessary compactness tends to work against that, so you can only go with pre-built, and therefore there's *always* compromises you have to make. I mean, I like my laptop overall, but I could give you a whole laundry list of my annoyances with it. But it was the best I could find (in my price range anyway).
 (Same thing goes for software... one thing I really like about Linux
 is that you can replace parts freely without voiding warranties or
 violating EULAs or wrestling with straitjacketed software licenses or
 fighting with gratuitous incompatibilities between software not
 written by the same people, that sorta thing. And usually OSS
 software comes with alternatives for everything, should the default
 one turn out to be crap. (Well OK, sometimes all the alternatives are
 crap too, but that's another story.))
 

Yup, same here. Like the "Play/Pause" keyboard button on a Win7 machine: Windows insists on taking it over - completely. Not much you can really do about it. And MS doesn't care, so you're SOL. They *could* have offered a simple "Do what when that button is pressed?" setting, but they didn't. But OTOH, sometimes the lack of standardization on Linux can be a pain, and sometimes you can't find a nice alternative (for example, I have yet to find a linux file manager I like, and I've tried LOTS of them).
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 08:25:00 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 Heh. Reminds me of my UPS... I bought it to protect my very expensive
 PSU from power surges/failures, but guess what? The PSU is still
 running and the UPS has been dead for 4 years. :-P
 

One thing I learned, the batteries in those things (just like any battery) won't last forever. You're supposed to replace the battery in it roughly every 2 years (IIRC). Personally I find it worth it. I'd never run my desktop without a UPS again - just one random power flicker and the whole thing reboots no matter what I'm in the middle of? I can't be having that. Every time our power fluctuates I'm thinking "Phew, I'm *so* glad I have that UPS, otherwise this thing would be rebooting right now."
Sep 18 2012
prev sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 18, 2012, at 1:33 PM, Nick Sabalausky =
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Tue, 18 Sep 2012 10:03:13 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
=20
 On Tue, Sep 18, 2012 at 08:36:26AM -0700, Sean Kelly wrote:
 On Sep 18, 2012, at 12:48 AM, Walter Bright
 <newshound2 digitalmars.com> wrote:
 The most common failure I've had are the power supplies, they're
 still as bad today as in the 80's.

There are good power supplies, they just don't come in pre-built computers because they're expensive. I think the same could be said of products from any era.=20

Yeah, I've learned the hard way not to trust pre-assembled PCs. They may have one or two good components listed in the ad just to hook =


 but usually many other parts (that people don't usually pay attention
 to) are crap. PSUs are one of them. Nowadays I only ever buy parts,
 and assemble my own PCs. Things tend to last much longer this way.

I think the last time I bought a fully pre-assembled desktop, it was a a 486. I got into the habit of building from parts just because that was the easiest way to get *exactly* what I wanted (Yea, I'm a control freak). And it's not difficult to do either, it's not like building a car from parts (Although my large hands/fingers are admittedly a liability when digging around a PC's internals).

I've never owned a pre-assembled PC. Back when I built my first in 1989 = it was because I couldn't afford to buy from Compaq or whoever was = around at the time. After that, it was more because I'd upgrade a = component at a time. I've considered going to a custom builder = recently, but there's still a decent premium on top of the system price. = Back in the day, the difficulty was in knowing how to plug everything = together, configure IRQs, etc, because nothing was polarized or = color-coded, and at best you'd get a page or two of reference material = regarding jumpers in Korean. These days it's more in selecting = components that are verified to be compatible. Actually putting the = machine together is fairly trivial.=
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 17 Sep 2012 13:18:51 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Mon, Sep 17, 2012 at 03:57:41PM -0400, Nick Sabalausky wrote:
 On Mon, 17 Sep 2012 00:16:26 -0700
 Jonathan M Davis <jmdavisProg gmx.com> wrote:

 The search results seem to be identical whether you search for D
 or :D, so the colon seems to be ignored.
 

Yea, google pathologically ignores anything that isn't strictly alphanumeric, even when you enclose in quotes. Fucking annoying as hell. Especially when you're trying to find something about C++ and the damn thing comes back with a bunch of C# results. That's a real obnoxious trend in computing: Software doing whatever the hell it feels like (usually under the guise of "being helpful") instead of doing what it's fucking told.

Any time you hear "smart" and "software" in the same sentence, be prepared for something dumb.

Heh, I actually say pretty much the same thing myself very often. Couldn't agree more. If you were around me in person, you'd frequently hear "I hate when (devices|programs) try to be smart." Smart(.*) is a red flag for "badly designed" or "unreliable". That's actually been an even bigger thing with me lately than ever before since, because of work, I have a call phone for the first time now - two actually, an iPhone and an Android - and I absolutely *HATE* both the damn things (with the iPhone being slightly worse). *Everything* about them is just wrong, backwards, idiotic. They even managed to take something as trivial to get right as volume controls and *completely* fuck it up in every imaginable way. And of course, Android aped Apple's idiotic lead on that, as usual. Damn I miss pay phones: I spent less than $5/year on those. Try finding a cell plan that even remotely compares to that. Or one with buttons that are actually usable. Or any fucking buttons at all, for that matter. Meh, now I'm *really* rambling though... ;)
Sep 17 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 17 Sep 2012 13:10:59 -0700
David Gileadi <gileadis NSPMgmail.com> wrote:

 On 9/16/12 3:00 PM, "J=F8n" wrote:
 The best idea I had today: rename D into :D

You asked for it: a mockup. =20

You know, make one of them look like Phobos, and the other Deimos, and you may be onto something...
Sep 17 2012
prev sibling next sibling parent "Xinok" <xinok live.com> writes:
On Monday, 17 September 2012 at 07:16:15 UTC, Jonathan M Davis 
wrote:
 On Monday, September 17, 2012 09:05:48 David Nadlinger wrote:
 On Sunday, 16 September 2012 at 21:59:30 UTC, Jøn wrote:
 The best idea I had today: rename D into :D
 
 * Easier to google

You might be surprised to see that D is the number 1 result for ":D" even today.

The search results seem to be identical whether you search for D or :D, so the colon seems to be ignored. Of course, the fact that dlang.org comes up first could just be because google taylors its results to you, and we're both people who deal with D (and presumably search for it from time to time) already. - Jonathan M Davis

It's the second result on DuckDuckGo, which *doesn't* tailor it's search results. https://duckduckgo.com/?q=d
Sep 17 2012
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 17 Sep 2012 15:35:53 -0700
Ali =C7ehreli <acehreli yahoo.com> wrote:

 On 09/17/2012 03:08 PM, Nick Sabalausky wrote:
  > On Mon, 17 Sep 2012 13:18:51 -0700
  > "H. S. Teoh"<hsteoh quickfur.ath.cx>  wrote:
=20
  >> Any time you hear "smart" and "software" in the same sentence, be
  >> prepared for something dumb.
  >>
  >
  > Heh, I actually say pretty much the same thing myself very often.
  > Couldn't agree more. If you were around me in person, you'd
  > frequently hear "I hate when (devices|programs) try to be smart."
  > Smart(.*) is a red flag for "badly designed" or "unreliable".
  >
  > That's actually been an even bigger thing with me lately than ever
  > before since, because of work, I have a call phone for the first
  > time now - two actually, an iPhone and an Android - and I
  > absolutely *HATE* both the damn things (with the iPhone being
  > slightly worse). *Everything* about them is just wrong, backwards,
  > idiotic. They even managed to take something as trivial to get
  > right as volume controls and *completely* fuck it up in every
  > imaginable way. And of course, Android aped Apple's idiotic lead
  > on that, as usual.
=20
 I have to jump in on this discussion: Those have been exactly my=20
 feelings since I've gotten my "smart" phone about two years ago. I=20
 cannot believe the lack of usability! :) I have an Android but of
 course I have played with iPhones as well. Let me tell you: the
 emperor has no clothes! :)
=20

Finally, someone who's with me on that! I thought I was the only one!
 They have imagined a "phone", where being able to answer the call is=20
 completely by luck if the phone has been in your pocket when the call=20
 arrived! Chances are, you will touch something on the "smart" screen
 and reject the call by some random reason like "I am in class." (No,
 I am not a student or a teacher at this time; but that exact scenario=20
 happened to me multiple times.)
=20

Oh man, I could go on for pages listing the issues I have with them.
 Imagine a device where the *entire* screen is touchable with
 different areas meaning different things depending on context! The
 users can only cradle it gently but they can't hold it firmly! Wow! I
 can't believe how this whole idea took off. Later generations will
 have a good laugh at these devices.
=20

And worse: When you *do* want to interact with it, you can't do so accurately, because it's *completely unresponsive* to anything even remotely accurate like a fingernail or stylus. Not that they even *have* any place to keep a stylus. And the idiotic claim rationalizing that is that capacitive touchscreens are supposedly "more accurate" than resistive. Which is bullshit because a finger can *never* be sanely considered even remotely as accurate as a fingernail or a non-capacitive stylus. Like you said: No clothes on this emperor. Speaking of resistive touchscreens and stylus, that reminds me: I miss the PalmOS devices. I loved my Visor and Zire71. If they hadn't killed them off with that WebOS junk (and if the assholes at Xerox hadn't helped by killing off the *good* version of Grafitti with their goddamn software patents), then I think a modern PalmOS incarnation would have been a fantastic alternative to iOS/Android. PalmOS 6 was looking great, but never materialized due to the one thing that made it so great: It wasn't trying to ape Apple's moronic ideas. Hell, that's why it's impossible to get a good portable music player: They all decided they *had* to ape Apple. Shit, if I wanted a portable music player with minimal storage, proprietary communications, and a non-tactile poorly-designed interface, I'd have actually *gotten* an iPod (either iTouch or pre-iTouch, they're both junk). I *don't* want that Apple-style junk, that's *why* I went looking for non-Apple devices! The best I could find was a Toshiba Gigabeat F hacked up with the Rockbox firmware, but even that could have been a lot better by toning down the Apple-envy (damn touch-sensitive "buttons"). (Incidentally, the Zune 1 would have been *perfect* if MS's insistence on aping Apple's "Don't let anyone access it like the USB HDD it literally is" hadn't single-handedly rendered it useless. Well, and if MS knew how to make non-trivial hardware that didn't break down at the drop of a hat. Zune 2 was junk, though.)
 Thanks for letting me vent. :)
=20

Heh. One thing I've learned about myself: I love to complain :) I don't like having things *to* complain about, but when I do...
Sep 17 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Sep 19, 2012 at 12:02:24PM +0200, Timon Gehr wrote:
 On 09/19/2012 11:54 AM, Mehrdad wrote:
...
At least when Windows has the occasional boot problem which I
stupidly caused, it's _fixable_ and doesn't lie to you about having
fixed it!!

The issue is that in one case you know how to fix it and in the other one you do not (and you care less about it because you prefer to think Windows is superior as it is what you use '99% of the time'), not that the problems are inherently (un)fixable.

Yeah, that's one of the things that irks me about Windows culture. It's touted as being "user-friendly" and "easy to use", etc., but actually it requires just as much effort as learning to use Linux. People complain about how Linux is hard to use or things break for no reason, but the same thing happens with Windows -- you either do things the Windows way (which requires that you learn what it is), or you quickly run into a whole bunch of gratuitous incompatibilities and bugs that nobody cares about because you aren't "supposed" to do things that way. (I tried switching the mouse to sloppy focus once... and never dared try it again.) As a programmer, though, I find Windows fundamentally annoying because the hood is welded shut. Sometimes you *know* what's wrong but it refuses to let you fix it, whereas on Linux you can look at the source and figure out how to fix it -- heck, you can modify and recompile the dang *kernel* to make it do what you want, should you be so inclined! You can't even get close to that in Windows. But then again, this is from the POV of a programmer. From the user's POV, none of this matters, it's all just a question of familiarity and preference. I personally find the bash shell far easier and more comfortable to use than any kind of klunky GUI, but most people won't because the prevalence of Windows has made GUIs more familiar to the average user. T -- Turning your clock 15 minutes ahead won't cure lateness---you're just making time go faster!
Sep 19 2012
prev sibling next sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Wednesday, 19 September 2012 at 17:29:17 UTC, H. S. Teoh wrote:
 On Wed, Sep 19, 2012 at 12:02:24PM +0200, Timon Gehr wrote:
 The issue is that in one case you know how to fix it and in 
 the other one you do not (and you care less about it because 
 you prefer to think Windows is superior as it is what you use 
 '99% of the time'),  not that the problems are inherently 
 (un)fixable.

Yeah, that's one of the things that irks me about Windows culture. It's touted as being "user-friendly" and "easy to use", etc., but actually it requires just as much effort as learning to use Linux. People complain about how Linux is hard to use or things break for no reason, but the same thing happens with Windows -- you either do things the Windows way (which requires that you learn what it is), or you quickly run into a whole bunch of gratuitous incompatibilities and bugs that nobody cares about because you aren't "supposed" to do things that way.

Yeah, they're "fixable" by your definition all right. It's just that when you ask people how, either no one you ask knows why, or they try to convince you that you're an idiot for even thinking about asking." Relevant examples: It's next-to-impossible to go on a forum and ask about fixing a boot-sector GRUB install without some fool coming along and diverting the entire thread into "Why the hell isn't GRUB installed on your MBR?" When you have a (God forbid!) space character in your directory/file names and some program chokes on it? "Stop putting spaces in your file names." When you ask how to make a passwordless account or how to obtain permanent root privileges? "Are you insane?!" When you ask if there is a defragmenter for Linux? Some fool comes along and says "Linux doesn't need defragmentation!!!!!!!!!" When you ask why the fonts are blurry? "It's just different, you're just picky. Get used to it." When you ask why the touchpad is so darn hypersensitive? "Modify the source code." Bottom line: Yeah, there's _always_ way to fix your problems, if by "fixing the problem" you mean "rewriting the OS". It's pretty damn hard to convince Linux users that what you're trying to do is, in fact, not out of stupidity/ignorance.
 (I tried switching the mouse to sloppy focus once... and never 
 dared try it again.)

What's "sloppy focus"?
Sep 19 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Wed, 19 Sep 2012 20:49:58 +0200
"Mehrdad" <wfunction hotmail.com> wrote:

 On Wednesday, 19 September 2012 at 17:29:17 UTC, H. S. Teoh wrote:
 On Wed, Sep 19, 2012 at 12:02:24PM +0200, Timon Gehr wrote:
 The issue is that in one case you know how to fix it and in 
 the other one you do not (and you care less about it because 
 you prefer to think Windows is superior as it is what you use 
 '99% of the time'),  not that the problems are inherently 
 (un)fixable.

Yeah, that's one of the things that irks me about Windows culture. It's touted as being "user-friendly" and "easy to use", etc., but actually it requires just as much effort as learning to use Linux. People complain about how Linux is hard to use or things break for no reason, but the same thing happens with Windows -- you either do things the Windows way (which requires that you learn what it is), or you quickly run into a whole bunch of gratuitous incompatibilities and bugs that nobody cares about because you aren't "supposed" to do things that way.

Yeah, they're "fixable" by your definition all right. It's just that when you ask people how, either no one you ask knows why, or they try to convince you that you're an idiot for even thinking about asking." Relevant examples: It's next-to-impossible to go on a forum and ask about fixing a boot-sector GRUB install without some fool coming along and diverting the entire thread into "Why the hell isn't GRUB installed on your MBR?" When you have a (God forbid!) space character in your directory/file names and some program chokes on it? "Stop putting spaces in your file names." When you ask how to make a passwordless account or how to obtain permanent root privileges? "Are you insane?!" When you ask if there is a defragmenter for Linux? Some fool comes along and says "Linux doesn't need defragmentation!!!!!!!!!" When you ask why the fonts are blurry? "It's just different, you're just picky. Get used to it." When you ask why the touchpad is so darn hypersensitive? "Modify the source code."

Yea, as much as there is I like about Linux (and I intend to switch to it for my primary system), I've always considered the "culture" surrounding it to be one of Linux's biggest liabilities. You should have seen the shitstorm I had to put up with when inquiring about a text-mode editor (so I could use it through SSH) that worked more like Kate/Gedit and less like VI/Emacs/Nano. Of course, I did make the mistake of *mentioning* the forbidden word: Windows. But still, I mean, grow up people: it's a fucking OS, not a religion. (I even got responses that outright ignored the "text-mode" part and suggested various GUI editors.) There are certainly *good* helpful mature users too, though. It'd be unfair, and patently untrue, for me to say that *all* the Linux culture is screwy like that. But there's too much.
Sep 19 2012
prev sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Wednesday, 19 September 2012 at 21:30:58 UTC, Nick Sabalausky 
wrote:
 On Wed, 19 Sep 2012 20:49:58 +0200
 "Mehrdad" <wfunction hotmail.com> wrote:

 On Wednesday, 19 September 2012 at 17:29:17 UTC, H. S. Teoh 
 wrote:
 On Wed, Sep 19, 2012 at 12:02:24PM +0200, Timon Gehr wrote:
 The issue is that in one case you know how to fix it and in 
 the other one you do not (and you care less about it 
 because you prefer to think Windows is superior as it is 
 what you use '99% of the time'),  not that the problems are 
 inherently (un)fixable.

Yeah, that's one of the things that irks me about Windows culture. It's touted as being "user-friendly" and "easy to use", etc., but actually it requires just as much effort as learning to use Linux. People complain about how Linux is hard to use or things break for no reason, but the same thing happens with Windows -- you either do things the Windows way (which requires that you learn what it is), or you quickly run into a whole bunch of gratuitous incompatibilities and bugs that nobody cares about because you aren't "supposed" to do things that way.

Yeah, they're "fixable" by your definition all right. It's just that when you ask people how, either no one you ask knows why, or they try to convince you that you're an idiot for even thinking about asking." Relevant examples: It's next-to-impossible to go on a forum and ask about fixing a boot-sector GRUB install without some fool coming along and diverting the entire thread into "Why the hell isn't GRUB installed on your MBR?" When you have a (God forbid!) space character in your directory/file names and some program chokes on it? "Stop putting spaces in your file names." When you ask how to make a passwordless account or how to obtain permanent root privileges? "Are you insane?!" When you ask if there is a defragmenter for Linux? Some fool comes along and says "Linux doesn't need defragmentation!!!!!!!!!" When you ask why the fonts are blurry? "It's just different, you're just picky. Get used to it." When you ask why the touchpad is so darn hypersensitive? "Modify the source code."

Yea, as much as there is I like about Linux (and I intend to switch to it for my primary system), I've always considered the "culture" surrounding it to be one of Linux's biggest liabilities. You should have seen the shitstorm I had to put up with when inquiring about a text-mode editor (so I could use it through SSH) that worked more like Kate/Gedit and less like VI/Emacs/Nano. Of course, I did make the mistake of *mentioning* the forbidden word: Windows. But still, I mean, grow up people: it's a fucking OS, not a religion.

+1
Sep 19 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Sep 17, 2012 at 07:33:44PM -0400, Nick Sabalausky wrote:
 On Mon, 17 Sep 2012 15:35:53 -0700
 Ali ehreli <acehreli yahoo.com> wrote:

 Imagine a device where the *entire* screen is touchable with
 different areas meaning different things depending on context! The
 users can only cradle it gently but they can't hold it firmly! Wow!
 I can't believe how this whole idea took off. Later generations will
 have a good laugh at these devices.
 

And worse: When you *do* want to interact with it, you can't do so accurately, because it's *completely unresponsive* to anything even remotely accurate like a fingernail or stylus. Not that they even *have* any place to keep a stylus. And the idiotic claim rationalizing that is that capacitive touchscreens are supposedly "more accurate" than resistive. Which is bullshit because a finger can *never* be sanely considered even remotely as accurate as a fingernail or a non-capacitive stylus. Like you said: No clothes on this emperor.

Yeah, I spent the better part of a few *weeks* just to get my finger to land in the right spots for iSilo to open a link correctly. Even now, YEARS later, sometimes I still have to stand there like an idiot tapping the same spot 50 times before it will go, because the link is 2 characters wide, and the stupid software can't figure out that since the finger landed closest to a 5x7 pixel link, the user probably meant to hit that link instead of empty space. Like you said, fingers are totally inaccurate. And a fingernail or stylus doesn't work because of the capacitative surface. [...]
 (Incidentally, the Zune 1 would have been *perfect* if MS's insistence
 on aping Apple's "Don't let anyone access it like the USB HDD it
 literally is" hadn't single-handedly rendered it useless. [...])

Hear, hear! The one thing that irks me the most is this whole "you can't access your own files 'cos we decided that you just can't" nonsense. Like you said, it's essentially a USB HDD. Now I have a bunch of files on my iPod that I accidentally corrupted on my PC, and I can't copy them back because I can't access them from outside! Grrrrrr... And don't get me started on the straitjacketed app store that has the full freedom to kill off Apple competition at a whim. Oh yes, lest you have any illusion that the app store's policies are for "protecting the user", let's face the fact that there is a long history of USEFUL apps that got blocked because they competed with Apple's own inferior offerings. VLC player, for one. Opera Mobile. And countless others. The official reason? They competed with Apple's own offerings. Yes, that's the *official* reason. And if you were lucky enough to install them before they got taken off, you'd quickly realize that they are far superior to what Apple has to offer. In the meantime, totally worthless apps and $0.99 scams that do *nothing* except pocket your dollar are left free to roam. Does this remind you of "DOS ain't done until Lotus won't run"? I was looking forward to getting an Android when my current precambrian non-smart phone finally breathes its last... but it looks like it's just going to be Apple Hell, Version 2 "we just changed the props but the annoying misfeatures are exactly the same as you experienced before (tm)". :-( T -- What do you get if you drop a piano down a mineshaft? A flat minor.
Sep 17 2012
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 9/18/12, Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> wrote:
 Heh. One thing I've learned about myself: I love to complain :) I don't
 like having things *to* complain about, but when I do...

I love reading posts like these. Here's a recent one: http://www.hanselman.com/blog/EverythingsBrokenAndNobodysUpset.aspx
Sep 17 2012
prev sibling next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 9/18/12, Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:
 On 9/18/12, Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> wrote:
 Heh. One thing I've learned about myself: I love to complain :) I don't
 like having things *to* complain about, but when I do...

I love reading posts like these. Here's a recent one: http://www.hanselman.com/blog/EverythingsBrokenAndNobodysUpset.aspx

Btw who on earth develops set top box software? Granted I've only used two so far (since I switched ISPs and my triple-play service recently), but the software on it is such incredible garbage. How do they manage to create software for a specific device, while knowing all of its characteristics, that lags like hell? I'd really like to see the source code for that. How many cycles could they possibly waste to blit a pre-designed bitmap on the screen (like the main menu)?
Sep 17 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Sep 19, 2012 at 08:49:58PM +0200, Mehrdad wrote:
 On Wednesday, 19 September 2012 at 17:29:17 UTC, H. S. Teoh wrote:
On Wed, Sep 19, 2012 at 12:02:24PM +0200, Timon Gehr wrote:
The issue is that in one case you know how to fix it and in the
other one you do not (and you care less about it because you prefer
to think Windows is superior as it is what you use '99% of the
time'),  not that the problems are inherently (un)fixable.

Yeah, that's one of the things that irks me about Windows culture. It's touted as being "user-friendly" and "easy to use", etc., but actually it requires just as much effort as learning to use Linux. People complain about how Linux is hard to use or things break for no reason, but the same thing happens with Windows -- you either do things the Windows way (which requires that you learn what it is), or you quickly run into a whole bunch of gratuitous incompatibilities and bugs that nobody cares about because you aren't "supposed" to do things that way.

Yeah, they're "fixable" by your definition all right. It's just that when you ask people how, either no one you ask knows why, or they try to convince you that you're an idiot for even thinking about asking."

"How do I use Windows without a GUI?" "What are you, an idiot?!"
 Relevant examples:
 
 It's next-to-impossible to go on a forum and ask about fixing a
 boot-sector GRUB install without some fool coming along and
 diverting the entire thread into "Why the hell isn't GRUB installed
 on your MBR?"
 
 When you have a (God forbid!) space character in your directory/file
 names and some program chokes on it?
 "Stop putting spaces in your file names."
 
 When you ask how to make a passwordless account or how to obtain
 permanent root privileges?
 "Are you insane?!"
 
 When you ask if there is a defragmenter for Linux?
 Some fool comes along and says "Linux doesn't need
 defragmentation!!!!!!!!!"
 
 When you ask why the fonts are blurry?
 "It's just different, you're just picky. Get used to it."
 
 When you ask why the touchpad is so darn hypersensitive?
 "Modify the source code."

"Why can't I do things the Linux way on Windows?" "Because it's not Linux, you fool."
 Bottom line:
 
 Yeah, there's _always_ way to fix your problems, if by "fixing the
 problem" you mean "rewriting the OS".

We have the option of rewriting the OS, or any of its parts thereof. Yes you may have to (gosh!) spend time learning how the thing works and how to make it do what you want. But at least it's _possible_. You couldn't rewrite Windows even if you knew how. Besides, most of the problems you listed are a result of trying to do things the Windows way on a system that *isn't* Windows. I bet I'll get exactly the same responses if I started asking Windows forums how to make Windows behave like Linux. It all comes down to preference. I can't stand *any* kind of GUI, much less the straitjacketed non-configurable (not without massive breakage) kind of GUI that Windows offers. I do stuff on the shell that no GUI can ever hope to achieve, and I like it that way. I prefer to communicate in complete sentences rather than point-n-grunt. But I don't pretend that everybody else feels the same way. With Linux I can twist it and warp it until X11 behaves like a glorified console. Or like a 3D desktop, if I cared for that sorta thing. Heck, I've even contemplated writing a _4D_ window manager, for that matter. With Windows, I have no choice. I have to use a GUI, and a Windows-style GUI at that. Try to change the way it behaves, and everything breaks. The Windows way is shoved down my throat whether I like it or not. So guess which system I prefer to use?
 It's pretty damn hard to convince Linux users that what you're
 trying to do is, in fact, not out of stupidity/ignorance.

It's pretty damn hard to convince Windows zealots that anything but the Windows way is not out of stupidity/ignorance.
(I tried switching the mouse to sloppy focus once... and never dared
try it again.)

What's "sloppy focus"?

The window focus automatically changes to whatever window the mouse is currently hovering over. Preferably WITHOUT automatically bringing said window to the top. (Good luck making this work on Windows. And once you actually manage to coax Windows to do it, have fun seeing the train wreck that is your applications once you start using them this way.) T -- INTEL = Only half of "intelligence".
Sep 19 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Wed, 19 Sep 2012 13:38:32 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Wed, Sep 19, 2012 at 08:49:58PM +0200, Mehrdad wrote:
 It's pretty damn hard to convince Linux users that what you're
 trying to do is, in fact, not out of stupidity/ignorance.

It's pretty damn hard to convince Windows zealots that anything but the Windows way is not out of stupidity/ignorance.

Windows zealots are pretty rare though. Most Windows users accept that it's just an OS, and that it has its problems and downsides. (It'd be pretty hard to be a Windows user and *not* accept that Windows has it's problems.)
 
(I tried switching the mouse to sloppy focus once... and never
dared try it again.)

What's "sloppy focus"?

The window focus automatically changes to whatever window the mouse is currently hovering over. Preferably WITHOUT automatically bringing said window to the top. (Good luck making this work on Windows. And once you actually manage to coax Windows to do it, have fun seeing the train wreck that is your applications once you start using them this way.)

Not exactly what you described, but similar: http://ehiti.de/katmouse/ When I point at something and scroll, I expect my *target* to scroll, not whatever the hell random thing I just happened to have clicked on last. I would *HATE* using windows if I didn't have that. Unfortunately, it doesn't *always* work on Win7 (usually does, though). Works great on XP. But I agree, trying to do anything the non-Windows way on Windows involves stupid PITA hacking, that doesn't always work right, *if* it's even possible at all. And it's not *just* doing something the non-Windows way, it's even specific *versions* of windows: You can't even get things the WinXP way on Win7. Sure, *some* things you can, *sometimes*, with obscure hacks that don't even always work... Man, I'm really gonna have to get around to upgrading my laptop from Win7 back to XP sometime...Fuck this shit...
Sep 19 2012
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Wed, 19 Sep 2012 13:38:32 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 I prefer to communicate in
 complete sentences rather than point-n-grunt.

That's great :) I like both CLI and GUI, depending on what I'm doing, but that's really good.
Sep 19 2012
prev sibling next sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Monday, 17 September 2012 at 22:14:51 UTC, Walter Bright wrote:
 The trouble with cute logos is like hearing the same joke over 
 and over.

s/cute logos/TCP jokes/
Sep 17 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 03:15:33 +0200
Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:

 On 9/18/12, Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:
 On 9/18/12, Nick Sabalausky <SeeWebsiteToContactMe semitwist.com>
 wrote:
 Heh. One thing I've learned about myself: I love to complain :) I
 don't like having things *to* complain about, but when I do...

I love reading posts like these. Here's a recent one: http://www.hanselman.com/blog/EverythingsBrokenAndNobodysUpset.aspx

Btw who on earth develops set top box software?

Former lab monkeys who survived the brain experiments. Funny, that *one* sentence alone and already I know exactly what you're talking about...
 Granted I've only used
 two so far (since I switched ISPs and my triple-play service
 recently), but the software on it is such incredible garbage. How do
 they manage to create software for a specific device, while knowing
 all of its characteristics, that lags like hell? I'd really like to
 see the source code for that. How many cycles could they possibly
 waste to blit a pre-designed bitmap on the screen (like the main
 menu)?

Yup. A few months ago, we ended up just ditching cable TV entirely: - Set-top firmware completely fubared, just like you described, and the company and tech people just shrugged it off and gave excuses that didn't make any sense at all. - Video feeds that I could almost swear must have been MPEG *1*. Constant compression artifacts. - A/V frequently out-of-sync. - A/V frequently cutting out entirely (note this was *cable*, not satellite). - Multiple service visits, only ever fixing a small minority of the issues, and only ever temporarily. - Roughly $100/mo for nothing but reality shows and dodgy camerawork, all with *overlayed* advertisements. - The *only* thing I liked was that, as a promo, we were getting NHK for a couple months, which was actually pretty cool, even though I barely know any of the langauge. - Oh, and even before any of that even started happening, there was this: http://semitwist.com/articles/article/view/time-warner-cable-cannot-find-my-account We replaced them (Time Warner in our case) with a $50 converter box and $40 antenna (*one*-time fees), neither of which I ever actually use (I just get DVDs from the library), and I couldn't be happier. And if I ever want more, I can just get Netflix: $8/mo vs the cable company's $100/mo (although Netflix's seeking sucks, and they never offer subtitled non-dubbed alternatives for foreign stuff, which is really annoying when you come across something with bad dubbing, especially since it's internet so there's nothing actually preventing them from offering it). And it's not just cable-boxes, it's almost anything embedded. Like car stereos: I *never* used to have *any* complaint about any car stereo other than "The after market ones are always ugly as hell and look like damn toys". But last time my mom got a new car, a Hyndai Elantra, it came with one of those combo satnav/stereo units. So pretty cool, right? And the satnav part seems to work fine (now that it's been replaced after dying...twice). But the stereo is a barely-usable piece of shit. Aside from over-reliance on touch-screen (a *really* dumb fucking idea *in a CAR*), these are *some* of its problems: - Extremely laggy UI. - *Every* time you start the car, NO MATTER WHAT, it turns on the radio (or a CD if it was playing one before) and sets it to a default volume. Not "if the radio/CD was already playing when you turned the car off", but "EVERY time" period. She was told this was the car "being helpful". Yea, way to spin an obvious fuckup, people. At least I hope to hell it's a bug and not...<shudder>...deliberate. - Then you turn it off, but five seconds later, it turns back on and starts playing again. Turn it off that *second* time, *then* it stays off. - It never remembers your volume level after turning the car off and back on. - There's no way to see what the volume is set at without changing it. - You can't change the volume level without turning the car on. (Seriously, what the hell is up with this "war against POTs"? Variable resistors are the *perfect* volume control, and yet now they've become taboo and replaced by shit that doesn't even always work.) - Satnav/stereo unit CANNOT be turned off without turning the *whole car* off. - In order to use the satnav/stereo unit, you have to *read and respond to* a prompt that tells you (ready for this?) **NOT** to read and interact with it while driving! Uhh...WTF? I'm betting this one was due to some dumbshit lawyer or politician. And there's even more. Honestly, if I were looking into getting a new car, I would consider that stereo *alone* to be a deal-breaker. It's that bad. I miss the 80's: Devices worked and idiots didn't use computers.
Sep 17 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 18, 2012 at 03:15:33AM +0200, Andrej Mitrovic wrote:
 On 9/18/12, Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:
 On 9/18/12, Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> wrote:
 Heh. One thing I've learned about myself: I love to complain :) I
 don't like having things *to* complain about, but when I do...

I love reading posts like these. Here's a recent one: http://www.hanselman.com/blog/EverythingsBrokenAndNobodysUpset.aspx


+1. After having worked in the industry for over a decade, I'm becoming increasingly cynical about the state of software today. And seeing it "from the inside" as it were, I realize that it *can* be done better. We have all the tools to make it better. A lot better. But it isn't. For example, I've seen enterprise code that looks like it was written by highschool dropouts. I've seen how said code survives for YEARS in spite of the presence of a code review system, simply because nobody has the time to devote to cleaning things up, or nobody cares to because it is not rewarded. Employers want "positive" contributions -- new features, glitzy GUIs, unreasonable customer feature requests, bloat deemed necessary because the CTO coughed it up one morning after a sleepless night, etc.. Nobody cares about cleaning up what's currently there 'cos it doesn't give anything to the marketing types to sell, and it doesn't have any immediate apparent benefits. The code review process is more concerned about hot-ticket items like security fixes, blatant crashes, or other such important issues like Yahoo messenger not working on the corporate network. Nobody cares about the thousands of little bits of horribly, horribly wrong code, the effect of which isn't obvious because it's been covered over with layer after layer of festering bandages. And even if you *do* make the extra effort to clean things up, the next person comes along and doesn't understand what was done before, and just slobbers all over it (figuratively speaking), turning it into yet another mess. And the result? You get stupidities like strange inconsistencies in software behaviour, bugs that can no longer be fixed 'cos things have started depending on the buggy behaviour, etc.. An embedded system that has THREE database engines 'cos the teams in charge of various parts of the system don't talk to each other and/or refuse to consolidate on a single DBMS, resulting in completely needless bloat (I mean, *three* SQL engines on a single embedded system?! Really?!). Or an executable that takes 50GB of memory to link... I made the mistake of attempting to running two builds at the same time, both of which hit this executable around the same time, which caused my PC to lock up for over an hour (locked up for all practical purposes, that is; it was taking 5 *minutes* to respond to a single keystroke as the disk thrashed itself to death. Just don't ask why responding to keystrokes depends on disk I/O). A large part of that laughably huge executable consists of largely copy-n-pasted-n-modified cout<< statements that outputs HTML and Javascript, and other such boilerplate code. It boggles the mind that a saner, lighter-weight system had not been designed for the task. It has been like this for YEARS, and will likely remain so for the foreseeable future. Is it any surprise that most software today is crap? Sometimes I fear that if I introduce D to certain people, they will just proceed to rewrite the same train wreck that is their current C++ code in D, except now they have so many more ways to shoot themselves (and all of the miserable people who will come after them) in the foot, several times over.
 Btw who on earth develops set top box software? Granted I've only used
 two so far (since I switched ISPs and my triple-play service
 recently), but the software on it is such incredible garbage. How do
 they manage to create software for a specific device, while knowing
 all of its characteristics, that lags like hell? I'd really like to
 see the source code for that. How many cycles could they possibly
 waste to blit a pre-designed bitmap on the screen (like the main
 menu)?

They must have written the software in ActionScript or something. >:-) Either that, or they have 3 SQL engines running on the set top box with 50GB of copy-n-pasted Javascript-outputting code (that gets piped to VM running a VB version of IE5's rendering engine). :-P T -- LINUX = Lousy Interface for Nefarious Unix Xenophobes.
Sep 17 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 17 Sep 2012 22:06:56 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 +1. After having worked in the industry for over a decade, I'm
 becoming increasingly cynical about the state of software today. And
 seeing it "from the inside" as it were, I realize that it *can* be
 done better. We have all the tools to make it better. A lot better.
 But it isn't.
 

You know, you have a habit of saying things that make me wonder if you're stealing my brainwaves. What you've said up there is *exactly* how I feel about it too, and I've even worded it that way in conversations. Non-techies find out I'm a programmer and assume I like computers and software (which used to be true). But really I'm critical about them *because* I know them so well, so I notice all the screwups and idiocy that most people don't. I like computing's *potential*, though.
 For example, I've seen enterprise code that looks like it was written
 by highschool dropouts.

Actually, one of the best coders I know was a high school dropout. I find that the really bad code I come across is almost always from people whose primary coding experience is college courses.
 I've seen how said code survives for YEARS in
 spite of the presence of a code review system, simply because nobody
 has the time to devote to cleaning things up, or nobody cares to
 because it is not rewarded. Employers want "positive" contributions
 -- new features, glitzy GUIs, unreasonable customer feature requests,
 bloat deemed necessary because the CTO coughed it up one morning
 after a sleepless night, etc.. Nobody cares about cleaning up what's
 currently there 'cos it doesn't give anything to the marketing types
 to sell, and it doesn't have any immediate apparent benefits. The
 code review process is more concerned about hot-ticket items like
 security fixes, blatant crashes, or other such important issues like
 Yahoo messenger not working on the corporate network.  Nobody cares
 about the thousands of little bits of horribly, horribly wrong code,
 the effect of which isn't obvious because it's been covered over with
 layer after layer of festering bandages. And even if you *do* make
 the extra effort to clean things up, the next person comes along and
 doesn't understand what was done before, and just slobbers all over
 it (figuratively speaking), turning it into yet another mess.
 

Exactly. I genuinely believe that understanding that really should be one of the top qualifications of being a software manager. Because if you don't get that, then you really are doomed to being a pointy-hair who causes more harm than good. It's just not *possible* to do better than that without understanding such realities of software dev.
 And the result? You get stupidities like strange inconsistencies in
 software behaviour, bugs that can no longer be fixed 'cos things have
 started depending on the buggy behaviour, etc..

Yea, and then guess what (or rather, who) gets blamed for those problems? That is, instead of the managers who allowed, nay, *expected* the code to remain is such a poor state. And I've seen all that happen.
 Is it any surprise that most software today is crap?

Is it any surprise the vast majority of *good* software is either open-source or otherwise non-commercial? Managed by *programmers*, with no suits and MBAs and hired-liars (ie, salesmen) to muck everything up, and sell features that don't exist without bothering to check with or even tell the dev team, at least not until it's "Hey, the sales team just sold XXXX feature and promised it by YYYY date, so I need you to do that". "What?!?! FUCK YOU!!" (Yes, I've worked at a company where that was standard operating practice. *cough* Main Sequence Technologies *cough*)
 Sometimes I fear
 that if I introduce D to certain people, they will just proceed to
 rewrite the same train wreck that is their current C++ code in D,
 except now they have so many more ways to shoot themselves (and all
 of the miserable people who will come after them) in the foot,
 several times over.
 

At least it's not PHP...which makes it basically impossible NOT to aim directly at your own foot.
 
 Btw who on earth develops set top box software? Granted I've only
 used two so far (since I switched ISPs and my triple-play service
 recently), but the software on it is such incredible garbage. How do
 they manage to create software for a specific device, while knowing
 all of its characteristics, that lags like hell? I'd really like to
 see the source code for that. How many cycles could they possibly
 waste to blit a pre-designed bitmap on the screen (like the main
 menu)?

They must have written the software in ActionScript or something. >:-) Either that, or they have 3 SQL engines running on the set top box with 50GB of copy-n-pasted Javascript-outputting code (that gets piped to VM running a VB version of IE5's rendering engine). :-P

I would honestly be very, very surprised if there *isn't* something screwy like that going on. You use the thing, and you just *know* there's some "thedailywtf.com"-worthy stuff going on in there. (Speaking of which, I actually had to stop reading that site just because it got to be so damn depressing.)
Sep 18 2012
prev sibling next sibling parent "Mehrdad" <wfunction hotmail.com> writes:
On Tuesday, 18 September 2012 at 08:09:41 UTC, Nick Sabalausky 
wrote:
 Is it any surprise the vast majority of *good* software is 
 either open-source or otherwise non-commercial?

It is? Every time I try to switch from Microsoft Office to Open/LibreOffice, I find them unusable. And those are probably the best alternatives. Every time I try to switch from Windows to Ubuntu, GRUB belches at me, saying it thinks it's THE boot loader and it just cries like a baby about how it wants to install itself on the MBR. And it stops working randomly every once in a while when I put it on the partition boot sector. Funny, the only times the Windows boot loader ever gets messed up is when I try to install Linux. Not when I happen to resize a random partition. And if you tell me GIMP or Inkscape or whatever take the place of Adobe suites I'm just going to laugh. Are they good? Sure. Are the comparable with the commercial versions? Hell no. Google Chrome? It's open-source, but it's driven by commercial interests -- it's driven by the advantages it gives Google in the market, even though it's "free" by itself. Oh, and there's a reason people still use WinRAR instead of 7z, as great as 7-Zip is. (Yes, the icons and toolbars DO make a difference, even if you think that's stupid.) In the programming world -- just look at how popular C# is. It's not popular because it was open-source (although people tried to make Mono) -- it's popular because it's got damn good balance in terms of usability and IDE support. And VS is a lot of $$$ to buy. Nothing open-source/non-commercial about it. Of course, there's good open-source software. No doubt about that. But at the moment I can't think of one that took the place of commercial software because people find it "good" and they find the commercial version "not good". And let's not go into computer games and such...
 Managed by *programmers*

LOL, that's precisely why open-source software has a "steep learning curve", as the creators like to put it. It's a result of programmers not knowing (or caring) about making good UIs, so they just think the users are noobs when they can't use the software.
Sep 18 2012
prev sibling next sibling parent "renoX" <renozyx gmail.com> writes:
Not very good rant,
you write:
 They have imagined a "phone", where being able to answer the 
 call is completely by luck if the phone has been in your pocket 
 when the call arrived! Chances are, you will touch something on 
 the "smart" screen and reject the call by some random reason

I have the *same issue* with a non-tactile phone: when the phone is activated, quite often the keys will be pressed randomly and create something unwanted. Worse sometimes the phone will unlock itself while in my pocket, something that I think is more rare for touch phones. How do you suggest to fix this issue? RenoX
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 00:41:58 -0700
Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/17/2012 9:35 PM, Nick Sabalausky wrote:
 And there's even more. Honestly, if I were looking into getting a
 new car, I would consider that stereo *alone* to be a deal-breaker.
 It's that bad.

Install headers and a cherry-bomb exhaust,

Heh, I don't even know what those are :P
 and you won't need no
 steekin' car stereo no more.
 
 I miss the 80's: Devices worked and idiots didn't use computers.

You've got a selective memory!! A car stereo in the 80's used cassettes. With a cassette, you've got flutter, rewinding, and a player that randomly ate your tapes. You also had tapes scattered about your car, usually encrusted with some substance that may or may not have come from McDonald's or the dog. I was happy a few years back to throw my cassette collection into the garbage.

Yea, so was I, but then I discovered that that we're basically trading one set of problems for another, especially with video. Casettes suck, and I'm glad to be done with them, but with discs: - They're less durable. Scratch a cassette? DEEPLY? Like, with a knife? So what? The vulnerable tape is actually *protected*. But ordinary usage of disc, even *without* those jackasses who who set discs *on top* of the case instead of *in* it (*cough* both my parents), and it still gets scuffed and will start skipping. - PUO's. 'Nuff said. - Inevitable laser burnout. - Cassette-eating decks? The ultra-popular XBox 360 eats discs. You know those rental discs you get with the BIG circular grooves dug into them guaranteeing it won't play thorugh? It was the laser of someone's 360 that did that. Yes, cassette sucked, but disc sucks, too.
 Oh, and TV sets and VCRs stunk compared to today. The TV shows stunk,
 too. With netflix, I rewatched some of those older shows, and was
 appalled at how bad they were. Try watching an 80's miniseries -
 gawd, what stinkers.
 

Yea, there was a lot of junk (there's a lot of junk in every decade), but I'd rather watch a bad 80's show than modern reality show any day. And reality shows are about all there are anymore. Hell even documentaries are starting to do shitty JJ Abrams style directing. Some fantastic 80's shows off the top of my head: - Soap - Hunter - Magnum PI - Remington Steele - Miami Vice - MacGyver - Cheers - Golden Girls (ok, minus the occasional "After School Special" scenes) - Married With Children (the first two or three seasons were in the 80's)
 But I did like 80's fashions much better than today's. The 70's were
 the worst, and the 80's the best.
 

I once heard someone say the 70's were the hangover from the 60's. That's how I feel about the 80's and 90's: - Torn jeans? Awesome. Sagging? GTFO. - Spandex/leather? Sweet. Flanel? Blech. - Flock of Seagulls? Radical. Combover? What is this, "Leave it to Beaver"?
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 11:03:03 +0200
"renoX" <renozyx gmail.com> wrote:

 Not very good rant,
 you write:
 They have imagined a "phone", where being able to answer the 
 call is completely by luck if the phone has been in your pocket 
 when the call arrived! Chances are, you will touch something on 
 the "smart" screen and reject the call by some random reason

I have the *same issue* with a non-tactile phone:

I assume you mean non-touchscreen? The iPhone and Androids are as non-tactile as it gets, which is one of the biggest things I hate about them.
 when the phone 
 is activated, quite often the keys will be pressed randomly and 
 create something unwanted.
 
 Worse sometimes the phone will unlock itself while in my pocket, 
 something that I think is more rare for touch phones.
 
 How do you suggest to fix this issue?
 

- "Clamshell" flip-phone. - A proper physical switch for lock/unlock. The one on the Toshiba Gigabeat F (admittedly a music player, not a phone) works flawlessly.
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 10:39:58 +0200
"Mehrdad" <wfunction hotmail.com> wrote:

 On Tuesday, 18 September 2012 at 08:09:41 UTC, Nick Sabalausky 
 wrote:
 Is it any surprise the vast majority of *good* software is 
 either open-source or otherwise non-commercial?

It is?

Naturally there are exceptions, but that's been my experience more often than not. Of course, I'm *certainly* not going to say that "OSS is *usually* good". And I'm not even saying "OSS is more frequently good than commercial". I'm just saying, when I find I program that I actually like and doesn't irritate me, it's usually either OSS or freeware.
 Every time I try to switch from Microsoft Office to 
 Open/LibreOffice, I find them unusable. And those are probably 
 the best alternatives.
 

It works well enough for me, but then I don't really do much with them.
 Every time I try to switch from Windows to Ubuntu, GRUB belches 
 at me, saying it thinks it's THE boot loader and it just cries 
 like a baby about how it wants to install itself on the MBR.
 And it stops working randomly every once in a while when I put it 
 on the partition boot sector.
 
 Funny, the only times the Windows boot loader ever gets messed up 
 is when I try to install Linux. Not when I happen to resize a 
 random partition.
 

Oh, god, I learned a long time ago to NEVER mess with dual-booting. It's just never worth it no matter what the OS. Use a VM, or a LiveDisc distro (with USB persistence), but forget dual-boot bootloaders.
 And if you tell me GIMP or Inkscape or whatever take the place of 
 Adobe suites I'm just going to laugh.
 Are they good? Sure.
 Are the comparable with the commercial versions? Hell no.
 

GIMP sucks and Inkscape has it's problems, but I've never used an Abobe program that I didn't hate just as much. So it's either be annoyed by GIMP/Inkscape for free, or shell out hundreds if not thousands (PLUS hardware upgrades) for the privilege of being annoyed by Adobe's equally obnoxious bloatware.
 Google Chrome? It's open-source, but it's driven by commercial 
 interests -- it's driven by the advantages it gives Google in the 
 market, even though it's "free" by itself.
 

Ok this is the one I *really* disagree with: You'll *never* convince me that Chrome is anything but the absolute WORST browser in existence. Wretched, horrid, terrible, awful piece of shit (and yes, it does crash, too), *and* it's to blame for kick-starting the endless trend of absolutely god-awful browser UIs. There is no such thing as a browser with a sane UI anymore, and it's all thanks to Chrome. I'm not exaggerating when I say I'd sooner go back to *Netscape* then even *allow* Chrome on my computer at all (And when I need to test on Chrome, I use SRWare Iron instead - It's the same engine and the same wretched god-awful UI, but without all the "raping my computer").
 Oh, and there's a reason people still use WinRAR instead of 7z, 
 as great as 7-Zip is. (Yes, the icons and toolbars DO make a 
 difference, even if you think that's stupid.)
 

Actually, I never noticed any difference. I only ever use the shell integration anyway.
 In the programming world -- just look at how popular C# is.
 It's not popular because it was open-source (although people 
 tried to make Mono) -- it's popular because it's got damn good 
 balance in terms of usability and IDE support.
 
 And VS is a lot of $$$ to buy. Nothing open-source/non-commercial 
 about it.
 

I find VS bloated. I like Programmer's Notepad 2. Nothing commercial about it.
 
 Of course, there's good open-source software. No doubt about that.
 
 But at the moment I can't think of one that took the place of 
 commercial software because people find it "good" and they find 
 the commercial version "not good".
 

Disc burning is a good example. These are *great* programs: - InfraRecorder - ImgBurn/DVD Decryptor - DVD Shrink None of those are commercial. I have yet to find *one* commercial disc burning program that isn't a steaming pile of shit. Nero's been shit for over a decade. Roxio is shit. DVD Fab is shit. It's all shit. Also, note in my earlier post I didn't say "popular software", I said "good software".
 
 Managed by *programmers*

LOL, that's precisely why open-source software has a "steep learning curve", as the creators like to put it. It's a result of programmers not knowing (or caring) about making good UIs, so they just think the users are noobs when they can't use the software.

There is too much of that, unfortunately. But it's definitely not true of all OSS. And at the same time, most commercial developers have been doing nothing but making their UIs worse and worse and worse. So basically most UIs these days suck, period, commercial or not. When I do find one I like, more often that not it's non-commercial.
Sep 18 2012
prev sibling next sibling parent "renoX" <renozyx gmail.com> writes:
On Tuesday, 18 September 2012 at 09:14:30 UTC, Nick Sabalausky 
wrote:
 On Tue, 18 Sep 2012 11:03:03 +0200
 "renoX" <renozyx gmail.com> wrote:

 Not very good rant,
 you write:
 They have imagined a "phone", where being able to answer the 
 call is completely by luck if the phone has been in your 
 pocket when the call arrived! Chances are, you will touch 
 something on the "smart" screen and reject the call by some 
 random reason

I have the *same issue* with a non-tactile phone:

I assume you mean non-touchscreen?

Yes. [cut]
 How do you suggest to fix this issue?
 

- "Clamshell" flip-phone.

Well you can buy a case with a hardcover, the result is the same; well at least for a phone with a touch screen the result should be the same, for my nokia (which has no touchscreen) the phone still activate from time to time (even with a case with a hardcover): my next phone will have a touchscreen.. Even better any mobile part is fragile but replacing a case is much cheaper than replacing a phone.
 - A proper physical switch for lock/unlock. The one on the 
 Toshiba Gigabeat F (admittedly a music player, not a phone) 
 works flawlessly.

Maybe, lot of physical switches are not proper though. RenoX
Sep 18 2012
prev sibling next sibling parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Tue, 18 Sep 2012 05:15:08 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 when the phone
 is activated, quite often the keys will be pressed randomly and
 create something unwanted.

 Worse sometimes the phone will unlock itself while in my pocket,
 something that I think is more rare for touch phones.

 How do you suggest to fix this issue?

- "Clamshell" flip-phone. - A proper physical switch for lock/unlock. The one on the Toshiba Gigabeat F (admittedly a music player, not a phone) works flawlessly.

iPhone has a lock switch. It's on the top. You push it, and the phone is locked. I'm pretty sure almost all android phones have this feature as well. I never ever ever accidentally call someone when the phone is in my pocket, because it gets locked when I'm done with it. In fact, I never accidentally do *anything* on my iPhone. Never happened with my flip-phone either, but certainly the capacitive touch screen has not reintroduced that problem for those who are willing to learn how to use them. These rants are absolutely hilarious. It's like saying you hate calculators because you can't slide the buttons like on your abacus. -Steve
Sep 18 2012
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 09:21:38 -0400
Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> wrote:

 On 9/18/12 8:53 AM, Steven Schveighoffer wrote:
 I never ever ever accidentally call someone when the phone is in my
 pocket, because it gets locked when I'm done with it. In fact, I
 never accidentally do *anything* on my iPhone. Never happened with
 my flip-phone either, but certainly the capacitive touch screen has
 not reintroduced that problem for those who are willing to learn
 how to use them.

Yes!
 These rants are absolutely hilarious. It's like saying you hate
 calculators because you can't slide the buttons like on your abacus.

I thought I'm alone in thinking so. To me these rants are eery - I can't recognize in them one single problem I've actually experienced.

But I'm sure you're aware just because you haven't had any such problem doesn't mean others haven't. Honestly, I've never had stray "In my pocket" behaviors on the iPhone or the Android, either. Their lock system *is* effective at that, at least. Actually, it's a little too effective: It's impossible to reach down into my pocket and adjust the volume because it plain refuses to *let* me adjust the volume without taking it out, pushing "Lock" or "Home", sliding the touch-slider, and *then* using the damn volume buttons - which *still* don't even do what I want most of the time. And there's a ton of other issues I have had with the devices, like poor accuracy (because my fingers aren't <=1mm in diameter and the damn thing won't even register touches from anything that's actually more accurate).
Sep 18 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Sep 21, 2012 at 03:54:21PM +0200, Paulo Pinto wrote:
[...]
 In big corporations you spend more time taking care of existing
 projects in big teams, than developing stuff from scratch.
 
 In these type of environments you learn to appreciate the verbosity
 of certain programming languages, and keep away from cute hacks.

I have to say, this is very true. When I first got my current job, I was appalled at the verbosity of the C code that I had to work with. C code!! Not Java or any of that stuff. My manager told me to try to conform to the (very verbose) style of the code. So I thought, well they're paying me to do this, so I'll shut up and cope. After a few years, I started to like the verbosity (which is saying a lot from a person like me -- I used to code with 2-space indents), because it makes it so darned easy to read, to search, and to spot stupid bugs. Identifier names are predictable, so you could just guess the correct name and you'd be right most of the time. Makes it easy to search for identifier usage in the ~2 million line codebase, because the predictable pattern excludes (almost) all false positives. However:
 Specially when you take into consideration the quality of work that
 many programming drones are capable of.

Yeah, even the verbosity / consistent style of the code didn't prevent people from doing stupid things with the code. Utterly stupid things. My favorite example is a particular case of checking for IPv6 subnets by converting the subnet and IP address to strings and then using string prefix comparison. Another example is a bunch of static functions with identical names and identical contents, copy-n-pasted across like 30 modules (or worse, some copies are imperfect buggy versions). It makes you wonder if the guy who wrote it even understands what code factorization means. Or "bug fixes" that consists of a whole bunch of useless redundant code to "fix" a problem, that adds all sorts of spurious buggy corner cases to the code and *doesn't actually address the cause of the bug at all*. It boggles the mind how something like that made it through code review. The saddest thing is that people are paying big bucks for this kind of "enterprise" code. It's one of those things that make me never want to pay for *any* kind of software... why waste the money when you can download the OSS version for free? Yeah a lot of OSS code is crap, but it's not like it's any worse than the crap you pay for. Sigh. T -- Маленькие детки - маленькие бедки.
Sep 21 2012
prev sibling next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 21 September 2012 at 19:09:48 UTC, H. S. Teoh wrote:
 On Fri, Sep 21, 2012 at 03:54:21PM +0200, Paulo Pinto wrote:
 [...]
 In big corporations you spend more time taking care of existing
 projects in big teams, than developing stuff from scratch.
 
 In these type of environments you learn to appreciate the 
 verbosity
 of certain programming languages, and keep away from cute 
 hacks.

I have to say, this is very true. When I first got my current job, I was appalled at the verbosity of the C code that I had to work with. C code!! Not Java or any of that stuff. My manager told me to try to conform to the (very verbose) style of the code. So I thought, well they're paying me to do this, so I'll shut up and cope. After a few years, I started to like the verbosity (which is saying a lot from a person like me -- I used to code with 2-space indents), because it makes it so darned easy to read, to search, and to spot stupid bugs. Identifier names are predictable, so you could just guess the correct name and you'd be right most of the time. Makes it easy to search for identifier usage in the ~2 million line codebase, because the predictable pattern excludes (almost) all false positives. However:
 Specially when you take into consideration the quality of work 
 that
 many programming drones are capable of.

Yeah, even the verbosity / consistent style of the code didn't prevent people from doing stupid things with the code. Utterly stupid things. My favorite example is a particular case of checking for IPv6 subnets by converting the subnet and IP address to strings and then using string prefix comparison. Another example is a bunch of static functions with identical names and identical contents, copy-n-pasted across like 30 modules (or worse, some copies are imperfect buggy versions). It makes you wonder if the guy who wrote it even understands what code factorization means. Or "bug fixes" that consists of a whole bunch of useless redundant code to "fix" a problem, that adds all sorts of spurious buggy corner cases to the code and *doesn't actually address the cause of the bug at all*. It boggles the mind how something like that made it through code review. The saddest thing is that people are paying big bucks for this kind of "enterprise" code. It's one of those things that make me never want to pay for *any* kind of software... why waste the money when you can download the OSS version for free? Yeah a lot of OSS code is crap, but it's not like it's any worse than the crap you pay for. Sigh. T

Welcome to my world. As a Fortune 500 outsourcing consulting company employee, I see this type of code everyday. -- Paulo
Sep 21 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Fri, 21 Sep 2012 22:13:22 +0200
"Paulo Pinto" <pjmlp progtools.org> wrote:

 On Friday, 21 September 2012 at 19:09:48 UTC, H. S. Teoh wrote:
 The saddest thing is that people are paying big bucks for this 
 kind of
 "enterprise" code. It's one of those things that make me never 
 want to
 pay for *any* kind of software... why waste the money when you 
 can
 download the OSS version for free? Yeah a lot of OSS code is 
 crap, but
 it's not like it's any worse than the crap you pay for.

Welcome to my world. As a Fortune 500 outsourcing consulting company employee, I see this type of code everyday.

I find it depressing to see just how *easy* it is to have dailywtf-worthy material. They anonymized my name as Nate here: http://thedailywtf.com/Articles/We_Have_Met_the_Enemy.aspx Note also that the "' ...code here" and "' ...more code here" sections were typically HUGE. And that was only scratching the surface of the lunacy that was going on there - both in and out of the codebase. I've been sticking to contract stuff now, largely because I really just can't take that sort of insanity anymore (not that I ever could). If I ever needed to go back to 9-5 code, or cubicles, or open-floorplan warrooms, I'd *really* be in trouble.
Sep 21 2012
prev sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 21 September 2012 at 21:37:23 UTC, Nick Sabalausky 
wrote:
 On Fri, 21 Sep 2012 22:13:22 +0200
 "Paulo Pinto" <pjmlp progtools.org> wrote:

 On Friday, 21 September 2012 at 19:09:48 UTC, H. S. Teoh wrote:
 The saddest thing is that people are paying big bucks for 
 this kind of
 "enterprise" code. It's one of those things that make me 
 never want to
 pay for *any* kind of software... why waste the money when 
 you can
 download the OSS version for free? Yeah a lot of OSS code is 
 crap, but
 it's not like it's any worse than the crap you pay for.

Welcome to my world. As a Fortune 500 outsourcing consulting company employee, I see this type of code everyday.

I find it depressing to see just how *easy* it is to have dailywtf-worthy material. They anonymized my name as Nate here: http://thedailywtf.com/Articles/We_Have_Met_the_Enemy.aspx Note also that the "' ...code here" and "' ...more code here" sections were typically HUGE. And that was only scratching the surface of the lunacy that was going on there - both in and out of the codebase. I've been sticking to contract stuff now, largely because I really just can't take that sort of insanity anymore (not that I ever could). If I ever needed to go back to 9-5 code, or cubicles, or open-floorplan warrooms, I'd *really* be in trouble.

One of the reasons that keeps me in the company is the offer around my area. Many of the other companies I would work for, are the same type or I will be forced to switch region for something better. -- Paulo
Sep 21 2012
prev sibling next sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 18, 2012, at 1:09 PM, Walter Bright <newshound2 digitalmars.com> =
wrote:

 On 9/18/2012 2:08 AM, Nick Sabalausky wrote:
=20
 My car stereo takes a USB stick. I specifically picked that model for =

Mine does bluetooth, so I don't even have to take my phone out of my = pocket to listen to music. CDs are terrible and DVDs are worse. Most = of the kids movies we have at home don't even play any more, even though = the underside for most isn't terribly scratched.=
Sep 18 2012
prev sibling next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Tuesday, September 18, 2012 16:50:18 Nick Sabalausky wrote:
 Actually, it's a little too effective: It's impossible to reach down
 into my pocket and adjust the volume because it plain refuses to *let*
 me adjust the volume without taking it out, pushing "Lock" or "Home",
 sliding the touch-slider, and *then* using the damn volume buttons

Actually, I wish that my Android worked that way. The perfect behavior IMHO would be for no buttons on the phone (including the power button) to do _anything_ other than turn on the screen if the phone's on unless you unlock it first. I _hate_ it how the volume changes while my phone is in my pocket or how my phone keeps rebooting just because there's enough pressure on the power button with how it shifts in my pocket. - Jonathan M Davis
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 13:09:39 -0700
Walter Bright <newshound2 digitalmars.com> wrote:
 
 Yea, so was I, but then I discovered that that we're basically
 trading one set of problems for another, especially with video.
 Casettes suck, and I'm glad to be done with them, but with discs:

My car stereo takes a USB stick. I specifically picked that model for that reason.

I've already decided, next time I look, I'll be looking for a 1/8" Aux input jack (which does seem to be pretty common now).
 CDs in the car suck.
 

I'm not a fan of CDs anymore anyway. Too much disc swapping. Plus poor durability. I've got a 40GB HDD-based portable music player, and I'll *never* go back to anything less (I do want to upgrade the HDD though...and get a new battery, which is gonna be difficult...). Being able to have all (or in my case: "most") of your music in one place is just something you never want to give up once you get used to it.
 
 Some fantastic 80's shows off the top of my head:

 - Soap
 - Hunter
 - Magnum PI
 - Remington Steele
 - Miami Vice

I loved MV in the 80's. It was on netflix, so I started watching it. It was *horrible*! Awful. Cringeworthy.

I've heard that the third and fourth seasons went downhill. I saw the whole first season just about a year ago and loved it.
 Cheers - awful.
 

Really? That's just weird!
 There's nothing, nothing remotely as good as Breaking Bad.
 

Not familiar with it.
Sep 18 2012
prev sibling next sibling parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Tue, 18 Sep 2012 16:50:18 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 Actually, it's a little too effective: It's impossible to reach down
 into my pocket and adjust the volume because it plain refuses to *let*
 me adjust the volume without taking it out, pushing "Lock" or "Home",
 sliding the touch-slider, and *then* using the damn volume buttons -
 which *still* don't even do what I want most of the time.

If you want to adjust the ringer volume, yes. If you want to adjust the volume of something that is currently playing (like a song), it works without having to unlock. I find the silent switch more useful, I don't often change ringer volumes.
 And there's
 a ton of other issues I have had with the devices, like poor accuracy
 (because my fingers aren't <=1mm in diameter and the damn thing won't
 even register touches from anything that's actually more accurate).

There are styli for capacitive screens, they aren't that great, but better than a finger. But no place to store them on the phone. I think Samsumg has a stylus-based capacitive screen phone called the Galaxy note. But I have not had much of a problem with accuracy. In certain cases when I'm browsing the web, I have to zoom in to accurately tap a link. However, my touch screens that I had with my palm Treo, and Windows Mobile 6 phones both sucked at accuracy. I spent so much time "calibrating" them, and even then, I couldn't click on anything near the edges. My Windows Mobile phone I completely gave up on using the touch screen at all, I got very good at using the keyboard shortcuts. The only thing I ever used the stylus for was playing solitaire, and even then, I had trained myself to offset my tap locations based on what part of the screen I was on. I literally knew exactly where to tap if I wanted to move whatever card to another pile -- and it wasn't uniform! -Steve
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 17:42:59 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Tue, 18 Sep 2012 16:50:18 -0400, Nick Sabalausky  
 <SeeWebsiteToContactMe semitwist.com> wrote:
 
 Actually, it's a little too effective: It's impossible to reach down
 into my pocket and adjust the volume because it plain refuses to
 *let* me adjust the volume without taking it out, pushing "Lock" or
 "Home", sliding the touch-slider, and *then* using the damn volume
 buttons - which *still* don't even do what I want most of the time.

If you want to adjust the ringer volume, yes. If you want to adjust the volume of something that is currently playing (like a song), it works without having to unlock. I find the silent switch more useful, I don't often change ringer volumes.

What I *really* want is a master volume control. But there is none. At all. And there is no "app for that". For example: - When I go into a library, I *expect* to have *no sound*, period. And this is what Apple apparently expects you to do: Pull it out, press "home" or "lock", slide the slider, double-press "home", swipe the bottom row to the right, adjust that volume with the touchscreen control, and switch the "ringer/vibrate-only" switch to "vibrate-only". And guess what? Even that *still* doesn't disable all sound. And that's even if you ignore the fact that vibrate isn't actually silent. I don't even take the fucking thing into libraries, I just leave the damn thing in the car. Fuck it. It's not worth it. - I'm haplessly attempting to peck something out on the miniature non-tactile chicklet-keyboard (which only *sometimes* goes into landscape mode) and notice it's too loud. So I have to go find something that plays sound, ideally music, play it, *then* adjust the fucking volume (otherwise it adjusts the ringer volume instead), then stop the music or whatever it was, then go back to whatever it was that I was doing and *hope* that I like the new volume setting because if not, I have to do it all over again. - Luckily, I don't use it to play music (I have a *real* portable music player for that, with a sensible amount of storage). Because if I did, then changing the ringer volume would work like this: Stop the music, change the ringer volume, resume the music. Seriously? Talk about pointless coupling. And then there's the fun times when the stupid thing *thinks* audio is playing so it won't let you adjust the ringer volume even though no audio is playing. Of course, I constantly need to change the ringer volume because, being mobile, it's constantly either too quiet or too loud. What a complete, moronic, absolute steaming turd of a device. I'd HAPPILY put up with accidental volume changes just to go back to a master volume POT (and even those can be made in a way to drastically minimize accidental volume changes). And that's *just* volume issues alone. God, I *HATE* the fucking thing. Any time I use it, I just want to hurl the damn thing into the nearest concrete wall as hard as I can. But I can't, because it's not even mine, it's a loaner, and I unfortunately need it for development/testing (or at least *will* need it for such once we pay Apple their Developer Ransom).
 And there's
 a ton of other issues I have had with the devices, like poor
 accuracy (because my fingers aren't <=1mm in diameter and the damn
 thing won't even register touches from anything that's actually
 more accurate).

There are styli for capacitive screens, they aren't that great, but better than a finger. But no place to store them on the phone. I think Samsumg has a stylus-based capacitive screen phone called the Galaxy note.

Right. Basically capacitive stylus is a hack solution. And the thing is too, I already *have* no less than *ten* styli built right into my fingers. But they're incompatible. And so is my knuckle (mostly), which is annoying when my fingers are messy.
 But I have not had much of a problem with accuracy.  In certain cases
 when I'm browsing the web, I have to zoom in to accurately tap a
 link. However, my touch screens that I had with my palm Treo, and
 Windows Mobile 6 phones both sucked at accuracy.  I spent so much
 time "calibrating" them, and even then, I couldn't click on anything
 near the edges.
 

I never had any accuracy problems with my Visor Deluxe or my Zire 71. Granted, they still *could* have been more accurate than they were (even though I never actually found it problematic), but the capacitive devices are far *less* accurate just because of the whole "finger" thing. Most people just don't notice the inaccuracy because they're using something (big beefy finger) that, unlike a stylus, they intuitively/subconciously expect to be inaccurate.
 My Windows Mobile phone I completely gave up on using the touch
 screen at all, I got very good at using the keyboard shortcuts.  The
 only thing I ever used the stylus for was playing solitaire, and even
 then, I had trained myself to offset my tap locations based on what
 part of the screen I was on.  I literally knew exactly where to tap
 if I wanted to move whatever card to another pile -- and it wasn't
 uniform!
 

Hmm, yea, I've never actually used any of the WinCE PDAs. I wouldn't know about them.
Sep 18 2012
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 9/18/12, Walter Bright <newshound2 digitalmars.com> wrote:
 There's nothing, nothing remotely as good as Breaking Bad.

You're just saying that 'cos your name rhymes with the lead character's name. :p
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 14:10:03 -0700
Sean Kelly <sean invisibleduck.org> wrote:

 On Sep 18, 2012, at 1:09 PM, Walter Bright
 <newshound2 digitalmars.com> wrote:
 
 On 9/18/2012 2:08 AM, Nick Sabalausky wrote:
 
 My car stereo takes a USB stick. I specifically picked that model
 for that reason. CDs in the car suck.

Mine does bluetooth, so I don't even have to take my phone out of my pocket to listen to music.

My dad's car has bluetooth connected up to his phone. Every time he starts his car and turns the radio on, about half a minute later the radio cuts completely out and there's an entirely useless voice saying "Device Connected" (or something like that, I forget the exact words). Then it switches his radio back on. I can always tell when he's calling me with it, too, because I can't make out a single word he says (and then he compensates by talking louder which just makes it worse). Meh, it's like there's no such thing as good design anymore. Actually, music over bluetooth? Wouldn't even an FM transmitter be better quality? (Well, unless you have one of those newer antennas that can't be retracted.) Bluetooth has *really* bad bandwidth.
 CDs are terrible and DVDs are worse.
 Most of the kids movies we have at home don't even play any more,
 even though the underside for most isn't terribly scratched.

That's why circumventing copy-protection is fucking awesome: Grab "DVD Decryptor", insert disc, press a button, insert blank disc, press a button, and you have something you can actually let the kids have, *and* it's region-free and has all the PUO bullshit removed. (Well, and then there's DVD Shrink if you need to get a DVD9 down to a DVD5 - and the quality is actually surprisingly good.) Fuck the DMCA.
Sep 18 2012
prev sibling next sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 18, 2012, at 3:50 PM, Nick Sabalausky <SeeWebsiteToContactMe semitwis=
t.com> wrote:

 On Tue, 18 Sep 2012 14:10:03 -0700
 Sean Kelly <sean invisibleduck.org> wrote:
=20
 On Sep 18, 2012, at 1:09 PM, Walter Bright
 <newshound2 digitalmars.com> wrote:
=20
 On 9/18/2012 2:08 AM, Nick Sabalausky wrote:
=20
 My car stereo takes a USB stick. I specifically picked that model
 for that reason. CDs in the car suck.

Mine does bluetooth, so I don't even have to take my phone out of my pocket to listen to music.

My dad's car has bluetooth connected up to his phone. Every time he starts his car and turns the radio on, about half a minute later the radio cuts completely out and there's an entirely useless voice saying "Device Connected" (or something like that, I forget the exact words). Then it switches his radio back on. =20 I can always tell when he's calling me with it, too, because I can't make out a single word he says (and then he compensates by talking louder which just makes it worse). =20 Meh, it's like there's no such thing as good design anymore. =20 Actually, music over bluetooth? Wouldn't even an FM transmitter be better quality? (Well, unless you have one of those newer antennas that can't be retracted.) Bluetooth has *really* bad bandwidth.

Bluetooth 3.0 HR does something like 25 Mbit/s. The trick is finding a car s= tereo that supports the high bitrate for audio. I agree with you about the p= hone support, but it's because the mic is shoddy rather than anything about B= luetooth. I don't use the phone setup in my car very often for that reason. = With noise reduction turned on I just sound like I'm under water, and with i= t off it's just noisy in general. My car doesn't offer a very quiet ride tho= ugh, to be fair.=20=
Sep 18 2012
prev sibling next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Tue, 18 Sep 2012 18:32:41 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Tue, 18 Sep 2012 17:42:59 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Tue, 18 Sep 2012 16:50:18 -0400, Nick Sabalausky
 <SeeWebsiteToContactMe semitwist.com> wrote:

 Actually, it's a little too effective: It's impossible to reach down
 into my pocket and adjust the volume because it plain refuses to
 *let* me adjust the volume without taking it out, pushing "Lock" or
 "Home", sliding the touch-slider, and *then* using the damn volume
 buttons - which *still* don't even do what I want most of the time.

If you want to adjust the ringer volume, yes. If you want to adjust the volume of something that is currently playing (like a song), it works without having to unlock. I find the silent switch more useful, I don't often change ringer volumes.

What I *really* want is a master volume control. But there is none. At all. And there is no "app for that". For example: - When I go into a library, I *expect* to have *no sound*, period. And this is what Apple apparently expects you to do: Pull it out, press "home" or "lock", slide the slider, double-press "home", swipe the bottom row to the right, adjust that volume with the touchscreen control, and switch the "ringer/vibrate-only" switch to "vibrate-only". And guess what? Even that *still* doesn't disable all sound. And that's even if you ignore the fact that vibrate isn't actually silent. I don't even take the fucking thing into libraries, I just leave the damn thing in the car. Fuck it. It's not worth it.

You can configure silent mode to not vibrate. Then it has the odd effect (if you have vibrate enabled for full-ring mode) of vibrating when you turn it *off* silent. I tried doing that for a while, but I found myself forgetting to revert the switch, and I would miss updates/calls/emails all day without realizing it! A good improvement (to any phone really) would be to have it configure your audio settings according to wifi SSID. That is, if you're connected to "MyLocalLibraryWifi", then set the thing to full silent. My Windows Mobile phone had a cool feature where it would detect when you were supposed to be in a meeting (according to your calendar) and set itself on silent/vibrate.
 - I'm haplessly attempting to peck something out on the miniature
   non-tactile chicklet-keyboard (which only *sometimes* goes into
   landscape mode) and notice it's too loud. So I have to go find
   something that plays sound, ideally music, play it, *then* adjust the
   fucking volume (otherwise it adjusts the ringer volume instead), then
   stop the music or whatever it was, then go back to whatever it was
   that I was doing and *hope* that I like the new volume setting
   because if not, I have to do it all over again.

The keyboard click sound (which you can disable BTW, settings->sounds->keyboard clicks) obeys the ringer volume. But ringer volume cannot be lowered to "off", so you can't get rid of the volume. Unless you put the phone in silent mode, and then you will hear no clicks. I find silent mode pretty much makes everything silent. Apps do not have to obey that setting, but most of them do (all the games I've played do). I don't know what your exact situation is, or the app you are having difficulty with, but I just tested safari, and it definitely obeys the ringer volume. It really sounds like you just should be using the silent switch.
 - Luckily, I don't use it to play music (I have a *real* portable music
   player for that, with a sensible amount of storage). Because if I
   did, then changing the ringer volume would work like this: Stop the
   music, change the ringer volume, resume the music. Seriously? Talk
   about pointless coupling.

Coincidentally, I wanted to do this today. You can change the ringer volume without manually stopping music by going into settings. But it annoyingly stopped playing music temporarily to demonstrate the new ringer volume. Once I exited settings, it automatically resumed playing music. Meh, what are you going to do? Complain I guess :)
 And then there's the fun times when the stupid thing *thinks* audio is
 playing so it won't let you adjust the ringer volume even though no
 audio is playing.

 Of course, I constantly need to change the ringer volume because, being
 mobile, it's constantly either too quiet or too loud.

Well, I guess you fidget more about ringer volume than I do. I usually like the ringer to be on 100%, because I frequently leave it on my desk or somewhere other than my pocket. When I want it to be quiet, it goes into silent mode.
 And that's *just* volume issues alone. God, I *HATE* the fucking thing.
 Any time I use it, I just want to hurl the damn thing into the nearest
 concrete wall as hard as I can. But I can't, because it's not even
 mine, it's a loaner, and I unfortunately need it for
 development/testing (or at least *will* need it for such once we pay
 Apple their Developer Ransom).

Hehe, yeah, that sucks. But it's definitely worth it if you are going to do *any* development, even if you aren't publishing. Just wait until you try to install your app on your phone for the first time -- I have a feeling you will hate that too :)
 And there's
 a ton of other issues I have had with the devices, like poor
 accuracy (because my fingers aren't <=1mm in diameter and the damn
 thing won't even register touches from anything that's actually
 more accurate).

There are styli for capacitive screens, they aren't that great, but better than a finger. But no place to store them on the phone. I think Samsumg has a stylus-based capacitive screen phone called the Galaxy note.

Right. Basically capacitive stylus is a hack solution. And the thing is too, I already *have* no less than *ten* styli built right into my fingers. But they're incompatible. And so is my knuckle (mostly), which is annoying when my fingers are messy.

Again, given my experience with the fragility of the non-capacitive touch screen phones I've had, and the lack of accuracy of them, I'd take capacitive *any day*. My mom is a different story. I talked her into getting an iPhone and she has a difficult time because of her longer nails. I recommend getting this app to practice typing better: http://itunes.apple.com/us/app/taptyping-typing-trainer-suite/id364237969?mt=8 My typing has improved dramatically with some of the techniques they recommend.
 But I have not had much of a problem with accuracy.  In certain cases
 when I'm browsing the web, I have to zoom in to accurately tap a
 link. However, my touch screens that I had with my palm Treo, and
 Windows Mobile 6 phones both sucked at accuracy.  I spent so much
 time "calibrating" them, and even then, I couldn't click on anything
 near the edges.

I never had any accuracy problems with my Visor Deluxe or my Zire 71. Granted, they still *could* have been more accurate than they were (even though I never actually found it problematic), but the capacitive devices are far *less* accurate just because of the whole "finger" thing. Most people just don't notice the inaccuracy because they're using something (big beefy finger) that, unlike a stylus, they intuitively/subconciously expect to be inaccurate.

Also, the UI is designed around that limitation. For instance, typing on the keyboard pops up a temporary copy of the key so you can see what you are pressing.
 My Windows Mobile phone I completely gave up on using the touch
 screen at all, I got very good at using the keyboard shortcuts.  The
 only thing I ever used the stylus for was playing solitaire, and even
 then, I had trained myself to offset my tap locations based on what
 part of the screen I was on.  I literally knew exactly where to tap
 if I wanted to move whatever card to another pile -- and it wasn't
 uniform!

Hmm, yea, I've never actually used any of the WinCE PDAs. I wouldn't know about them.

It was the same screen as my palm. Same technology anyway. I love how my iPhone will never scratch or deteriorate. I remember a friend whose palm treo was so bad, he had to put so much force on the screen to get anything to happen that his hands would literally shake. -Steve
Sep 18 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Sep 21, 2012 at 05:38:06PM -0400, Nick Sabalausky wrote:
 On Fri, 21 Sep 2012 22:13:22 +0200
 "Paulo Pinto" <pjmlp progtools.org> wrote:
 
 On Friday, 21 September 2012 at 19:09:48 UTC, H. S. Teoh wrote:
 The saddest thing is that people are paying big bucks for this
 kind of "enterprise" code. It's one of those things that make me
 never want to pay for *any* kind of software... why waste the
 money when you can download the OSS version for free? Yeah a lot
 of OSS code is crap, but it's not like it's any worse than the
 crap you pay for.

Welcome to my world. As a Fortune 500 outsourcing consulting company employee, I see this type of code everyday.

I find it depressing to see just how *easy* it is to have dailywtf-worthy material. They anonymized my name as Nate here: http://thedailywtf.com/Articles/We_Have_Met_the_Enemy.aspx

LOL... I should submit the ipv6 prefix checking code that does conversion to string. The sad part is that so many of the commenters have no idea that adjacent C literals are concatenated at compile-time. It's a very nice way to put long strings in code and have it nicely indented, something that is sorely lacking in most languages. But regardless, why are they posting if they clearly don't know C that well?!
 Note also that the "' ...code here" and "' ...more code here" sections
 were typically HUGE.

Speaking of 1000-line functions... yeah I routinely work with those monsters. They tend to also have a ridiculously long list of parameters, which over the history of the function have been added one by one as people felt the need for Yet Another Variation on the function's capabilities. Most of those parameters are either meaningless or ignored most of the time (necessitating ridiculously long lists of null/dummy values every single time the function is called), save for one or two exceptional cases when most of the *other* parameters aren't needed. Calling the function with unforeseen combinations of parameters usually triggers a bug caused by unexpected interactions between parameters that were assumed to be independent.
 And that was only scratching the surface of the lunacy that was going
 on there - both in and out of the codebase.

I have seen code whose function names are along the lines of "do_it()" and "do_everything()". As well as "do_main()" and "${program_name}_main()" in addition to "main()".
 I've been sticking to contract stuff now, largely because I really
 just can't take that sort of insanity anymore (not that I ever could).
 If I ever needed to go back to 9-5 code, or cubicles, or
 open-floorplan warrooms, I'd *really* be in trouble.

I really should start doing contract work. Being stuck with the same project and dealing with the same stupid code that never gets fixed is just very taxing on the nerves. T -- Blunt statements really don't have a point.
Sep 21 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Fri, 21 Sep 2012 15:37:46 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 The sad part is that so many of the commenters have no idea that
 adjacent C literals are concatenated at compile-time. It's a very nice
 way to put long strings in code and have it nicely indented, something
 that is sorely lacking in most languages. But regardless, why are they
 posting if they clearly don't know C that well?!
 

Heh, actually I didn't even know about it until I learned it from D and then learned that D got it from C (does D still do it, or is that one of those "to be deprecated" things?) But then dealing with strings is something I generally tried to avoid in C anyway ;)
 
 Note also that the "' ...code here" and "' ...more code here"
 sections were typically HUGE.

Speaking of 1000-line functions... yeah I routinely work with those monsters.

*cough* DMD's main() *cough* ;) Although it's actually, surprisingly, not too bad in DMD's case, all things considered. Took me by surprise at first though, I really wasn't expecting it.
 And that was only scratching the surface of the lunacy that was
 going on there - both in and out of the codebase.

I have seen code whose function names are along the lines of "do_it()" and "do_everything()". As well as "do_main()" and "${program_name}_main()" in addition to "main()".

What really gets me is that these are the sorts of things that are harped on in chapter 1 of just about any decent "intro to programming" book. So where did these people even learn to code in the first place? Heck, back in college, I used to be a CS tutor for first semester programming students. Even *they* wrote better code, no exaggeration. (Well, except for the handful of students, and I could always tell which ones they were, who were from the class of Mrs. "Let's Teach OOP *Before* Basic Flow Of Execution". Those poor students couldn't write *any* code, let alone good or bad code. I felt bad for them.)
 
 I've been sticking to contract stuff now, largely because I really
 just can't take that sort of insanity anymore (not that I ever
 could). If I ever needed to go back to 9-5 code, or cubicles, or
 open-floorplan warrooms, I'd *really* be in trouble.

I really should start doing contract work. Being stuck with the same project and dealing with the same stupid code that never gets fixed is just very taxing on the nerves.

Yea, contract has it's upsides, although naturally it has it's own perils too. Making a living at it is *damn* hard (either that or I'm just REALLY bad at self-employment...but it's probably both), and frankly I'm still trying to figure out how to do it. And you can forget about health care if you're in the US: Non-group premiums on insurance (read: legalized casinos without the neon lights and cocktails) are just as expensive as paying out-of-pocket (remember, the house *always* has the advantage), and that's if you're lucky enough to have never had a gap in coverage. If you have, then your premiums are literally buying you nothing unless you *ahem* "win" and get mangled by a car or get a terminal disease or something. Not to discourage you though. Everything sucks, it's just finding a "suck" that you can live with, y'know ;) Personally, I'm still looking...
Sep 22 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Sep 24, 2012 at 07:52:15PM -0400, Nick Sabalausky wrote:
[...]
 Independently controllable ringer/game/music volumes: Good
 
 Complete *lack* of any way to control *overall* volume: Bad

I have to agree with that. It's OK, and sometimes even useful, to have multiple independent volumes, but it makes no sense to NOT have a master volume that controls everything else. Sometimes you just want to mute the whole dang device, and that should not require fiddling with every single independent volume setting.
 A lot of the videogames I've played have independent adjustable
 SFX/music/voice volumes. I've even happily made use of that. And I'm
 damn glad that the TV *still* has a properly working volume control
 despite that because I make even more use of that.

Yeah I almost never play games with music on, 'cos I generally find the music not to my liking. SFX I sometimes leave on low, though on handhelds I generally turn both off. But the option to only have SFX without music is a plus. I *have* deleted apps before that didn't allow independent settings. [...]
 I feel like I get the best of all worlds.

Yea, but to get that, you have to use OSX as your *primary* environment, and stick with expensive iHardware. Might work for you, but those are all deal-breakers for me.

I find it sad that Apple has left its original philosophy of open protocols and specs so that you can make it interoperate with stuff. For all their flaws, PCs are much more palatable 'cos you can replace parts that you don't like with alternatives. With closed hardware and vendor lock-in, I can't say that Macs are exactly near the top of the list for hardware I'd consider buying. I've had a bad experience with PC laptops already (after 2 years parts starting wearing out and I can't replace them 'cos they need specialized tools that vary from vendor to vendor -- no choice but to buy a brand new one though the old one could've continued to work if a few basic parts were replaced) -- I don't feel like I want to repeat that experience. So yeah, this is a deal-breaker for me too. [...]
 The one thing I would rip out of OSX and throw against the wall is
 the mail app.  Its interface and experience is awesome.  But it
 frequently corrupts messages and doesn't properly save outgoing
 mail.  Not good for a mail application.


Ahhh how I love Mutt. ;-)
 I didn't have corruption issues with it, but I did find it to be
 rather gimped and straight-jacketed much like the rest of the system.

I find pretty much all GUI mail apps (both webmail and local MUAs) strait-jacketed. Anything that doesn't let you configure mail headers is unusable to me, and HTML by default gets on my nerves so much it's not even funny. I want my mail to NOT have stupid extraneous headers that are completely unnecessary for what I use mail for, and yes most people don't care, but as the adage goes: easy things should be easy, hard things should be possible. I find in pretty much every GUI mail app that easy things are hard and hard things are impossible. But anyway, I stumbled across this cute little thing just today: http://daringfireball.net/projects/markdown/ I'd love to start a trend for a new kind of email: one in which the message is transmitted as markdown text, and, should the receiver so wish, the receiving end automatically converts that into HTML. This way you can either write directly in plaintext (like I do) or use a GUI front-end for composing messages (like most normal people do), the transmission won't have stupid useless HTML clutter (or worse, JS viruses and other detritus), and the receiver can get all mails in plaintext or HTML according to their choice. AND there is no need for multiple MIME parts; the markdown text can be read directly as plaintext or translated into HTML for people who prefer that. Now, somebody just has to cook up this MUA in D, and make it the killer D app that will take over the world. ;-) T -- When solving a problem, take care that you do not become part of the problem.
Sep 24 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 24 Sep 2012 18:10:09 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Mon, Sep 24, 2012 at 07:52:15PM -0400, Nick Sabalausky wrote:
 
 A lot of the videogames I've played have independent adjustable
 SFX/music/voice volumes. I've even happily made use of that. And I'm
 damn glad that the TV *still* has a properly working volume control
 despite that because I make even more use of that.

Yeah I almost never play games with music on, 'cos I generally find the music not to my liking. SFX I sometimes leave on low, though on handhelds I generally turn both off. But the option to only have SFX without music is a plus. I *have* deleted apps before that didn't allow independent settings.

I never used to mute videogame music until they started licensing stuff from the record labels. Like all that "EA Trax" stuff. Blech. Last generation, that was one of the great things about the XBox: custom soundtracks. My brother introduced me to Quarashi's Jinx album which made for a far better soundtrack for THPS2X than the built-in songs. The Tony Hawk games from 3 onward were almost unplayable with the built-in music enabled. Unfortunately, my most frequent use of game audio controls is to fix the piss-poor mixing that's common in a lot of games. When you can't hear important voiceovers because they're quieter than the music or sfx (example: Splinter Cell 3), it's nice to be able to fix that screwup by cranking up the voice volume, and turning everything else down. I've often wished I could turn off the elevator music in Wii Sports Resort without having to mute the whole thing. But of course, all that still doesn't mean I'd ever be willing to give up the TV's "adjust *everything's* volume". Individual controls let you adjust the "mix" ie relative volume relations, and then a master volume is indispensible for normal "I need this thing louder/quieter".
 
 [...]
 I feel like I get the best of all worlds.

Yea, but to get that, you have to use OSX as your *primary* environment, and stick with expensive iHardware. Might work for you, but those are all deal-breakers for me.

I find it sad that Apple has left its original philosophy of open protocols and specs so that you can make it interoperate with stuff.

Absolutely. That's one of my biggest irritations with modern Apple.
 For all their flaws, PCs are much more palatable 'cos you can replace
 parts that you don't like with alternatives. With closed hardware and
 vendor lock-in, I can't say that Macs are exactly near the top of the
 list for hardware I'd consider buying. I've had a bad experience with
 PC laptops already (after 2 years parts starting wearing out and I
 can't replace them 'cos they need specialized tools that vary from
 vendor to vendor -- no choice but to buy a brand new one though the
 old one could've continued to work if a few basic parts were
 replaced) -- I don't feel like I want to repeat that experience. So
 yeah, this is a deal-breaker for me too.
 

Yeah.
 
 [...]
 The one thing I would rip out of OSX and throw against the wall is
 the mail app.  Its interface and experience is awesome.  But it
 frequently corrupts messages and doesn't properly save outgoing
 mail.  Not good for a mail application.


Ahhh how I love Mutt. ;-)

I've been finding Mutt very useful for when I'm ssh'ed into my server to create a temporary throwaway address. Doing "mutt -f /path/to/mailbox" is so much more convenient than setting up a POP3 GUI client. I need to learn how to use mutt better though, as I've just been fumbling around with it. For my usual mailboxes though, I prefer typical GUI desktop clients. Unfortunately, I still haven't been able to find one that I like. Outlook Express has a bunch of problems (no spellcheck, can't send UTF, proprietary storage, etc). Windows Mail won't be an option when I move to Linux or upgrade back to XP. Claws mail is just generally buggy and never does anything in the background (feels almost like it might be purely single-threaded). And I'm not a big fan of Opera and don't really want to use a web browser as my desktop mail client. I think I might actually try moving to Thunderbird even though I'm generally unhappy with Mozilla software/practices, and didn't like it last time I tried (for example, it kept trying to bold/italic/underline parts of text in my *plaintext* views, and the people on the "help" forums just complained that I should shut up and like it - which is consistent with what usually happens when I inquire about customizing parts of Mozilla's so-called "most customizable browser in the world").
 
 I didn't have corruption issues with it, but I did find it to be
 rather gimped and straight-jacketed much like the rest of the
 system.

I find pretty much all GUI mail apps (both webmail and local MUAs) strait-jacketed. Anything that doesn't let you configure mail headers is unusable to me, and HTML by default gets on my nerves so much it's not even funny.

I never care about mail headers (unless I'm debugging something mail-related, which isn't often), but I *ALWAYS* have HTML disabled. I'll never use a mail client that doesn't let me turn HTML off. Not only do I not want to deal with any tracker-images (or god forbid, JS emails), but in my experience "HTML email" just means it's too easy, and far too tempting, for other people to make the stuff they send me really, really ugly ;) "Just the words, ma'am."
Sep 24 2012
prev sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 24, 2012, at 6:55 PM, Nick Sabalausky =
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Mon, 24 Sep 2012 18:10:09 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
=20
 On Mon, Sep 24, 2012 at 07:52:15PM -0400, Nick Sabalausky wrote:
=20
 A lot of the videogames I've played have independent adjustable
 SFX/music/voice volumes. I've even happily made use of that. And I'm
 damn glad that the TV *still* has a properly working volume control
 despite that because I make even more use of that.

Yeah I almost never play games with music on, 'cos I generally find the music not to my liking. SFX I sometimes leave on low, though on handhelds I generally turn both off. But the option to only have SFX without music is a plus. I *have* deleted apps before that didn't allow independent settings. =20

I never used to mute videogame music until they started licensing =

 from the record labels. Like all that "EA Trax" stuff. Blech. Last
 generation, that was one of the great things about the XBox: custom
 soundtracks. My brother introduced me to Quarashi's Jinx album which
 made for a far better soundtrack for THPS2X than the built-in songs.
 The Tony Hawk games from 3 onward were almost unplayable with the
 built-in music enabled.

One really interesting side effect of using licensed music in games is = that it can prevent the game from being re-released as a "classic" later = on, ported to other platforms, etc, if the licensing deal didn't include = a clause for that (which is typically the case). There have been games = re-released in the past few years with no music track because the = license didn't allow for its inclusion.=20=
Sep 25 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 23:46:35 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 
 You can configure silent mode to not vibrate.  Then it has the odd
 effect (if you have vibrate enabled for full-ring mode) of vibrating
 when you turn it *off* silent.
 
 I tried doing that for a while, but I found myself forgetting to
 revert the switch, and I would miss updates/calls/emails all day
 without realizing it!
 

Thanks for the tip!
 A good improvement (to any phone really) would be to have it
 configure your audio settings according to wifi SSID.  That is, if
 you're connected to "MyLocalLibraryWifi", then set the thing to full
 silent.  My Windows Mobile phone had a cool feature where it would
 detect when you were supposed to be in a meeting (according to your
 calendar) and set itself on silent/vibrate.
 

Now those are some clever ideas.
 The keyboard click sound (which you can disable BTW,  
 settings->sounds->keyboard clicks) obeys the ringer volume.

Ehh? How unintuitive.
 But
 ringer volume cannot be lowered to "off", so you can't get rid of the
 volume. Unless you put the phone in silent mode, and then you will
 hear no clicks.  I find silent mode pretty much makes everything
 silent.  Apps do not have to obey that setting, but most of them do
 (all the games I've played do).
 
 I don't know what your exact situation is, or the app you are having  
 difficulty with, but I just tested safari, and it definitely obeys
 the ringer volume.  It really sounds like you just should be using
 the silent switch.
 

I think the main problem is that the volume rules are just far too convoluted. They took something trivial and hacked it up beyond recognition, and all in the supposed name of "simplicity", go figure.
 
 Well, I guess you fidget more about ringer volume than I do.  I
 usually like the ringer to be on 100%, because I frequently leave it
 on my desk or somewhere other than my pocket.  When I want it to be
 quiet, it goes into silent mode.
 

Well, I *would* fidget with it a lot, but frankly no matter what I do it's always playing something either too loud or two quiet, and I've got better things to do than mess with a screwy interface every time I walk into a different environment. So really it just encourages me to avoid even using it or even bringing the thing anywhere unless I really need it. A stiff, recessed master volume dial that I could reach into my pocket to adjust would pretty much solve the issue, but I guess that just isn't "high tech" enough. Make it holographic so you can't even feel it at all, *then* Apple would probably toss it in. :/
 And that's *just* volume issues alone. God, I *HATE* the fucking
 thing. Any time I use it, I just want to hurl the damn thing into
 the nearest concrete wall as hard as I can. But I can't, because
 it's not even mine, it's a loaner, and I unfortunately need it for
 development/testing (or at least *will* need it for such once we pay
 Apple their Developer Ransom).

Hehe, yeah, that sucks. But it's definitely worth it if you are going to do *any* development, even if you aren't publishing.

If it were my own personal device, I'd just jailbreak it and be done with it. (And then pay the ransom to publish, of course, because what else can you do? Create your own device and compete with Apple under capitalism? Nope, Google tried that idea of "competition" and look what happened: <http://www.nytimes.com/2012/08/25/technology/jury-reaches-decision-in-apple-samsung-patent-trial.html? r=1&ref=technology> )
 Just wait until you try to install your app on your phone for the
 first time -- I have a feeling you will hate that too :)
 

I've done it on the Android already - could be better could be worse. Marmalade's deployment tool is really dodgy when installing to a device, but using Google's ADB directly is pretty reliable, and so is installing from a URL via the device's browser. I'm definitely not looking forward to dealing with iTunes though. I've already used it for syncing the phone, and it's just a big mess. I don't even bother trying to sync it anymore (PalmOS syncing OTOH, was flawless). When the time comes, I'll probably grab copies of "Phone to PC" and/or "Phone Disk" <http://www.macroplant.com/downloads.php>. The demos of those seem to work much better than iTunes, plus they don't treat me like a brain-damaged monkey.
 
 I love how my iPhone will never scratch or deteriorate.

Instead, it'll just get prematurely discontinued ;) But I dunno, I've heard that the iPhones are so brittle that you practically look at them the wrong way and they break. (I wouldn't know - I've got a super heavy-duty case on mine. The device is far too expensive to replace if anything happened to it. Damn thing costs twice as much as my laptop. For a stupid little phone. Go figure.)
Sep 18 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 20:32:29 -0700
Sean Kelly <sean invisibleduck.org> wrote:

 On Sep 18, 2012, at 3:50 PM, Nick Sabalausky
 <SeeWebsiteToContactMe semitwist.com> wrote:
 
 
 Actually, music over bluetooth? Wouldn't even an FM transmitter be
 better quality? (Well, unless you have one of those newer antennas
 that can't be retracted.) Bluetooth has *really* bad bandwidth.

Bluetooth 3.0 HR does something like 25 Mbit/s. The trick is finding a car stereo that supports the high bitrate for audio. I agree with you about the phone support, but it's because the mic is shoddy rather than anything about Bluetooth. I don't use the phone setup in my car very often for that reason. With noise reduction turned on I just sound like I'm under water, and with it off it's just noisy in general. My car doesn't offer a very quiet ride though, to be fair.

I guess my bluetooth info's pretty out-of-date. When I think "bluetooth audio" I still think "Wii remote speaker". Granted, that's a terrible "speaker" period (I usually have it muted), but my understanding was that the bandwidth couldn't have really driven anything much better...at least at the time, I guess.
Sep 18 2012
prev sibling next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Wed, 19 Sep 2012 01:34:12 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Tue, 18 Sep 2012 23:46:35 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 The keyboard click sound (which you can disable BTW,
 settings->sounds->keyboard clicks) obeys the ringer volume.

Ehh? How unintuitive.

I cannot argue that Apple's audio volume isn't too simplistic for its own good. AIUI, they have two "volumes", one for the ringer, and one for playing audio, games, videos, etc. I feel like the volume should be app-specific, and you should be able to allocate new volume categories. Putting keyboard clicks under the ringer volume seems like a kludge. However, it *does* do a good job of remembering volume settings for different audio outputs. For example, it keeps track of your headphone ringer and audio volume separate from your speaker ringer and audio volume.
 I think the main problem is that the volume rules are just far too
 convoluted. They took something trivial and hacked it up beyond
 recognition, and all in the supposed name of "simplicity", go figure.

I think if they simply made the volume buttons control the ringer while locked and not playing music, it would solve the problem. BTW, a cool feature I didn't know for a long time is if you double tap the home button, your audio controls appear on the lock screen (play/pause, next previous song, and audio volume). But I think you have to unlock to access ringer volume.
 Well, I guess you fidget more about ringer volume than I do.  I
 usually like the ringer to be on 100%, because I frequently leave it
 on my desk or somewhere other than my pocket.  When I want it to be
 quiet, it goes into silent mode.

Well, I *would* fidget with it a lot, but frankly no matter what I do it's always playing something either too loud or two quiet, and I've got better things to do than mess with a screwy interface every time I walk into a different environment. So really it just encourages me to avoid even using it or even bringing the thing anywhere unless I really need it. A stiff, recessed master volume dial that I could reach into my pocket to adjust would pretty much solve the issue, but I guess that just isn't "high tech" enough. Make it holographic so you can't even feel it at all, *then* Apple would probably toss it in. :/

It's more moving parts to break. I wouldn't like it. Just my opinion.
 And that's *just* volume issues alone. God, I *HATE* the fucking
 thing. Any time I use it, I just want to hurl the damn thing into
 the nearest concrete wall as hard as I can. But I can't, because
 it's not even mine, it's a loaner, and I unfortunately need it for
 development/testing (or at least *will* need it for such once we pay
 Apple their Developer Ransom).

Hehe, yeah, that sucks. But it's definitely worth it if you are going to do *any* development, even if you aren't publishing.

If it were my own personal device, I'd just jailbreak it and be done with it. (And then pay the ransom to publish, of course, because what else can you do? Create your own device and compete with Apple under capitalism? Nope, Google tried that idea of "competition" and look what happened: <http://www.nytimes.com/2012/08/25/technology/jury-reaches-decision-in-apple-samsung-patent-trial.html? r=1&ref=technology> )

If you want to develop for only jailbroken phones, you basically alienate most users of iPhone. It's not a viable business model IMO. Yes, it sucks to have to jump through apple's hoops, but having access to millions of users is very much worth it.
 Just wait until you try to install your app on your phone for the
 first time -- I have a feeling you will hate that too :)

I've done it on the Android already - could be better could be worse. Marmalade's deployment tool is really dodgy when installing to a device, but using Google's ADB directly is pretty reliable, and so is installing from a URL via the device's browser. I'm definitely not looking forward to dealing with iTunes though. I've already used it for syncing the phone, and it's just a big mess. I don't even bother trying to sync it anymore (PalmOS syncing OTOH, was flawless). When the time comes, I'll probably grab copies of "Phone to PC" and/or "Phone Disk" <http://www.macroplant.com/downloads.php>. The demos of those seem to work much better than iTunes, plus they don't treat me like a brain-damaged monkey.

Oh, when you develop apps, it's quite easy to install on the phone, you just click "run" from xcode, selecting your device, you don't ever have to start itunes (though itunes will auto-start every time you plug in the phone, but you can disable this in itunes, more annoying is that iPhoto *always* starts, I can't figure out how to stop that). From then on, the app is installed. The issue is setting up all the certificates via xcode and their web portal to get that to work (should only have to do this once). I think the process has streamlined a bit, you used to have to create an app id for each app and select which devices were authorized to install it. Now I think you get a wildcard app id, but you still have to register each device.
 I love how my iPhone will never scratch or deteriorate.

Instead, it'll just get prematurely discontinued ;)

3gs (released june 2009) was still being sold last month, and it is getting ios 6 upgrade. I still have mine and develop with it.
 But I dunno, I've heard that the iPhones are so brittle that you
 practically look at them the wrong way and they break. (I wouldn't
 know - I've got a super heavy-duty case on mine. The device is far too
 expensive to replace if anything happened to it. Damn thing costs twice
 as much as my laptop. For a stupid little phone. Go figure.)

My wife and I have been very careful with ours, but I do see a lot with cracked screens. Interesting thing is they still seem to work! I don't think a cracked/broken screen would ever work with a palm-style touch screen. Also, starting with iPhone 4s (and iPad 2 I think?) you can buy apple care for your device for $99 that covers two accidental breakage incidents (at $49 each) for up to 2 years. This includes cracked screens and water damage. Only catch is you have to buy it within 30 days of activating the phone (or purchasing the iPad if not 3g enabled). Well worth the extra cost when you consider the full retail price! I did it for my 4s, and will do it for all my subsequent iPhones. -Steve
Sep 19 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Sep 22, 2012 at 03:48:49AM -0400, Nick Sabalausky wrote:
 On Fri, 21 Sep 2012 15:37:46 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 The sad part is that so many of the commenters have no idea that
 adjacent C literals are concatenated at compile-time. It's a very
 nice way to put long strings in code and have it nicely indented,
 something that is sorely lacking in most languages. But regardless,
 why are they posting if they clearly don't know C that well?!
 

Heh, actually I didn't even know about it until I learned it from D and then learned that D got it from C (does D still do it, or is that one of those "to be deprecated" things?)

Heh. I suppose in any language complex enough to be interesting there are always some things that you don't know about until a long time later. :) So maybe I was a bit harsh on the commenters. But still, they should've checked before they posted (but then I'm guilty of that one too).
 But then dealing with strings is something I generally tried to avoid
 in C anyway ;)

Yeah... D is just so much more comfortable to write when dealing with strings. With std.regex in Phobos now, writing string-processing code in D is almost as comfortable as Perl, and probably performs better too.
 Note also that the "' ...code here" and "' ...more code here"
 sections were typically HUGE.

Speaking of 1000-line functions... yeah I routinely work with those monsters.

*cough* DMD's main() *cough* ;) Although it's actually, surprisingly, not too bad in DMD's case, all things considered. Took me by surprise at first though, I really wasn't expecting it.

I haven't followed this rule to the letter all the time, but usually I consider that if a function doesn't read like pseudo-code, then you're doing something wrong. What I mean is, it should read like steps of an algorithm that makes sense when you read it, for example, "initialize data structure X to empty, loop over input items, transform data and store in X, return result" should map to something like: Result func(Input input) { auto result = X(); foreach (item; input) { auto x = transform(item); result.add(x); } return result; } That is, the detailed steps in transform() shouldn't pollute the main code in func(), but should remain as a separate function. Ditto with X.add(), which may involve a complicated series of steps. As soon as you start having a whole bunch of code at different levels of abstraction mixed together, you know it's time to split them up into separate functions, 'cos chances are you'll need to use each piece independently one day. [...]
 I have seen code whose function names are along the lines of
 "do_it()" and "do_everything()". As well as "do_main()" and
 "${program_name}_main()" in addition to "main()".
 

What really gets me is that these are the sorts of things that are harped on in chapter 1 of just about any decent "intro to programming" book. So where did these people even learn to code in the first place?

Probably from a youtube video on how to write your own lame Flash game. :-P OK, I'm being a bit harsh. But it's hard not to be cynical when you've seen the kind of code that passes for "enterprise software" these days.
 Heck, back in college, I used to be a CS tutor for first semester
 programming students. Even *they* wrote better code, no exaggeration.

Totally. I've been a teaching assistant before, and we made sure to drill sound programming practices into the students early, and often. Makes me wonder where all these students went after they graduated, 'cos the people writing code in the workforce don't seem to be the same people who attended these courses. Strange.
 (Well, except for the handful of students, and I could always tell
 which ones they were, who were from the class of Mrs. "Let's Teach OOP
 *Before* Basic Flow Of Execution". Those poor students couldn't write
 *any* code, let alone good or bad code. I felt bad for them.) 

Speaking of students who couldn't code... I used to give out "sympathy marks" for struggling students. Y'know, their code was so bad, like code that obviously didn't compile or work, or code with comments written in a way that suggested the student thought that if they pleaded hard enough verbally the computer might *just* do what they wanted it to do -- they had to get a failing mark, but I tried to find excuses to not give them an outright zero. But one time, after marking a bunch of 15-20 page assignments (complete with intro, description, code, test results, etc.), I came across a submission consisting of a single sheet of paper *hand-written* on a single side. I was boggled for a good moment. It was like... I was trying not to give anyone an outright zero but she gave me no choice, y'know? What was the point of handing that piece of paper if she wasn't even going to make the effort of using the lab *printer*, for crying out loud. What's scary is that sometimes I wonder if that girl would've done a better job at the kind of "enterprise" code I see every now and then. At least her inability to code (or use a computer, for that matter) would be obvious, instead of the kind of garbage that passes for code, compiles, and apparently works (and somehow passes code review) but has so many things wrong with it that it makes you face-palm, many, many times.
 I've been sticking to contract stuff now, largely because I really
 just can't take that sort of insanity anymore (not that I ever
 could). If I ever needed to go back to 9-5 code, or cubicles, or
 open-floorplan warrooms, I'd *really* be in trouble.

I really should start doing contract work. Being stuck with the same project and dealing with the same stupid code that never gets fixed is just very taxing on the nerves.

Yea, contract has it's upsides, although naturally it has it's own perils too. Making a living at it is *damn* hard (either that or I'm just REALLY bad at self-employment...but it's probably both), and frankly I'm still trying to figure out how to do it.

True. I suppose you just have to do what's most popular out there right now. I know someone who does Java stuff, and he's never short on contracts. In fact, he gets to choose his vacations 'cos he has enough options in terms of which contracts he chooses to bid on. T -- "Maybe" is a strange word. When mom or dad says it it means "yes", but when my big brothers say it it means "no"! -- PJ jr.
Sep 23 2012
prev sibling next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 24 September 2012 at 05:45:11 UTC, H. S. Teoh wrote:
 On Sat, Sep 22, 2012 at 03:48:49AM -0400, Nick Sabalausky wrote:
 On Fri, 21 Sep 2012 15:37:46 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 The sad part is that so many of the commenters have no idea 
 that
 adjacent C literals are concatenated at compile-time. It's a 
 very
 nice way to put long strings in code and have it nicely 
 indented,
 something that is sorely lacking in most languages. But 
 regardless,
 why are they posting if they clearly don't know C that well?!
 

Heh, actually I didn't even know about it until I learned it from D and then learned that D got it from C (does D still do it, or is that one of those "to be deprecated" things?)

Heh. I suppose in any language complex enough to be interesting there are always some things that you don't know about until a long time later. :) So maybe I was a bit harsh on the commenters. But still, they should've checked before they posted (but then I'm guilty of that one too).
 But then dealing with strings is something I generally tried 
 to avoid
 in C anyway ;)

Yeah... D is just so much more comfortable to write when dealing with strings. With std.regex in Phobos now, writing string-processing code in D is almost as comfortable as Perl, and probably performs better too.

Having learned Turbo Pascal before I delved into C, the language always felt pre-historic to me. Strings were a joke compared with what Turbo Pascal offered, lack of modules and so forth. After high-school, I only touched C in the university assignments that made use of it, or in legacy code at my first job. I would rather use C++ instead, which gave me back a bit of what I've lost in the Turbo Pascal -> C transition, plus much more.
 [...]
 
 Yea, contract has it's upsides, although naturally it has it's 
 own
 perils too. Making a living at it is *damn* hard (either that 
 or I'm
 just REALLY bad at self-employment...but it's probably both), 
 and
 frankly I'm still trying to figure out how to do it.

True. I suppose you just have to do what's most popular out there right now. I know someone who does Java stuff, and he's never short on contracts. In fact, he gets to choose his vacations 'cos he has enough options in terms of which contracts he chooses to bid on. T

That is why I stopped being religious about technology. If the customer pays for doing a something in language X, operating system Y, that is all that counts for me at the end of the day. Otherwise you are forced to travel a lot, if you only do certain types of technologies. -- Paulo
Sep 23 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sun, 23 Sep 2012 22:47:58 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

[...]
 
 As soon as you start having a whole bunch of code at different levels
 of abstraction mixed together, you know it's time to split them up
 into separate functions, 'cos chances are you'll need to use each
 piece independently one day.
 

Yea, I feel the same way: Code should operate at one level of abstraction. At least ideally. I don't always adhere hard-and-fast to it either because sometimes "getting it done" is more important than "getting it perfect" (not to say that excuses *really* bad enterprisey code). And sometimes you *can't* stick to one level due to performance or language limitations (depending on the language). But whenever reasonably possible, the rule of thumb: One level of abstraction per function. Along similar lines, there's another rule-of-thumb that I actually did pick up in college (and oddly enough, from the "Let's teach first-time beginners OOP, ie code architecture, before flow-of-execution" instructor): If a function is more than one screenful, it's probably too long. Now obviously that can't and shouldn't be used as a hard-and-fast rule, but I've found it a sensible rough guideline. Although I'm much more relaxed about it now than I used to be...and even *moreso* now that I'm on one of these annoying half-height short-screens, erm, I mean "widescreen" ;)
 
 [...]
 
 What really gets me is that these are the sorts of things that are
 harped on in chapter 1 of just about any decent "intro to
 programming" book. So where did these people even learn to code in
 the first place?

Probably from a youtube video on how to write your own lame Flash game. :-P OK, I'm being a bit harsh. But it's hard not to be cynical when you've seen the kind of code that passes for "enterprise software" these days.

Maybe it's just because I'm just beyond the snapping point, too, but I don't think it's possible to be too harsh on such code, or anything involving Flash ;)
 (Well, except for the handful of students, and I could always tell
 which ones they were, who were from the class of Mrs. "Let's Teach
 OOP *Before* Basic Flow Of Execution". Those poor students couldn't
 write *any* code, let alone good or bad code. I felt bad for them.) 

Speaking of students who couldn't code... I used to give out "sympathy marks" for struggling students. Y'know, their code was so bad, like code that obviously didn't compile or work, or code with comments written in a way that suggested the student thought that if they pleaded hard enough verbally the computer might *just* do what they wanted it to do -- they had to get a failing mark, but I tried to find excuses to not give them an outright zero. But one time, after marking a bunch of 15-20 page assignments (complete with intro, description, code, test results, etc.), I came across a submission consisting of a single sheet of paper *hand-written* on a single side. I was boggled for a good moment. It was like... I was trying not to give anyone an outright zero but she gave me no choice, y'know? What was the point of handing that piece of paper if she wasn't even going to make the effort of using the lab *printer*, for crying out loud.

Ouch. Yea. I think I actually got a sympathy "pass" in German 101. I barely scraped by with a "D-" overall, but if you went strictly by the numbers and class rules it really should have been an F. I think she knew I was at least trying and struggling though. Ironically, the only reason I even took it was because I was having trouble getting through the school's "4 semesters of the same foreign language" requirement when I went with Japanese. Little did I know German would be *far* harder. "Oh, yea, go with German - it's the closest language to English!" Yea, I think that's actually what made it so hard ;) That, and gendered words.
Sep 24 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Sep 24, 2012 at 09:55:48PM -0400, Nick Sabalausky wrote:
[...]
 The one thing I would rip out of OSX and throw against the wall
 is the mail app.  Its interface and experience is awesome.  But
 it frequently corrupts messages and doesn't properly save
 outgoing mail.  Not good for a mail application.


Ahhh how I love Mutt. ;-)

I've been finding Mutt very useful for when I'm ssh'ed into my server to create a temporary throwaway address. Doing "mutt -f /path/to/mailbox" is so much more convenient than setting up a POP3 GUI client. I need to learn how to use mutt better though, as I've just been fumbling around with it.

Well, mutt's tagline is that it sucks, all MUAs suck, mutt just sucks less. :-)
 For my usual mailboxes though, I prefer typical GUI desktop clients.
 Unfortunately, I still haven't been able to find one that I like.

Maybe you should write one in D. ;-) For one thing, having a MIME library in D would be awesome.
 Outlook Express has a bunch of problems (no spellcheck, can't send
 UTF, proprietary storage, etc). Windows Mail won't be an option when I
 move to Linux or upgrade back to XP. Claws mail is just generally
 buggy and never does anything in the background (feels almost like it
 might be purely single-threaded). And I'm not a big fan of Opera and
 don't really want to use a web browser as my desktop mail client.

I'm a big Opera fan, because Opera lets me configure stuff to work the way I want it to. But I never use it for mail (I don't like using a browser as an MUA, I think that's just feeping creaturism). And recent releases of Opera are starting to show signs of instability and excessive memory consumption, unlike earlier releases, and I'm starting to wonder if I might want to switch to Firefox...
 I think I might actually try moving to Thunderbird even though I'm
 generally unhappy with Mozilla software/practices, and didn't like it
 last time I tried (for example, it kept trying to
 bold/italic/underline parts of text in my *plaintext* views, and the
 people on the "help" forums just complained that I should shut up and
 like it - which is consistent with what usually happens when I inquire
 about customizing parts of Mozilla's so-called "most customizable
 browser in the world").

... but if it's that unconfigurable, then Opera might just be the lesser of two evils. I have to admit that I've tried using Firefox as my primary browser before, and I didn't like it. It's too IE-like for my tastes. [...]
 I find pretty much all GUI mail apps (both webmail and local MUAs)
 strait-jacketed. Anything that doesn't let you configure mail
 headers is unusable to me, and HTML by default gets on my nerves so
 much it's not even funny.

I never care about mail headers (unless I'm debugging something mail-related, which isn't often), but I *ALWAYS* have HTML disabled. I'll never use a mail client that doesn't let me turn HTML off. Not only do I not want to deal with any tracker-images (or god forbid, JS emails), but in my experience "HTML email" just means it's too easy, and far too tempting, for other people to make the stuff they send me really, really ugly ;) "Just the words, ma'am."

That's why I liked Markdown. :) Give users _basic_, logical markup that also just happens to be readable in plaintext that can be sent verbatim over the wire, and can be optionally written/read in HTML. Email doesn't need HTML, the only really necessary stuff is a bit of logical markup for people who find plaintext "too primitive". HTML is overkill. Not to mention... it's not just the JS or tracker images, but have you ever been asked to make HTML email newsletters that have to look the same across the board? Ever looked at the standards for HTML emails? Haha, fooled you. There is no standard. Every webmail and their neighbour's open relay have their own conventions for HTML email. Nothing is compatible. You can't rely on CSS because many webmails strip CSS and JS. Google Mail strips embedded style tags. Different webmails strip different things, and have different ways of formatting the same thing (often implemented by invasive mangling of the HTML). The result is that people revert to using table-based formatting and *shudder* font tags *shudder* 'cos that's the only way you can get things to even remotely resemble something sane. It's 1995 all over again, two decades later. T -- Microsoft is to operating systems & security ... what McDonalds is to gourmet cooking.
Sep 25 2012
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 25 Sep 2012 08:10:07 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 On Mon, Sep 24, 2012 at 09:55:48PM -0400, Nick Sabalausky wrote:
 For my usual mailboxes though, I prefer typical GUI desktop clients.
 Unfortunately, I still haven't been able to find one that I like.

Maybe you should write one in D. ;-)

Heh, I'd love to, and I've even had that in mind for some time now (a few years). Problem is there's a *lot* of things I'd like to do, and all on top of other things I *have* to do ;) My current pet project ATM is a blog^H^H^H^H*article* system using vibe.d and Adam's HTML DOM (the combination of which are making development a *breeze*...in the rare cases I actually get to work on it). It won't be as fully-featured as wordpress or tangocms, but at least it'll do what I want, how I want, and won't go anywhere near PHP ;)
 For one thing, having a MIME library in D would be awesome.
 

Maybe I'm just not awake enough yet but: What exactly would it do? Just be a mapping of "file extension" <--> "mime type"?
 
 I'm a big Opera fan, because Opera lets me configure stuff to work the
 way I want it to. But I never use it for mail (I don't like using a
 browser as an MUA, I think that's just feeping creaturism). And recent
 releases of Opera are starting to show signs of instability and
 excessive memory consumption, unlike earlier releases, and I'm
 starting to wonder if I might want to switch to Firefox...
 

Newer Operas also got rid of the "native-ish" theme, which is why I'm not upgrading past v10. It may seem trivial, but skinned apps *really* bug me. I find the UIs in the FF4-onward to be completely intolerable. Even FF3's UI was god-awful, and then they managed to make it worse with 4 by going all "Chrome-envy".
 ... but if it's that unconfigurable, then Opera might just be the
 lesser of two evils. I have to admit that I've tried using Firefox as
 my primary browser before, and I didn't like it. It's too IE-like for
 my tastes.
 

That was probably a long time ago, as FF is basically a Chrome knock-off now. Then again, so is IE now... Speaking of, I wrote a "not-a-blog" post about these browser issues just a few months back: <http://semitwist.com/articles/article/view/the-perfect-browser-is-easy!-yet-it-still-doesn-t-exist...> (Yea, TangoCMS uses loooong urls.)
 The result is that people revert to using table-based formatting and

Hey, I *like* table-based formatting :). Beats the hell out of trying to kluge together sane layouts/flowing with CSS. And nobody's ever going to convince me that HTML isn't the presentation layer.
Sep 25 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Wed, 19 Sep 2012 10:11:50 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Wed, 19 Sep 2012 01:34:12 -0400, Nick Sabalausky  
 <SeeWebsiteToContactMe semitwist.com> wrote:
 
 On Tue, 18 Sep 2012 23:46:35 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 The keyboard click sound (which you can disable BTW,
 settings->sounds->keyboard clicks) obeys the ringer volume.

Ehh? How unintuitive.

I cannot argue that Apple's audio volume isn't too simplistic for its own good. AIUI, they have two "volumes", one for the ringer, and one for playing audio, games, videos, etc.

There's also a separate one for alarms/alerts: http://www.ipodnn.com/articles/12/01/13/user.unaware.that.alarm.going.off.was.his/ And Jobs-only-knows what else. Apple actually thought that was a good idea. Plus, my understanding is that one of Apple's explicit design principles is that if an user-prompted action is something that's "expected" to make a sound (by whatever *Apple* decides is "expected", naturally), then to hell with the user's volume setting, it should make a sound anyway. It's just unbelievably convoluted, over-engineered, and as far from "simple" as could possibly be imagined. Basically, you have "volume up" and "volume down", but there's so much damn modality (something Apple *loves*, but it almost universally bad for UI design) that they work pretty much randomly.
 I think the main problem is that the volume rules are just far too
 convoluted. They took something trivial and hacked it up beyond
 recognition, and all in the supposed name of "simplicity", go
 figure.

I think if they simply made the volume buttons control the ringer while locked and not playing music, it would solve the problem.

I very much disagree. Then when you take it out to use it, everything will *still* be surprisingly too loud (or quiet). Just not when a call comes in...
 BTW, a cool feature I didn't know for a long time is if you double
 tap the home button, your audio controls appear on the lock screen
 (play/pause, next previous song, and audio volume).  But I think you
 have to unlock to access ringer volume.
 

That's good to know (I didn't know). Unfortunately, it still only eliminates one, maybe two, swipes from an already-complex procedure, that on any sensible device would have been one step: Reach down into the pocket to adjust the volume.
 
 It's more moving parts to break.  I wouldn't like it.  Just my
 opinion.
 

How often has anyone ever had a volume POT go bad? I don't think I've *ever* even had it happen. It's a solid, well-established technology.
 If it were my own personal device, I'd just jailbreak it and be done
 with it. (And then pay the ransom to publish, of course, because
 what else can you do? Create your own device and compete with Apple
 under capitalism? Nope, Google tried that idea of "competition" and
 look what happened:
 <http://www.nytimes.com/2012/08/25/technology/jury-reaches-decision-in-apple-samsung-patent-trial.html?
r=1&ref=technology>  
 )

If you want to develop for only jailbroken phones, you basically alienate most users of iPhone. It's not a viable business model IMO. Yes, it sucks to have to jump through apple's hoops, but having access to millions of users is very much worth it.

No, no, no, I'd jailbreak it for *testing*. Like I said, I'd begrudgingly still pay Apple's ransom for publishing, because what other realistic option is there?
 
 Oh, when you develop apps, it's quite easy to install on the phone,
 you just click "run" from xcode, selecting your device, you don't
 ever have to start itunes (though itunes will auto-start every time
 you plug in the phone, but you can disable this in itunes, more
 annoying is that iPhoto *always* starts, I can't figure out how to
 stop that).  From then on, the app is installed.  The issue is
 setting up all the certificates via xcode and their web portal to get
 that to work (should only have to do this once).  I think the process
 has streamlined a bit, you used to have to create an app id for each
 app and select which devices were authorized to install it.  Now I
 think you get a wildcard app id, but you still have to register each
 device.
 

I don't use a mac, and I never will again. I spent about a year or two with OSX last decade and I'll never go back for *any* reason. Liked it at first, but the more I used it the more I hated it. Fortunately, I'm developing with Marmalade, so I don't have to even have a mac at all (not only that, I don't need to touch any Objective-C, either). Now that I've actually had some sleep, ;), I remember now that since Marmalade's deployment tool can code-sign (assuming you paid the ransom for Apple's dev cert) and install direct to the device, so you're right, I don't need iTunes after all. Apple still requires a mac to submit to the app store, but luckily my "boss" has a mac, and he's going to be doing the submitting anyway. So I don't even have to touch one of those wretched machines at all.
 I love how my iPhone will never scratch or deteriorate.

Instead, it'll just get prematurely discontinued ;)

3gs (released june 2009) was still being sold last month, and it is getting ios 6 upgrade. I still have mine and develop with it.

That's fairly uncharacteristic for Apple though. And it's still only 3 years, that's not much anyway. Yea, for phones it's *considered* a lot, but that's coming from a world where people *expect* you to go throwing away your "old" expensive devices the moment your lock-in contract is up (after only a year or two) so you can immediately jump back into more lock-in, which is insane.
 
 My wife and I have been very careful with ours, but I do see a lot
 with cracked screens.  Interesting thing is they still seem to work!
 I don't think a cracked/broken screen would ever work with a
 palm-style touch screen.
 

Palm screens were better protected anyway, in various ways. And I never saw a busted one (though I don't doubt they existed). Although I did have a scare on my Palm once, when I noticed the touchscreen and all buttons were unresponsive. After a special trip home from work to get it on the charger (and hopefully sync it), I realized what had happened: Turned out that when I had been playing with the screen protector earlier, I'd managed to wedge the corner in between the screen and the casing, so it was registering that as one loooong tap. That was kinda embarrassing :)
Sep 19 2012
prev sibling next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Wed, 19 Sep 2012 17:05:35 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Wed, 19 Sep 2012 10:11:50 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 I cannot argue that Apple's audio volume isn't too simplistic for its
 own good.  AIUI, they have two "volumes", one for the ringer, and one
 for playing audio, games, videos, etc.

There's also a separate one for alarms/alerts: http://www.ipodnn.com/articles/12/01/13/user.unaware.that.alarm.going.off.was.his/

This makes sense. Why would you ever want your alarm clock to "alarm silently" How would you wake up? This is another case of someone using the wrong tool for the job (for reminders, use the new reminder feature, or use an appointment with an alert, those obey the silent switch). And the volume is set by the ringer, it's not a separate volume. It's just that it doesn't obey the silent switch. If it did I'd be pissed, because I frequently turn my phone to silent at night, but expect the alarm to wake me up.
 Apple actually thought that was a good idea.

Because it is.
 Plus, my understanding is that one of Apple's explicit design principles
 is that if an user-prompted action is something that's "expected" to
 make a sound (by whatever *Apple* decides is "expected", naturally),
 then to hell with the user's volume setting, it should make a sound
 anyway.

I don't know any examples of sounds that disobey the silent switch except for the "find my iPhone" alert, and the alarm clock, both of which would be quite foolish to have make no sounds. Really, when you take the silent switch into account, the sound system works adequately for most people.
 It's just unbelievably convoluted, over-engineered, and as far from
 "simple" as could possibly be imagined. Basically, you have "volume up"
 and "volume down", but there's so much damn modality (something Apple
 *loves*, but it almost universally bad for UI design) that they
 work pretty much randomly.

I think you exaggerate. Just a bit.
 I think if they simply made the volume buttons control the ringer
 while locked and not playing music, it would solve the problem.

I very much disagree. Then when you take it out to use it, everything will *still* be surprisingly too loud (or quiet). Just not when a call comes in...

The ringer volume affects almost all the incidental sounds, the click for keyboard typing, the lock/unlock sounds, alert sounds, alarm volume, etc. The audio volume affects basically music, video, and game sounds.
 BTW, a cool feature I didn't know for a long time is if you double
 tap the home button, your audio controls appear on the lock screen
 (play/pause, next previous song, and audio volume).  But I think you
 have to unlock to access ringer volume.

That's good to know (I didn't know). Unfortunately, it still only eliminates one, maybe two, swipes from an already-complex procedure, that on any sensible device would have been one step: Reach down into the pocket to adjust the volume.

Well, for music/video, the volume buttons *do* work in locked mode.
 It's more moving parts to break.  I wouldn't like it.  Just my
 opinion.

How often has anyone ever had a volume POT go bad? I don't think I've *ever* even had it happen. It's a solid, well-established technology.

I have had several sound systems where the volume knob started misbehaving, due to corrosion, dust, whatever. You can hear it mostly when you turn the knob, and it has a scratchy sound coming from the speakers.
 If you want to develop for only jailbroken phones, you basically
 alienate most users of iPhone.  It's not a viable business model
 IMO.  Yes, it sucks to have to jump through apple's hoops, but having
 access to millions of users is very much worth it.

No, no, no, I'd jailbreak it for *testing*. Like I said, I'd begrudgingly still pay Apple's ransom for publishing, because what other realistic option is there?

I wouldn't do that if it were me. You might find yourself adding features that aren't allowed or available in non-jailbroken phones, and then go to publish, find out your whole design is not feasible.
 Oh, when you develop apps, it's quite easy to install on the phone,
 you just click "run" from xcode, selecting your device, you don't
 ever have to start itunes (though itunes will auto-start every time
 you plug in the phone, but you can disable this in itunes, more
 annoying is that iPhoto *always* starts, I can't figure out how to
 stop that).  From then on, the app is installed.  The issue is
 setting up all the certificates via xcode and their web portal to get
 that to work (should only have to do this once).  I think the process
 has streamlined a bit, you used to have to create an app id for each
 app and select which devices were authorized to install it.  Now I
 think you get a wildcard app id, but you still have to register each
 device.

I don't use a mac, and I never will again. I spent about a year or two with OSX last decade and I'll never go back for *any* reason. Liked it at first, but the more I used it the more I hated it.

It's a required thing for iOS development :) I have recently experienced the exact opposite. I love my mac, and I would never go back to Windows. Mac + VMWare fusion for running XP and Linux is fucking awesome.
 Fortunately, I'm developing with Marmalade, so I don't have to even
 have a mac at all (not only that, I don't need to touch any Objective-C,
 either). Now that I've actually had some sleep, ;), I remember now that
 since Marmalade's deployment tool can code-sign (assuming you paid the
 ransom for Apple's dev cert) and install direct to the device, so
 you're right, I don't need iTunes after all.

I recently learned objective C, and I'd hate to use it without xcode, which is a fantastic IDE. Obj-C is extremely verbose, so without auto-complete, it would be torturous.
 3gs (released june 2009) was still being sold last month, and it is
 getting ios 6 upgrade.  I still have mine and develop with it.

That's fairly uncharacteristic for Apple though. And it's still only 3 years, that's not much anyway. Yea, for phones it's *considered* a lot, but that's coming from a world where people *expect* you to go throwing away your "old" expensive devices the moment your lock-in contract is up (after only a year or two) so you can immediately jump back into more lock-in, which is insane.

I think that model is here to stay, because they have now gone to a model where the two prior generations are available, the previous at half price, and the one two generations back for free (subsidized of course). Given that the release cycle is about once per year, this also means that the one 3 generations back will be supported for at least a year (it would be shit if you got a 3gs, and then when the 5 comes out 2 months later, it's not supported). This is why iOS 6 is on 3gs, but not on the iPad 1 (which was released more recently, but has not been sold for a while).
 My wife and I have been very careful with ours, but I do see a lot
 with cracked screens.  Interesting thing is they still seem to work!
 I don't think a cracked/broken screen would ever work with a
 palm-style touch screen.

Palm screens were better protected anyway, in various ways. And I never saw a busted one (though I don't doubt they existed).

The *screen* wasn't broken, it's just the plastic starts deteriorating. Jobs famously had an early iPhone prototype with a plastic screen and pulled it out at a designer meeting and yelled at them saying "this fucking thing is in with my keys, it's getting all scratched up! we need something better." That's when they started thinking about using the glass screens. Hate him if you want, but he definitely has revolutionized mobile technology.
 Although I did have a scare on my Palm once, when I noticed the
 touchscreen and all buttons were unresponsive. After a special
 trip home from work to get it on the charger (and hopefully sync it), I
 realized what had happened: Turned out that when I had been playing
 with the screen protector earlier, I'd managed to wedge the corner in
 between the screen and the casing, so it was registering that as one
 loooong tap. That was kinda embarrassing :)

hehe :) My kids often say the iPad isn't working, and then I have to point out they are holding it with their thumb on the screen. At least those problems are easy to fix :) -Steve
Sep 20 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 25, 2012 at 05:36:48PM -0400, Nick Sabalausky wrote:
 On Tue, 25 Sep 2012 08:10:07 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 On Mon, Sep 24, 2012 at 09:55:48PM -0400, Nick Sabalausky wrote:
 For my usual mailboxes though, I prefer typical GUI desktop
 clients.  Unfortunately, I still haven't been able to find one
 that I like.

Maybe you should write one in D. ;-)

Heh, I'd love to, and I've even had that in mind for some time now (a few years). Problem is there's a *lot* of things I'd like to do, and all on top of other things I *have* to do ;)

Ah yes. I have that problem too. Too many pet projects, too little time. [...]
 For one thing, having a MIME library in D would be awesome.
 

Maybe I'm just not awake enough yet but: What exactly would it do? Just be a mapping of "file extension" <--> "mime type"?

I must've been half-asleep when I wrote that. I meant a mail-handling library that can handle MIME attachments.
 I'm a big Opera fan, because Opera lets me configure stuff to work
 the way I want it to. But I never use it for mail (I don't like
 using a browser as an MUA, I think that's just feeping creaturism).
 And recent releases of Opera are starting to show signs of
 instability and excessive memory consumption, unlike earlier
 releases, and I'm starting to wonder if I might want to switch to
 Firefox...
 

Newer Operas also got rid of the "native-ish" theme, which is why I'm not upgrading past v10. It may seem trivial, but skinned apps *really* bug me.

Skinned apps don't bug me at all. I tend to like apps where you can delete useless buttons off the UI and turn off toolbars and stuff you never use. As well as configure custom keyboard bindings ('cos I hate having to use the mouse unless it's needed for an *inherently* graphical task, like picking out pixels).
 I find the UIs in the FF4-onward to be completely intolerable. Even
 FF3's UI was god-awful, and then they managed to make it worse with 4
 by going all "Chrome-envy".

What I'd _really_ like, is browser *library*, where you get to assemble your own browser from premade parts. Like replace the lousy UI front end with a custom interface. Applications nowadays suffer from excessive unnecessary integration. Software should be made reusable, dammit. And I don't mean just code reuse on the level of functions. I mean entire software systems that are pluggable and inter-connectible. If there's a browser that has a good back-end renderer but lousy UI, it should be possible to rip out the UI part and substitute it with the UI of another browser that has a better UI but lousy back-end. And if there's a browser that comes with unnecessary bloat like a mail app, it should be possible to outright _delete_ the mail component off the HD and have just the browser part working. Software these days is just so monolithic and clumsy. We need a new paradigm. [...]
 The result is that people revert to using table-based formatting and

Hey, I *like* table-based formatting :). Beats the hell out of trying to kluge together sane layouts/flowing with CSS. And nobody's ever going to convince me that HTML isn't the presentation layer.

I say trash it all, tables, HTML, everything. Markdown is good enough for email. If you need more than that, go buy a real website and post it there instead of transmitting that crap over SMTP. T -- This is a tpyo.
Sep 25 2012
prev sibling next sibling parent reply "Adam D. Ruppe" <destructionator gmail.com> writes:
On Tuesday, 25 September 2012 at 22:49:36 UTC, H. S. Teoh wrote:
 I must've been half-asleep when I wrote that. I meant a 
 mail-handling library that can handle MIME attachments.

I did a simple one a while ago. Jush pushed to github: https://github.com/adamdruppe/misc-stuff-including-D-programming-language-web-stuff/blob/fd7bfd5c250901a8c546b502772f18fe019ed7e9/email.d Nothing really special, but it works for what I've tried with it so far. I need to do a parser soon too, to replace my current indexOf hacks in my mail reading apps. Eh, I'll get around to it eventually.
Sep 25 2012
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 25 Sep 2012 18:05:01 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 On Tue, Sep 25, 2012 at 08:19:00PM -0400, Nick Sabalausky wrote:
 
 A "web browser control" is pretty common, AIUI. I know IE and WebKit
 can be used as controls that you just plop into a window. Then you
 have to add in all the bells and whistles like address bar,
 bookmarking, etc., which all still adds up to a lot of extra work,
 though.

No no, that's still part of the monolithic system. I find that kinda silly, actually. It's a system that lets the user to illogical things like put the "forward" button to the left of the "back" button followed by the "next" button with "rewind" interspersed between them. It gives you the illusion of control, but hides the fact that you still can't do things like rip out the entire dang UI and sticking a totally new one in its place.

Hmm, one of us is still not understanding the other, maybe both. With what I'm talking about, all of the back/forward/etc buttons and everything are completely gone. Think Scintilla, but for HTML/CSS instead of text. Ie, imagine you make a trivial HTML page that's nothing but a purple background and no content. Now load it in a web browser. The purple part (and maybe the scroll bars?) is the *only* part that's included. Anything else, forward, back, addr bar, etc., must be *created* by you, the developer and made to call whatever API the control exposes to do such things. Or at least, that's my understanding anyway. Perhaps I'm mistaken. The result does admittedly end up being another monolithic system though, just with the same layout engine.
 
 But on a more serious note, *all* programs should be written as though
 they're intended for a library. The frontend, be it main() or whatever
 the toolkit substitute for it is, should just be wrappers that call
 the library functions. The key idea behind this is automation and
 scripting, which is something sorely lacking in GUI-centric
 applications. To me, it is stupid that just because a program that
 solves problem P with algorithm X comes with a GUI, you're stuck with
 having to use the badly-designed GUI instead of just plugging into
 algorithm X directly. The whole point of the program is to solve
 problem P, so X should be in a library that you can call directly
 from an external program without having to jump through GUI hoops
 just to get at it.
 

Totally agree. In fact, I specifically tried to make sure, as much as I could, that all the tools in my Goldie <http://semitwist.com/goldie> system were basically just thin front-ends over a D API. (Except for the JsonViewer thing because I didn't write that, I just hacked it up with some extra features.) You can even check all the 'main.d' files in the src repo <https://bitbucket.org/Abscissa/goldie/src/master/src>, they're mostly just thin API wrappers. I generally try to do that with all my tools. Definitely the way to go.
 I mean, this is just basic computer science. It's function
 composition. Something that most software of today seems to have
 forgotten.
 

Good way of putting it.
 
 [...]
 The result is that people revert to using table-based
 formatting and

Hey, I *like* table-based formatting :). Beats the hell out of trying to kluge together sane layouts/flowing with CSS. And nobody's ever going to convince me that HTML isn't the presentation layer.

I say trash it all, tables, HTML, everything. Markdown is good enough for email. If you need more than that, go buy a real website and post it there instead of transmitting that crap over SMTP.

Well, I just meant on the web, not email. Death to HTML emails!

LOL... If I had my way, the web would be formatted with LaTeX (or equivalent) instead of that crappy HTML+CSS+JS which makes it so easy for clueless people to create webpages that make your eyes bleed.

I've been meaning to look into latex.
 HTML was never intended to be used for the kinds of stuff people use
 it for these days,

Absolutely. +1
 and while CSS has some nice points, it's also
 insufficient to express some basic page layout concepts, which
 necessitates hacks to make things appear the way you want it to. Yes
 the hacks are clever, but a complete and consistent layout language
 shouldn't need hacks to express basic layout concepts. Like an
 explicit horizontal centering for containers that doesn't need half a
 dozen different shenanigans with auto margins and padding and
 text-align and what-not to accomplish. Or basic container alignment
 between two elements that aren't immediate siblings. Or a true fluid
 layout that doesn't require hard-coding to specific browser
 resolutions (like seriously... you'd have thought CSS should've
 obsoleted that, but no, it's still happening). Ad nauseum.
 

Yup. I do like HTML/CSS for documents, could be less verbose, could use to not suck at diagrams and math formulas, and it has this linking problem <http://semitwist.com/articles/article/view/html-fragment-linking-is-stupid-here-s-the-fix>, but it generally gets the job done reasonably well...for documents...ie, what it was *made* for. But it's *not*, by any means, a sane UI/general-presentation description format (as it's increasingly being used as), nor was it ever designed to be. A modern web browser is no different from having "applications" run inside Acrobat Viewer or MS Word using PDF or DOC for i/o. It's literally the same thing, just using a different format (but, thankfully, with no page breaks or multi-column text, neither of which make sense in electronic form).
Sep 25 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 25, 2012 at 10:29:54PM -0400, Nick Sabalausky wrote:
 On Tue, 25 Sep 2012 18:05:01 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 On Tue, Sep 25, 2012 at 08:19:00PM -0400, Nick Sabalausky wrote:
 
 A "web browser control" is pretty common, AIUI. I know IE and
 WebKit can be used as controls that you just plop into a window.
 Then you have to add in all the bells and whistles like address
 bar, bookmarking, etc., which all still adds up to a lot of extra
 work, though.

No no, that's still part of the monolithic system. I find that kinda silly, actually. It's a system that lets the user to illogical things like put the "forward" button to the left of the "back" button followed by the "next" button with "rewind" interspersed between them. It gives you the illusion of control, but hides the fact that you still can't do things like rip out the entire dang UI and sticking a totally new one in its place.

Hmm, one of us is still not understanding the other, maybe both.

OK it's probably my fault. I'm acquiring this bad habit recently of responding before I read, and writing before I think. My bad.
 With what I'm talking about, all of the back/forward/etc buttons and
 everything are completely gone. Think Scintilla, but for HTML/CSS
 instead of text.
 
 Ie, imagine you make a trivial HTML page that's nothing but a purple
 background and no content. Now load it in a web browser. The purple
 part (and maybe the scroll bars?) is the *only* part that's included.
 Anything else, forward, back, addr bar, etc., must be *created* by
 you, the developer and made to call whatever API the control exposes
 to do such things.
 
 Or at least, that's my understanding anyway. Perhaps I'm mistaken.
 
 The result does admittedly end up being another monolithic system
 though, just with the same layout engine.

Wait, so you're using HTML to make a browser UI? Or was that just an illustration?
 But on a more serious note, *all* programs should be written as
 though they're intended for a library. The frontend, be it main() or
 whatever the toolkit substitute for it is, should just be wrappers
 that call the library functions. The key idea behind this is
 automation and scripting, which is something sorely lacking in
 GUI-centric applications. To me, it is stupid that just because a
 program that solves problem P with algorithm X comes with a GUI,
 you're stuck with having to use the badly-designed GUI instead of
 just plugging into algorithm X directly. The whole point of the
 program is to solve problem P, so X should be in a library that you
 can call directly from an external program without having to jump
 through GUI hoops just to get at it.
 

Totally agree. In fact, I specifically tried to make sure, as much as I could, that all the tools in my Goldie <http://semitwist.com/goldie> system were basically just thin front-ends over a D API. (Except for the JsonViewer thing because I didn't write that, I just hacked it up with some extra features.) You can even check all the 'main.d' files in the src repo <https://bitbucket.org/Abscissa/goldie/src/master/src>, they're mostly just thin API wrappers. I generally try to do that with all my tools. Definitely the way to go.

You're doing better than I am. :-P I whine and groan about it, but then very often my own programs are monolithic monsters. Hopefully once I start having more D projects replace my C/C++ ones, I'll improve in this area. D does make it a lot easier to design software in this way, which is one of the reasons I like it so much in spite of the current implementation flaws. [...]
 LOL... If I had my way, the web would be formatted with LaTeX (or
 equivalent) instead of that crappy HTML+CSS+JS which makes it so
 easy for clueless people to create webpages that make your eyes
 bleed.
 

I've been meaning to look into latex.

And here's a quote for you: Those who've learned LaTeX swear by it. Those who are learning LaTeX swear *at* it. -- Pete Bleackley My own take on it is this: LaTeX itself is quite old, and its age is starting to show. There are many things about its implementation details that I don't quite like. Lack of native UTF support is one major flaw (though there are imperfect workarounds). Also some holdovers from the bad old days of trying to squeeze out every last byte from something, causing a zoo of command names that you pretty much have to memorize. However. The *concepts* that it is based on are rock solid, and its layout engine beats any WYSIAYG app hands down. You cannot beat LaTeX at paragraph layout. It is optimized to prevent "running rivers" of spaces when adjacent lines in a paragraph has inter-word spaces in the wrong places -- the kind of stuff that browsers spew out every day until hardly anybody notices how awful it looks anymore. And math!! Once you've tasted the power of math layout in LaTeX, you'll want to dash every other math layout format (esp. web-based ones) against the rocks. Presently no combination of browser hacks + MathML + whatever other feeble attempt at math notation, etc., can beat LaTeX at math formatting. You can write insanely complex math expressions *and* have it all come out like it was designed by a professional math typographer. *And* you can do this just by a relatively simple plaintext format that doesn't require clicking through 150 nested menus and hand-picking symbols from a 20-page character map. You can embed math in your text or have it displayed separately, and in each instance it comes out appropriately formatted. You have symbols that automatically scale (like matrix parentheses that doesn't require stupid manual and ugly resizing, tall brackets, hats and underscores that stretch over multiple symbols, etc.). It's math notation heaven. I mean, once I (ab)used math mode's amazing diacritic-placing ability to invent a writing system for an artificial language that involves multiple stacked accents over and under letters, and they always come out right. Try that on a modern-day browser with UTF combining diacritics and your eyes will bleed by the time you get to two accents on a single letter, possibly before that. My LaTeX macros could handle up to 5 diacritics above and below a single letter (*and* with some accents side-by-side while being stacked with others) without losing its composure. I mean, I could've formatted Vietnamese diacritics with math mode alone, if there hadn't already been a Vietnamese port of LaTeX, it's that powerful. You've no idea how much I wish for the day when the web could even come up to 50% of the formatting power that LaTeX provides. My eyes would bleed so much less when I'm online. [...]
 I do like HTML/CSS for documents, could be less verbose, could use to
 not suck at diagrams and math formulas,

Once you've tasted the power of math layout in LaTeX, you wouldn't want to go near HTML math formulas with a 20-foot sterilized asbestos-lined titanium pole. It's *that* bad by comparison.
 and it has this linking problem
 <http://semitwist.com/articles/article/view/html-fragment-linking-is-stupid-here-s-the-fix>,
 but it generally gets the job done reasonably well...for
 documents...ie, what it was *made* for.

I'm all for better ways of linking than the klunky baroque name= or id= dance, but one problem with overly-specific linking is, if the target page gets updated and stuff moves around, your link breaks. That's a bit too fragile for my tastes. (Though of course, one could argue that the same will happen if the id= gets deleted by the update, but at least if whatever it is you're linking to still exists, chances are the id= will stay, but unmarked elements can come and go at any time.)
 But it's *not*, by any means, a sane UI/general-presentation
 description format (as it's increasingly being used as), nor was it
 ever designed to be.

What never made any sense to me was the use of HTML for what amounts to a GUI (*cough*form elements*cough*JS/CSS popup dialogs*cough*). The markup for these monstrosities make no sense at all, for the simple reason that these are *user interface widgets*, not document fragments!!! So you end up with nonsense like the baroque dance of (ab)using id's and what-not to link disparate elements of a radio button set/select/whatever, <select>'s that have to be modified in real-time by JS mostly by rewriting HTML with string fragments (unless you're crazy enough to actually use native DOM operations for creating individual elements, in which case you're probably beyond help), to mimic what a *real* GUI does with a few lines of code that reads much more sensibly than any JS + HTML string fragments nonsense ever will. Then you have this nonsensical way of simulating popup dialogs by appending <div>s to the end of the "document" and (ab)using CSS to make it appear like a "real" dialog. Which is all fine and dandy until you start scrolling the document, then all of a sudden a random subset of the form elements in the dialog scrolls off the fake dialogue window while the rest stay on screen -- because the developer forgot to apply certain CSS attributes to said form elements so their positioning got out of sync. (I've actually witnessed this first hand. This is no exaggeration. You can argue this is just a careless bug, but my point is, how did any of this convolution even make any sense in the first place? This is a *document* format, not a windowing toolkit, for crying out loud.) And using a (semi)transparent screen-filling <div> to simulate modal dialogs? Yeah it's clever. It also shows how stupid the whole enterprise is. That's not a "document" element at all. That's a windowing *system* that got transplanted into a *document* format. It's like implementing a flight simulator in an MS Word document. Very clever if you could pull it off, but also equally ludicrous.
 A modern web browser is no different from having "applications" run
 inside Acrobat Viewer or MS Word using PDF or DOC for i/o. It's
 literally the same thing, just using a different format (but,
 thankfully, with no page breaks or multi-column text, neither of which
 make sense in electronic form).

Yeah the biggest beef I have with PDF (and LaTeX, for that matter) is that you can't break out of the page paradigm. It's just a holdover from the dead tree print days, which is quickly becoming obsolete in spite of objections from aging book lovers. It doesn't make sense for electronic documents. This is one thing where HTML is actually better than other generally-more-superior formats. I'm on the fence about multi-column though. I find that an unfortunately large percentage of websites out there are overly wide, with text that stretch way past your eye's natural comfortable reading width, making reading said text a very tiring experience. OTOH if you just clip the width to a more natural width you have the problem that most screen these days are too lacking in the height department^W^W^W^W^W^W overly endowed in the width department, so you end up with large swaths of empty spaces on either side, which looks awful. Standard support for multi-columns would be a big help here. (Preferably one that's decided by the *browser* instead of hard-coded by some silly HTML hack with tables or what-not.) I think CSS3 has (some) support for this. T -- If it breaks, you get to keep both pieces. -- Software disclaimer notice
Sep 25 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 25 Sep 2012 23:00:41 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 On Tue, Sep 25, 2012 at 10:29:54PM -0400, Nick Sabalausky wrote:
 With what I'm talking about, all of the back/forward/etc buttons and
 everything are completely gone. Think Scintilla, but for HTML/CSS
 instead of text.
 
 Ie, imagine you make a trivial HTML page that's nothing but a purple
 background and no content. Now load it in a web browser. The purple
 part (and maybe the scroll bars?) is the *only* part that's
 included. Anything else, forward, back, addr bar, etc., must be
 *created* by you, the developer and made to call whatever API the
 control exposes to do such things.
 
 Or at least, that's my understanding anyway. Perhaps I'm mistaken.
 
 The result does admittedly end up being another monolithic system
 though, just with the same layout engine.

Wait, so you're using HTML to make a browser UI?

No...
 Or was that just an illustration?
 

Let me try putting it a different way (Sorry if it sounds patronizing, I'm not trying to be): Most GUIs are made of common re-usable widgets, right? The "button" widget, the "checkbox" or "radio box" widgets, the "menu bar" widget, the "text box" widget, "image", "list", "grid", "treeview", etc. So then you make a GUI by plopping those widgets into a window, adjusting their exposed properties, and providing actions for stuff like "onClick". Basic stuff, right? So if I take a text-edit control and plop it onto a window, I haven't recreated the entire GUI and functionality of Windows Notepad or Kate or Gedit or Eclipse or whatever. It's just a text box. No menu bar, no toolbar, no "save button", no "currently opened files list", no status indicators, no nothing. Just a box you can type into. You have to add those widgets in, and add code to make them cause the right things to happen to the text box widget (or as a result of the text box widget). But then there's fancier widgets, too. Like WebKit (used by Safari, Arora, Chrome IIRC, and likely others), or a similar one from IE, and maybe FF too, I dunno. These are "HTML/CSS Viewer" widgets: You give them HTML/CSS and they show it and, I assume, give you an API for accessing the DOM somehow. I imagine you can probably give them a URL too. They may or may even do other stuff too, like offer an API for "goToPreviousPage()". I'm sure it all varies from one to another, and I honestly don't know how far any of them go because I haven't used them personally. Although I *think* with WebKit it's up to you to wire up a JS engine yourself (just as an example). But what they *don't* do, AIUI, is provide any GUI *at all* for anything besides the final rendered HTML/CSS. (Although I do have a vague recollection that there *is* an IE widget you can use that *does* give you all of that.) So... ...suppose you made a GUI program, gave it a window, and filled that window with one of those HTML/CSS/Web widgets. It'll show *nothing*. Blank page. So in code, you tell it to go here (Go ahead and visit this link for real right now): http://semitwist.com/blank-purple.html (Or maybe your code just downloads that URL and hands the raw HTML to the control. Or something.) Then...Your program will be *nothing* but a big purple rectangle, maybe with scrollbars, and whatever window-frame is added by your system's window manager. That's it. The user won't even be able to type in a URL or go forwards/back without you putting in code and widgets (or keyboard accelerators, or whatever) to let them do that. The widget may or may not have a simple "browserView.goBack();" that you can call, I don't know, but even if it does then YOUR code has to actually bother to call it. Interestingly, there are add-ons for FireFox that let you use the IE renderer instead of FF's renderer *in* one or more of your FF tabs. There's even some obscure browser out there that lets you choose between IE, FF and some other (WebKit?) for each of it's own tabs. Might be called Trident or something like that, IIRC? At least...that's my understanding of it all, having never actually dealt with browser code myself. I apologize if I happen to be totally wrong ;)
 
 You're doing better than I am. :-P  I whine and groan about it, but
 then very often my own programs are monolithic monsters. Hopefully
 once I start having more D projects replace my C/C++ ones, I'll
 improve in this area. D does make it a lot easier to design software
 in this way, which is one of the reasons I like it so much in spite
 of the current implementation flaws.
 

D is seriously so awesome. I had to go back to C++ for the iOS/Andorid game I'm doing, and man do I miss D <http://semitwist.com/articles/article/view/top-d-features-i-miss-in-c>. I miss D whenever I do maintenance on my Haxe-based webapp (one of my real-world projects). I miss D like crazy anytime I have to use any language other than D. It's so totally spoiled me :)
 
 And here's a quote for you:
 
 	Those who've learned LaTeX swear by it. Those who are learning
 	LaTeX swear *at* it. -- Pete Bleackley
 

I think I heard that once before. I do like it :)
 My own take on it is this: LaTeX itself is quite old, and its age is
 starting to show. There are many things about its implementation
 details that I don't quite like. Lack of native UTF support is one
 major flaw (though there are imperfect workarounds).  Also some
 holdovers from the bad old days of trying to squeeze out every last
 byte from something, causing a zoo of command names that you pretty
 much have to memorize.

Might be fun to make a front-end for it. Something that spits out raw latex given a modernized equivalent.
 [...LaTeX rox stuff snipped...],

Sounds cool.
 [...]
 I do like HTML/CSS for documents, could be less verbose, could use
 to not suck at diagrams and math formulas,

Once you've tasted the power of math layout in LaTeX, you wouldn't want to go near HTML math formulas with a 20-foot sterilized asbestos-lined titanium pole. It's *that* bad by comparison.

Even now I wouldn't even bother. I'd just use an image, or if not that, then maybe try pre-baked MathML output. But you have convinced me to get around to trying latex when I get a chance.
 
 and it has this linking problem
 <http://semitwist.com/articles/article/view/html-fragment-linking-is-stupid-here-s-the-fix>,
 but it generally gets the job done reasonably well...for
 documents...ie, what it was *made* for.

I'm all for better ways of linking than the klunky baroque name= or id= dance, but one problem with overly-specific linking is, if the target page gets updated and stuff moves around, your link breaks. That's a bit too fragile for my tastes. (Though of course, one could argue that the same will happen if the id= gets deleted by the update, but at least if whatever it is you're linking to still exists, chances are the id= will stay, but unmarked elements can come and go at any time.)

Yea, but at least it's something. And the "alternatives" feature helps, too. What might be a good complement to that would be a text-search fragment. To go to the first instance of some short phrase. Again, not perfect, but beats the hell out of "No 'id=' where you need one? You're SOL!" But in any case, the whole damn web is completely fluid anyway, so even without the "id=" fragments that we *do* have, there's always still links breaking. At least a broken fragment still goes to the right page, instead of a 404 or "server not found" something.
 
 But it's *not*, by any means, a sane UI/general-presentation
 description format (as it's increasingly being used as), nor was it
 ever designed to be.

What never made any sense to me was the use of HTML for what amounts to a GUI (*cough*form elements*cough*JS/CSS popup dialogs*cough*).

*Exactly* The "JS/CSS popup dialogs" (I call them "pop-ins") are probably the #1 thing that irritates me most on the web. Everything about it is wrong. Not just the technical things you describe, but the whole user experience even when it's *not* buggy. I mean, here we have a *popup*, that *can't* be killed by popup blockers, *and* makes the page underneath inaccessible! *And* it breaks the "back" button! Plus, on top of all that, it's completely unnecessary 100% of the time and does *nothing* to improve the user experience. I had a *cough*fun experience with them recently, too: I wanted to check the availability of some item at my local library system, but their site insists on showing the availability info in one of those "pop-ins". Ok, annoying normally, but I was out of the house so I was doing this on the iPhone (which took forever due to the barely-usable text-entry on the thing). So I finally get to what I want, get to the "item availability" pop-in, and it's too big to fit on the screen. Ok, to be expected, it *is* a phone. So I try to scroll...and the pop-in *stays in place* as I scroll around the faded-out page underneath. So I can't scroll the pop-in. So I try to zoom out. Oh, it zooms out ok, but the part that was offscreen *stays* offscreen, so that doesn't help either. Go landscape - that just makes it worse because it everything scales up to keep the page width the same, so I just loose vertical real estate. Funny thing is, it works fine (ie without using pop-ins) when JS is off. But I can't turn JS off in iPhone Safari. So I don't know if it's an HTML/CSS/JS problem, a site problem, an iPhone problem, or a combination of all, and honestly I don't even care - it blows, and that's all that matters.
 [...]
 And using a (semi)transparent screen-filling <div> to simulate modal
 dialogs? Yeah it's clever. It also shows how stupid the whole
 enterprise is. That's not a "document" element at all. That's a
 windowing *system* that got transplanted into a *document* format.
 It's like implementing a flight simulator in an MS Word document.
 Very clever if you could pull it off, but also equally ludicrous.
 

Exactly.
 
 I'm on the fence about multi-column though. I find that an
 unfortunately large percentage of websites out there are overly wide,
 with text that stretch way past your eye's natural comfortable
 reading width, making reading said text a very tiring experience.
 OTOH if you just clip the width to a more natural width you have the
 problem that most screen these days are too lacking in the height
 department^W^W^W^W^W^W overly endowed in the width department, so you
 end up with large swaths of empty spaces on either side, which looks
 awful.  Standard support for multi-columns would be a big help here.
 (Preferably one that's decided by the *browser* instead of hard-coded
 by some silly HTML hack with tables or what-not.) I think CSS3 has
 (some) support for this.
 

The problem with that is you're creating excess vertical scrolling. Just to read linearly it's "scroll down, scroll up, scroll down", etc. (Of course, that pain is hugely compounded when the multi-columns are on page-based PDFs, like academic research papers.) The root problem there is that the need for multi-column on the web is artificially created by manufacturers and consumers who have collectively decided that watching movies is by far the #1 most important thing for anyone to ever be doing on a computer. Hence, "decapitated fat midget" 16:9 screens for everyone! No matter how bad it is for...just about everything *but* movies and certain games. Which, I suspect, is also the main reason we can't have browsers anymore with nice traditional UIs - because they have to be shoe-horned into a movie-oriented half-screen. Thank god they didn't standardize it all for those 1.85:1 films. Yet.
 If it breaks, you get to keep both pieces. -- Software disclaimer
 notice

Heh, that's awesome :)
Sep 26 2012
prev sibling next sibling parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Wednesday, 26 September 2012 at 02:29:09 UTC, Nick Sabalausky 
wrote:
 Ie, imagine you make a trivial HTML page that's nothing but a 
 purple background and no content. Now load it in a web browser.
 The purple part (and maybe the scroll bars?) is the *only* part
 that's included.

I think that's how most the web widgets work. Though a while ago, I was toying with doing a little browser widget in D, the idea being the window is nothing but a html thingy and everything is done by code... including stuff like link behavior, by doing event handlers in your D. But I never really got too much done with it.
Sep 26 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Sep 26, 2012 at 05:19:16AM -0400, Nick Sabalausky wrote:
[...]
 Most GUIs are made of common re-usable widgets, right? The "button"
 widget, the "checkbox" or "radio box" widgets, the "menu bar" widget,
 the "text box" widget, "image", "list", "grid", "treeview", etc. So
 then you make a GUI by plopping those widgets into a window, adjusting
 their exposed properties, and providing actions for stuff like
 "onClick". Basic stuff, right?
 
 So if I take a text-edit control and plop it onto a window, I haven't
 recreated the entire GUI and functionality of Windows Notepad or Kate
 or Gedit or Eclipse or whatever. It's just a text box. No menu bar, no
 toolbar, no "save button", no "currently opened files list", no status
 indicators, no nothing. Just a box you can type into. You have to add
 those widgets in, and add code to make them cause the right things to
 happen to the text box widget (or as a result of the text box widget).

Ah I see. That sounds a bit closer to what I have in mind. [...]
 You're doing better than I am. :-P  I whine and groan about it, but
 then very often my own programs are monolithic monsters. Hopefully
 once I start having more D projects replace my C/C++ ones, I'll
 improve in this area. D does make it a lot easier to design software
 in this way, which is one of the reasons I like it so much in spite
 of the current implementation flaws.
 

D is seriously so awesome. I had to go back to C++ for the iOS/Andorid game I'm doing, and man do I miss D <http://semitwist.com/articles/article/view/top-d-features-i-miss-in-c>.

I have one objection to your list though: although _for the most part_ AA's can work with any kind of key, there are a lot of bugs in that area. The language itself, of course, in theory supports any kind of key, but the current implementation is honestly a mess. I've tried fixing things but one thing leads to another and nothing short of a total overhaul will completely address all of the problems. But still, for most common cases, D's built-in AA's are a big factor in convincing me to switch to D. In this day and age, a language that doesn't have AA's of some sort (that doesn't require 3rd party libraries) is simply unforgiveable. (*ahem**cough*C++ prior to C++11*cough* Can you believe that prior to C++11, true AA's weren't even a part of the standard library? You had to use non-standard vendor-dependent hashes, or settle with tree-based substitutes which is NOT the same thing. What kind of idiocy is that?!)
 I miss D whenever I do maintenance on my Haxe-based webapp (one of my
 real-world projects). I miss D like crazy anytime I have to use any
 language other than D. It's so totally spoiled me :)

Haha yeah. In spite of all the current implementation flaws, D is still the best language out there. [...]
 My own take on it is this: LaTeX itself is quite old, and its age is
 starting to show. There are many things about its implementation
 details that I don't quite like. Lack of native UTF support is one
 major flaw (though there are imperfect workarounds).  Also some
 holdovers from the bad old days of trying to squeeze out every last
 byte from something, causing a zoo of command names that you pretty
 much have to memorize.

Might be fun to make a front-end for it. Something that spits out raw latex given a modernized equivalent.

Well, there's a GUI front-end for it (LyX), I don't know if that handles things like native UTF support. I was thinking more of a 21st century rewrite of LaTeX that has modern support like native UTF, revamped syntax to replace anachronisms, etc., but adhering to the original design principles. Sorta like what Knuth & Lamport would've come up with, if they had developed TeX/LaTeX in 2012. [...]
 I do like HTML/CSS for documents, could be less verbose, could use
 to not suck at diagrams and math formulas,

Once you've tasted the power of math layout in LaTeX, you wouldn't want to go near HTML math formulas with a 20-foot sterilized asbestos-lined titanium pole. It's *that* bad by comparison.

Even now I wouldn't even bother. I'd just use an image, or if not that, then maybe try pre-baked MathML output. But you have convinced me to get around to trying latex when I get a chance.

Sites like Wikipedia use LaTeX to generate math formula images by passing embedded <math> tags through LaTeX for formatting. :) Seriously, that's what makes math even remotely tolerable to write in Wikipedia. It's imperfect, though, 'cos the baseline of the formatted text in the image often doesn't line up with the baseline of the surrounding HTML text. And font sizes don't always match up. But it's better than the horror of attempting to write math in HTML. [...]
 What never made any sense to me was the use of HTML for what amounts
 to a GUI (*cough*form elements*cough*JS/CSS popup dialogs*cough*).

*Exactly* The "JS/CSS popup dialogs" (I call them "pop-ins") are probably the #1 thing that irritates me most on the web. Everything about it is wrong. Not just the technical things you describe, but the whole user experience even when it's *not* buggy. I mean, here we have a *popup*, that *can't* be killed by popup blockers, *and* makes the page underneath inaccessible! *And* it breaks the "back" button! Plus, on top of all that, it's completely unnecessary 100% of the time and does *nothing* to improve the user experience.

This is one of the things I like about Opera: I can switch to author mode which I configured to override all CSS with my own. When I hit a site with a CSS popup, either I just close it outright, or if I care enough about the content, I'll either turn off javascript (which is usually the culprit behind the CSS popup) or switch to author mode and read it in what's essentially a poor man's version of plaintext. This also works very well with sites with horrific choices of background/foreground colors (like red on grey or yellow on neon green) that make your eyes bleed, or microscopic font sizes, or b0rken styles that assume specific font/screen pixel sizes that breaks on every system except the author's.
 I had a *cough*fun experience with them recently, too:
 
 I wanted to check the availability of some item at my local library
 system, but their site insists on showing the availability info in one
 of those "pop-ins". Ok, annoying normally, but I was out of the house
 so I was doing this on the iPhone (which took forever due to the
 barely-usable text-entry on the thing). So I finally get to what I
 want, get to the "item availability" pop-in, and it's too big to fit
 on the screen. Ok, to be expected, it *is* a phone. So I try to
 scroll...and the pop-in *stays in place* as I scroll around the
 faded-out page underneath. So I can't scroll the pop-in. So I try to
 zoom out. Oh, it zooms out ok, but the part that was offscreen *stays*
 offscreen, so that doesn't help either. Go landscape - that just makes
 it worse because it everything scales up to keep the page width the
 same, so I just loose vertical real estate.
 
 Funny thing is, it works fine (ie without using pop-ins) when JS is
 off. But I can't turn JS off in iPhone Safari.

Argh... iPhone/iPod Safari is one of the worst horrors there are. The UI is simplistic to the point of daimbramage, which makes it unusable for anything but the most trivial of tasks. Nothing is configurable, no privacy settings, can't control Javascript, the maximum number of tabs is ridiculously small, scrolling a long page is really horrible, wide images get clipped with no way to unclip them when using the mobile stylesheet (probably the same bug you describe above), etc.. And Apple has the audacity of forcefully banning all other browsers from the app store, for the simple reason that they are superior browsers, and oh no, we simply can't allow customers to have a superior experience! About the only commendable thing with iPod Safari is the lack of Flash (good riddance!). [...]
 I'm on the fence about multi-column though. I find that an
 unfortunately large percentage of websites out there are overly wide,
 with text that stretch way past your eye's natural comfortable
 reading width, making reading said text a very tiring experience.
 OTOH if you just clip the width to a more natural width you have the
 problem that most screen these days are too lacking in the height
 department^W^W^W^W^W^W overly endowed in the width department, so you
 end up with large swaths of empty spaces on either side, which looks
 awful.  Standard support for multi-columns would be a big help here.
 (Preferably one that's decided by the *browser* instead of hard-coded
 by some silly HTML hack with tables or what-not.) I think CSS3 has
 (some) support for this.
 

The problem with that is you're creating excess vertical scrolling. Just to read linearly it's "scroll down, scroll up, scroll down", etc. (Of course, that pain is hugely compounded when the multi-columns are on page-based PDFs, like academic research papers.)

That's why I said that multicolumn support needs to be natively supported in the browser, NOT hardcoded into the page itself. It should be the browser that decides whether something should be multicolumn, and how tall the columns should be. There's no way the author can possibly account for every possible browser configuration out there to make this kind of decisions.
 The root problem there is that the need for multi-column on the web is
 artificially created by manufacturers and consumers who have
 collectively decided that watching movies is by far the #1 most
 important thing for anyone to ever be doing on a computer. Hence,
 "decapitated fat midget" 16:9 screens for everyone! No matter how bad
 it is for...just about everything *but* movies and certain games.
 Which, I suspect, is also the main reason we can't have browsers
 anymore with nice traditional UIs - because they have to be
 shoe-horned into a movie-oriented half-screen.

I avoid those height-truncated monitors like the plague. I only ever buy monitors with 4:3 aspect ratio. Seriously, if all I wanted to do was to watch movies, I wouldn't be using a PC in the first place. But still. Sometimes you have a long list of narrow items, and multi-column makes it more readable without excessive scrolling. Maybe I should start a new trend: side-scrolling webpages with *multi*-columns. :) (Though this probably only makes sense with vertical writing systems, like the vertical variant of Chinese writing. Which is in vertical columns *and* read right-to-left. Bwahahahaha...) T -- It said to install Windows 2000 or better, so I installed Linux instead.
Sep 26 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Wed, 26 Sep 2012 10:37:10 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 I have one objection to your list though: although _for the most part_
 AA's can work with any kind of key, there are a lot of bugs in that
 area. The language itself, of course, in theory supports any kind of
 key, but the current implementation is honestly a mess. I've tried
 fixing things but one thing leads to another and nothing short of a
 total overhaul will completely address all of the problems.
 

There is that, however, it beats the hell out of what I've seen in other languages, like Haxe, where the *value* is generic or templated, but the key is a string. Period. (And then I think Haxe also has a IntHash type now, too, but that's...a signal of not quite *getting* generic code.)
 Can you believe that prior to C++11, true AA's weren't
 even a part of the standard library?

Yes, I can believe that very easily. :/
 
 Well, there's a GUI front-end for it (LyX), I don't know if that
 handles things like native UTF support. I was thinking more of a 21st
 century rewrite of LaTeX that has modern support like native UTF,
 revamped syntax to replace anachronisms, etc., but adhering to the
 original design principles. Sorta like what Knuth & Lamport would've
 come up with, if they had developed TeX/LaTeX in 2012.
 

Right. I mean like what CoffeeScript does for JavaScript. But then I don't know if you'd would be able to solve all of latex's issues that way, mostly just syntactic ones.
 Sites like Wikipedia use LaTeX to generate math formula images by
 passing embedded <math> tags through LaTeX for formatting. :)
 Seriously, that's what makes math even remotely tolerable to write in
 Wikipedia. It's imperfect, though, 'cos the baseline of the formatted
 text in the image often doesn't line up with the baseline of the
 surrounding HTML text. And font sizes don't always match up. But it's
 better than the horror of attempting to write math in HTML.
 

Interesting.
 
 Funny thing is, it works fine (ie without using pop-ins) when JS is
 off. But I can't turn JS off in iPhone Safari.

Argh... iPhone/iPod Safari is one of the worst horrors there are. The UI is simplistic to the point of daimbramage, which makes it unusable for anything but the most trivial of tasks. Nothing is configurable, no privacy settings, can't control Javascript, the maximum number of tabs is ridiculously small,

You can't tell it to override all "target:blank" (I think that's what it's called, I never make them, so I don't remember) and always open in the same tab unless *I* say otherwise. That's one of my biggest annoyances with it so far.
 scrolling a long page is really horrible,

Yea. Needs directional buttons. Swipe is overrated and only suitable for minor infrequent uses.
 wide images get clipped with no way to unclip them when using the
 mobile stylesheet (probably the same bug you describe above), etc..
 And Apple has the audacity of forcefully banning all other browsers
 from the app store, for the simple reason that they are superior
 browsers, and oh no, we simply can't allow customers to have a
 superior experience!

I thought Chrome was available for iOS? But if what you say is true, then that's interesting to compare to "evil M$": Microsoft: Installs their browser by default. Allows any other browser to be installed and set as default. People are pissed. Gates is demonized. DOJ sues. Apple: Installs their browser by default. Bans other browsers entirely. Everybody's happy and praises Jobs as a great designer and savvy businessman. No lawsuit.
 
 About the only commendable thing with iPod Safari is the lack of Flash
 (good riddance!).
 

Yea, I was always ambivalent about that. On one had, I felt it was a bone-headed decision and that it should be left up to the user. OTOH, I can get behind almost anything that helps bring an end to Flash. So I've always been torn ;)
 
 The problem with that is you're creating excess vertical scrolling.
 Just to read linearly it's "scroll down, scroll up, scroll down",
 etc. (Of course, that pain is hugely compounded when the
 multi-columns are on page-based PDFs, like academic research
 papers.)

That's why I said that multicolumn support needs to be natively supported in the browser, NOT hardcoded into the page itself. It should be the browser that decides whether something should be multicolumn, and how tall the columns should be. There's no way the author can possibly account for every possible browser configuration out there to make this kind of decisions.

I see. I'm not sure how even the browser would really make it work though, unless maybe you make the whole page scroll horizontally with as many columns as it takes?
 
 The root problem there is that the need for multi-column on the web
 is artificially created by manufacturers and consumers who have
 collectively decided that watching movies is by far the #1 most
 important thing for anyone to ever be doing on a computer. Hence,
 "decapitated fat midget" 16:9 screens for everyone! No matter how
 bad it is for...just about everything *but* movies and certain
 games. Which, I suspect, is also the main reason we can't have
 browsers anymore with nice traditional UIs - because they have to be
 shoe-horned into a movie-oriented half-screen.

I avoid those height-truncated monitors like the plague. I only ever buy monitors with 4:3 aspect ratio. Seriously, if all I wanted to do was to watch movies, I wouldn't be using a PC in the first place.

I would do the same thing. In fact I had sworn I would never get anything wider than 5:4 (and even then I prefer 4:3). Unfortunately, when I was shopping for a laptop, there was *nothing* but 16:9. Not one single model, in or out of my price range. So it was 16:9 or no portability :( At least this has VGA output though (and HDMI, but anything that takes HDMI is going to be 16:9).
 But still. Sometimes you have a long list of narrow items, and
 multi-column makes it more readable without excessive scrolling.
 
 Maybe I should start a new trend: side-scrolling webpages with
 *multi*-columns. :) (Though this probably only makes sense with
 vertical writing systems, like the vertical variant of Chinese
 writing. Which is in vertical columns *and* read right-to-left.
 Bwahahahaha...)
 

Heh :) Traditional Japanese is like that, too. (Not surprising since their writing system is derived from Chinese.) Weird thing is, after studying that, and reading a lot of manga, anytime I see vertical English text, I keep trying to read it right-to-left out of habit :)
Sep 26 2012
prev sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 26 September 2012 at 22:23:00 UTC, Nick Sabalausky 
wrote:
 On Wed, 26 Sep 2012 10:37:10 -0700
 "H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 wide images get clipped with no way to unclip them when using 
 the
 mobile stylesheet (probably the same bug you describe above), 
 etc..
 And Apple has the audacity of forcefully banning all other 
 browsers
 from the app store, for the simple reason that they are 
 superior
 browsers, and oh no, we simply can't allow customers to have a
 superior experience!

I thought Chrome was available for iOS? But if what you say is true, then that's interesting to compare to "evil M$": Microsoft: Installs their browser by default. Allows any other browser to be installed and set as default. People are pissed. Gates is demonized. DOJ sues. Apple: Installs their browser by default. Bans other browsers entirely. Everybody's happy and praises Jobs as a great designer and savvy businessman. No lawsuit.

You are forbidden to use other rendering engines. So what browsers for iOS do, is to have their own network stack, but the rendering has to go via UIWebView. Safari has special rights, being the only application allowed to generate native code via JIT. For me this makes it rather pointless to install any other browser. Many young geeks only know Apple from Mac OS X onwards, but the new secretive Apple is actually the old Apple. Apple used to have it own standards for everything, NuBus, AppleTalk, QuickDraw 3D, QT, etc. APIs were a mix of C and Pascal code, without any proper POSIX support. Apple only became a bit more friendly to open source, after the NeXTStep guys got on board. Specially, because they needed a quick way out of two failed OS projects. Now that Apple hardware sells like hot pancakes in many countries, they are back to their old self. -- Paulo
Sep 26 2012
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 25, 2012 at 06:59:55PM -0400, Nick Sabalausky wrote:
[...]
 There is a general assumption by many applications/websites that
 *everyone* uses facebook.

I know! And it's not just software, it's all business in general. They noticed that it's popular so they think that means "nearly everyone uses it" when the reality is that even as popular as it is, it's still only a *minority* of internet users. Same with twitter.

Huh. I must've been living in a cave. Last I heard, facebook must've hit 90% market saturation, 'cos *everyone* I know has facebook (or pretty close to everyone). I'm not among the 90%, though, and never intend to be. :) Usually when I run into a website that asks for facebook login, I hit ctrl-W by instinct and don't even flinch. When I run into an app that asks for facebook login, 99% of the time I just delete it without thinking twice. And it's not just facebook, it's anything that *requires* signing up for some kind of social networking site. Making it an option is OK, I don't care if other people want to post their latest Doodle Jump score to their stream of inanity, all power to them, but *requiring* it to use an app that doesn't logically need it? Plonk. Nagging me about it after I said no the first time? Plonk. [...]
 Besides, my wife is on facebook, and if any important news happens
 via FB, she'll tell me :)
 

Heh. Similar situation here. My brother and sister are both on it, so I'll catch wind of any family news from FB. My parents, like me, aren't on FB either so they get the same benefit, too, although they usually hear much sooner I do ;)

Me too. My wife has FB, and that's good enough for me. Sad to say, though, I got suckered into signing up for Google+. Every now and then (like once a month or less) I post something, but mostly I don't even bother logging on. Most of the stuff on it is pretty inane anyway. Y'know, your typical "what I ate for breakfast", "how many hairs fell into my bathroom sink this morning" and other such content-free posts that are a total waste of time to read. There *are* times when it's useful, like when you're going abroad and have friends who can let you crash in their place for a night or two -- but generally speaking, the signal-to-noise ratio is very low. T -- That's not a bug; that's a feature!
Sep 25 2012
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 9/25/12 7:24 PM, H. S. Teoh wrote:
 Me too. My wife has FB, and that's good enough for me.

 Sad to say, though, I got suckered into signing up for Google+.

No Facebook but Google+? That's it. You're out. Use Go. Andrei
Sep 25 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 25 Sep 2012 15:52:09 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Tue, Sep 25, 2012 at 05:36:48PM -0400, Nick Sabalausky wrote:
 
 Newer Operas also got rid of the "native-ish" theme, which is why
 I'm not upgrading past v10. It may seem trivial, but skinned apps
 *really* bug me.

Skinned apps don't bug me at all. I tend to like apps where you can delete useless buttons off the UI and turn off toolbars and stuff you never use. As well as configure custom keyboard bindings

That's not really skinned, that just simply customizable (which I agree is good). What I mean by "skinned" is "Blatantly disregards my system settings and poorly re-invents all the standard UI controls." Often that's done so that users like me can re-skin it to make it look and act like *anything* we want...*except* for native and "consistent with the rest of the fucking system".
 
 I find the UIs in the FF4-onward to be completely intolerable. Even
 FF3's UI was god-awful, and then they managed to make it worse with
 4 by going all "Chrome-envy".

What I'd _really_ like, is browser *library*, where you get to assemble your own browser from premade parts.

A "web browser control" is pretty common, AIUI. I know IE and WebKit can be used as controls that you just plop into a window. Then you have to add in all the bells and whistles like address bar, bookmarking, etc., which all still adds up to a lot of extra work, though.
 Like replace the lousy
 UI front end with a custom interface. Applications nowadays suffer
 from excessive unnecessary integration. Software should be made
 reusable, dammit. And I don't mean just code reuse on the level of
 functions. I mean entire software systems that are pluggable and
 inter-connectible. If there's a browser that has a good back-end
 renderer but lousy UI, it should be possible to rip out the UI part
 and substitute it with the UI of another browser that has a better UI
 but lousy back-end. And if there's a browser that comes with
 unnecessary bloat like a mail app, it should be possible to outright
 _delete_ the mail component off the HD and have just the browser part
 working. Software these days is just so monolithic and clumsy. We
 need a new paradigm.
 

More like "need an old paradigm" because it sounds like you're describing the Unix philosophy ;) I'm with you though, that would be nice.
 
 [...]
 The result is that people revert to using table-based formatting
 and

Hey, I *like* table-based formatting :). Beats the hell out of trying to kluge together sane layouts/flowing with CSS. And nobody's ever going to convince me that HTML isn't the presentation layer.

I say trash it all, tables, HTML, everything. Markdown is good enough for email. If you need more than that, go buy a real website and post it there instead of transmitting that crap over SMTP.

Well, I just meant on the web, not email. Death to HTML emails!
Sep 25 2012
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 25 Sep 2012 16:24:14 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Tue, Sep 25, 2012 at 06:59:55PM -0400, Nick Sabalausky wrote:
 [...]
 There is a general assumption by many applications/websites that
 *everyone* uses facebook.

I know! And it's not just software, it's all business in general. They noticed that it's popular so they think that means "nearly everyone uses it" when the reality is that even as popular as it is, it's still only a *minority* of internet users. Same with twitter.

Huh. I must've been living in a cave. Last I heard, facebook must've hit 90% market saturation, 'cos *everyone* I know has facebook (or pretty close to everyone). I'm not among the 90%, though, and never intend to be. :) Usually when I run into a website that asks for facebook login, I hit ctrl-W by instinct and don't even flinch. When I run into an app that asks for facebook login, 99% of the time I just delete it without thinking twice. And it's not just facebook, it's anything that *requires* signing up for some kind of social networking site. Making it an option is OK, I don't care if other people want to post their latest Doodle Jump score to their stream of inanity, all power to them, but *requiring* it to use an app that doesn't logically need it? Plonk. Nagging me about it after I said no the first time? Plonk.

Maybe I'm the one living in a cave, because I've never come across anything (besides maybe facebook itself) that actually *requires* social networking login. The closest I've seen is Stack Overflow which I don't post to because it requires OpenPhishing, I mean OpenID.
 
 [...]
 Besides, my wife is on facebook, and if any important news happens
 via FB, she'll tell me :)
 

Heh. Similar situation here. My brother and sister are both on it, so I'll catch wind of any family news from FB. My parents, like me, aren't on FB either so they get the same benefit, too, although they usually hear much sooner I do ;)

Me too. My wife has FB, and that's good enough for me. Sad to say, though, I got suckered into signing up for Google+. Every now and then (like once a month or less) I post something, but mostly I don't even bother logging on. Most of the stuff on it is pretty inane anyway. Y'know, your typical "what I ate for breakfast", "how many hairs fell into my bathroom sink this morning" and other such content-free posts that are a total waste of time to read. There *are* times when it's useful, like when you're going abroad and have friends who can let you crash in their place for a night or two -- but generally speaking, the signal-to-noise ratio is very low.

Yea, that's also why I stubbornly refuse to call my...*ahem*...site with posts and articles, a "blog" (heh, usually I call it a "not-a-blog"). Because to me, what you've described is what a *real* "blog" is. Like LiveJournal.
Sep 25 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Thu, 20 Sep 2012 08:46:00 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Wed, 19 Sep 2012 17:05:35 -0400, Nick Sabalausky  
 <SeeWebsiteToContactMe semitwist.com> wrote:
 
 On Wed, 19 Sep 2012 10:11:50 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 I cannot argue that Apple's audio volume isn't too simplistic for
 its own good.  AIUI, they have two "volumes", one for the ringer,
 and one for playing audio, games, videos, etc.

There's also a separate one for alarms/alerts: http://www.ipodnn.com/articles/12/01/13/user.unaware.that.alarm.going.off.was.his/

This makes sense. Why would you ever want your alarm clock to "alarm silently"

I don't carry around my alarm clock everywhere I go. Aside from that, if it happens to be set wrong, I damn sure don't want it going off in a library, in a meeting, at the front row of a show, etc.
 How would you wake up?

By using a real alarm clock? Besides, we can trivially both have our own ways thanks to the simple invention of "options". Unfortunately, Apple apparently seems to think somebody's got that patented or something.
 This is another case of
 someone using the wrong tool for the job

Apparently so ;)
 
 I don't know any examples of sounds that disobey the silent switch

There is no silent switch. The switch only affects *some* sounds, and I'm not interested in memorizing which ones just so I can try to avoid the others. The only "silent switch" is the one I use: Just leave the fucking thing in the car.
 except for the "find my iPhone" alert,

That's about the only one that actually does make any sense at all.
 It's just unbelievably convoluted, over-engineered, and as far from
 "simple" as could possibly be imagined. Basically, you have "volume
 up" and "volume down", but there's so much damn modality (something
 Apple *loves*, but it almost universally bad for UI design) that
 they work pretty much randomly.

I think you exaggerate. Just a bit.

Not really (and note I said "pretty much randomly" not "truly randomly"). Try listing out all the different volume rules (that you're *aware* of - who knows what other hidden quirks there might be), all together, and I think you may be surprised just how much complexity there is. Then compare that to, for example, a walkman or other portable music player (iTouch doesn't count, it's a PDA) which is 100% predictable and trivially simple right from day one. You never even have to think about it, the volume **just works**, period. The fact that the ijunk has various other uses besides music is immaterial: It could have been simple and easy and worked well, and they instead chose to make it complex. Not only that, but it would have been trivial to just offer an *option* to turn that "smart" junk off. But then allowing a user to configure their own property to their own liking just wouldn't be very "Apple", now would it?
 BTW, a cool feature I didn't know for a long time is if you double
 tap the home button, your audio controls appear on the lock screen
 (play/pause, next previous song, and audio volume).  But I think
 you have to unlock to access ringer volume.

That's good to know (I didn't know). Unfortunately, it still only eliminates one, maybe two, swipes from an already-complex procedure, that on any sensible device would have been one step: Reach down into the pocket to adjust the volume.

Well, for music/video, the volume buttons *do* work in locked mode.

More complexity and modality! Great.
 How often has anyone ever had a volume POT go bad? I don't think
 I've *ever* even had it happen. It's a solid, well-established
 technology.

I have had several sound systems where the volume knob started misbehaving, due to corrosion, dust, whatever. You can hear it mostly when you turn the knob, and it has a scratchy sound coming from the speakers.

Was that before or after the "three year old" mark?
 I don't use a mac, and I never will again. I spent about a year or
 two with OSX last decade and I'll never go back for *any* reason.
 Liked it at first, but the more I used it the more I hated it.

It's a required thing for iOS development :)

Uhh, like I said, it *isn't*. I've *already* built an iOS package on my Win machine (again, using Marmalade, although I'd guess Corona and Unity are likely the same story), which a co-worker has *already* successfully run on his jailbroken iTouches and iPhone. And the *only* reason they needed to be jailbroken is because we haven't yet paid Apple's ransom for a signing certificate. Once we have that, I can sign the .ipa right here on Win with Marmalade's deployment tool. The *only* thing unfortunately missing without a mac is submission to the Big Brother store.
 I have recently
 experienced the exact opposite.  I love my mac, and I would never go
 back to Windows.

Not trying to "convert" you, just FWIW: You might like Win7. It's very Mac-like out-of-the-box which is exactly why I hate it ;)
 Mac + VMWare fusion for running XP and Linux is
 fucking awesome.
 

Virtualization is indeed awesome :) Personally I prefer VirtualBox though. (Although I worry about it now being under the roof of Oracle.)
 
 I recently learned objective C, and I'd hate to use it without
 xcode, which is a fantastic IDE.  Obj-C is extremely verbose, so
 without auto-complete, it would be torturous.
 

Hmm, I'm glad I don't have to deal with Obj-C then. Sounds like the Java development philosophy. Not that C++ is all that great either, but at least I already know it :/
 
 The *screen* wasn't broken, it's just the plastic starts
 deteriorating. Jobs famously had an early iPhone prototype with a
 plastic screen and pulled it out at a designer meeting and yelled at
 them saying "this fucking thing is in with my keys, it's getting all
 scratched up!  we need something better."  That's when they started
 thinking about using the glass screens.
 

Yea, he never did grow up, did he? Still throwing tantrums all the way up to, what was he, like 60? And he never did learn about such things as "covers", did he?
 Hate him if you want, but he definitely has revolutionized mobile  
 technology.
 

Eh, "revolutionize" is definitely not the word I would use...
 
 My kids often say the iPad isn't working, and then I have to point
 out they are holding it with their thumb on the screen.  At least
 those problems are easy to fix :)
 

Heh :)
Sep 20 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 18 Sep 2012 00:29:11 -0700
Walter Bright <newshound2 digitalmars.com> wrote:
 
 I tend to snicker at companies that insist they only hire the top 1%.
 It seems that about 90% of the engineers out there must be in that
 top 1% <g>.
 

I bet that's marketing-speak for "Our applicant-to-hire ratio is 100:1, and naturally we pick the one we like best instead of the one we like least." (Either that or it's just a claim pulled right out of their ass.)
Sep 20 2012
prev sibling next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Thu, 20 Sep 2012 17:16:14 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Thu, 20 Sep 2012 08:46:00 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Wed, 19 Sep 2012 17:05:35 -0400, Nick Sabalausky
 <SeeWebsiteToContactMe semitwist.com> wrote:

 There's also a separate one for alarms/alerts:
  

This makes sense. Why would you ever want your alarm clock to "alarm silently"

I don't carry around my alarm clock everywhere I go.

You don't have to use it as an alarm clock. An alarm clock is for waking you up. Why would you set it to wake you up in a music performance?
 Aside from that, if it happens to be set wrong, I damn sure don't want
 it going off in a library, in a meeting, at the front row of a show,
 etc.

Can't help you there :) It's *really* hard to set it wrong (just try it). Besides, it doesn't sound like that person was using the right tool for the job. If he's awake at that time, he's using it as a reminder, for which the reminders app is better suited.
 How would you wake up?

By using a real alarm clock?

What if you don't have one? You are camping, sleeping on the couch at a friends house, etc.
 Besides, we can trivially both have our own ways thanks to the simple
 invention of "options". Unfortunately, Apple apparently seems to think
 somebody's got that patented or something.

Huh? Just don't use it as an alarm clock? Why do you need an option to prevent you from doing that?
 I don't know any examples of sounds that disobey the silent switch

There is no silent switch. The switch only affects *some* sounds, and I'm not interested in memorizing which ones just so I can try to avoid the others.

s/some/nearly all Again, I gave you the *two* incidental sounds it doesn't affect. Sorry you can't be bothered to learn them.
 The only "silent switch" is the one I use: Just leave the fucking thing
 in the car.

That works too, but doesn't warrant rants about how you haven't learned how to use the fucking thing :)
 It's just unbelievably convoluted, over-engineered, and as far from
 "simple" as could possibly be imagined. Basically, you have "volume
 up" and "volume down", but there's so much damn modality (something
 Apple *loves*, but it almost universally bad for UI design) that
 they work pretty much randomly.

I think you exaggerate. Just a bit.

Not really (and note I said "pretty much randomly" not "truly randomly"). Try listing out all the different volume rules (that you're *aware* of - who knows what other hidden quirks there might be), all together, and I think you may be surprised just how much complexity there is.

1. ringer volume affects all sounds except for music/video/games 2. Silent switch will ringer volume to 0 for all sounds except for find-my-iphone and alarm clock 3. If playing a game/video/music, the volume buttons affect that volume, otherwise, they affect ringer volume. Wow, you are right, three whole rules. That's way more than 1. I stand corrected :)
 Then compare that to, for example, a walkman or other portable music
 player (iTouch doesn't count, it's a PDA) which is 100% predictable and
 trivially simple right from day one. You never even have to think about
 it, the volume **just works**, period. The fact that the ijunk has
 various other uses besides music is immaterial: It could have been
 simple and easy and worked well, and they instead chose to make it
 complex.

 Not only that, but it would have been trivial to just offer an *option*
 to turn that "smart" junk off. But then allowing a user to configure
 their own property to their own liking just wouldn't be very "Apple",
 now would it?

I detect a possible prejudice against Apple here :)
 Well, for music/video, the volume buttons *do* work in locked mode.

More complexity and modality! Great.

This is the one thing I agree with you on -- the volume buttons should just work in locked mode, following the rules of when the phone is not locked. I can't envision how the volume buttons would accidentally get pressed.
 How often has anyone ever had a volume POT go bad? I don't think
 I've *ever* even had it happen. It's a solid, well-established
 technology.

I have had several sound systems where the volume knob started misbehaving, due to corrosion, dust, whatever. You can hear it mostly when you turn the knob, and it has a scratchy sound coming from the speakers.

Was that before or after the "three year old" mark?

Not sure. I don't have any of these things anymore :) POTs aren't used very much any more.
 The *only* thing unfortunately missing without a mac is submission to
 the Big Brother store.

 I have recently
 experienced the exact opposite.  I love my mac, and I would never go
 back to Windows.

Not trying to "convert" you, just FWIW: You might like Win7. It's very Mac-like out-of-the-box which is exactly why I hate it ;)

No, it's nowhere near the same level. I have Win 7, had it from the day of its release, and while it's WAY better than XP, I'd drop it in a heartbeat if it wasn't so damn expensive to buy an iMac. For instance, when I want to turn my Mac off, I press the power button, shut down, and when it comes back up, all the applications I was running return in exactly the same state they were in. This is not hibernation, it's a complete shutdown. Every app has built in it, the ability to restore its state. This is because it's one of the things Mac users expect. You can't do that with Windows or even Linux. Ubuntu has tried to make their UI more mac like, but because the applications are not built to handle the features, it doesn't quite work right.
 Mac + VMWare fusion for running XP and Linux is
 fucking awesome.

Virtualization is indeed awesome :) Personally I prefer VirtualBox though. (Although I worry about it now being under the roof of Oracle.)

VMWare fusion was $50, and runs XP apps just like they were native ones (even gives you a searchable start menu). I actually was forced to use VMWare fusion, because a development project I'm working on includes a VMWare Linux image with the correct SDK/cross compiler. So I didn't really shop around for other VM solutions.
 I recently learned objective C, and I'd hate to use it without
 xcode, which is a fantastic IDE.  Obj-C is extremely verbose, so
 without auto-complete, it would be torturous.

Hmm, I'm glad I don't have to deal with Obj-C then. Sounds like the Java development philosophy. Not that C++ is all that great either, but at least I already know it :/

Objective C isn't actually terrible, I much prefer it to C++. But if I had to develop it without an IDE, I would hate it. And xcode is very very good at helping you develop with it. I haven't used xcode for anything else, so I'm not sure how good an IDE it is for other languages. It's git integration is very good too, especially for viewing differences.
 The *screen* wasn't broken, it's just the plastic starts
 deteriorating. Jobs famously had an early iPhone prototype with a
 plastic screen and pulled it out at a designer meeting and yelled at
 them saying "this fucking thing is in with my keys, it's getting all
 scratched up!  we need something better."  That's when they started
 thinking about using the glass screens.

Yea, he never did grow up, did he? Still throwing tantrums all the way up to, what was he, like 60? And he never did learn about such things as "covers", did he?

Interesting that's what you see as the defining point of that story :) Especially considering your calm, controlled statements about Apple products... -Steve
Sep 21 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 25, 2012 at 09:42:26PM -0400, Andrei Alexandrescu wrote:
 On 9/25/12 7:24 PM, H. S. Teoh wrote:
Me too. My wife has FB, and that's good enough for me.

Sad to say, though, I got suckered into signing up for Google+.

No Facebook but Google+? That's it. You're out. Use Go.

Heh heh... I was half-expecting you'd show up in this thread after all the anti-FB sentiment, and sure enough you did. ;-) Seriously, though, no offense intended, but I find FB's privacy policy rather ... lacking for my tastes. That's not to say G+ isn't susceptible to Big Brotherisms, of course, Google being what it is, but at least it gives you the illusion of control, like controlling who sees which posts (which FB imitated after the fact, if you allow me to say so), easier management of friends with the circles system, being able to delete your account data without caveats and jumping through hoops, etc.. But anyway, I hardly ever use my G+ account... Like I said, most of the posts are a waste of time to read, and I don't feel a compelling need to add to the noise or to publish my personal life online. So *shrug*. T -- Why waste time learning, when ignorance is instantaneous? -- Hobbes, from Calvin & Hobbes
Sep 25 2012
prev sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 25, 2012, at 11:11 PM, H. S. Teoh <hsteoh quickfur.ath.cx> wrote:

 On Tue, Sep 25, 2012 at 09:42:26PM -0400, Andrei Alexandrescu wrote:
 On 9/25/12 7:24 PM, H. S. Teoh wrote:
 Me too. My wife has FB, and that's good enough for me.
=20
 Sad to say, though, I got suckered into signing up for Google+.

No Facebook but Google+? That's it. You're out. Use Go.

=20 Heh heh... I was half-expecting you'd show up in this thread after all the anti-FB sentiment, and sure enough you did. ;-) =20 Seriously, though, no offense intended, but I find FB's privacy policy rather ... lacking for my tastes. That's not to say G+ isn't =

 to Big Brotherisms, of course, Google being what it is, but at least =

 gives you the illusion of control, like controlling who sees which =

 (which FB imitated after the fact, if you allow me to say so), easier
 management of friends with the circles system, being able to delete =

 account data without caveats and jumping through hoops, etc..

Google+ is an opt-out service rather than an opt-in service if you have = a gmail account, so that you have G+ isn't surprising. Personally, I = really like the G+ interface. It does exactly what I want simply and = succinctly, which is shocking from a Google product as typically, I'd = hold up their apps as examples of terrible UI designs. Facebook has the = community though, so that's what I actually use despite not really = liking the interface or anything else.=
Sep 26 2012
prev sibling next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 20 September 2012 at 21:15:24 UTC, Nick Sabalausky 
wrote:
 On Thu, 20 Sep 2012 08:46:00 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Wed, 19 Sep 2012 17:05:35 -0400, Nick Sabalausky  
 <SeeWebsiteToContactMe semitwist.com> wrote:
 
 On Wed, 19 Sep 2012 10:11:50 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 I cannot argue that Apple's audio volume isn't too 
 simplistic for
 its own good.  AIUI, they have two "volumes", one for the 
 ringer,
 and one for playing audio, games, videos, etc.

There's also a separate one for alarms/alerts: http://www.ipodnn.com/articles/12/01/13/user.unaware.that.alarm.going.off.was.his/

This makes sense. Why would you ever want your alarm clock to "alarm silently"

I don't carry around my alarm clock everywhere I go. Aside from that, if it happens to be set wrong, I damn sure don't want it going off in a library, in a meeting, at the front row of a show, etc.
 How would you wake up?

By using a real alarm clock? Besides, we can trivially both have our own ways thanks to the simple invention of "options". Unfortunately, Apple apparently seems to think somebody's got that patented or something.
 This is another case of
 someone using the wrong tool for the job

Apparently so ;)
 
 I don't know any examples of sounds that disobey the silent 
 switch

There is no silent switch. The switch only affects *some* sounds, and I'm not interested in memorizing which ones just so I can try to avoid the others. The only "silent switch" is the one I use: Just leave the fucking thing in the car.
 except for the "find my iPhone" alert,

That's about the only one that actually does make any sense at all.
 It's just unbelievably convoluted, over-engineered, and as 
 far from
 "simple" as could possibly be imagined. Basically, you have 
 "volume
 up" and "volume down", but there's so much damn modality 
 (something
 Apple *loves*, but it almost universally bad for UI design) 
 that
 they work pretty much randomly.

I think you exaggerate. Just a bit.

Not really (and note I said "pretty much randomly" not "truly randomly"). Try listing out all the different volume rules (that you're *aware* of - who knows what other hidden quirks there might be), all together, and I think you may be surprised just how much complexity there is. Then compare that to, for example, a walkman or other portable music player (iTouch doesn't count, it's a PDA) which is 100% predictable and trivially simple right from day one. You never even have to think about it, the volume **just works**, period. The fact that the ijunk has various other uses besides music is immaterial: It could have been simple and easy and worked well, and they instead chose to make it complex. Not only that, but it would have been trivial to just offer an *option* to turn that "smart" junk off. But then allowing a user to configure their own property to their own liking just wouldn't be very "Apple", now would it?
 BTW, a cool feature I didn't know for a long time is if you 
 double
 tap the home button, your audio controls appear on the lock 
 screen
 (play/pause, next previous song, and audio volume).  But I 
 think
 you have to unlock to access ringer volume.

That's good to know (I didn't know). Unfortunately, it still only eliminates one, maybe two, swipes from an already-complex procedure, that on any sensible device would have been one step: Reach down into the pocket to adjust the volume.

Well, for music/video, the volume buttons *do* work in locked mode.

More complexity and modality! Great.
 How often has anyone ever had a volume POT go bad? I don't 
 think
 I've *ever* even had it happen. It's a solid, 
 well-established
 technology.

I have had several sound systems where the volume knob started misbehaving, due to corrosion, dust, whatever. You can hear it mostly when you turn the knob, and it has a scratchy sound coming from the speakers.

Was that before or after the "three year old" mark?
 I don't use a mac, and I never will again. I spent about a 
 year or
 two with OSX last decade and I'll never go back for *any* 
 reason.
 Liked it at first, but the more I used it the more I hated 
 it.

It's a required thing for iOS development :)

Uhh, like I said, it *isn't*. I've *already* built an iOS package on my Win machine (again, using Marmalade, although I'd guess Corona and Unity are likely the same story), which a co-worker has *already* successfully run on his jailbroken iTouches and iPhone. And the *only* reason they needed to be jailbroken is because we haven't yet paid Apple's ransom for a signing certificate. Once we have that, I can sign the .ipa right here on Win with Marmalade's deployment tool. The *only* thing unfortunately missing without a mac is submission to the Big Brother store.
 I have recently
 experienced the exact opposite.  I love my mac, and I would 
 never go
 back to Windows.

Not trying to "convert" you, just FWIW: You might like Win7. It's very Mac-like out-of-the-box which is exactly why I hate it ;)
 Mac + VMWare fusion for running XP and Linux is
 fucking awesome.
 

Virtualization is indeed awesome :) Personally I prefer VirtualBox though. (Although I worry about it now being under the roof of Oracle.)
 
 I recently learned objective C, and I'd hate to use it without
 xcode, which is a fantastic IDE.  Obj-C is extremely verbose, 
 so
 without auto-complete, it would be torturous.
 

Hmm, I'm glad I don't have to deal with Obj-C then. Sounds like the Java development philosophy. Not that C++ is all that great either, but at least I already know it :/

Sorry if this is duplicate, somehow my reply was lost it seems. In big corporations you spend more time taking care of existing projects in big teams, than developing stuff from scratch. In these type of environments you learn to appreciate the verbosity of certain programming languages, and keep away from cute hacks. Specially when you take into consideration the quality of work that many programming drones are capable of. -- Paulo
Sep 21 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Fri, 21 Sep 2012 08:24:07 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 
 That works too, but doesn't warrant rants about how you haven't
 learned how to use the fucking thing :)
 

It's *volume* controls, there doesn't need to be *anything* to learn.
 Try listing out all the different volume rules (that you're *aware*
 of - who knows what other hidden quirks there might be), all
 together, and I think you may be surprised just how much complexity
 there is.

1. ringer volume affects all sounds except for music/video/games 2. Silent switch will ringer volume to 0 for all sounds except for find-my-iphone and alarm clock 3. If playing a game/video/music, the volume buttons affect that volume, otherwise, they affect ringer volume. Wow, you are right, three whole rules.

And each one with exceptions, the rules as a whole aren't particularly intuitive. And then there's the question of what rules you forgot. I can think of one right now: 4. If you're in the camera app then the volume button takes a picture instead of adjusting volume.
 That's way more than 1.  I stand corrected :)
 

Now compare that to a normal device: 1. The volume control adjusts the volume. Gee, how horrible to have one trivially intuitive rule and no exceptions. Bottom line, they took something trivial, complicated it, and people hail them as genius visionaries.
 Then compare that to, for example, a walkman or other portable music
 player (iTouch doesn't count, it's a PDA) which is 100% predictable
 and trivially simple right from day one. You never even have to
 think about it, the volume **just works**, period. The fact that
 the ijunk has various other uses besides music is immaterial: It
 could have been simple and easy and worked well, and they instead
 chose to make it complex.

 Not only that, but it would have been trivial to just offer an
 *option* to turn that "smart" junk off. But then allowing a user to
 configure their own property to their own liking just wouldn't be
 very "Apple", now would it?

I detect a possible prejudice against Apple here :)

Heh :) But yea, I *do* take a lot a issue with Apple, partly because as a business they make MS look like the EFF, but also largely because I've dealt with their products, and I really *do* find them to be awful overall.
 Not trying to "convert" you, just FWIW:

 You might like Win7. It's very Mac-like out-of-the-box which is
 exactly why I hate it ;)

No, it's nowhere near the same level. I have Win 7, had it from the day of its release, and while it's WAY better than XP,

Heh, yea I had a feeling. Like I said, Win7 is very Mac-like as far as windows goes. I find it interesting that while I absolutely can't stand Win7 (at least without *heavy* non-standard configuring and some hacks), Mac people OTOH tend to see it as a big improvement over XP. It's Microsoft OSX.
 For instance, when I want to turn my Mac off, I press the power
 button, shut down, and when it comes back up, all the applications I
 was running return in exactly the same state they were in.  This is
 not hibernation, it's a complete shutdown.  Every app has built in
 it, the ability to restore its state.  This is because it's one of
 the things Mac users expect.
 
 You can't do that with Windows or even Linux.  Ubuntu has tried to
 make their UI more mac like, but because the applications are not
 built to handle the features, it doesn't quite work right.
 

Well, we can make any OS look good by picking one nice feature. And personally, I actually like that shutdown serves as a "close all". There's a number of programs that do have settings for roughly "when starting, resume wherever I left off last time". I always end up turning that off because it just means I usually have to close whatever it auto-opened anyway. When I close/exit/etc something, it's generally because I'm done with that task. So auto-resumes just get in my way. OS is the same thing: If it auto-resumed everything, then I would just have to go closing most of it myself. Makes more work for me in it's quest to be "helpful".
 The *screen* wasn't broken, it's just the plastic starts
 deteriorating. Jobs famously had an early iPhone prototype with a
 plastic screen and pulled it out at a designer meeting and yelled
 at them saying "this fucking thing is in with my keys, it's
 getting all scratched up!  we need something better."  That's when
 they started thinking about using the glass screens.

Yea, he never did grow up, did he? Still throwing tantrums all the way up to, what was he, like 60? And he never did learn about such things as "covers", did he?

Interesting that's what you see as the defining point of that story :)

It's a story that always did stike me as odd: Here we have a grown man (one who was *well known* to be unstable, asinine, drug-soaked and frankly, borderline megalomaniacal) that's going around throwing tantrums, and largely because he doesn't understand "cover" or "case" or what obviously happens to plastic when you bash keys against it, and it gets interpreted by millions as "Wow, look how great he was!" I don't get it.
 Especially considering your calm, controlled statements
 about Apple products...
 

Heh, well, like I said my hatred for Apple and Apple products comes from having used them and been around them. I actually *liked* my OSX machine when I first got it. And then it, and the whole Jobs culture, and the way Apple runs their business, successfully turned me against Apple. And now I have this iPhone which, while having even been *useful* when out-of-town when I first got it - due to it essentially being a wirelessly internet-connected PDA - everything else about it just makes me want to smash it into a concrete wall nearly every time I use it. And I've never had that temptation from *any* other device before (hard as that may be to believe ;) )
Sep 21 2012
prev sibling next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Fri, 21 Sep 2012 17:22:32 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Fri, 21 Sep 2012 08:24:07 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 That works too, but doesn't warrant rants about how you haven't
 learned how to use the fucking thing :)

It's *volume* controls, there doesn't need to be *anything* to learn.

OK, so this is what you'd rather have: 1. Want to listen to music in the library on my headphones. But I have the silent switch on to prevent calls, and since there is only one volume, I have to turn it off, and turn up the volume. Then a phone call arrives, and I can't make that silent because it's all the same volume, bothers everyone else in the library. 2. Want my ringer as high as possible so I can hear it when a call arrives. But I start playing a game, and it BLASTS the initial music have to quicky turn down the volume. No, I think the current design, while not perfect, is *WAY* better than a single volume. I would rather actually have *more* granularity in volume. Doing it your way means everyone, not just you, needs to fiddle with the volume knob for every single thing they want to do. That may make you happy, but it would piss off the rest of the users :)
 1. ringer volume affects all sounds except for music/video/games
 2. Silent switch will ringer volume to 0 for all sounds except for
 find-my-iphone and alarm clock
 3. If playing a game/video/music, the volume buttons affect that
 volume, otherwise, they affect ringer volume.

 Wow, you are right, three whole rules.

And each one with exceptions, the rules as a whole aren't particularly intuitive.

They aren't? They make complete sense to me. You even admit that it makes sense to have find my iphone play its alerts as loud as possible. I contend that if you use alarm clock what it is for, (i.e. waking you up) there is no problem there either. Those are the only exceptions. Besides, you don't have to "memorize" these rules, most of the time, it is what a normal person would expect.
 And then there's the question of what rules you forgot. I can think of
 one right now:

 4. If you're in the camera app then the volume button takes a picture
 instead of adjusting volume.

I admit, I completely forgot about this one. Simply because I rarely use it :) It was a gimmicky feature, and doesn't hurt anything, but I find it unusable, simply because my natural inclination, being a right-handed person, is to rotate the phone left to go into landscape mode, If I want to use the button, my sequence is to rotate left, then realize the button's on the other side, flip 180 degrees, then realize my finger is in front of the lens, etc. I think this is essentially an orthogonal problem because there is no volume control in camera, and that "feature" doesn't interfere with any other use of the phone. When I read about it though, I thought it was a good idea. Interestingly enough, Apple doesn't even *let* you use the volume control for anything but volume in your own apps. Doing it is clunky in any case, you have to take over the volume, disable anything that is playing, then make sure the volume is not at min/max. When you detect the "volume" goes up or down, take action, then reset the volume. Very lame.
 That's way more than 1.  I stand corrected :)

Now compare that to a normal device: 1. The volume control adjusts the volume. Gee, how horrible to have one trivially intuitive rule and no exceptions.

Right, and now I'm stuck in "Nick mode", where I'm constantly worrying about and changing the volume to deal with the current situation. No thanks.
 Bottom line, they took something trivial, complicated it, and people
 hail them as genius visionaries.

s/complicated/improved/ This isn't really genius, nor is it unprecedented (iPhone is not the first to control ringer and game/music volume separately). It's just common sense.
 You might like Win7. It's very Mac-like out-of-the-box which is
 exactly why I hate it ;)

No, it's nowhere near the same level. I have Win 7, had it from the day of its release, and while it's WAY better than XP,

Heh, yea I had a feeling. Like I said, Win7 is very Mac-like as far as windows goes. I find it interesting that while I absolutely can't stand Win7 (at least without *heavy* non-standard configuring and some hacks), Mac people OTOH tend to see it as a big improvement over XP. It's Microsoft OSX.

I wasn't a mac user until november of last year. And even then I didn't start using it in earnest until February of this year, when my iOS side business picked up. I still used my Linux laptop for almost everything else, and my Win7 machine at home. I barely used the Mac, and that was just to run xcode. About 4 months ago, I had to start developing for an arm-based single-board-computer. The manufacturer provided a fully-configured VMWare Linux image. Only option for the MAC was to try/buy VMWare Fusion (VMWare does not make a free VMWare player for Mac, and it was required that this run on the Mac). So I bought VMWare Fusion. Once I realized, I could run all my other Linux development for my day-job under a VMWare image, and I could run an old copy of XP Pro that I had purchased a long time ago, I could simply use my Mac for all business-related tasks. I decided to try switching, took about a week to transfer all my stuff over. Loving it ever since. So no, I'm not a MAC person, I'm a Unix/Linux person. But Mac seems to have done Unix better than Linux :) And with VMWare Fusion, I can run MS office (no not MAC office, which is crap AIUI) when I need it.
 Well, we can make any OS look good by picking one nice feature.

 And personally, I actually like that shutdown serves as a "close all".
 There's a number of programs that do have settings for roughly "when
 starting, resume wherever I left off last time". I always end up
 turning that off because it just means I usually have to close whatever
 it auto-opened anyway. When I close/exit/etc something, it's generally
 because I'm done with that task. So auto-resumes just get in
 my way. OS is the same thing: If it auto-resumed everything, then I
 would just have to go closing most of it myself. Makes more work for
 me in it's quest to be "helpful".

It was an example. But it was one that I noticed right away coming from Ubuntu with Unity. Unity tries to be very MAC-like, but is fighting and strong-arming applications into compliance. It doesn't always work. For example, Netbeans still has a menu bar, even though Unity tries to put the menu bar at the top of the screen. So it ends up with 2 menu bars, the one at the top being empty. And Unity's feature of "searching all menu options" (also a mac ripoff) doesn't work in those apps. If I had to summarize why I like MacOS better than windows -- the GUI is a complete GUI, and as good as Windows (unlike Linux), but it does Unix *SOOO* much better than cygwin. I feel like I get the best of all worlds. And don't get me started on the trackpad. I *hated* using my Dell touchpad on my Linux laptop every time after I had been using my Mac trackpad. The one thing I would rip out of OSX and throw against the wall is the mail app. Its interface and experience is awesome. But it frequently corrupts messages and doesn't properly save outgoing mail. Not good for a mail application.
 Interesting that's what you see as the defining point of that
 story :)

It's a story that always did stike me as odd: Here we have a grown man (one who was *well known* to be unstable, asinine, drug-soaked and frankly, borderline megalomaniacal) that's going around throwing tantrums, and largely because he doesn't understand "cover" or "case" or what obviously happens to plastic when you bash keys against it, and it gets interpreted by millions as "Wow, look how great he was!" I don't get it.

Having amassed more money than US treasury, based on his ideas and hard work, seems to suggest he was pretty successful :) Not that I completely equate money with greatness, but if success of a product is measured by how well it sells, then he was very great. Present company notwithstanding, most people like apple products and think they are good/best of breed. I think in order to succeed in producing a good product, you have to kind of have a somewhat high opinion of yourself, and having the balls to take risks on designs that may not be popular but, when engineered correct, produce a superior product. Imagine how D would be if Walter allowed every idea that came across the newsgroup to be implemented. I hated the idea of unshared-by-default, but now I think it's probably the most important improvement to the language so far. Not by itself, but the things it enables.
 Especially considering your calm, controlled statements
 about Apple products...

Heh, well, like I said my hatred for Apple and Apple products comes from having used them and been around them. I actually *liked* my OSX machine when I first got it. And then it, and the whole Jobs culture, and the way Apple runs their business, successfully turned me against Apple. And now I have this iPhone which, while having even been *useful* when out-of-town when I first got it - due to it essentially being a wirelessly internet-connected PDA - everything else about it just makes me want to smash it into a concrete wall nearly every time I use it. And I've never had that temptation from *any* other device before (hard as that may be to believe ;) )

I think if it didn't have a big apple symbol on the back, you would be less inclined to try and destroy it :) Just my opinion. I have brands that I hate too due to prior experience too. I'm sure you would be able to find anyone who *hates* a certain brand of car because they bought a lemon from them at one time, even though statistics show there are *always* some bad apples (no pun intended) in otherwise good products. These can be badly designed single products (*cough* Vista) or simply one instance of a product with defective parts. I think humans have a tendency to put too much emphasis on anecdotal experience rather than scientifically detected trends. And I think the sometimes prohibitive costs of some of theses gadgets plays a large part -- You aren't likely to go out and buy another $200 iPhone, for instance, if your previous two broke within a year. Even though most people don't have that experience. -Steve
Sep 24 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Sep 26, 2012 at 09:30:43AM -0700, Sean Kelly wrote:
 On Sep 25, 2012, at 11:11 PM, H. S. Teoh <hsteoh quickfur.ath.cx> wrote:
 
 On Tue, Sep 25, 2012 at 09:42:26PM -0400, Andrei Alexandrescu wrote:
 On 9/25/12 7:24 PM, H. S. Teoh wrote:
 Me too. My wife has FB, and that's good enough for me.
 
 Sad to say, though, I got suckered into signing up for Google+.

No Facebook but Google+? That's it. You're out. Use Go.

Heh heh... I was half-expecting you'd show up in this thread after all the anti-FB sentiment, and sure enough you did. ;-) Seriously, though, no offense intended, but I find FB's privacy policy rather ... lacking for my tastes. That's not to say G+ isn't susceptible to Big Brotherisms, of course, Google being what it is, but at least it gives you the illusion of control, like controlling who sees which posts (which FB imitated after the fact, if you allow me to say so), easier management of friends with the circles system, being able to delete your account data without caveats and jumping through hoops, etc..

Google+ is an opt-out service rather than an opt-in service if you have a gmail account, so that you have G+ isn't surprising.

I don't have a gmail account, though. I hate webmails with a passion.
 Personally, I really like the G+ interface.  It does exactly what I
 want simply and succinctly, which is shocking from a Google product as
 typically, I'd hold up their apps as examples of terrible UI designs.

Haha, yeah, I like the G+ interface better than FB, too.
 Facebook has the community though, so that's what I actually use
 despite not really liking the interface or anything else.

True, but I've always liked rooting for the underdog, so I like to promote non-FB to all my contacts. :) T -- Your inconsistency is the only consistent thing about you! -- KD
Sep 26 2012
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Wed, 26 Sep 2012 10:40:13 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Wed, Sep 26, 2012 at 09:30:43AM -0700, Sean Kelly wrote:
 
 Google+ is an opt-out service rather than an opt-in service if you
 have a gmail account, so that you have G+ isn't surprising.

I don't have a gmail account, though. I hate webmails with a passion.

I have a gmail account as a backup in case my mail server goes down. I only ever access it via POP3/SMTP. (I probably hate webmails every bit as much as you.) I've never even touched G+.
Sep 26 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 24 Sep 2012 10:02:57 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Fri, 21 Sep 2012 17:22:32 -0400, Nick Sabalausky  
 <SeeWebsiteToContactMe semitwist.com> wrote:
 
 On Fri, 21 Sep 2012 08:24:07 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 That works too, but doesn't warrant rants about how you haven't
 learned how to use the fucking thing :)

It's *volume* controls, there doesn't need to be *anything* to learn.

OK, so this is what you'd rather have:

 
 No, I think the current design, while not perfect, is *WAY* better
 than a single volume.
 

No, that's not it at all. The problem is the *lack* of any master volume control whatsoever, not the existence of finer-grained volume controls. My walkman example was perhaps misleading.
 1. ringer volume affects all sounds except for music/video/games
 2. Silent switch will ringer volume to 0 for all sounds except for
 find-my-iphone and alarm clock
 3. If playing a game/video/music, the volume buttons affect that
 volume, otherwise, they affect ringer volume.

 Wow, you are right, three whole rules.

And each one with exceptions, the rules as a whole aren't particularly intuitive.

They aren't? They make complete sense to me. You even admit that it makes sense to have find my iphone play its alerts as loud as possible.

No, only the "find iPhone" one. The iPhone has no fucking idea what environment I'm in. I *definitely* don't want it screeching "PAY ATTENTION TO MEEEE!!!!" indiscriminately whenever it damn well feels like it.
 I contend that if you use alarm clock what it is for,
 (i.e. waking you up) there is no problem there either.  Those are the
 only exceptions.
 

Keep in mind, when I started talking "alarms" I didn't just mean "alarm clock". Pardon if I'm not completely up on official iTerminology.
 Besides, you don't have to "memorize" these rules, most of the time,
 it is what a normal person would expect.
 

What a normal person expects is for turning down a device's volume to...turn down the device's volume. Or for "silent" to actually *be* silent. What a normal person does not expect is for the device to take the user's commands as mere suggestions.
 And then there's the question of what rules you forgot. I can think
 of one right now:

 4. If you're in the camera app then the volume button takes a
 picture instead of adjusting volume.

I admit, I completely forgot about this one. Simply because I rarely use it :) It was a gimmicky feature, and doesn't hurt anything, but I find it unusable, simply because my natural inclination, being a right-handed person, is to rotate the phone left to go into landscape mode, If I want to use the button, my sequence is to rotate left, then realize the button's on the other side, flip 180 degrees, then realize my finger is in front of the lens, etc. I think this is essentially an orthogonal problem because there is no volume control in camera, and that "feature" doesn't interfere with any other use of the phone. When I read about it though, I thought it was a good idea.

I can never remember which way I'm supposed to tilt the stupid thing for landscape photos. It *shouldn't* matter, but then when you go grab your photos (and videos!!) off the device you find the stupid thing decided to ignore the accelerometer and save them upside-down. As for buttons and such, the Zire 71 had a great design for the camera: Slide the face upward and the normally-protected lens is revealed, along with a "shutter" button (no need for modal "volume button" contrivances), *and* it goes directly into the camera program. So basically a real camera instead of a mere a camera "app", always trivially accessible, and always the same easy way. And yea, it's a moving part, but it *still* far outlasted the life of the (unfortunately non-replaceable) battery. *That* was brilliant design. I wish apple had copied it. It didn't have an accelerometer (this *was* a decade ago, after all) so it couldn't determine the current "tilt" and auto-rotate photos accordingly (like the iPhone *should* have been able to do), but it had an easy built-in "rotate photo" feature that even iPhone's built-ins won't do (at least not in any realistically discoverable way).
 That's way more than 1.  I stand corrected :)

Now compare that to a normal device: 1. The volume control adjusts the volume. Gee, how horrible to have one trivially intuitive rule and no exceptions.

Right, and now I'm stuck in "Nick mode", where I'm constantly worrying about and changing the volume to deal with the current situation. No thanks.

No, as I said above.
 Bottom line, they took something trivial, complicated it, and people
 hail them as genius visionaries.

s/complicated/improved/ This isn't really genius, nor is it unprecedented (iPhone is not the first to control ringer and game/music volume separately). It's just common sense.

Ok again, clarification: Independently controllable ringer/game/music volumes: Good Complete *lack* of any way to control *overall* volume: Bad A lot of the videogames I've played have independent adjustable SFX/music/voice volumes. I've even happily made use of that. And I'm damn glad that the TV *still* has a properly working volume control despite that because I make even more use of that.
 So no, I'm not a MAC person, I'm a Unix/Linux person.  But Mac seems
 to have done Unix better than Linux :)

That was never my impression with macs. For example, I'll take even a mediocre linux GUI over Finder/etc any day. I don't understand why mac...*users*...inevitably have such trouble with the idea that someone could actually dislike it when it's (apperently) so objectively wonderful.
 
 It was an example.  But it was one that I noticed right away coming
 from Ubuntu with Unity.  Unity tries to be very MAC-like,

That's why switched to Debian for my linux stuff instead of upgrading to the newer Ubuntus, and also why I'm not moving to Gnome 3. Too much Apple-envy for my tastes.
 If I had to summarize why I like MacOS better than windows -- the GUI
 is a complete GUI, and as good as Windows (unlike Linux),

See I disagree with that. I like XP's GUI (with luna disabled), but I hate having to use OSX GUIs and OSX-alike GUIs (such as Win7). Linux GUIs are definitely clunky, but when they're not aping Mac or iOS then I can at least get by with them.
 but it does Unix *SOOO* much better than cygwin.

Cygwin's not even worth considering. As far as I'm concerned it may as well not exist. When I do linux it's either a VM or a physical linux box (connected to my primary system with Synergy+, a software KVM that absolutely rules).
 I feel like I get the best of all worlds.

Yea, but to get that, you have to use OSX as your *primary* environment, and stick with expensive iHardware. Might work for you, but those are all deal-breakers for me.
 
 And don't get me started on the trackpad.  I *hated* using my Dell  
 touchpad on my Linux laptop every time after I had been using my Mac  
 trackpad.
 

I always considered trackpads completely useless until I got my current Asus laptop. It's surprisingly usable in a pinch, and in fact I honestly couldn't believe how much they've improved (or that they even managed to improve at all). And yet I still go for my trackball instead whenever possible because it's sooo much better.
 The one thing I would rip out of OSX and throw against the wall is
 the mail app.  Its interface and experience is awesome.  But it
 frequently corrupts messages and doesn't properly save outgoing
 mail.  Not good for a mail application.
 

I didn't have corruption issues with it, but I did find it to be rather gimped and straight-jacketed much like the rest of the system.
 Interesting that's what you see as the defining point of that
 story :)

It's a story that always did stike me as odd: Here we have a grown man (one who was *well known* to be unstable, asinine, drug-soaked and frankly, borderline megalomaniacal) that's going around throwing tantrums, and largely because he doesn't understand "cover" or "case" or what obviously happens to plastic when you bash keys against it, and it gets interpreted by millions as "Wow, look how great he was!" I don't get it.

Having amassed more money than US treasury, based on his ideas and hard work, seems to suggest he was pretty successful :) Not that I completely equate money with greatness, but if success of a product is measured by how well it sells, then he was very great. Present company notwithstanding, most people like apple products and think they are good/best of breed.

He was a salesman. Their job is to sell people on crap. Successfully unloading broken freezers on eskimos and dog shit to...anyone...isn't really deserving of praise or appreciation or anything but condemnation.
 
 I think if it didn't have a big apple symbol on the back, you would
 be less inclined to try and destroy it :)  Just my opinion.
 

I'm sure most people would assume that, particularly since I dislike something that "everyone knows is undeniably great". I know there's no way I can ever convince anyone of this, but I don't do things backwards like that: I hate apple *because* I don't like their products or their business. The other way around makes absolutely no sense. I'd love for apple to start putting out good stuff because...I *like* good stuff. Hell, I love the Apple II. And I loved Sherlock/Watson (part of what got me to try OSX). If I want to see apple destroyed it's because they keep putting out poorly-designed, overpriced, Orwellian bullshit and instead of dismissing it like in the 90's people are actually praising the shit now that it has a glossy finish and the name "Jobs". Oh, and because it sold well :/...which I always found to be a bizarre reason to appreciate anything. *I* think that people wouldn't be so quick to praise Apple's last decade of products if they didn't have "Steve Jobs has returned!", "Designed by Jobs!" attached. (And the iPhone 5 obviously still has a lot of Jobs legacy, esp since it's basically the 4S with higher specs.)
 I have brands that I hate too due to prior experience too.  I'm sure
 you would be able to find anyone who *hates* a certain brand of car
 because they bought a lemon from them at one time, even though
 statistics show there are *always* some bad apples (no pun intended)
 in otherwise good products.  These can be badly designed single
 products (*cough* Vista) or simply one instance of a product with
 defective parts.  I think humans have a tendency to put too much
 emphasis on anecdotal experience rather than scientifically detected
 trends.  And I think the sometimes prohibitive costs of some of
 theses gadgets plays a large part -- You aren't likely to go out and
 buy another $200 iPhone, for instance, if your previous two broke
 within a year.  Even though most people don't have that experience.
 

So therefore if someone argues against something popular, then it must be due to such a fallacy as that, because what's popular clearly must be good, right? Because those people who do like it must be liking it for purely objective reasons, right? You're arguing that most people are non-objective. If that's so, then the objective viewpoint would be an unpopular one. Kinda like "Apple products suck". Or is it that the "humans are often non-objective" only applies to negative opinions? People are always being objective when they say something positive?
Sep 24 2012
prev sibling next sibling parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Mon, 24 Sep 2012 19:52:15 -0400, Nick Sabalausky
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Mon, 24 Sep 2012 10:02:57 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Fri, 21 Sep 2012 17:22:32 -0400, Nick Sabalausky
 <SeeWebsiteToContactMe semitwist.com> wrote:

 On Fri, 21 Sep 2012 08:24:07 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 That works too, but doesn't warrant rants about how you haven't
 learned how to use the fucking thing :)

It's *volume* controls, there doesn't need to be *anything* to learn.

OK, so this is what you'd rather have:

 No, I think the current design, while not perfect, is *WAY* better
 than a single volume.

No, that's not it at all. The problem is the *lack* of any master volume control whatsoever, not the existence of finer-grained volume controls. My walkman example was perhaps misleading.

There is a master volume control. It has two volumes, on and off, and it's called the silent switch ;)
 1. ringer volume affects all sounds except for music/video/games
 2. Silent switch will ringer volume to 0 for all sounds except for
 find-my-iphone and alarm clock
 3. If playing a game/video/music, the volume buttons affect that
 volume, otherwise, they affect ringer volume.

 Wow, you are right, three whole rules.

And each one with exceptions, the rules as a whole aren't particularly intuitive.

They aren't? They make complete sense to me. You even admit that it makes sense to have find my iphone play its alerts as loud as possible.

No, only the "find iPhone" one. The iPhone has no fucking idea what environment I'm in. I *definitely* don't want it screeching "PAY ATTENTION TO MEEEE!!!!" indiscriminately whenever it damn well feels like it.

When does it do that?
 I contend that if you use alarm clock what it is for,
 (i.e. waking you up) there is no problem there either.  Those are the
 only exceptions.

Keep in mind, when I started talking "alarms" I didn't just mean "alarm clock". Pardon if I'm not completely up on official iTerminology.

All other alerts are silenced by the silent switch. I don't even know if that's the correct term for that switch. I just discovered through testing that timer has the same feature as alarm. I find that incorrect. If I have the silent switch enabled, the timer should just vibrate. In fact, I don't think there's a way to make the timer "just vibrate" in any way. That's counter-intuitive and I will agree with you on that one.
 Besides, you don't have to "memorize" these rules, most of the time,
 it is what a normal person would expect.

What a normal person does not expect is for the device to take the user's commands as mere suggestions.

At least in the case of alarm clock, the user has said both "wake me up at this time" and "be silent." Apple chose "wake me up". The alternative is that the phone stays silent, and you don't wake up. Much worse.
 4. If you're in the camera app then the volume button takes a
 picture instead of adjusting volume.

I admit, I completely forgot about this one. Simply because I rarely use it :) It was a gimmicky feature, and doesn't hurt anything, but I find it unusable, simply because my natural inclination, being a right-handed person, is to rotate the phone left to go into landscape mode, If I want to use the button, my sequence is to rotate left, then realize the button's on the other side, flip 180 degrees, then realize my finger is in front of the lens, etc. I think this is essentially an orthogonal problem because there is no volume control in camera, and that "feature" doesn't interfere with any other use of the phone. When I read about it though, I thought it was a good idea.

I can never remember which way I'm supposed to tilt the stupid thing for landscape photos. It *shouldn't* matter, but then when you go grab your photos (and videos!!) off the device you find the stupid thing decided to ignore the accelerometer and save them upside-down.

I have seen strange things there, sometimes a photo/video comes in rotated (I see it pass by the Windows photo import preview), but then when I look at the photo in Explorer, it's correctly rotated. I have not seen it show photos or videos incorrectly rotated once downloaded.
 As for buttons and such, the Zire 71 had a great design for the camera:
 Slide the face upward and the normally-protected lens is revealed,
 along with a "shutter" button (no need for modal "volume button"
 contrivances), *and* it goes directly into the camera program. So
 basically a real camera instead of a mere a camera "app",
 always trivially accessible, and always the same easy way. And yea,
 it's a moving part, but it *still* far outlasted the life of the
 (unfortunately non-replaceable) battery. *That* was brilliant design. I
 wish apple had copied it.

Hehe, they have something like that, the photo icon on the lock screen slides up to reveal the photo app. Yeah, it's not a hardware button, but it does sound similar. I have to say, this is one of the better improvements, especially with those of us who have kids.
 It didn't have an accelerometer (this *was* a decade ago, after all) so
 it couldn't determine the current "tilt" and auto-rotate photos
 accordingly (like the iPhone *should* have been able to do), but it had
 an easy built-in "rotate photo" feature that even iPhone's built-ins
 won't do (at least not in any realistically discoverable way).

While viewing a photo, tap the screen to bring up the controls. Click "Edit" (upper right corner), then you can rotate the photo. Don't think you can do the same with a video. Don't think I agree that an Edit button on the main photo viewing screen is not realistically discoverable. I will say though, like any UI, you have to get used to the mechanisms that are standard. One of the things that I didn't know for a while is how to get controls to come up. Generally that's a single tap in the middle of the screen. If you didn't know that, it would be difficult to discover.
 Bottom line, they took something trivial, complicated it, and people
 hail them as genius visionaries.

s/complicated/improved/ This isn't really genius, nor is it unprecedented (iPhone is not the first to control ringer and game/music volume separately). It's just common sense.

Ok again, clarification: Independently controllable ringer/game/music volumes: Good

OK, I get it now.
 Complete *lack* of any way to control *overall* volume: Bad

Well, there is the silent switch. Which is a bit blunt, but it effectively is a "master volume" with two levels.
 So no, I'm not a MAC person, I'm a Unix/Linux person.  But Mac seems
 to have done Unix better than Linux :)

That was never my impression with macs. For example, I'll take even a mediocre linux GUI over Finder/etc any day. I don't understand why mac...*users*...inevitably have such trouble with the idea that someone could actually dislike it when it's (apperently) so objectively wonderful.

Finder could be better, but Nautilus sucks. I'd rather use command line than Nautilus. And actually, I did :) However, I think Finder is only usable once you force it to show you all hidden files. It pisses me off royally when an OS decides I don't know enough to allow me to see hidden files. At least on Windows, that was a setting in advanced options. On MAC, you have to use some obscure commands to enable hidden files, that I think was pretty lame.
 It was an example.  But it was one that I noticed right away coming
 from Ubuntu with Unity.  Unity tries to be very MAC-like,

That's why switched to Debian for my linux stuff instead of upgrading to the newer Ubuntus, and also why I'm not moving to Gnome 3. Too much Apple-envy for my tastes.

For my VMWare image for work, I chose Linux Mint with the default GUI, and it works pretty well. I like it better than Unity.
 If I had to summarize why I like MacOS better than windows -- the GUI
 is a complete GUI, and as good as Windows (unlike Linux),

See I disagree with that. I like XP's GUI (with luna disabled), but I hate having to use OSX GUIs and OSX-alike GUIs (such as Win7). Linux GUIs are definitely clunky, but when they're not aping Mac or iOS then I can at least get by with them.

You may misunderstand when I say *complete* GUI, I mean you can do everything with the GUI, and everything is seamless. There is no run "system preferences" for some settings, and "Compiz settings" for others, like in Ubuntu. Same as Windows, one place to find everything -- control panel. The style may not fit your tastes, and I can't really argue that point -- it's your taste that matters to you, not mine. But my point is, it is *functional* and can do everything I need it to.
 I feel like I get the best of all worlds.

Yea, but to get that, you have to use OSX as your *primary* environment, and stick with expensive iHardware. Might work for you, but those are all deal-breakers for me.

It's not what I would have chosen (at first), but I wanted to write iOS apps, and Mac is the only way to do it (at least the complete thing, including submission, but I admit I wasn't aware of marmalade). It's one of those things where I reluctantly bought it, and started using it, then was pleasantly surprised with the UI, and finally addicted. I hope Apple doesn't turn to shit, because I'll be upset if I have to give up this experience. But I must say, the expensive hardware (quad-core i7) kicks the pants off of any other machine I've ever used.
 And don't get me started on the trackpad.  I *hated* using my Dell
 touchpad on my Linux laptop every time after I had been using my Mac
 trackpad.

I always considered trackpads completely useless until I got my current Asus laptop. It's surprisingly usable in a pinch, and in fact I honestly couldn't believe how much they've improved (or that they even managed to improve at all). And yet I still go for my trackball instead whenever possible because it's sooo much better.

No, this is a multi-touch pad, not a synaptics touchpad (on most standard laptops). Way different. The best feature is the 2-finger scroll. Don't know how I lived without that! And I've tried Apple's magic mouse, it sucks. The trackpad is awesome.
 The one thing I would rip out of OSX and throw against the wall is
 the mail app.  Its interface and experience is awesome.  But it
 frequently corrupts messages and doesn't properly save outgoing
 mail.  Not good for a mail application.

I didn't have corruption issues with it, but I did find it to be rather gimped and straight-jacketed much like the rest of the system.

ech, I guess the corruption issues have been happening since OSX 10.6. Many posts in the apple forums. I guess mail doesn't get the attention it needs over at Apple. Come to think of it, iCal kinda sucks too, I could live without that.
 Interesting that's what you see as the defining point of that
 story :)

It's a story that always did stike me as odd: Here we have a grown man (one who was *well known* to be unstable, asinine, drug-soaked and frankly, borderline megalomaniacal) that's going around throwing tantrums, and largely because he doesn't understand "cover" or "case" or what obviously happens to plastic when you bash keys against it, and it gets interpreted by millions as "Wow, look how great he was!" I don't get it.

Having amassed more money than US treasury, based on his ideas and hard work, seems to suggest he was pretty successful :) Not that I completely equate money with greatness, but if success of a product is measured by how well it sells, then he was very great. Present company notwithstanding, most people like apple products and think they are good/best of breed.

He was a salesman. Their job is to sell people on crap.

Wow, have you ever liked anything in your life? A salesperson's job is to sell a product. Whether that product is good or not certainly helps the sale, and not all salespeople just sell no matter what. The best salesman tells you *not* to buy something because it doesn't fit you. This doesn't work when your job is to sell crappy stuff (you will not end up selling anything).
 Successfully unloading broken freezers on eskimos and dog shit
 to...anyone...isn't really deserving of praise or appreciation or
 anything but condemnation.

Oh, I totally agree. Fuck all those salespeople, I just cut out the middle man and go to dogshitfreezers.com. And they think I'm so stupid, how's that commision check now?
 I think if it didn't have a big apple symbol on the back, you would
 be less inclined to try and destroy it :)  Just my opinion.

I'm sure most people would assume that, particularly since I dislike something that "everyone knows is undeniably great". I know there's no way I can ever convince anyone of this, but I don't do things backwards like that: I hate apple *because* I don't like their products or their business. The other way around makes absolutely no sense.

I think we probably are both a couple of pots calling each other kettles, or... something.
 I'd love for apple to start putting out good stuff because...I *like*
 good stuff. Hell, I love the Apple II. And I loved Sherlock/Watson
 (part of what got me to try OSX). If I want to see apple destroyed it's
 because they keep putting out poorly-designed, overpriced, Orwellian
 bullshit and instead of dismissing it like in the 90's people are
 actually praising the shit now that it has a glossy finish and the
 name "Jobs". Oh, and because it sold well :/...which I always found to
 be a bizarre reason to appreciate anything.

 *I* think that people wouldn't be so quick to praise Apple's last
 decade of products if they didn't have "Steve Jobs has returned!",
 "Designed by Jobs!" attached. (And the iPhone 5 obviously still has a
 lot of Jobs legacy, esp since it's basically the 4S with higher specs.)

I think that's very wrong. My reasons for liking apple products are because they are good products. I can explain my history if you want, but I tend to think you won't believe it. Truth is, people who don't like a brand will find a reason to complain, and people who like a brand will find a reason to forgive. It's the same with D and any other product. By all means, I don't think Apple's products are flawless. Just less flawed.
 I have brands that I hate too due to prior experience too.  I'm sure
 you would be able to find anyone who *hates* a certain brand of car
 because they bought a lemon from them at one time, even though
 statistics show there are *always* some bad apples (no pun intended)
 in otherwise good products.  These can be badly designed single
 products (*cough* Vista) or simply one instance of a product with
 defective parts.  I think humans have a tendency to put too much
 emphasis on anecdotal experience rather than scientifically detected
 trends.  And I think the sometimes prohibitive costs of some of
 theses gadgets plays a large part -- You aren't likely to go out and
 buy another $200 iPhone, for instance, if your previous two broke
 within a year.  Even though most people don't have that experience.

So therefore if someone argues against something popular, then it must be due to such a fallacy as that, because what's popular clearly must be good, right? Because those people who do like it must be liking it for purely objective reasons, right?

No, that's not what I'm saying. I'm saying basing your perception of a new product on your experience with another product from the same brand is not always objective. And that's not always a bad thing -- there's a reason humans learn from their experience. I never said what's "popular" is good, that's BS. I'm saying past experiences bias our decisions (all of us, myself included). I sure as hell will *never* buy another motorola bluetooth headset again. But this isn't fashion. People don't buy shit electronics that don't work just because they have a brand name. At least not after two consecutive failures. Look at Microsoft. They had extremely good brand recognition, and a huge market share, as well as being basically the only pre-installed OS on most PCs. Yet, along came Vista, and you saw a huge decline in sales for them, because it *SUCKED*. Brand name doesn't help you if your products suck. However, because of their brand, Vista didn't seem to impact the success of Windows 7 (a great product IMO). So while popularity isn't the cause of success, it certainly is a reflection of it. But let's face it, popularity is a huge market driver. If people you know like something, you tend to trust their opinion. If people you aren't so fond of like something, you may tend to dislike that thing. Saying you don't like something because it's popular (not saying you are saying that) is *still* an opinion driven by popularity! For example, if you learned that the new iOS 6 has better integration with facebook, a popular (but I'm pretty sure from past posts I've seen from you, a revolting) service, are you a) less likely to like iPhone (and no, I don't mean facebook like), b) more likely, or c) neutral? If you didn't answer c, then you are letting your bias get in the way. Period. I personally will *never* sign up for facebook (sorry Andrei), and therefore will never use facebook integration on my phone. But it doesn't make me less likely to like iPhone, because it doesn't impact me at all. Now, if iOS suddenly *required* me to use facebook, that would be a problem for me.
 You're arguing that most people are non-objective. If that's so, then
 the objective viewpoint would be an unpopular one. Kinda like "Apple
 products suck". Or is it that the "humans are often non-objective" only
 applies to negative opinions? People are always being objective when
 they say something positive?

Most people *are* non-objective. It's very difficult to have a truly objective view. And you can't really measure everything objectively, especially with something as broad as intuition or ease-of-use. I just saw this *ridiculously* biased "test" of apple iPhone 5 vs. Samsung Galaxy S III on durability. I bet these people thought they were being truly objective too... http://youtu.be/bLW0HrVeoD8 -Steve
Sep 24 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 24 Sep 2012 21:52:05 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 
 There is a master volume control.  It has two volumes, on and off, and
 it's called the silent switch ;)
 

Calling that a master volume control is a stretch.
 They aren't?  They make complete sense to me.  You even admit that
 it makes sense to have find my iphone play its alerts as loud as
 possible.

No, only the "find iPhone" one. The iPhone has no fucking idea what environment I'm in. I *definitely* don't want it screeching "PAY ATTENTION TO MEEEE!!!!" indiscriminately whenever it damn well feels like it.

When does it do that?

I thought you were just saying that the iPhone plays it's alerts as loud as possible?
 
 I just discovered through testing that timer has the same feature as
 alarm.  I find that incorrect.  If I have the silent switch enabled,
 the timer should just vibrate.
 
 In fact, I don't think there's a way to make the timer "just vibrate"
 in any way.  That's counter-intuitive and I will agree with you on
 that one.
 

Yea, see there's just too much "surprise" involved, IMO.
 I have seen strange things there, sometimes a photo/video comes in
 rotated (I see it pass by the Windows photo import preview), but then
 when I look at the photo in Explorer, it's correctly rotated.
 

I'm looking at the photos on my iPhone through Explorer right now and aside from the screenshots, the majority of them are either sideways or upside-down. The bizarre thing is, when I look at them through "Photos" on the device itself, it actually shows them all correctly. Which means that the device *knows* how they're supposed to be but doesn't bother to actually save them correctly.
 I have not seen it show photos or videos incorrectly rotated once
 downloaded.
 

I just copied all of them to my local machine, and they're still rotated wrong. Makes sense though, I wouldn't expect (or want) a file copy to affect content.
 
 Hehe, they have something like that, the photo icon on the lock screen
 slides up to reveal the photo app.  Yeah, it's not a hardware button,
 but it does sound similar.
 

Doesn't protect the lens though, and it doesn't provide a physical button which would obviate the need to hijack the volume button. (It *is* at least a little better than not being able to access the camera from the lock screen at all.) I can't even tell you how many times I've accidentally gone back to the home screen when trying to take a picture. But I'll at least grant that *that* error was due to me being accustomed to my Zire71 (which, when slid open, has the "shutter" button exactly where the iPhone's home button is).
 I have to say, this is one of the better improvements, especially with
 those of us who have kids.
 

Yea, one-size-fits-all design :/ That said, I do like to use "kids" as an argument for having an OS-level "disable software eject" option for optical drives. ;) "Ok, I'll just leave that to burn..." Walk away. It finishes and ejects. Kid waddles by. "Ohh, a pretty shiny object! Should I eat it or flush it?"
 It didn't have an accelerometer (this *was* a decade ago, after
 all) so it couldn't determine the current "tilt" and auto-rotate
 photos accordingly (like the iPhone *should* have been able to do),
 but it had an easy built-in "rotate photo" feature that even
 iPhone's built-ins won't do (at least not in any realistically
 discoverable way).

While viewing a photo, tap the screen to bring up the controls. Click "Edit" (upper right corner), then you can rotate the photo. Don't think you can do the same with a video. Don't think I agree that an Edit button on the main photo viewing screen is not realistically discoverable.

I don't see any rotate there: http://semitwist.com/download/img/shots/IMG_0859.PNG I just see the "Back" button then...umm "Do a Magic Trick?" (WTF?), then I'm guessing maybe "Anti-Red-Eye", and...ok, I'm pretty sure that last one's crop, I remember seeing it in one or two image editing programs.
 I will say though, like any UI, you have to get used to the mechanisms
 that are standard.  One of the things that I didn't know for a while
 is how to get controls to come up.  Generally that's a single tap in
 the middle of the screen.  If you didn't know that, it would be
 difficult to discover.
 

Android has an actual button for "Settings". Much easier to discover (despite not actually saying "settings" - or anything at all, really). And easier to use since it usually brings up a list of real words unlike the contrived hieroglyphs used throughout most of Android and iOS. Or...at least the older Androids did. The damn newer ones replaced the few buttons it used to have with on-screen touch abominations. At least, for the buttons they didn't eliminate outright in their quest to clone the iPhone misfeature-for-misfeature. The settings button might have been one of the ones they killed off entirely, I don't remember offhand.
 So no, I'm not a MAC person, I'm a Unix/Linux person.  But Mac
 seems to have done Unix better than Linux :)

That was never my impression with macs. For example, I'll take even a mediocre linux GUI over Finder/etc any day. I don't understand why mac...*users*...inevitably have such trouble with the idea that someone could actually dislike it when it's (apperently) so objectively wonderful.

Finder could be better, but Nautilus sucks. I'd rather use command line than Nautilus. And actually, I did :)

I agree Nautilus sucks (and back in the day, it was bloated as hell, too). Best one I've found on Linux is Dolphin, and I'm not real big on that either. Out of all of them, Finder is easily still my least favorite though. I actually *liked* one of the views it had (the multi-column one) until I actually started using it firsthand.
 However, I think Finder is only usable once you force it to show you
 all hidden files.  It pisses me off royally when an OS decides I
 don't know enough to allow me to see hidden files.
 

Yea, "Show hidden files" is one of the first things I do when I install a new OS. And "Show my f*** extensions" on windows.
 It was an example.  But it was one that I noticed right away coming
 from Ubuntu with Unity.  Unity tries to be very MAC-like,

That's why switched to Debian for my linux stuff instead of upgrading to the newer Ubuntus, and also why I'm not moving to Gnome 3. Too much Apple-envy for my tastes.

For my VMWare image for work, I chose Linux Mint with the default GUI, and it works pretty well. I like it better than Unity.

I don't know what Mint uses, but I always thought Unity was a bit of a misstep for Canonical. It's like Canonical pulling a "Metro".
 If I had to summarize why I like MacOS better than windows -- the
 GUI is a complete GUI, and as good as Windows (unlike Linux),

See I disagree with that. I like XP's GUI (with luna disabled), but I hate having to use OSX GUIs and OSX-alike GUIs (such as Win7). Linux GUIs are definitely clunky, but when they're not aping Mac or iOS then I can at least get by with them.

You may misunderstand when I say *complete* GUI, I mean you can do everything with the GUI, and everything is seamless. There is no run "system preferences" for some settings, and "Compiz settings" for others, like in Ubuntu.

Ok, yea, Linux has always been weak in that regard.
 Same as Windows, one place to find everything -- control panel.
 

Well, sort of... :/
 The style may not fit your tastes, and I can't really argue that
 point -- it's your taste that matters to you, not mine.  But my point
 is, it is *functional* and can do everything I need it to.
 

Right, I get that. Fair enough. My point has been that Mac doesn't work for me.
 But I must say, the expensive hardware (quad-core i7) kicks the pants
 off of any other machine I've ever used.
 

I recently moved from a 32-bit single-core XP to a 64-bit dual-core Win7 (don't remember exactly what CPU, but it's Intel and newer/faster than the Core 2 Duo). Video processing is waaay faster, compiling C++ is slightly faster, and everything else I do is...pretty much the same. All of it already ran fine on the old system, so there's not much left for this one to improve on speed-wise.
 And don't get me started on the trackpad.  I *hated* using my Dell
 touchpad on my Linux laptop every time after I had been using my
 Mac trackpad.

I always considered trackpads completely useless until I got my current Asus laptop. It's surprisingly usable in a pinch, and in fact I honestly couldn't believe how much they've improved (or that they even managed to improve at all). And yet I still go for my trackball instead whenever possible because it's sooo much better.

No, this is a multi-touch pad, not a synaptics touchpad (on most standard laptops). Way different. The best feature is the 2-finger scroll. Don't know how I lived without that!

Multitouch is standard on all laptops these days, including mine. In fact, this does 2-finger scroll, too (I did it just now), and has a bunch of other gestures including 3-finger ones, and all totally configurable. Two-finger scroll is ok, but personally I *much* prefer the "circular"-motion scrolling (forget what they call it) - it's actually just about as good as a scroll wheel.
 And I've tried Apple's magic mouse, it sucks.

Is that the one they had five or ten years ago as a "two-button scroll mouse" but was touch-sensitive instead of having actual mouse buttons? I've only come across one person who ever liked it - and it definitely wasn't me.
 The trackpad is awesome.
 

I've used it. It's awesome, just like mine, in the sense that it's a trackpad that's not 100% useless. I'm still not of fan of them though.
 
 ech, I guess the corruption issues have been happening since OSX 10.6.
 Many posts in the apple forums.
 
 I guess mail doesn't get the attention it needs over at Apple.
 
 Come to think of it, iCal kinda sucks too, I could live without that.
 

Apple is very A.D.D. They catch a whiff of something they want to do, go nuts with it (but not to the point of feature-completeness), and more or less forget about everything else. To this day, iTunes still can't play Vorbis like, uhh, every other music player in the world. iTunes used to be their pride and joy, now they just dick around with its button placements once in a while and use it as a convenient dumping ground for anything involving their handheld devices - ie, their latest interest.
 He was a salesman. Their job is to sell people on crap.

Wow, have you ever liked anything in your life?

Megaman's pretty fucking awesome ;) And I loved PalmOS. And Apple II, like I said. Got to drive a Saab 9-3 Turbo once, that was pretty cool. If I started talking about music, movies, videogames and TV shows I liked, I'd be here all night ;)
 Successfully unloading broken freezers on eskimos and dog shit
 to...anyone...isn't really deserving of praise or appreciation or
 anything but condemnation.

Oh, I totally agree. Fuck all those salespeople, I just cut out the middle man and go to dogshitfreezers.com. And they think I'm so stupid, how's that commision check now?

Heh heh :)
 I think if it didn't have a big apple symbol on the back, you would
 be less inclined to try and destroy it :)  Just my opinion.

I'm sure most people would assume that, particularly since I dislike something that "everyone knows is undeniably great". I know there's no way I can ever convince anyone of this, but I don't do things backwards like that: I hate apple *because* I don't like their products or their business. The other way around makes absolutely no sense.

I think we probably are both a couple of pots calling each other kettles, or... something.

Probably ;)
 *I* think that people wouldn't be so quick to praise Apple's last
 decade of products if they didn't have "Steve Jobs has returned!",
 "Designed by Jobs!" attached. (And the iPhone 5 obviously still has
 a lot of Jobs legacy, esp since it's basically the 4S with higher
 specs.)

I think that's very wrong. My reasons for liking apple products are because they are good products. I can explain my history if you want, but I tend to think you won't believe it.

I was just (un)cleverly turning it around there. Didn't actually mean it. Although I don't doubt there *are* people like that out there...
 No, that's not what I'm saying.  I'm saying basing your perception of
 a new product on your experience with another product from the same
 brand is not always objective.  And that's not always a bad thing --
 there's a reason humans learn from their experience.  I never said
 what's "popular" is good, that's BS.  I'm saying past experiences
 bias our decisions (all of us, myself included).  I sure as hell will
 *never* buy another motorola bluetooth headset again.
 

 
 Saying you don't like something because it's popular (not saying you
 are saying that) is *still* an opinion driven by popularity!

All fine, but I don't see how any of it leads you to conclude that I'm dismissing Apple products on account of them being from Apple.
 
 I personally will *never* sign up for facebook (sorry Andrei), and

Bizarrely enough, I likely will, but only because these multiplayer-enabled mobile games (I'm working on one - hopefully it won't suck *too* bad) apparently need (for some definition of "need" ;) ) to support facebook-based login these days. So I gotta be able to test it. Will never use it for anything beyond that though.
Sep 24 2012
prev sibling next sibling parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Tue, 25 Sep 2012 01:55:54 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Mon, 24 Sep 2012 21:52:05 -0400
 "Steven Schveighoffer" <schveiguy yahoo.com> wrote:
 There is a master volume control.  It has two volumes, on and off, and
 it's called the silent switch ;)

Calling that a master volume control is a stretch.

Yeah I know. But it's about the closest thing you can get to a physical master volume on the iPhone.
 They aren't?  They make complete sense to me.  You even admit that
 it makes sense to have find my iphone play its alerts as loud as
 possible.

No, only the "find iPhone" one. The iPhone has no fucking idea what environment I'm in. I *definitely* don't want it screeching "PAY ATTENTION TO MEEEE!!!!" indiscriminately whenever it damn well feels like it.

When does it do that?

I thought you were just saying that the iPhone plays it's alerts as loud as possible?

The only alert which is not played at the set ringer volume that I know of is the find-my-iphone alert (which I think you agree makes sense). All the other alerts (alarm, message notification, timer expired, etc.) play at the ringer volume.
 I just discovered through testing that timer has the same feature as
 alarm.  I find that incorrect.  If I have the silent switch enabled,
 the timer should just vibrate.

 In fact, I don't think there's a way to make the timer "just vibrate"
 in any way.  That's counter-intuitive and I will agree with you on
 that one.

Yea, see there's just too much "surprise" involved, IMO.

To me, that is not a critical issue. I've had an iPhone since June of 2010, and I didn't even realize this until now (and I use my iPhone for pretty much everything). But if you are *looking* for problems, this certainly was not as well thought out as the other sounds.
 I have seen strange things there, sometimes a photo/video comes in
 rotated (I see it pass by the Windows photo import preview), but then
 when I look at the photo in Explorer, it's correctly rotated.

I'm looking at the photos on my iPhone through Explorer right now and aside from the screenshots, the majority of them are either sideways or upside-down.

Wait, did you *download* them? Or are you just browsing via the USB cable? When you download them via the camera import feature of Windows (I think XP has that), it corrects the rotation. I have no idea why it waits until then.
 The bizarre thing is, when I look at them through "Photos" on the
 device itself, it actually shows them all correctly. Which means that
 the device *knows* how they're supposed to be but doesn't bother to
 actually save them correctly.

I don't think the photos are meant to be browsed that way. See this thread here https://discussions.apple.com/message/16514340#16514340 I think explorer must not be using the rotation field (seems odd), but the camera import rotates the picture on import.
 Doesn't protect the lens though, and it doesn't provide a physical
 button which would obviate the need to hijack the volume button. (It
 *is* at least a little better than not being able to access the camera
 from the lock screen at all.)

Weren't you the one advocating a case? And the hijacking of the button, as I said before, is a misfeature. It doesn't really hurt, but it's too poorly positioned to be useful IMO.
 I have to say, this is one of the better improvements, especially with
 those of us who have kids.

Yea, one-size-fits-all design :/

Oh, it was annoying when the kids were doing something cute, and you have to type in your code to unlock, then go find the camera app, wait for it to load (I think they actually improved the load time too) and by that time, it was over. One of the perks of having a camera on your phone is you always have it with you.
 That said, I do like to use "kids" as an argument for having an
 OS-level "disable software eject" option for optical drives. ;)  "Ok,
 I'll just leave that to burn..." Walk away. It finishes and ejects. Kid
 waddles by. "Ohh, a pretty shiny object! Should I eat it or flush it?"

Or use it as a frisbee :) Then you can damage two things at once!
 While viewing a photo, tap the screen to bring up the controls.  Click
 "Edit" (upper right corner), then you can rotate the photo.  Don't
 think you can do the same with a video.

 Don't think I agree that an Edit button on the main photo viewing
 screen is not realistically discoverable.

I don't see any rotate there: http://semitwist.com/download/img/shots/IMG_0859.PNG I just see the "Back" button then...umm "Do a Magic Trick?" (WTF?), then I'm guessing maybe "Anti-Red-Eye", and...ok, I'm pretty sure that last one's crop, I remember seeing it in one or two image editing programs.

The "back button" is the rotate. I agree it's not very well drawn, it should be more like a quarter-turn and less snazzy (just a quarter circle arrow would be better). The button on the top that says "Cancel" is actually the back button. Besides, I don't think rotating that picture will help much ;)
 Android has an actual button for "Settings". Much easier to discover
 (despite not actually saying "settings" - or anything at all, really).
 And easier to use since it usually brings up a list of real words
 unlike the contrived hieroglyphs used throughout most of Android and
 iOS.

 Or...at least the older Androids did. The damn newer ones replaced the
 few buttons it used to have with on-screen touch abominations. At
 least, for the buttons they didn't eliminate outright in their quest to
 clone the iPhone misfeature-for-misfeature. The settings button might
 have been one of the ones they killed off entirely, I don't remember
 offhand.

My brother has an android with dedicated buttons, but they are part of the touch screen (they aren't displayed, they are inlays, but are part of the whole touch sensitive screen). They misfunction sometimes, and it annoys. He wishes they were real buttons. I can't deny that the home button is overused for things, and it would make more sense to have a dedicated menu button. It's not like there's no room on the bottom of the phone...
 So no, I'm not a MAC person, I'm a Unix/Linux person.  But Mac
 seems to have done Unix better than Linux :)

That was never my impression with macs. For example, I'll take even a mediocre linux GUI over Finder/etc any day. I don't understand why mac...*users*...inevitably have such trouble with the idea that someone could actually dislike it when it's (apperently) so objectively wonderful.

Finder could be better, but Nautilus sucks. I'd rather use command line than Nautilus. And actually, I did :)

I agree Nautilus sucks (and back in the day, it was bloated as hell, too). Best one I've found on Linux is Dolphin, and I'm not real big on that either. Out of all of them, Finder is easily still my least favorite though. I actually *liked* one of the views it had (the multi-column one) until I actually started using it firsthand.

That is the default, and I absolutely love it. However, only with my trackpad, where I can easily scroll horizontally. I really would like to have a folder view on the left though, for copying files like in Windows. You know how you can open the directory you want to copy from, then go find the folder you want to copy to, but not open it, and just drag the files? That is perfect. With Finder, I have to drag the file to "Documents" shortcut, then wait until it pulls that up, then go navigating through subdirectories while holding down the button.
 However, I think Finder is only usable once you force it to show you
 all hidden files.  It pisses me off royally when an OS decides I
 don't know enough to allow me to see hidden files.

Yea, "Show hidden files" is one of the first things I do when I install a new OS. And "Show my f*** extensions" on windows.

Hells yeah! It always strikes me as comical that MS created that "feature" and it created a whole class of openme.txt.exe viruses. Yet instead of just removing that misfeature, they built legions of extra CPU-consuming mail filtering and anti-virus software to prevent people from having any files with multiple extensions, only to piss off people who tried to use .tar.gz files :) It never seemed to bother *anyone* in DOS or Windows 3.1, I think that was a huge design mistake.
 It was an example.  But it was one that I noticed right away coming
 from Ubuntu with Unity.  Unity tries to be very MAC-like,

That's why switched to Debian for my linux stuff instead of upgrading to the newer Ubuntus, and also why I'm not moving to Gnome 3. Too much Apple-envy for my tastes.

For my VMWare image for work, I chose Linux Mint with the default GUI, and it works pretty well. I like it better than Unity.

I don't know what Mint uses, but I always thought Unity was a bit of a misstep for Canonical. It's like Canonical pulling a "Metro".

I liked unity at first, and I like the design of it. But it doesn't work right, because apps are not built to use it. That was my point. Looked it up, Mint has two shells, MATE and Cinnamon. I think I settled on MATE, the start menu was like the best of both XP and Win7. See here: http://www.linuxmint.com/pictures/screenshots/katya/menu.png That's a couple versions back, but start menu looks reasonably the same.
 But I must say, the expensive hardware (quad-core i7) kicks the pants
 off of any other machine I've ever used.

I recently moved from a 32-bit single-core XP to a 64-bit dual-core Win7 (don't remember exactly what CPU, but it's Intel and newer/faster than the Core 2 Duo). Video processing is waaay faster, compiling C++ is slightly faster, and everything else I do is...pretty much the same. All of it already ran fine on the old system, so there's not much left for this one to improve on speed-wise.

I think my old laptop was centrino with "hyperthreading" (it was that old). It doesn't hurt that I doubled my mac to 8GB of ram, especially in the VMWare dept. :)
 No, this is a multi-touch pad, not a synaptics touchpad (on most
 standard laptops).  Way different. The best feature is the 2-finger
 scroll.  Don't know how I lived without that!

Multitouch is standard on all laptops these days, including mine. In fact, this does 2-finger scroll, too (I did it just now), and has a bunch of other gestures including 3-finger ones, and all totally configurable.

Oh, that's cool! I didn't know. I know that I've seen HPs where the "buttons" were just drawings on the touchpad. But they sucked, didn't always work right. And then if you wanted to hold down the button while scrolling, didn't work at all. Must be they got it right by copying apple :) Or maybe apple copied them, I don't know.
 Two-finger scroll is ok, but personally I *much* prefer the
 "circular"-motion scrolling (forget what they call it) - it's actually
 just about as good as a scroll wheel.

What I like about the 2-finger scroll is that it goes all 4 directions, it's like panning. And I don't have to move my finger to a certain spot.
 And I've tried Apple's magic mouse, it sucks.

Is that the one they had five or ten years ago as a "two-button scroll mouse" but was touch-sensitive instead of having actual mouse buttons? I've only come across one person who ever liked it - and it definitely wasn't me.

It has no buttons or visible delineations, you have to just "know" that if you click on a certain spot (and you better not have your other fingers down) that it will be the correct mouse button. Not my cup of tea. If you swipe one finger, it scrolls. My biggest gripe is that it was very uncomfortable to hold. And this was after using it for about 30 minutes. Compare that to the trackpad where you click with two fingers down for right-click. In fact, the trackpad supports way more gestures, and gives you a large surface to use. When I do get an iMac (need to save up some more), I will be opting for the trackpad instead of the MM.
 No, that's not what I'm saying.  I'm saying basing your perception of
 a new product on your experience with another product from the same
 brand is not always objective.  And that's not always a bad thing --
 there's a reason humans learn from their experience.  I never said
 what's "popular" is good, that's BS.  I'm saying past experiences
 bias our decisions (all of us, myself included).  I sure as hell will
 *never* buy another motorola bluetooth headset again.

 Saying you don't like something because it's popular (not saying you
 are saying that) is *still* an opinion driven by popularity!

All fine, but I don't see how any of it leads you to conclude that I'm dismissing Apple products on account of them being from Apple.

Your posts seem to always include a general disdain of all things Apple (frankly, all things "new technology"). It's hard to separate the cause from the effect... I apologize if I was too assuming.
 I personally will *never* sign up for facebook (sorry Andrei), and

Bizarrely enough, I likely will, but only because these multiplayer-enabled mobile games (I'm working on one - hopefully it won't suck *too* bad) apparently need (for some definition of "need" ;) ) to support facebook-based login these days. So I gotta be able to test it.

Hehe. I almost always immediately delete an app that won't let me proceed without logging in to facebook. There is no reason for that, unless it, um... is the facebook app :) There is a general assumption by many applications/websites that *everyone* uses facebook. I refuse to pretend that I have 800 "friends". I have friends, I know who they are. I don't need to know what's going on with them every second of the day. Besides, my wife is on facebook, and if any important news happens via FB, she'll tell me :) -Steve
Sep 25 2012
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
 I'm looking at the photos on my iPhone through Explorer right now
 and aside from the screenshots, the majority of them are either
 sideways or upside-down.

Wait, did you *download* them? Or are you just browsing via the USB cable? When you download them via the camera import feature of Windows (I think XP has that), it corrects the rotation. I have no idea why it waits until then.
 The bizarre thing is, when I look at them through "Photos" on the
 device itself, it actually shows them all correctly. Which means
 that the device *knows* how they're supposed to be but doesn't
 bother to actually save them correctly.

I don't think the photos are meant to be browsed that way. See this thread here https://discussions.apple.com/message/16514340#16514340 I think explorer must not be using the rotation field (seems odd), but the camera import rotates the picture on import.

Ugh, yea, exactly. I can't do a normal file copy? I can't email them? The way apple handled photo orientation is just terrible. Like the one guy said in there, at the very *least*, they should have allowed an option to actually store them rotated since there's obviously so damn much that doesn't support that metadata flag.
 
 The "back button" is the rotate.  I agree it's not very well drawn,
 it should be more like a quarter-turn and less snazzy (just a quarter
 circle arrow would be better).
 

Ugh, geez... I miss words. I didn't mind non-word toolbar buttons on the desktop, because then you have the concept of "hover" which will trigger the words until you learn the icons (and then get annoyed because you usually can't turn off the tooltips once you no longer need them...). Plus toolbar buttons on the desktop aren't so damn abstract. Android's guilty of that too, their wordless icons are just getting more and more abstract (and thus, obscure) with each new version. Yea, they're prettier now, but who cares about pretty when it's not usable?
 Besides, I don't think rotating that picture will help much ;)
 

Very true ;)
 
 My brother has an android with dedicated buttons, but they are part
 of the touch screen (they aren't displayed, they are inlays, but are
 part of the whole touch sensitive screen).

Yea, that's how mine is, too (Nexus S 4G). I prefer the older ones with physical buttons, but at least this is better than the latest ones which merely draw it on the screen (which means they can decide to make the "standard" buttons disappear on a whim...gee, great...) I think they're just trying to "be like apple" and minimize the amount of anything tactile. I can't think of any other sane reason for it.
 Finder is easily still my
 least favorite though. I actually *liked* one of the views it had
 (the multi-column one) until I actually started using it firsthand.

That is the default, and I absolutely love it. However, only with my trackpad, where I can easily scroll horizontally.

With KatMouse (http://ehiti.de/katmouse/ - I won't use Windows without it), I can easily scroll horizontally by pointing at the horiz scroll bar and using the scroll wheel. Not as handy as on a trackpad, but at least I don't have to be using a trackpad to do it ;) I do wish tilting scroll wheels were more common though.
 
 I really would like to have a folder view on the left though, for
 copying files like in Windows.  You know how you can open the
 directory you want to copy from, then go find the folder you want to
 copy to, but not open it, and just drag the files?  That is perfect.
 With Finder, I have to drag the file to "Documents" shortcut, then
 wait until it pulls that up, then go navigating through
 subdirectories while holding down the button.
 

Yea, for me, that was actually one of my biggest issues with finder. I rely on the dual-pane too much to give it up. Finder's tree-view-with-folders-AND-files is sometimes nice (It's common on Linux file managers, too), but without that extra folder-tree on the left, I found I just couldn't be using it for everyday work. It's a non-starter for me.
 Yea, "Show hidden files" is one of the first things I do when I
 install a new OS. And "Show my f*** extensions" on windows.

Hells yeah! It always strikes me as comical that MS created that "feature" and it created a whole class of openme.txt.exe viruses. Yet instead of just removing that misfeature, they built legions of extra CPU-consuming mail filtering and anti-virus software to prevent people from having any files with multiple extensions, only to piss off people who tried to use .tar.gz files :)

Yup :) They seem to think their "Type" field solves the issue, and maybe that works fine for average Joes, but I'm not an average Joe and I don't want to be playing guessing games about "Ok, what's the Microsoft term for a .XXXXX file?" Or "What the hell file type is a 'Configuration settings' again?" And then there's different file types that will have the *same* Microsoft "Type".
 It never seemed to bother *anyone* in DOS or Windows 3.1, I think
 that was a huge design mistake.
 

To be fair though, back then, there were fewer idiots using computers. And I'm not entirely joking. I mean think about, say, the 80's. Who were the most common people using computers? There were plenty of exceptions, but mostly it was people who knew what they were doing. That's because the people who *didn't* know what they were doing would either not buy one, or just let it collect dust. Now everyone uses them, including the "dummies" who previously avoided them.
 Looked it up, Mint has two shells, MATE and Cinnamon.  I think I
 settled on MATE, the start menu was like the best of both XP and Win7.
 

Hmm, I've never heard of either of those. Looking it up, apparently MATE is a fork of the now-abandoned GNOME 2. Yea, that's not too bad. I used GNOME 2 and found it occasionally trying to be more mac-like than I would have preferred, but it wasn't bad overall (although the taskbar seemed a little buggy - it only ever used about 50% of the space available, weird). And it certainly beat the hell out of KDE 4 (even the so-called "good" versions of KDE4 stink) and what I've seen of GNOME 3.
 But I must say, the expensive hardware (quad-core i7) kicks the
 pants off of any other machine I've ever used.

I recently moved from a 32-bit single-core XP to a 64-bit dual-core Win7 (don't remember exactly what CPU, but it's Intel and newer/faster than the Core 2 Duo). Video processing is waaay faster, compiling C++ is slightly faster, and everything else I do is...pretty much the same. All of it already ran fine on the old system, so there's not much left for this one to improve on speed-wise.

I think my old laptop was centrino with "hyperthreading" (it was that old).

Hah! My desktop (ie my primary system until a few months ago) pre-dated hyperthreading. <g> My previous *laptop* was...I think it was about a P2 or so, definitely <1GHz, and it had a PCMCIA slot, parallel/serial ports and even a (yes, *a*) USB 1.1 port ;) It was awesome because it could read DVDs (but couldn't burn anything) and had an *active matrix* display, wow! That laptop's been completely useless for many years now, of course.
 I know that I've seen HPs where
 the "buttons" were just drawings on the touchpad.

Yeech!
 Two-finger scroll is ok, but personally I *much* prefer the
 "circular"-motion scrolling (forget what they call it) - it's
 actually just about as good as a scroll wheel.

What I like about the 2-finger scroll is that it goes all 4 directions, it's like panning. And I don't have to move my finger to a certain spot.

I'm not sure this one does that (although in some apps I can do that by middle-dragging on my trackball - I wish it was all though). But, and maybe I'm being paranoid, I have a very strong suspicion that limitation is due to an apple patent. They *have* been very patent-litigious in recent years, and it doesn't seem like the kind of feature anyone would actually have any trouble getting right.
 And I've tried Apple's magic mouse, it sucks.

Is that the one they had five or ten years ago as a "two-button scroll mouse" but was touch-sensitive instead of having actual mouse buttons? I've only come across one person who ever liked it - and it definitely wasn't me.

It has no buttons or visible delineations, you have to just "know" that if you click on a certain spot (and you better not have your other fingers down) that it will be the correct mouse button. Not my cup of tea. If you swipe one finger, it scrolls. My biggest gripe is that it was very uncomfortable to hold.

Yea, sounds like the one I tried years ago at some apple store. IIRC, you couldn't even rest your fingers on the mouse because that would be a "click". You had to hover *over* the "button".
 
 Your posts seem to always include a general disdain of all things
 Apple (frankly, all things "new technology").

Apple and I do seem to have very different tastes in general.
 It's hard to separate
 the cause from the effect...
 
 I apologize if I was too assuming.
 

Fair enough.
 I personally will *never* sign up for facebook (sorry Andrei), and

Bizarrely enough, I likely will, but only because these multiplayer-enabled mobile games (I'm working on one - hopefully it won't suck *too* bad) apparently need (for some definition of "need" ;) ) to support facebook-based login these days. So I gotta be able to test it.

Hehe. I almost always immediately delete an app that won't let me proceed without logging in to facebook. There is no reason for that, unless it, um... is the facebook app :)

Yea. In our case, we're aiming for the "Words with Friends" model where "Log in with Facebook" is merely an option.
 There is a general
 assumption by many applications/websites that *everyone* uses
 facebook.

I know! And it's not just software, it's all business in general. They noticed that it's popular so they think that means "nearly everyone uses it" when the reality is that even as popular as it is, it's still only a *minority* of internet users. Same with twitter.
 I refuse to pretend that I have 800 "friends".  I have
 friends, I know who they are.  I don't need to know what's going on
 with them every second of the day.
 

Yea, I think at the very least they really botched the wording on that. That's been a pretty common jab made towards facebook. And I can't disagree with it.
 Besides, my wife is on facebook, and if any important news happens
 via FB, she'll tell me :)
 

Heh. Similar situation here. My brother and sister are both on it, so I'll catch wind of any family news from FB. My parents, like me, aren't on FB either so they get the same benefit, too, although they usually hear much sooner I do ;)
Sep 25 2012
prev sibling next sibling parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Tue, 25 Sep 2012 18:59:55 -0400, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:


 I don't think the photos are meant to be browsed that way.  See this
 thread here https://discussions.apple.com/message/16514340#16514340

 I think explorer must not be using the rotation field (seems odd),
 but the camera import rotates the picture on import.

Ugh, yea, exactly. I can't do a normal file copy? I can't email them? The way apple handled photo orientation is just terrible. Like the one guy said in there, at the very *least*, they should have allowed an option to actually store them rotated since there's obviously so damn much that doesn't support that metadata flag.

I think you can do a normal file copy. But it seems like many photo viewing applications (Including explorer apparently, which surprises me) does not support the rotation data that your file copy won't look right on those. There are probably some applications that support the rotation flag. It kind of makes sense to me. You are getting a raster image from the camera, and obviously the hardware doesn't do the rotation, so to be as efficient as possible, instead of doing a transformation in software, which might also require moving the data to places it doesn't have to go, it simply stores a few bits different in the image. From that thread, I could see that Apple is not the first nor only one to do that -- cameras which have accelerometers also do it.
 The "back button" is the rotate.  I agree it's not very well drawn,
 it should be more like a quarter-turn and less snazzy (just a quarter
 circle arrow would be better).

Ugh, geez... I miss words. I didn't mind non-word toolbar buttons on the desktop, because then you have the concept of "hover" which will trigger the words until you learn the icons (and then get annoyed because you usually can't turn off the tooltips once you no longer need them...).

From my software design class in college, I learned that pictures are actually better *if* they are obviously intuitive. For example, take a large room with 10 light switches. What is easier to understand, a bank of 10 light switches with each one having a label of what it is, or a layout of the room with a light switch placed at the location on the map that it controls in the room? If it can't be obviously intuitive, then use words. This feature can be obviously intuitive. A rectangle on its side, with a rounded arrow rotating to a rectangle on it's bottom would be obvious and require no words, for anyone in any language. It's just a failure on whoever designed that icon, and I think it should be fixed.
 Plus toolbar buttons on the desktop aren't so damn abstract. Android's
 guilty of that too, their wordless icons are just getting more and more
 abstract (and thus, obscure) with each new version. Yea, they're
 prettier now, but who cares about pretty when it's not usable?

Abstract/obscure is the *wrong* way to go with icons. Not all operations are easy to make into an icon. But then you will probably have the asthetics dept screaming at you if you made a toolbar with half icons and half words :)
 My brother has an android with dedicated buttons, but they are part
 of the touch screen (they aren't displayed, they are inlays, but are
 part of the whole touch sensitive screen).

Yea, that's how mine is, too (Nexus S 4G). I prefer the older ones with physical buttons, but at least this is better than the latest ones which merely draw it on the screen (which means they can decide to make the "standard" buttons disappear on a whim...gee, great...) I think they're just trying to "be like apple" and minimize the amount of anything tactile. I can't think of any other sane reason for it.

I can't understand the lack of love for physical buttons these days. There are some things that need real buttons.
 I do wish tilting scroll wheels were more common though.

I had one of those. the issue is, the software has to support it. Not all do.
 Yea, "Show hidden files" is one of the first things I do when I
 install a new OS. And "Show my f*** extensions" on windows.

Hells yeah! It always strikes me as comical that MS created that "feature" and it created a whole class of openme.txt.exe viruses. Yet instead of just removing that misfeature, they built legions of extra CPU-consuming mail filtering and anti-virus software to prevent people from having any files with multiple extensions, only to piss off people who tried to use .tar.gz files :)

Yup :) They seem to think their "Type" field solves the issue, and maybe that works fine for average Joes, but I'm not an average Joe and I don't want to be playing guessing games about "Ok, what's the Microsoft term for a .XXXXX file?" Or "What the hell file type is a 'Configuration settings' again?" And then there's different file types that will have the *same* Microsoft "Type".

No, it's not that! Just *SHOW THE EXTENSION*. I don't understand how they think people's brains are so fragile that they wouldn't be able to handle seeing the extensions. It's like Microsoft thought that was an ugly wart and fought to cover it up at all costs -- including spawning viruses.
 And I'm not entirely joking. I mean think about, say, the 80's. Who
 were the most common people using computers? There were plenty of
 exceptions, but mostly it was people who knew what they were doing.
 That's because the people who *didn't* know what they were doing would
 either not buy one, or just let it collect dust. Now everyone uses them,
 including the "dummies" who previously avoided them.

I actually don't think that is the case. There seems to be this common view that people who aren't computer savvy need icons and GUIs and whatever to be able to use them. If you want to see proof that this is false, go to any Sears store, and buy something, then watch the salesperson (whom I don't consider a tech guru) breeze through the terminal-powered curses interface to enter your order -- using F keys and everything else. I think tech-unsavvy people just take more training, but they certainly can use any interface you give them.
 What I like about the 2-finger scroll is that it goes all 4
 directions, it's like panning.  And I don't have to move my finger to
 a certain spot.

I'm not sure this one does that (although in some apps I can do that by middle-dragging on my trackball - I wish it was all though). But, and maybe I'm being paranoid, I have a very strong suspicion that limitation is due to an apple patent. They *have* been very patent-litigious in recent years, and it doesn't seem like the kind of feature anyone would actually have any trouble getting right.

Meh, if Apple wants to sue someone like HP over PC features, I'm sure HP can shoot back. I don't think that's the issue. Remember, most companies hold patents so that they don't get sued, not so that they sue others. It's probably more of the case that Windows apps just aren't built to handle it.
 Yea, sounds like the one I tried years ago at some apple store. IIRC,
 you couldn't even rest your fingers on the mouse because that would be a
 "click". You had to hover *over* the "button".

Hm... I don't think it has to be configured that way. The whole mouse "clicks" when you push it. But you could configure just a tap on the surface to be a click. In any case, not worth having IMO.
 There is a general
 assumption by many applications/websites that *everyone* uses
 facebook.

I know! And it's not just software, it's all business in general. They noticed that it's popular so they think that means "nearly everyone uses it" when the reality is that even as popular as it is, it's still only a *minority* of internet users. Same with twitter.

I begrudgingly signed up for twitter, so I could send a message to a radio host (who is a twitter fanatic, so I knew he would read it). Since then, I've tweeted a few things, but I'm not crazy about it. At least you aren't expected to "follow" everyone you met for 5 minutes. -Steve
Sep 26 2012
prev sibling next sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Sep 26, 2012, at 7:44 AM, Steven Schveighoffer <schveiguy yahoo.com> =
wrote:

 On Tue, 25 Sep 2012 18:59:55 -0400, Nick Sabalausky =

=20
=20
 Ugh, yea, exactly. I can't do a normal file copy? I can't email them?
 The way apple handled photo orientation is just terrible. Like the =


 guy said in there, at the very *least*, they should have allowed an
 option to actually store them rotated since there's obviously so damn
 much that doesn't support that metadata flag.

I think you can do a normal file copy. But it seems like many photo =

does not support the rotation data that your file copy won't look right = on those.
=20
 There are probably some applications that support the rotation flag.
=20
 It kind of makes sense to me.  You are getting a raster image from the =

efficient as possible, instead of doing a transformation in software, = which might also require moving the data to places it doesn't have to = go, it simply stores a few bits different in the image.
=20
 =46rom that thread, I could see that Apple is not the first nor only =

I think you're talking about the EXIF Orientation tag. Picasa used to = use this and other flags so when an image was saved, instead of = rewriting the image itself it would attach a bunch of EXIF tags to say = how the viewer should display the image. But enough viewers ignored the = flags that Picasa added an "export" option to rewrite the actual image = as desired sans tags. Browsers seem to ignore these tags as well for = some reason, so fixing the display of the image may have to happen at = the server side or CSS has to be used to tell the browser how to orient = the image. In short, it's kind of a bad situation despite EXIF having = been around for ages now.=
Sep 26 2012
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Wed, 26 Sep 2012 10:44:34 -0400
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:

 On Tue, 25 Sep 2012 18:59:55 -0400, Nick Sabalausky  
 I miss words. I didn't mind non-word toolbar buttons on the desktop,
 because then you have the concept of "hover" which will trigger the
 words until you learn the icons (and then get annoyed because you
 usually can't turn off the tooltips once you no longer need
 them...).

From my software design class in college, I learned that pictures are actually better *if* they are obviously intuitive.

Oh I agree. It's just that iOS/Android devices seem to be breeding grounds for obscure non-intuitive ones, from what I've seen. :(
 
 I can't understand the lack of love for physical buttons these
 days. There are some things that need real buttons.
 

That was one of the first things that turned me off of the iOS devices (and the Android ones which followed the same design): My Zire 71 had dedicated directional controls, dedicated shutter button, and four other re-purposable buttons that defaulted to opening commonly-used programs. I *never* found the existence of those to be a downside. And the only reason the lack of physical keyboard didn't bother me was that Grafitti was actually pretty good (although not *nearly* as good as the original Grafitti which was killed off thanks to the patent trolls at Xerox.) I didn't want to, and saw absolutely no legitimate reason to, give that up. What Palm did to make a good handheld interface is start with a touchscreen, which is totally general and repurposable, but frequently less than ideal, AND THEN identify the most common needs, both high- and low-level (ex: "use as a camera", "directional controls") and provide better, even if less general, controls for those. That's how you make a good convergence device. Apple, OTOH, only did that first half and then just stopped. And then Andorid came and copied that, but with four physical (and non-direction) buttons which they ended up getting rid of anyway.
 I do wish tilting scroll wheels were more common though.

I had one of those. the issue is, the software has to support it. Not all do.

Yea, the damn chicken-and-egg. People need to be willing to just break that damn cycle.
 Yea, "Show hidden files" is one of the first things I do when I
 install a new OS. And "Show my f*** extensions" on windows.

Hells yeah! It always strikes me as comical that MS created that "feature" and it created a whole class of openme.txt.exe viruses. Yet instead of just removing that misfeature, they built legions of extra CPU-consuming mail filtering and anti-virus software to prevent people from having any files with multiple extensions, only to piss off people who tried to use .tar.gz files :)

Yup :) They seem to think their "Type" field solves the issue, and maybe that works fine for average Joes, but I'm not an average Joe and I don't want to be playing guessing games about "Ok, what's the Microsoft term for a .XXXXX file?" Or "What the hell file type is a 'Configuration settings' again?" And then there's different file types that will have the *same* Microsoft "Type".

No, it's not that! Just *SHOW THE EXTENSION*. I don't understand how they think people's brains are so fragile that they wouldn't be able to handle seeing the extensions. It's like Microsoft thought that was an ugly wart and fought to cover it up at all costs -- including spawning viruses.

Well, even when showing the extension, you still can't get an actual *column* of the extensions, nor can you sort by them. You can only do that with the type. I did manage to find an add-on that adds an Ext column, but it has a couple little issues (conflict with TortoiseSVN, and ZIP files are always-at-the-top together with folders, instead of sorted alphabetically). I've been meaning to make a little tool I can periodically run that just goes into the registry and sets all of the "Type" names to be the same as the extension they're mapped to. It should be pretty trivial, I just haven't gotten around to it yet.
 I actually don't think that is the case.  There seems to be this
 common view that people who aren't computer savvy need icons and GUIs
 and whatever to be able to use them.  If you want to see proof that
 this is false, go to any Sears store, and buy something, then watch
 the salesperson (whom I don't consider a tech guru) breeze through
 the terminal-powered curses interface to enter your order -- using F
 keys and everything else.
 
 I think tech-unsavvy people just take more training, but they
 certainly can use any interface you give them.
 

They're definitely able to, yes (and you provide a great example), but the problem is they're not willing to unless it's mandatory to get their wage/salary.
 What I like about the 2-finger scroll is that it goes all 4
 directions, it's like panning.  And I don't have to move my finger
 to a certain spot.

I'm not sure this one does that (although in some apps I can do that by middle-dragging on my trackball - I wish it was all though). But, and maybe I'm being paranoid, I have a very strong suspicion that limitation is due to an apple patent. They *have* been very patent-litigious in recent years, and it doesn't seem like the kind of feature anyone would actually have any trouble getting right.

Meh, if Apple wants to sue someone like HP over PC features, I'm sure HP can shoot back. I don't think that's the issue. Remember, most companies hold patents so that they don't get sued, not so that they sue others.

Usually yes, but look at Jobs's famed "going thermo-nuclear" on Android. He was out for blood (figuratively, at least I assume), and it's pretty well established that they were taking their patents on the offensive, contrary to usual industry practice.
 It's probably more of the case that Windows apps just aren't built
 to handle it.
 

I don't think they need to be. As long as they're using the standard OS scrollbars, that's all the driver needs to hook into. It might not work in skinned apps, but that's the inherent problem with skinned apps anyway, they can't always work right.
 Yea, sounds like the one I tried years ago at some apple store.
 IIRC, you couldn't even rest your fingers on the mouse because that
 would be a "click". You had to hover *over* the "button".

Hm... I don't think it has to be configured that way. The whole mouse "clicks" when you push it. But you could configure just a tap on the surface to be a click. In any case, not worth having IMO.

It's possible we might not even be talking about the same mouse. Like I said, the one I used was at least five years ago.
 
 I begrudgingly signed up for twitter, so I could send a message to a
 radio host (who is a twitter fanatic, so I knew he would read it).
 
 Since then, I've tweeted a few things, but I'm not crazy about it.
 At least you aren't expected to "follow" everyone you met for 5
 minutes.
 

Before I finally switched to linode and VPS-hosting, my last shared web-host at one point made the decision to *only* send out import maintenance notices to twitter. Nevermind that they actually *had* support contact emails from me and the rest of their users. Nevermind that not everyone's interested in following twitter, contrary to popular beleif. That pissed me off. Actually that was one of the first in a series of blunders that marked their downfall, IMO. About a year later they went under and got bought out. Right as I was fed up and about to switch to a linode VPS anyway :)
Sep 26 2012