Sounds like Ford's "smart pointer" all over again. What was the smart
pointer theorem? Something like, every programming environment will
eventually degrade in design until it sucks so hard that somebody
invents the smart pointer?
They hint at it don't they... "since C++ doesn't have reference counts..."
erm, reference coounts? I thought everyone knew that reference counts were insufficiently general to be included as a central language construct. that's downright weird...
I'm more upset by the fact that they felt they needed to introduce language extensions for something that is clearly a system API.
Yes, I know they said "Without a pervasive approach such as GCD, even the best-written application cannot deliver the best possible performance, because it doesn’t have full insight into everything else happening in the system" but I'm just not buying it.
Not to mention, by making the language itself proprietary, they're making it even more difficult to write portable applications. Sure, sometimes you want your software to have "Macness" but much of the time you just want it to work everywhere.
*sigh* Are there C bindings for Qt yet?
I'm more upset by the fact that they felt they needed to introduce language
extensions for something that is clearly a system API.
I was thinking that too. That was the first thing that struck me, but the way it's written, it's not actually part of the language, it's functions that are part of the spec, and the spec is more than just about the language.
Sort of. They're making it kind of grey. The code blocks are part of the language the threading functions are part of the GCD spec. If they say the gcd spec is an extension to C, that's not exactly entirely true.
It's more semantics for an argument I guess.
But the reality is if you implement GCD (or they make it part of the C standard) then it might as well be considered part of the language since you HAVE to do it to be compliant.
But it's not a syntax thing per se.
right now, the point is moot. no other systrem supports this, so if you build something that requires it -- and regardless of the syntax used to implement it, it is inherently so different from traditioanl threading that you are not going to write a program that can run with either GCD or pthreads -- then you are limited to OS X.
there are a number of techniques to write portable code, even GUI code, that runs on all of Linux, OS X, and Windows. GCD is not one of them.
Di Sep 22 2009 16:58:36 EDT von fleeb @ Uncensored
User expectations are also a bit different now. No one complains anymore when an application appears on the screen with slightly different chrome than "native" apps.
cut'n' paste won't work as expected, (sometimes...)
focus management inside of the application won't always work as expected...
and so on and so on.
Java apps still feel like aliens. everywhere. (at least everywhere i've met them...)
Long time Pythoneer Tim Peters succinctly channels the BDFL's
guiding principles for Python's design into 20 aphorisms, only 19
of which have been written down.
The Zen of Python
Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Special cases aren't special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one-- and preferably only one --obvious way to do it.
Although that way may not be obvious at first unless you're Dutch.
Now is better than never.
Although never is often better than *right* now.
If the implementation is hard to explain, it's a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea -- let's do more of those!
saw that recently.
Subject: howto start a flamewar
except about editors:
target some version controll system written in C, moan about its code flexibility and design errors, and rewrite it in perl because of "the code is better readable" and call it cupt instead of apt.
The Zen of Python
Too bad none of that applies to python.
Forced indentation isn't new to programming, if I remember correctly.
I think some other languages had the same quirk. I think I remember some assemblers that used to be a little twitchy about indentation, and I'm not entirely sure COBOL is free of it.
I'm familiar with the arguments in favor of indentation. And, I understand them. They're valid arguments. I just don't agree with them.
I just take each language for their idiosyncrasies and move on. If the language can accomplish my goals, I live with their problems.
Except Perl, of course.