PDA

View Full Version : Objective-C vs. Java


ezkcdude
2005-11-15, 18:19
What are the pros and cons of developing with either of these on the Mac? It seems to me that unless there is a huge performance hit, Java serves as the cross-platform lingua franca these days. I'm just wondering for those of us who don't work for Apple or write Mac OS X specific applications, especially with the looming Intel change, are there any reasons for Cocoa programming or learning Objective-C? Comments appreciated.

byzantium
2005-11-15, 18:52
I think your question needs to be a bit more specific .. but here goes with an answer:

Well the first thing is that you can't develop Cocoa applications with Java - it's been officially dropped as a supported language.

Cocoa is a set of frameworks (and related tools such as Interface builder) that can let you build applications rapidly. Technologies like Core Data are based on advanced technologies like EOF; Java has similar equivalents like Hibernate; but it's not part of the J2SE specification.

Building native-appearance apps with Java isn't that easy. If you need to build desktop applications fast for the Mac; and you want a first-class citizen native interface as well as access to OS X technologies (Quicktime, Core Image, Core Data, Spotlight), then you really have little choice but to use Cocoa.

Java (IMO) is better suited for server side applications (like web application servers) then for desktop apps.

chucker
2005-11-15, 20:21
The verdict is rather simple really, with over 2 million results for Java sucks (http://www.google.ca/search?q=java+sucks) and only about 32,000 for Objective-C sucks (http://www.google.ca/search?q=objective-c+sucks). IOW, the difference is: they both suck, but one less so.

SCNR. ;)

Brad
2005-11-15, 20:27
SCNR?

Of course, the fact that practically no one outside of Mac OS X developers uses Objective-C would taint your results, chucker. :p

the looming Intel change...will have no effect on Objective-C. Unless you are writing low-level code that is dependent on byte order, you won't be in any way affected by this transition.

The biggest advantage for Java is it runs (generally) anywhere.
The second biggest advantage for Java is that is has a HUGE library of existing classes.
The biggest disadvantage for Java is that it's slow.
The second biggest disadvantage for Java is that it's more difficult to manipulate memory directly.

I don't have enough experience with Objective-C to make a judgement call there.

Wyatt
2005-11-15, 20:30
SCNR?


Come on, Brad! Google is your friend. :D

SCNR = Sorry, Could Not Resist

Brad
2005-11-15, 20:36
Yeah, I Googled it, but I was hoping someone would have some more witty response to it. :) Something like:

Stressed Crabs Never Rest
Strange Communists Nuked Russia
See? Can't Negotiate with a Republican
Stupid, Crazy, Nonsensical, and Ridiculous
Slap Children, Not Royalty
Something Cold Nudged my Rib

chucker
2005-11-15, 20:39
Of course, the fact that practically no one outside of Mac OS X developers uses Objective-C would taint your results, chucker. :p

I was, as the "SCNR" and the wink smilie after that should suggest, not being serious at all. Obviously, both Objective-C and Java have their legitimations to exist.

The biggest advantage for Java is it runs (generally) anywhere.
The second biggest advantage for Java is that is has a HUGE library of existing classes.
The biggest disadvantage for Java is that it's slow.
The second biggest disadvantage for Java is that it's more difficult to manipulate memory directly.

Yup. Although on the fourth one, it could be argued that it's an advantage. I.e., developers don't have to worry about memory as much as they do with C derivatives.

Kickaha
2005-11-15, 20:52
The verdict is rather simple really, with over 2 million results for Java sucks (http://www.google.ca/search?q=java+sucks) and only about 32,000 for Objective-C sucks (http://www.google.ca/search?q=objective-c+sucks). IOW, the difference is: they both suck, but one less so.

SCNR. ;)

And 514,000 for C++ sucks. :D

Obviously, Obj-C is teh w1n! ;)

Huh, I really would have that it would be much higher for C++. *shrug*

The issue with Obj-C outside of the Mac isn't the language - gcc has native support for Obj-C by default in the normal distros. It's that *Cocoa* is what people use, Obj-C is just sort of a byproduct of that decision, and Cocoa isn't anywhere else. GNUStep is getting there, but it's still kind of primitive in many ways. Okay, a lot of ways. (But at only 15k hits for 'gnustep sucks', apparently it's better than even Obj-C. ;) )

But, all that being said, Obj-C is up there with Smalltalk in my personal shrine to programming languages.

ezkcdude
2005-11-15, 21:44
To sum up, it would seem that it's only useful to learn Objective-C if you're going to use Cocoa and write OS X-only applications. If anyone else has a different opinion, speak now or forever hold your piece (sic).

Kickaha
2005-11-15, 21:46
Or unless you want to learn how to grok OO really well. Only Smalltalk is really better for that. C++? Pah. Bastard child. Java? Feh. Developmentally slow child. Go for the gold standard, learn Smalltalk... or, if you want to actually do something fun with it, learn Obj-C. ;)

Remember, languages come and go, but the basic principles are forever.

Well, except FORTRAN. We're still dealing with that albatross.

chucker
2005-11-15, 22:22
Well, there's various bindings to write Cocoa code, such as RubyCocoa.

Kickaha
2005-11-15, 22:35
Indeed, and the Python-Cocoa binding, PyCocoa, is *great*.

I <3 Python.

Wickers
2005-11-15, 22:38
I <3 Python.

:)

:cool:

ShadowOfGed
2005-11-15, 23:54
The second biggest advantage for Java is that is has a HUGE library of existing classes.I really beg to differ here. The Java class library, if you want my opinion (I know you didn't ask :p), is a huge mess. There is a lot of redundant functionality it seems, with ClassNamesThatAreAboutFifteenThousandMilesLong. Not to mention there's not always true "backwards compatibility" between major revisions of Java, which seems to defeat the purpose of Java having a stable class library anyway.

And a lot of the buzzword technologies that surround Java? Also seems like a huge mess. I test J2EE apps that run on WebSphere. It drives me absolutely nuts. Maybe I'm a tad biased, but still... :\

I like these other languages because they don't force an enormous set of libraries on anyone; a given user has only the libraries s/he needs. Particularly Python and Ruby I find interesting because they're two of the other truly Object-oriented languages (where everything is an Object, versus C++ where that's not always true).

Thus my anti-Java bias makes me vote for Objective-C. :) :p

AsLan^
2005-11-16, 00:31
I use java for writing programs and I enjoy it.

I like the wealth of help available for java and I like the java API after you learn to navigate the spec, it is ridiculously easy to find a class that does what you want it to and a sometimes tutorial to go with it.

I don't have any problem with the speed of java. Little slowdowns here and there can be sorted out by refining your code.

I havn't tried Objective C yet, but I will someday. It just seems like a waste to me, to write a program that can only work on one platform. If I write a program with C or java, it can at least be compiled on another OS or architechure, with possibly a little debugging. If I write a cocoa program, well it's mac only, and the code will not be reusable.

I can see using Objective C / Cocoa for apps that depend on a technology only present in OS X. But for an app that doesn't access any technologies integrated into the operating system, IMHO, java is a better choice.

Kickaha
2005-11-16, 00:45
But for an app that doesn't access any technologies integrated into the operating system, IMHO, java is a better choice.


If your app doesn't access any technologies in the OS, then you're free to use ANY language, not just language du jour. In that case, why not use the language that is nicest? :)

Seriously, think about your argument... if you're writing for the Mac, then you're using the OS. If you're writing to Windows, then you're using the OS. If you're writing to KDE or Gnome, then you're using 'the OS'. The only time you *WON'T* be using technologies provided for you, is when you write everything from scratch yourself, in which case you have complete freedom to choose the language. In such cases, going with the herd is about the dumbest thing you could do.

Now what I *think* you meant to say was something along the lines of "If your app isn't using any SPECIFIC technologies that are wedded to THAT platform, then use a cross-platform set of libraries." The languages are all cross-platform - Obj-C compiles just spiffy on any machine gcc installs on, as does C, C++ or Java.

AsLan^
2005-11-16, 00:56
Now what I *think* you meant to say was something along the lines of "If your app isn't using any SPECIFIC technologies that are wedded to THAT platform, then use a cross-platform set of libraries." The languages are all cross-platform - Obj-C compiles just spiffy on any machine gcc installs on, as does C, C++ or Java.

I guess that's something like what I meant to say...

As for Obj-C, isn't cocoa the only decent API? I saw the screenshots from Gnustep, and I dont think it's quite there yet.

Kickaha
2005-11-16, 01:11
Pretty much, yeah. If you're going to use Obj-C, Cocoa is heads and shoulders above anything else... then again, I happen to think it's heads and shoulders above most other solutions for any language.

You have to consider though, that most custom apps are written in a very crude way, and that the cost effectiveness of buying more hardware to make the code development (and above all MAINTENANCE) more efficient is a no-brainer... but one that people seem to balk at.

Silly people.

ezkcdude
2005-11-16, 01:11
The languages are all cross-platform - Obj-C compiles just spiffy on any machine gcc installs on, as does C, C++ or Java.


True, the gcc compiler could be used to create "cross-platform" programs, but it is not optimized for each platform. For example, if you want to make the fastest C program on an Intel processor, you wouldn't use the gcc compiler, you'd use Borland or Visual C++.


If your app doesn't access any technologies in the OS, then you're free to use ANY language, not just language du jour. In that case, why not use the language that is nicest?

This is not practical real-world advice. While it might be a nice academic exercise to use your preferred language, in most corporations, you write code in the language that is "du jour".

Kickaha
2005-11-16, 01:12
Which is, I'm sorry, idiotic. You use the language that's best suited for the task, or you're just wasting your company's money.

It's not an academic exercise, it's how the real world works once you get past the Java=kewl level. Get some experience in industry, and you'll find that there are the most *bizarre* languages out there, finely tuned for particular needs.

Quick, what's the language that still has the most functioning and currently maintained lines of code?

ezkcdude
2005-11-16, 01:23
Cobol?

AsLan^
2005-11-16, 01:24
It's not an academic exercise, it's how the real world works once you get past the Java=kewl level.

This is my one problem with java, the attitude against java programmers !

It's like we are second class citizens of geekdom...

Kickaha
2005-11-16, 01:31
This is my one problem with java, the attitude against java programmers !

It's like we are second class citizens of geekdom...

Sorry, wasn't meant as a slam against you or ezkcdude, what I meant to say was that once you get past a certain "C++ or Java" mentality, you realize that there are literally hundreds of languages out there, most of which have been designed to solve *specific* problems, where the general classes of languages simply don't work well. It's just something that takes exposure over time.

And yes, it's COBOL. Latest estimate I saw had over 70 *BILLION* lines of code. C is second, C++ or Java distantly behind. I think FORTRAN still has the edge over Java, actually.

The point is that there are a lot more languages out there that you need to consider to do your company and your project justice. Many times, C++, Java, C# etc, will be just fine - but you have to know when to break free of the herd.

Heck, want to know what company ruled the Fortune 100 custom-app roost from 1988-1996? NeXT. Companies bought closed proprietary hardware at several times the cost of x86 boxes for their own internal custom apps. Why? Because the development and maintenance costs were orders of magnitude lower. Custom apps, not commercial shrink-wrap apps, still comprise the vast majority of the software systems in existence.

Fun Fact: Guess how much of the total cost of the average software system's total cost, from idea to termination of the project, is spent getting 1.0 out the door? Think of all the blood, sweat and tears that go into getting a product shipped for the first time.

Try 15%.

85% of the total cost is just maintenance. Anything you can do to help alleviate that will *swamp* the cost buying new computers or even training folks on a new language or library, if those solutions are that much better. And at 85%, you don't have to be *vastly* better, just incrementally.

If you want to make yourself employable, learn programming language theory, not programming languages. You'll be able to jump from solution to solution effortlessly, and learn a new language in a few hours. My first Python app, after spending about a day learning the ropes, is now about 55,000 lines. Same design, same architecture, same basic algorithms.

chucker
2005-11-16, 01:55
True, the gcc compiler could be used to create "cross-platform" programs, but it is not optimized for each platform. For example, if you want to make the fastest C program on an Intel processor, you wouldn't use the gcc compiler, you'd use Borland or Visual C++.

Or icc, for that matter. ;)

AsLan^
2005-11-16, 01:56
You are of course correct, programming theory will get you a lot further than knowledge of a specific language.

But the thread is "Objective-C vs. Java" so I was arguing my case for java.

Although I took offense to your comment, don't worry about it, I'm over it :) that being said, I often see derogatory comments online about java programmers, perhaps it's because many "java programmers" are new to the fold and their code is poorly written (from a lack of experience no doubt) which is further exacerbated by the JVM which historically has had poor performance.

I'm not sure if you've read this article or not but it was on slashdot a few weeks ago, it's about the industry wide dumbing down of programmers. I enjoyed the article.

Does Visual Studio rot the mind ? (http://charlespetzold.com/etc/DoesVisualStudioRotTheMind.html)

Kickaha
2005-11-16, 02:11
*twitch* Oh dear god, I'm with the author, IntelliSense is evil incarnate.

Unit testing? Forget it. You can't write the test before the code.

He's absolutely right, it enforces a particular way of coding, and worse, of *thinking*... in the small. Tiny little details instead of the larger issues and concepts, the *abstractions* you *should* be thinking about. Maybe Windows will finally collapse under its own stupidity now. :)

To be honest, I think the smackdowns against Java programmers you see are mostly because new programmers run across Java and think it's the KEWLEST TH!NG EVAH and the proceed to try and convert everyone else... when the rest of us have been there, done that, and moved on already. (You have no idea how tired I got of pointing out to zealots of the bytecode system that they were just UCSD P-codes resurrected. Those were in, um, 1974? Not. New.)

Java is a good little language. I like the language, for the most part, even though I think they made some bad decisions here and there. Or, at least, questionable ones. I'm not overly fond of the libraries, since they were modeled after NeXTstep, but didn't have the advantage of a richly dynamic underlying language, so some *ahem* compromises were made. Overall though, the libraries are livable. The *BYTECODES* however... Gosling & Co should be strung up, shot, drawn and quartered, buggered fatally by camels, and boiled in greasy geek hair oil for those. They should have stuck with the original vision for those: toasters. (No, I'm not kidding. Look up Project Oak sometime.)

Unch
2005-11-16, 05:20
I found Obj-C's syntax to be rather confusing compared to Java's. But that might be because I went from C -> a little C++ -> Java.

Dr. Kickaha, please could you pass on your views about Unit testing to my Lecturers? I'm fed up of this stupid buzzword filled eXtreme Programming crap :-(

And before I forget...

10 PRINT "BASIC is teh r00lz!!!"
20 GOTO 10

ezkcdude
2005-11-16, 09:25
One question nobody has really answered is how do Obj-C and Java compare in performance on the Mac? On PC's Java used to be much slower than C, for example, but now the performance difference is not that great (~5-10% I have heard). Is it more than that on OS X?

Kickaha
2005-11-16, 09:54
I found Obj-C's syntax to be rather confusing compared to Java's. But that might be because I went from C -> a little C++ -> Java.

It's modeled after Smalltalk, where you have the idea of 'message passing' instead of 'function calling'. If you think about it, having the two have difference syntaxes in Obj-C means that you *ALWAYS* know, at a glance, whether you're dealing with an actual object, or a struct/union from C. Not so in C++, where the hoops they have to jump through to perform correct parsing is much harder not only for the human, but for the compiler. If you have to have a mixed OO/procedural model, IMO it's best to have them distinct ala Obj-C.

But it is quite a bit different looking to the novice. :)

Dr. Kickaha, please could you pass on your views about Unit testing to my Lecturers? I'm fed up of this stupid buzzword filled eXtreme Programming crap :-(

You misunderstood me... IntelliSense *prevents* you from doing unit testing, and that's a bad thing. Unit testing is a *GREAT* thing in many cases, and an important tool in your toolbox.

Like any tool though, it can be overused, and you'll see folks touting them as the latest silver bullet.

And before I forget...

10 PRINT "BASIC is teh r00lz!!!"
20 GOTO 10

Naw.

10 POKE(RND(255) * 16384, RND(254) * 255)
20 GOTO 10

;)

bassplayinMacFiend
2005-11-16, 11:11
Naw.

10 POKE(RND(255) * 16384, RND(254) * 255)
20 GOTO 10

;)

Ahhh, reminds me of the good ole days of typing in pages of code from the latest Antic, Compute! or Byte magazine. Of course, BASIC was dog slow so you'd type in code that looked something like:

10 FOR X = 16384 TO 32768
20 POKE (X, DATA)
30 NEXT X
40 DATA 234, 120, 149, 123, 362, ...
50 DATA ...

and on and on, basically POKEing an ASM program into memory, then passing control to location 16384. Talk about debugging fun too!

ShadowOfGed
2005-11-16, 11:51
Ahhh, reminds me of the good ole days of typing in pages of code from the latest Antic, Compute! or Byte magazine. Of course, BASIC was dog slow so you'd type in code that looked something like:

10 FOR X = 16384 TO 32768
20 POKE (X, DATA)
30 NEXT X
40 DATA 234, 120, 149, 123, 362, ...
50 DATA ...

and on and on, basically POKEing an ASM program into memory, then passing control to location 16384. Talk about debugging fun too!So that's what POKE does. I never used it, I was always a very simple BASIC guy. Um... does that count as redundancy? :lol: ;)

Unch
2005-11-16, 14:46
It's modeled after Smalltalk, where you have the idea of 'message passing' instead of 'function calling'. If you think about it, having the two have difference syntaxes in Obj-C means that you *ALWAYS* know, at a glance, whether you're dealing with an actual object, or a struct/union from C. Not so in C++, where the hoops they have to jump through to perform correct parsing is much harder not only for the human, but for the compiler. If you have to have a mixed OO/procedural model, IMO it's best to have them distinct ala Obj-C.

But it is quite a bit different looking to the novice. :)


You mean n00b? ;)

I think what threw me was all the stuff in Apple's Dev Docs about Objective-C, and how unlike C++ it was a true superset, then seeing this totally foriegn syntax in there.

(And yes I know that's what a Superset is, but I was still expecting something a little less different)

I found C++ made more sense because I could relate it's syntax to stuff I knew about in C.


You misunderstood me... IntelliSense *prevents* you from doing unit testing, and that's a bad thing. Unit testing is a *GREAT* thing in many cases, and an important tool in your toolbox.

Like any tool though, it can be overused, and you'll see folks touting them as the latest silver bullet.


Just ignore me, I have nothing against Unit Testing (been using JUnit al lot recently). I'm just very tired and very stressed at a project that is taking up much more of my time than it should, just because I'm forced to play bullshit buzzword bingo by a bunch of academics who decided to follow some IT consultant with no evidence of his claims, as if he was the fecking messiah. :grumble:

At least I now know what I don't want to do when I graduate.


Naw.

10 POKE(RND(255) * 16384, RND(254) * 255)
20 GOTO 10

;)

Oooh That looks familiar, but what does it do? :confused:

Kickaha
2005-11-16, 15:05
You mean n00b? ;)

I think what threw me was all the stuff in Apple's Dev Docs about Objective-C, and how unlike C++ it was a true superset, then seeing this totally foriegn syntax in there.

(And yes I know that's what a Superset is, but I was still expecting something a little less different)

I found C++ made more sense because I could relate it's syntax to stuff I knew about in C.

*nod* The only danger is that it blurs the line between procedural and OO programming - the two have *very* different mindsets and ways of approaching problem solving, and making them look the same just muddies the conceptual waters more than necessary, IMO. :\

Just ignore me, I have nothing against Unit Testing (been using JUnit al lot recently). I'm just very tired and very stressed at a project that is taking up much more of my time than it should, just because I'm forced to play bullshit buzzword bingo by a bunch of academics who decided to follow some IT consultant with no evidence of his claims, as if he was the fecking messiah. :grumble:

Ooooh, sounds like industry. :\

Yeah, anytime people blindly follow the buzzword trail, you're going to suffer. I'm a skeptic when it comes to the latest and greatest, but I do take what lessons I can from whereever I can get them. When I first saw unit testing, I thought it was BS... and then I tried it. Writing the tests forces you to think about the problem (form requirements) and come up with expected outcomes (produce a conceptual product). The implementation then becomes almost trivial in most cases, because you've thought it through ahead of time. It works *GREAT* if you're dealing with, say, ADTs or library code, but it tends to be about useless if you're coding up a UI. Know when to use it, and when not to, and it's an excellent tool. More to the point, it's an excellent thought workflow to be able to step into as needed.

At least I now know what I don't want to do when I graduate.

Trust me, industry folks are just as susceptible to the tech du jour as academics. Probably more so, since you have management thrown into the mix. :\

Oooh That looks familiar, but what does it do? :confused:

Randomly hits the first 16K of RAM, and pushes a random value of 0-255 into it. This was my favorite two-line program to take down various 8-bit machines back in the day, such as TRS-80s and Apple ][s. Walk into Radio Shack, type that in in a couple seconds, hit RUN, and walk away. Generally it would cause visual and audible havoc as well, as the video and sound systems got send random crap. As long as the program didn't stomp on itself, it would just keep going until the machine locked hard. :)

GSpotter
2005-11-17, 17:00
* The *BYTECODES* however... Gosling & Co should be strung up, shot, drawn and quartered, buggered fatally by camels, and boiled in greasy geek hair oil for those. In 1997, I attended a Smalltalk conference. On a discussion, one of the speakers (IIRC. a VM guy from Digitalk) was asked about his views on the Java VM. His comment: "undergraduate student work" ...

Radarbob
2005-11-21, 22:20
Remember, languages come and go, but the basic principles are forever.

Well, except FORTRAN. We're still dealing with that albatross.And COBOL. But, as someone once said. COBOL is a terrible language to develop business apps in. The only one worse is any other language.

Maybe the bottom line is "Learn Java if you want to eat, learn Objective-C if you want to be intellectually superior to lesser OO slugs."

I was jazzed when I first learned C++ (my 1st exposure to OO programming). Java is decent enough. I think that if you want a job you have two major choices - learn Microsoft .Net (read Visual Basic or C#) or Java.

Having only perused the objective-C manual, my spidey sense is tingling. I sense this language is much much closer to OO "purity" than Java or C++. Having only read a brief tutorial on smalltalk it's a facinating language. If Objective-C like that, then I think one really is very near OO nirvana.

Kickaha
2005-11-21, 22:38
Yup yup! Obj-C is modeled very closely on Smalltalk in many ways, but with the ability to 'drop down' to C in a heartbeat. It's perhaps the best of both worlds.

Radarbob
2005-11-21, 23:30
Fun Fact: Guess how much of the total cost of the average software system's total cost, from idea to termination of the project, is spent getting 1.0 out the door? Think of all the blood, sweat and tears that go into getting a product shipped for the first time.

Try 15%.

85% of the total cost is just maintenance. Anything you can do to help alleviate that will *swamp* the cost buying new computers or even training folks on a new language or library, if those solutions are that much better. And at 85%, you don't have to be *vastly* better, just incrementally.

If you want to make yourself employable, learn programming language theory, not programming languages. You'll be able to jump from solution to solution effortlessly, and learn a new language in a few hours. My first Python app, after spending about a day learning the ropes, is now about 55,000 lines. Same design, same architecture, same basic algorithms. Enjoying your comments Dr. Kickaha...

I see what you're saying about learning language theory, but I suggest one learn how to program. Structured programming has been around for decades but most don't know how. And everyone who understands what we mean by "maximize cohesion and minimize coupling" raise your hand.

Sometimes I strongly suspect that too many think 'the next great language' is going to somehow make code more maintainable. Crap in any language smells the same. We're moving to .NET in our shop but I'm not getting a warm fuzzy that we're doing things much different than before: we still have the same coding mindset. All I keep thinking about is how my tendency for building objects in our JavaScript code was criticised as being "complex." I also saw the beginnings of future code crumbling in many little ways. And I think they think copy-and-paste is code reuse! Given that mindset - now that we're using .NET are we're gonna make more maintainable code? Not until we raise the bar on *how* we write code.

I had the eye-opening experience of working on old COBOL code for our "Y2K upgrade" - the oldest was dated 1968, most was from about 1980 - 1985. Now, I can see how coding had to be done in a certain, clumsy way using COBOL 75 but poor coding structure made it worse. Page long, deeply nested IF statements with about a dozen GOTOs kind of stuff. Putting final printing at the very beginning and calling the function "main." N.S.! So here we were in 1999 and generally coding the same old way. But realizing we were using a COBOL 85 complier I bothered to learn it, but after 3 years on a COBOL 85 complier, I was the only one in the entire IT department who did. So everyone was using a "new" language, but coding the same old crap.

That said, OO COBOL appears to be horrible - shear lunacy. So there are limits to a language... but arguing the various modern languages is perhaps a distinction without a difference, given that how we code makes a bigger difference in the context of "85% of our IT dollar is spent after initial delivery". IMHO.

drewprops
2005-11-22, 00:46
I have a box full of Antic magazines in a box under my bed. They had some of the bitchingest covers EVAR. As I dropped out of programming to take up architecture, then film, I'm way rusty on my codework but getting back into PHP last year has been so rewarding even though I'm just doing piddly stuff.

What are Longhorn apps being written in, C#?
Is there a proprietary reason that Apple landed on Objective-C? How does that affect companies like Adobe? Is there anything that could have been done to encourage the development of more software for the Mac by PC software makers? Writing in hooks to the OS has to be done, so there's a law of diminishing return for making the apps too generic, right?

chucker
2005-11-22, 00:59
What are Longhorn apps being written in, C#?

No different than XP, although WinFX makes .NET usage more enticing.

.NET-based apps can be written in VB.NET, C#, C++.NET and many other languages, such as Boo.

Non-.NET-based apps can be written in C++, VB, whatever.

Is there a proprietary reason that Apple landed on Objective-C?

Yes: NeXT. Both Objective-C and Cocoa (although not called that at the time) originate from NeXT.

How does that affect companies like Adobe?

It doesn't; they use Carbon, not Cocoa. They don't use Objective-C at all. It would make their applications too different code-wise from their Windows counterparts.

Mr Beardsley
2005-11-22, 02:29
For the folks that wonder why people are "anti java", it probably has to do with all the Java coders that are being turned out by schools these days. I taught myself Java then went on to learn Objective-C and Cocoa before I started taking CS classes. My current class is a junior level software engineering course, and nobody knows what the heck a pointer is. Memory management is not even a consideration. The things Java handles for you are nice, but it can lead to some really sloppy programming in the hands of beginners.

As far as Obj-C goes, I love it. I really like the "weird" brace syntax. Named parameters in methods are a huge time saver. Code completion in Java kind of helps, but I still find that I am in the class documentation all the time just to see which parameter goes where. Good method names in Obj-C make it much quicker to code.

[someObject initWithX:<#x#> Y:<#y#> Z:<#z#> W:<#w#>];


It's really easy to see what needs to go where in that method call. Also Cocoa is first rate for a framework to develop in. Inn Java some parts of the the library seem complicated for no good reason. I find myself enjoying the start of Java projects, but later running into some stupid Java-trism that makes me think, "Oh, that's why I don't like Java."

One potential downside to Cocoa and Obj-C, in my opinion, is memory management. Retain and release aren't horrible, but they seem less than ideal for a framework designed to facilitate speed in developing a solution. There are signs that automatic memory management are coming to Obj-C, and it will be cool to see what Apple will roll into Leopard and Xcode. If they implement a system that works well for most objects most of the time, but still lets you drop down into manual management when you need it, it would be a great addition to development with Cocoa.

Kickaha
2005-11-22, 10:50
Enjoying your comments Dr. Kickaha...

I see what you're saying about learning language theory, but I suggest one learn how to program. Structured programming has been around for decades but most don't know how. And everyone who understands what we mean by "maximize cohesion and minimize coupling" raise your hand.

No argument there. I was merely talking about the urge of people to 'learn C++', then 'learn Java', then 'learn C#', when instead they could 'learn OO languages', and then bounce from syntax to syntax at will. :)

What you're referring to I'd lump under 'good design' which is a *WHOLE* other issue. (Although learning good OO principles at the fundamental level of language will definitely help this.)

Sometimes I strongly suspect that too many think 'the next great language' is going to somehow make code more maintainable. Crap in any language smells the same.

... but arguing the various modern languages is perhaps a distinction without a difference, given that how we code makes a bigger difference in the context of "85% of our IT dollar is spent after initial delivery". IMHO.

*applause* Agreed wholeheartedly. The primary problem is that we in academia aren't producing designers, we're cranking out *coders*... who are then being asked to design. Learning to simply program alone isn't sufficient, a good software engineer has to be able to design a class, consider coupling issues, and see beyond the single function level. Unfortunately, most don't. I've got a potential solution though... maybe. I'm in the middle of writing it up a bit, but hope to have something available for feedback soon. :}

Unch
2005-11-23, 04:09
*applause* Agreed wholeheartedly. The primary problem is that we in academia aren't producing designers, we're cranking out *coders*... who are then being asked to design.

I can vouch for that, despite the vast amounts of coding we have to do in the 2nd year, design isn't until next semester, and even then it's an option rather than compulsory (I'm taking it because I know my design skills are weak). The problem lies in that some of the other options look "sexier", like e-Commerce and Intelligent systems, and so many who desperately need this kind of class will end up not taking it.

drewprops
2005-11-23, 06:57
That's interesting that the coding is taught before design principals. I expect that they want you to have the tools first?

I was studying computer sciences in college (Ga Tech) when I sidetracked into architecture. Of course they're two different studies, but at Tech the architecture program's goal was to teach the broad and specific concepts of design instead of the minutia of drafting reflected ceiling plans. Sure, we learned the technical skills of drafting and model building but that wasn't the goal - the goal was creative approaches to problem solving.

Looking back, similar ideas were being applied in the computer sciences program. I distinctly recall one of our professors explaining that the goal wasn't to turn us into computer programmers. Rather, the program was designed to allow us to be plugged into any situation and understand the problem to formulate solutions.

From that point forward that has been my general definition between a college and a trade school.

Kick, your comments about colleges turning out coders instead of designers makes me wonder if that perception is indicative of the schools you've attended or if it's a recognized problem through academia. What happened to teaching the broad concepts like I learned at Ma Tech?

Kickaha
2005-11-23, 12:16
That's interesting that the coding is taught before design principals. I expect that they want you to have the tools first?

More often, that's the order they learned it in... see below.

Also, the misguided thinking is that they want you to have the *context* first. "Students will never be able to grasp the design principles unless we give them something *big* enough to design with." Wellllll, no. What they're missing is that design occurs at *every* level, just with different contexts. What they mean is that they want students to have context for the *latest* design principles and issues... the rest they teach by rote. Which, in my opinion, does a great disservice to the students.

I was studying computer sciences in college (Ga Tech) when I sidetracked into architecture. Of course they're two different studies, but at Tech the architecture program's goal was to teach the broad and specific concepts of design instead of the minutia of drafting reflected ceiling plans. Sure, we learned the technical skills of drafting and model building but that wasn't the goal - the goal was creative approaches to problem solving.

BING BING BING

Software engineering could learn a LOT from architecture in how the field is taught.

Looking back, similar ideas were being applied in the computer sciences program. I distinctly recall one of our professors explaining that the goal wasn't to turn us into computer programmers. Rather, the program was designed to allow us to be plugged into any situation and understand the problem to formulate solutions.

That prof is worth his weight in gold.

From that point forward that has been my general definition between a college and a trade school.

The line is blurring badly.

Kick, your comments about colleges turning out coders instead of designers makes me wonder if that perception is indicative of the schools you've attended or if it's a recognized problem through academia. What happened to teaching the broad concepts like I learned at Ma Tech?

A few things... your experience is, from what I've been able to gather, not the norm. Industry wants people who can use Windows, Java, and Visual Studio, so that's what academia gets pressured into teaching. "Teach me what I need to get a job." is what students ask, not really knowing what it is they need in the first place. (Not just the students, parents too. We've had faculty members get phone calls from parents wanting to know why we're teaching Foo when *obviously* 'everyone' uses Bar, and just what are they wasting their money on? Idiots.) Yes, if that's what they wanted, they should have gone to trade school, but that's the pressure that even good CS schools are getting.

Secondly, there's a more insidious problem inside academia and industry both: a perception that there is this hierarchy of ability and training. Coders (little more than typists) are at the bottom, then programmers, then designers, then architects. Basically, these correspond to the four basic stages of development of software engineering techniques: higher-language (not assembly), structured, object-oriented, new undefined stuff. Each 'revolution' wanted to state absolutely and definitively that they were Not Like What Came Before, so they did everything in their power to highlight the differences. (Every generation rebels against their parents, and software engineers aren't any different.) The problem is, the similarities vastly outnumber the differences, and if you look at the entire history of software engineering as repeated attempts to control complexity through the abstraction of relationships between the highest-level programming entities of the day, it starts look like a smooth ramp based on some basic principles that can be taught directly. The only problem is, that hurts the widdle egos of the revolutionaries. So you obviously can't do *that*. :P (OMG, you should have heard the condescending yelling at I got at a conference by one of the leaders of Aspect-Oriented Programming when I mentioned to him that it seemed to me like a clever injection of the Observer pattern into the runtime... I was *trying* to give him a compliment, but the sheer *idea* that his baby had *ANYTHING* to do with *ANYTHING* that had come before was so frightening to his ego that he decided to lambast me in public. Guess what... he's still wrong.)

So there are forces external and internal that contribute - when the solution is to teach design principles alongside programming basics. Seems pretty simple, but no one's been able to really figure out what's important to teach at that level. I've got a good solid run at it, but don't want to go spreading a pre-print draft of the goods on the net quite yet. :}

drewprops
2005-11-23, 18:47
That professor was gus Baird (http://wiki.yak.net/77). I just Googled him and found out that he died two years after I graduated Tech. I remember he was writing on the chalkboard during one lecture and the chalk slipped out of his fingers. He scrambled to catch it, as if something terrible might happen if it hit the floor. We were howling with laughter as he raised back up, ashen faced. He told us "You wouldn't be laughing if you'd worked with stuff that I have." which was a reference to some kind of quiet work he'd done for the military. We stopped laughing. Damn, hate to see that he died. The man was a cipher to me at my fresh-out-of-high-school age. He was always poetic and passionate and mesmerizing and about five steps ahead of you... kind of a real-life Dr. Who.

If you need any Architectural metaphors let me know, sounds like you're plotting a revolution in Common Sense to me~

Kickaha
2005-11-23, 19:01
Little ol' moi? ;)

pmazer
2005-11-23, 21:12
Kick, your comments about colleges turning out coders instead of designers makes me wonder if that perception is indicative of the schools you've attended or if it's a recognized problem through academia. What happened to teaching the broad concepts like I learned at Ma Tech?

As a current first year CS student at GA Tech, I can tell you that they're still striving to do just that, in fact I've heard many people quote almost exactly that same line, "We're not creating programmers, we're creating problem solvers." Although Java does seem to be the main language used here (*sigh*), it's not the only language. The class I should be taking next semester, but couldn't due to scheduling, is geared towards lower-level computing and uses assembly and C and the class I'm actually taking, Introduction to AI, uses Common Lisp.

GSpotter
2005-11-26, 15:21
Coders (little more than typists) are at the bottom, then programmers, then designers, then architects. ... when the solution is to teach design principles alongside programming basics. This reminds me of an old process pattern: Architect also implements (http://users.rcn.com/jcoplien/Patterns/Process/section16.html).
I see it as two side of a medal: Coding without Design / Architecture leads to programs which are a nightmare to maintain. Architectures / Designs made up by people who are not involved in the implementation has also its downsides ...

I've got a good solid run at it, but don't want to go spreading a pre-print draft of the goods on the net quite yet. :}Tell me when it's ready.