User Name
Password

Register Members List Calendar Search FAQ Posting Guidelines
PowerPC
Page 23 of 23 First Previous 19 20 21 22 [23]  Thread Tools
Programmer
Member
 
Join Date: Nov 2004
 
2005-03-27, 13:18

Quote:
Originally Posted by FireDancer
I guess my point is.....if they could find a way to use both cores with the existing single threaded code....they could keep increasing performance without the need to push past 3.0GHz.

Example - People want a dual PPC970MP @ 3.0 GHz because this is the only way to increase single thread performance beyond the current 2.5GHz PowerMacs....because most code will only use one core.
.
.
.
Since the majority of code out there now is single threaded....I just find it funny that no one has come up with a way to leverage the power of multiple cores with the majority of existing code.....
No, not going to happen -- compiled single threaded code, by definition, only runs on one core. A single stream of instructions is too serialized and inter-dependent to allow more than one core to share in its workload at the same time.

What is going to be necessary is a change in software, and this has been happening slowly over the years. There are a variety of ways in which this happens:

- The single thread in question just takes over a single core and runs as fast as it can, end of story. The rest of the OS and other apps can use the rest of the machine.

- Recompiling a program with a compiler that supports OpenMP and/or auto-vectorization can take some advantage of parallelism. This usually requires some changes to the source code and the bigger the changes the bigger the speed up on multiple cores. In many cases this only delivers very minor speedups. In easily parallelized cases it can be a big win.

- Splitting a program into a couple of major components (i.e. seperating the computational engine from the GUI). In a well designed program this can be relatively straightforward, and is done fairly often these days to keep the GUI responsive, even on a single threaded machine. This will typically use only 2 cores, and one of them (running the GUI) will be under-utilized.

- Use a system software service or 3rd party library that uses parallelism internally. If things like CoreImage are used by the application to do its heavy lifting then the provider of the service can implement use of parallel hardware internally, immediately benefiting all of their clients. As long as the API of that system is designed properly this can be done quite effectively (e.g. the original QuickDraw was not designed well for this, but Quartz is).

- Use a language that directly supports parallelism. Unfortuantely these are not currently in favor, although that may change as the average number of cores per machine climbs.

- Redesign the software's core computational algorithms and re-code them to be multi-threaded and vectorized. In some cases there is a straightforward path to this even though it might be a lot of work. In other cases a lot of ingenuity and inventiveness is required to recast the problem in a parallel fashion. And there are some cases where it simply isn't possible (although generally there are fewer of these than most developers think... these last two categories are often lumped into one by less inspired or motivated developers).

One saving grace is the rule-of-thumb that says that the 80-90% of a program's performance is usually spent in 10-20% of the program's code. This means that, in most cases, only 10-20% of a program needs to be rewritten or modified & optimized in order to get the biggest improvements. Furthermore, multiple cores means more computational resources, so the only reason to worry about multi-processing (or vectorization) at all is if your program is computationally limited. A great deal of software is not limited by the speed at which it does its computations; other things limit its performance such as the user, the disk, the network, the external busses, etc. In these cases there is no point in going multi-threaded, and a multi-core machine just allows more of these things to go on at once (assuming the OS itself isn't forcing serialization).
 
Henriok
Member
 
Join Date: Jun 2004
Location: Gothenburg, Sweden
Send a message via AIM to Henriok  
2005-03-27, 15:57

What about compiling software with different procesors in mind? How much is there to gain to compile an application to two binaries each optimized for G4 and G5 processors. The G5 and G4 is quite different in a lot of respects so doing a recompile with two targets might result in performace gains for each architectue. I guess that most of Mac software is optimized for the G4 processor if optimized at all. It can't be optimal to run G4 code on a G5.. what penalties are we looking at there? Or the other way around.. how much can we gain by telling the compiler to build binaries for G5 instead?
 
FireDancer
Member
 
Join Date: Mar 2005
 
2005-03-27, 19:32

Thanks programmer....a very insightful reply. I have no programming skills but I always have big questions and (sometimes impossible) ideas
 
FallenFromTheTree
Member
 
Join Date: Aug 2004
Location: A Stoned Throw From Ground Zero
 
2005-03-29, 00:15

I'm getting more and more curious about what Apple might be up to next week
at FOSE, the government and military IT expo here in D.C.
Today, I heard a radio promotion on WTOP specifically announcing, up front
that Apple will be there along with other manufacturers and special keynote
speakers including Intel's CEO.

With all the security concerns hovering over Microsoft's Windows OS,
This could be the year that Apple "Officially" breaks into government IT.

Apple now has the first truly affordable Mac with plenty of guts to work
in any network or secure environment.

They also have large scale deployments of XServe and XSan including
major players like Oracle and Cisco and UVA.



The only thing that could make this showing even more impressive,
would be a surprise sneak peek debut of OSX Tiger
on the first Dual Core IBM/Apple powered PPC workstations. <">
 
FFL
Fishhead Family Reunited
 
Join Date: May 2004
Location: Slightly Off Center
 
2005-03-29, 00:34

It would be fun to see how the gov/mil IT hordes would react to direct exposure to the RDF.
 
mugwump
Member
 
Join Date: Jul 2004
Location: uh huh
 
2005-03-29, 01:02

It could lead to peace on earth in our lifetime.
 
ldv
Member
 
Join Date: Mar 2005
 
2005-03-29, 16:18

Quote:
Originally Posted by FFL
It would be fun to see how the gov/mil IT hordes would react to direct exposure to the RDF.
The gov/mil hordes are the original inventors of the RDF! Steve goes and meets with the government and military to get the latest psy-ops tricks to use on the Mac community!
 
NMR Guy
New Member
 
Join Date: Jul 2004
Location: NMR Lab
 
2005-03-29, 16:28

Let us re-focus our minds with this link:

Talk of increased 970fx yields
 
ldv
Member
 
Join Date: Mar 2005
 
2005-03-29, 19:42

Quote:
Originally Posted by NMR Guy
Let us re-focus our minds with this link:

Talk of increased 970fx yields
The 970FX for the most part is a captive Apple part. Even if yields were 100%, Apple could not drive demand because of the company's unwillingness to use price elasticity to drive demand. Apple's conversative approach does give the company a guaranteed revenue stream for upgraders which they do not want to risk vs. the possible increased market share based on demand elasticity. Instead Apple will continue to focus on "halo" revenue and leave prices high across most Apple product lines.

The Apple viewpoint is that the first generation G5 has pretty much gone as far as it will go. The PC computing world is moving to dual core. AMD and/or Intel are shipping dual core servers, workstations, and desktops 2Q05. So that Apple can maintain some sort of technological parity, Apple has to move to dual core as well. Thus the emphasis on the dual core Powerbook and Powermac computers. Apple's entire line will be moved to 64-bit soon as the cost of maintaining two separate operating systems, one 32 bit and one 64 bit, is very high.
 
DrGruv
Member
 
Join Date: Jul 2004
 
2005-03-29, 19:49

IBM forecasts seismic shifts in media and entertainment sectors by 2010

[Business, India News] Mumbai, Mar.29 : In a new report titled Media and Entertainment 2010, IBM Business Consulting Services (BCS) has revealed that changing shifts in technology and consumer consumption will force media companies, particularly in the broadcast and film industries, to redefine their business models over the next 5-7 years.

By 2010, IBM-BCS claims the landscape of the industry will change so dramatically that media companies will have to move to a truly open environment, allowing consumers around-the-clock access to protected media content for variable fees and the ability to largely control their own media and entertainment experiences if they want to survive.

The report highlights the struggle that media companies face, bridging from the historic model of systematic, promotional based, one-way delivery to mass audiences to a world incorporating digital technologies, analytics driven marketing approaches and distribution models leveraging bi-directional relationships. These changes will continue to redefine the economics of the media business, much as has already occurred in the music industry.

“We’re seeing the revolution in the media and entertainment industry and the market environment in India is no different. Those emerging today are embracing technological capabilities and working through regulatory and business issues to embrace entirely new methods of delivering content and related assets that are tailored to the individual choices of business partners and consumers,” said Arvind Mahajan, Partner, Communication Sector, IBM Business Consulting Services.

“By 2010, this transformation will have taken place throughout the industry, and especially in broadcast and film segments.

There will be clear winners and losers. The winners will move

away from traditional proprietary business models to open

standards, will leverage digital technologies to undermine

existing economics of the media business and to know their consumers and business partners intimately ; they will deliver media to them how, when, and where they want it. Management of digital media capabilities will be key basis of differentiation amongst media companies,” he adds.

Thriving companies in the new environment will allow customers access to information on their own terms. This includes the ability to purchase and download the rights to a book, or other media and have it configured for one or more types of devices, or delivered immediately in traditional hard or soft cover. Consumers will be offered extensions to the initial media offerings, with the ability to order the film of the book, the soundtrack or only one song, the liner notes or a single quotation to use in a variety of formats, from a term paper to a wall poster.

IBM forecasts that by 2010 successful media companies will have many of the following characteristics in common: (1)Companies will survive not just on creative content, but on creative intelligence, about customers, markets, and the value of digital assets, (2) Users’ opinions, or “buzz” will be more effectively monitored, helping to shape the content individual consumers experience, (3) Conglomerates, traditional studios and publishers will open up their inventories, putting old and new digitized content online in various forms for variable fees. The same song, movie, or other media will cost more, or less, depending on complex variables such as age, sales tracking, or even the rarity of archival content, (4) Many independent artists and producers will offer their music, short videos and movies completely free, making money instead from tie-ins, product placements, Webcast concerts and events, and fan merchandise, (5) Online accounting systems will automatically invoice huge data feeds of digital content ordered by network and cable broadcasters from distributors and (6) Millions of micro-payments will add up to sizable revenue streams from the sale of new or archived digital content, much of which will never travel to a theater, retail store or TV station - it will be delivered online.

In order for companies to become truly open media companies, the report outlines specific steps companies can take to survive in the new economic and technological environment of the future, including creating or converting all content to digital formats, be open for delivery in multiple packages, with variable pricing and always-on customer service, open digital doors to let consumers contribute and produce or author dynamic content, manage openly and communicate in real-time through digital infrastructure, leverage a new depth of business intelligence made possible by digital technology, use partnership strategies that drive efficiency and optimize customer attention with companies focussing on core competencies and become an on-demand business - focused, responsive and variable entities, and be resilient to the realities of a more competitive marketplace. (ANI)

http://www.newkerala.com/news-daily/...lnews&id=91537

could they be talking about apple and quicktime 7?

- Michael Droste Itunes Link Stop By: TrumpetStudio.com or SaveThePlanetSong.org Some Main Gear: AT4050, Dual 1.8 G4, Logic, Waves Plat, Waves SSL, Tritone, URS, PSP, Zebra, BFD, RND, Sony Oxford, Altiverb...
 
Enki
Senior Member
 
Join Date: Nov 2004
 
2005-03-29, 19:54

No, they just want the media companies to buy their content and database servers.

Best way to predict the future, invent and sell it. Just a little change from Dr Kay's original statement.
 
rickag
Member
 
Join Date: Jul 2004
 
2005-03-30, 09:02

Quote:
Originally Posted by ldv
...
Apple's entire line will be moved to 64-bit soon as the cost of maintaining two separate operating systems, one 32 bit and one 64 bit, is very high.
I could be wrong, but my understanding is that the PPC from its' inception was designed as 64 bit with seamless 32 bit compatibility. I don't think that it will be difficult nor costly for Apple to maintain 32 and 64 bit compatibility, but then again I could be wrong.

Just waiting to be included in one of Apple's target markets.
 
CharlesS
Member
 
Join Date: Jul 2004
 
2005-03-30, 13:45

Quote:
Originally Posted by rickag
I could be wrong, but my understanding is that the PPC from its' inception was designed as 64 bit with seamless 32 bit compatibility. I don't think that it will be difficult nor costly for Apple to maintain 32 and 64 bit compatibility, but then again I could be wrong.
Yeah, the fact that Tiger is 64-bit but will run on 32-bit hardware kind of messes up the credibility of ldv's statement. Since when is Apple making two separate operating systems?
 
Enki
Senior Member
 
Join Date: Nov 2004
 
2005-03-30, 18:07

There are some seperate libraries for 32-bit only and 64-bit systems. Think about it, the G4's CAN'T use a G5 64-bit library so Apple has to manage two OS development branches until they completely end support for G4 & G3 processors. Not likely for quite a few years yet.
 
Henriok
Member
 
Join Date: Jun 2004
Location: Gothenburg, Sweden
Send a message via AIM to Henriok  
2005-03-30, 18:34

IBM wants to help others integrate Cell technology.
More info here..
I really can't see any reason why Apple would turn this technology down..
 
ldv
Member
 
Join Date: Mar 2005
 
2005-03-30, 19:07

Quote:
Originally Posted by Enki
There are some seperate libraries for 32-bit only and 64-bit systems. Think about it, the G4's CAN'T use a G5 64-bit library so Apple has to manage two OS development branches until they completely end support for G4 & G3 processors. Not likely for quite a few years yet.
The move will happen sooner than you think. Apple had to wait a long time for IBM to get their 970 chip design and process up to snuff. Now that Apple has a competent, cheap, and modestly good yield 64-bit chip, the transition to a pure 64-bit hardware base can happen swiftly.

Apple is even mulling a special rebate if you bring in your old Mac and exchange it for a new 64-bit Mac. This gets people hanging on to OS 9 over to OS X and it increases the 64-bit installed base.

We sometimes forget Apple is not Microsoft. Apple's entire cash reserve is less than half a year's profit for Microsoft. Apple does not have the budget to maintain many operating systems. That is why OS 9 and the old versions of OS X were phased out. We can expect the same for 32-bit version of the OS. Yes over a longer period of time, but phased out nonetheless.
 
CharlesS
Member
 
Join Date: Jul 2004
 
2005-04-01, 17:20

Quote:
Originally Posted by Enki
There are some seperate libraries for 32-bit only and 64-bit systems. Think about it, the G4's CAN'T use a G5 64-bit library so Apple has to manage two OS development branches until they completely end support for G4 & G3 processors. Not likely for quite a few years yet.
But if they were to remove the 32-bit libraries, wouldn't that break old software which linked against those libraries?
 
Enki
Senior Member
 
Join Date: Nov 2004
 
2005-04-01, 20:55

Now you understand while it will be a few years yet before 32-bit OS code could even possibly go away altogether.
 
Henriok
Member
 
Join Date: Jun 2004
Location: Gothenburg, Sweden
Send a message via AIM to Henriok  
2005-04-02, 05:00

The way I understand it some things (most things?) are done faster and more efficiently using 32 bit programming compared to an all the way 64 bit solution. Since 64 bit POWER/PowerPC technology is doing 32 bit stuff just fine and the PowerPC ISA is created with this in mind I really can't see just one reason do do an pure 64 bit processor and a pure 64 bit operating system. The one eason is to save some space on the hard drive not doing FAT binaries and double APIs. Hardly something that of any concern in the future.
 
Programmer
Member
 
Join Date: Nov 2004
 
2005-04-02, 10:43

Henriok is right -- PPC32 is part of PPC64, and supporting it isn't very difficult for the CPU designers, and it costs very little. There is good reason to do it as well because a 32-bit program will run faster than a 64-bit program unless there is something in the software that is actually using the 64-bitness (i.e. 64-bit integers or pointers), and most software doesn't need 64-bit values and more than 4 GB of memory. You'd better hope it stays that way too: imagine the day all your programs need more than 4 GB of memory.
 
julesstoop
Veteran Member
 
Join Date: Oct 2004
Location: Leiden, the Netherlands
 
2005-04-02, 11:59

Bill Gates would say that day will never come
 
Snoopy
Member
 
Join Date: Jul 2004
Location: Portland, OR
 
2005-04-02, 12:13

Yeah, the voice of reason. Thanks Henriok and Programmer. I could not see any great benefit to have OS X run only on 64-bit processors, but I didn't have the computer science knowledge to respond. It's my guess that 64-bit Tiger has over 90 percent 32-bit code, and maintaining two versions of the rest can't be that difficult.
 
Programmer
Member
 
Join Date: Nov 2004
 
2005-04-02, 15:55

Quote:
Originally Posted by julesstoop
Bill Gates would say that day will never come
To be fair to Bill, saying that "you'll never need more than X" is very different than "you'll always need more than X".

If my menu bar clock (for example) one day needs more than 4 GB, I'll leave the platform.
 
CharlesS
Member
 
Join Date: Jul 2004
 
2005-04-02, 23:22

Quote:
Originally Posted by Programmer
If my menu bar clock (for example) one day needs more than 4 GB, I'll leave the platform.
20 years ago, you might have said the same thing about always needing more than 4 MB.
 
CharlesS
Member
 
Join Date: Jul 2004
 
2005-04-03, 04:16

You know, though, the big issue here isn't that switching all apps to 64-bit code would be unnecessary (although it would), it's that doing such a thing would break all existing 32-bit OS X apps, causing everyone to have to buy new versions of all their apps again. So someone who's getting along fine with Office v. X or 2004 would have to buy a $300 upgrade to whatever shiny new version of Office came out that had useless new features no one would ever use or even know about. Even if someone only used Office to write some simple letters and spreadsheets, they'd still have to pay the upgrade prices! Als, old games would be screwed, and any other apps that don't get updated would be screwed.

The thing is, remember a few years ago? You know, that OS 9 -> OS X transition we just got through? Remember all the people who said that well, if they had to spend massive amounts of money to upgrade all their software anyway, why not just switch to Wintel? And that transition at least had a compatibility layer to run the old apps to some extent. Removing the 32-bit libraries from OS X would render all old OS X apps simply useless. And of course, Classic apps would still work as they use the OS 9 libraries. It would just be an overall bizarre situation. Apple's user base would revolt.

Apple will not do this. Not now, not tomorrow, not next year, not in 10 years. Apple has never done a thing like this - applications are the one place where Apple is pretty good at backwards compatibility. There are some apps for System 1.0 that still run in the Classic layer on OS X for God's sake! So in summary, a move like this would not only provide next to no benefit and be a massive cluster-f*** to the users, but it would be something completely out of character for Apple to do.

Sorry, ldv, but I don't believe a word you say. By saying this stuff about dual-core PowerBooks, quad-core Power Macs, and all 64-bit lineups, you're just saying what people want to hear. I'd gladly be proven wrong on everything except for the 64-bit only OS, but I'd be quite surprised if it happened.

Last edited by CharlesS : 2005-04-03 at 04:22.
 
Snoopy
Member
 
Join Date: Jul 2004
Location: Portland, OR
 
2005-04-03, 10:26

The 64 bit issue can get confusing. Are we discussing whether both 32 and 64 bit applications will work on OS X? Or are we discussing whether OS X will run on both 32 and 64 bit PPC processors? I believe it is the second issue, which is not as clear cut as the first. There is no doubt that 32 bit applications will always run on OS X. The question here is whether Apple should restrict OS X to 64 bit processors?
 
CharlesS
Member
 
Join Date: Jul 2004
 
2005-04-03, 12:23

Quote:
Originally Posted by Snoopy
The 64 bit issue can get confusing. Are we discussing whether both 32 and 64 bit applications will work on OS X? Or are we discussing whether OS X will run on both 32 and 64 bit PPC processors? I believe it is the second issue, which is not as clear cut as the first. There is no doubt that 32 bit applications will always run on OS X. The question here is whether Apple should restrict OS X to 64 bit processors?
The given reason for restricting OS X to 64-bit processors was removal of the 32-bit libraries. Although to be fair, looking back it seems to be Enki who proposed this and not ldv, but ldv soon replied in agreement. Anyway, I still don't understand what he meant by "two separate operating systems." So far, Apple seems to be developing one operating system which is compatible with both 32-bit and 64-bit hardware.

Don't get me wrong, I would love to be proven wrong with regards to ldv's hardware specs. I want to believe... I just can't. I have trouble believing, for example, that Apple would go from not even being able to fit a single-core G5 in a PowerBook and all of a sudden jump all the way to a dual-core G5. If I'm wrong, I'll be delighted, but really, I'd be happy just to see a single G5, or any equivalent chip from Freescale or whomever.
 
Enki
Senior Member
 
Join Date: Nov 2004
 
2005-04-03, 14:28

Be careful whose mouth you put what words into. The code within an OS is not the same as the code within an application. And what bitness an OS is coded in does not necessarily preclude anything.

Also take a post in it's proper context, not a manufactured context from later in the thread. Earlier, I was simply making the point that Apple IS maintaining two OS branches contrary to what you thought, the reason was physical and not going to go away quickly. Nothing more.

The later "could even possibly" statement should confirm that. Doesn't say 32-bit will go away, just bound it as a long way off if it were to happen due to the dependencies in relation to physical hardware currently shipping. The software market just adds even more immovable inertia against making 32-bit code go away altogether. And it shouldn't. There isn't any good reason for using double the number of bits required to do the job unless using 32 is physically precluded. Something we don't have to worry about with the PPC ISA.
 
CharlesS
Member
 
Join Date: Jul 2004
 
2005-04-03, 14:45

Quote:
Originally Posted by Enki
Be careful whose mouth you put what words into. The code within an OS is not the same as the code within an application. And what bitness an OS is coded in does not necessarily preclude anything.

Also take a post in it's proper context, not a manufactured context from later in the thread. Earlier, I was simply making the point that Apple IS maintaining two OS branches contrary to what you thought, the reason was physical and not going to go away quickly. Nothing more.
Uh, how are there two separate branches just because there are 32-bit and 64-bit libraries in the OS? What definition of a branch are you using? I've always understood it to be basically what is said on this page:

http://en.wikipedia.org/wiki/Fork_%28software%29

Is Apple going to be releasing two versions of Tiger, one a Tiger 32-bit Edition and one a Tiger 64-bit Edition? If not, then I don't see how you can say there are two different branches, nor two different operating systems. Having to support two different processor architectures in one operating system, yes. Having to make two different operating systems, well, I don't understand what you're talking about.
 
Brad
Selfish Heathen
 
Join Date: May 2004
Location: Zone of Pain
 
2005-04-03, 14:54

Ugh. I'm sorry guys, but this thread needs to die. The last several pages are ridiculously fragmented and cover a wide array of loosely-related subject including but not limited to the cell processor, PPC 970 and 970FX and 970GX, 32 vs 64 bitness for Tiger, capabilities of 32 vs 64 bit libraries, random vague comments from Morpheus, doomsday predictions, Xbox thoughts, PS3 thoughts, some IBM documents, more random baseless speculation, iPods, iTunes, iBooks, PowerBooks, and Steve Jobs' jet.

Yes, there is a lot of interesting information here, but it's not going to be deleted or anything like that. It's just going to be locked and you guys can still use it for reference.

This thread is just too difficult for anyone that hasn't been reading it over the last nine months to follow. Please, feel free to start new threads with focused discussions on the various subjects and try to stay on topic. If you have a new though that's not wholly related to the subject, start a new thread for crying out loud!

Cheers.


The quality of this board depends on the quality of the posts. The only way to guarantee thoughtful, informative discussion is to write thoughtful, informative posts. AppleNova is not a real-time chat forum. You have time to compose messages and edit them before and after posting.
 
Posting Rules Navigation
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Page 23 of 23 First Previous 19 20 21 22 [23] 

Closed

Forum Jump
Thread Tools
Similar Threads
Thread Thread Starter Forum Replies Last Post
Opinions on the IBM Cell Processor article micmoo Speculation and Rumors 14 2004-07-20 22:57
Apple releases updated Power Mac G5s staph Apple Products 43 2004-06-09 13:20


« Previous Thread | Next Thread »

All times are GMT -5. The time now is 21:24.


Powered by vBulletin®
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004 - 2024, AppleNova