It is not only 64-bit that Apple is abusing to make its Mac faster in the new Snow Leopard. In addition to this technology, the system will use other tactics to make better use of the power of the processor (s) in your machine, which not only makes the programs you use daily faster, but also makes life easier for those who develop them.
During the last WWDC, Bertrand Serlet spoke to the public about a new approach that is coming, called Grand Central Dispatch (GCD, for the most intimate :-P). The weird name, and what it does is also hard to understand, but in the next few lines, I will do my best to answer you in a simple way what it does to make your Mac faster.
Basically, it takes advantage of something that (almost) all Intel Macs have on their CPUs: multiple cores (multi-colors). So, instead of talking directly about what the GCD does, let's see a little bit about what the cores are and what they brought to modern computers. Get ready, because this story won't be too short either …
Processors multi-core have two or more cores. The main difference between them and those that were popular some four years ago is the fact that the operating system treats each core as a different processor. To facilitate understanding, we can then say that a CPU with two cores (dual-core) has two processors on the same physical chip, and so on.
Each core has its own cache, capable of performing more than one task at a time (multitasking) and, in special cases, can even access the main memory of the system (regardless of the quantity) by itself. All this self-sufficiency that the cores have is used to divide tasks, which is why we cannot always consider a processor twice or four times faster than another, just following the numerical increase in its frequency.
Why do today's processors have more than one core?
I started using computers already when there were processors dual-core for them, but I got the start of this technology and understood early on the main reason for the industry to adopt it. When there were only chips with a core, a different tactic was followed to increase their performance, based on a study (not to call it “prophecy”) done by a gentleman named Gordon Moore, more than 40 years ago. In short, he said that processors could double their speed every 18 months.
At first, this was just an observation by the co-founder of a company that, at the time, had almost no reason to exist (and today we know it as Intel :-P). However, as so much that the alignment of planets explains, what he said ended up coming true. Still, as the industry has approached today, concern has begun to revolve around this law.
What happened was that the chips doubled in speed following a manufacturing process that basically did one thing: miniaturize components (that is, reduce them while preserving their characteristics) to make the frequency (measured in gigahertz) increase. It turns out that when we do this, the amount of energy needed to supply each processor increases and, as their dimensions remain the same, the heat generated by them also increases, requiring a larger thermal envelope for cooling purposes and preventing its use in hardware with limited dimensions (notebooks and desktops similar to Mac mini, for example).
Let's look at some practical examples of this. Suppose a standard CPU has a theoretical speed of 100 and a power consumption of 100:
By increasing its frequency of operation (clock) by 20%, its performance increases by 13%, but energy consumption can increase by up to 73%, maintaining the same physical space:
In contrast, decreasing the operating frequency by 20% causes the chip to lose 13% of performance, but energy consumption can easily drop to almost half of the original:
Taking into account current advances in processors, consider only increasing the clock they make energy consumption and the thermal envelope of machines impractical in various situations, making investments in better manufacturing processes useless in some cases. However, when using these same advances in the elaboration of a CPU capable of operating with two cores to perform tasks, we see a curious thing:
Putting an extra core on a CPU makes the theoretical performance increase 73%, at the cost of just 2% in energy consumption, without taking into account the work that manufacturers present from time to time in the production of processors. Thus, chips with more than one core deliver a lot of speed to applications due to relatively low power consumption, and this motivated the industry to adopt them with force.
And where does Mac OS X come in?
Using processors with more than one core on Macs has made Apple (and other manufacturers) follow the latest advances in this industry in a faster way. Moore's law has changed its face and now puts the cores into consideration in increasing computer performance. To the experts in software, however, it all brought a tremendous headache.
In the old model of increasing the performance of CPUs, those who developed applications showed no reaction to new products of this type. The frequency of their operation was higher, the machines were faster faster and the applications were faster, without anyone having to change a code line. With the chips multi-coreHowever, the work of professionals to make software more efficient for users not only includes the need to wait for new chips, but also to know how to use them.
Apple's performance in this medium began there at Tiger for PowerPC (2004/2005), and since then has focused on making Mac OS X easier for developers to take advantage of the faster processor speed dual-core and quad-core in your solutions. The first step to help them was to put the internal activities of the system to run automatically in the second core (considering that we are talking about a processor dual-core), giving the user applications greater processing clearance in the first.
If you open the Activity Monitor while reading this article and look at the tab CPU, at the bottom, see that Leopard is keeping small system tasks in the background, without disturbing your life. With that, the applications you open in the Dock or Spotlight find more space to work with maximum response.
Still, in recent years it hasn't been very good for developers, because they need to determine what are the best ways to take advantage of multiple cores and how to program software for it. Here a paradigm in computing called threadingIt is capable of enabling the processor to split an application's operations and run them on its multiple cores at the same time.
I have no way to go into detail here, but something is so complicated that it does not guarantee any software to run faster or not. While Apple calls this “lack of effort”, anyone who develops apps as a way to make a living is well aware of the difficulty in usingthreads and look for alternatives to that. The GCD is one of them.
Explaining the Grand Central Dispatch
What does the Grand Central Dispatch take away from applications the need to ownthreads and pass on the task of managing the use of machine resources for Mac OS X. Instead of each developer programming their applications to better take advantage of multiple processor cores, they simply need to specify how their operations need to be organized in series and blocks , and let Snow Leopard take care of the rest.
The great thing about using GCD is that programs only consume the machine's resources that are essential to function in the best possible way, and release them when they don't need to. As the new Mac OS X is fully optimized in this way, the end result is a much more efficient working environment.
All this management is done dynamically, without requiring the actions of the programmers in several cases. If a given application does not need more than a certain number of operations to be performed simultaneously, they are eliminated to give room to other tasks.
To take advantage of this in supported applications, you need a Mac dual-core, at least. However, none of these programs will stop working on the first mini Macs with a Core Solo processor, which have only one core: they just won't take advantage of the best performance.
Is it worth trusting the GCD?
This is a question more aimed at developers, but it also tells who will be upgrading to Snow Leopard soon. Because it is a type of technology that other versions of Mac OS X do not support, no one feels the need to prepare their applications for it if the majority of the current user base does not update.
Despite the fact that many people are tempted by the low price of Snow Leopard, having Grand Central Dispatch taking care of better distributing the resources of your Mac to the system and programs you use makes everyday tasks much more efficient and, per se, a good reason to update. Even if popular software in your day-to-day life fails to take full advantage of the GCD at the beginning, it is a good idea to give it a chance to improve your experience in other areas.
Like 64 bits, this is the kind of technology that may not show you immediate results, which is normal, as Snow Leopard has not even been released. However, the reasons why it is being put into practice in this version of Mac OS X also include the many possibilities that Apple and other developers should explore in the future, but that will only come out of the paper if we are confident at times like this.
. . .