I have been developing Java applications since the mid-1990s, and my primary focus was J2EE applications for large U.S. corporations. About two years ago, I migrated my entire family away from Windows operating systems to Apple's OS X. Recently I started working with Apple's Cocoa APIs and Objective-C.
This article is the first in a series detailing some of the more important differences between Java and Objective-C. If you have been interested in Cocoa and are coming from Java or another managed language, this compare-and-contrast might be helpful.
The merits of the various types of memory management notwithstanding, Java and Objective-C imply two different systems to manage memory. Java's system is completely autonomous and is not easily directed by the developer. Objective-C's system, however, is much more manual and requires direct attention from the developer during the creation of the source code.
Java Garbage Collection
Java employs an automatic garbage collection system to manage memory inside the virtual machine. This garbage collection is handled via reference counting. Each instance/object inside of the virtual machine has a reference count. When that reference count reaches zero, the object is "garbage," and should be removed and its memory released.
As a Java application runs, the garbage collector runs in the background on its own thread. Periodically throughout the life of the Java application, the collector will look through all of the objects and see if any of them have a reference count of zero. It is at this point that the objects are removed from the virtual machine and their memory is released.
Note that this memory release does not occur as soon as the object loses all its references—it occurs at some future point. This can cause some issues with Java applications because garbage collection can be heavy on CPU consumption and cause stalls in an application.