Why Macs have millions of tiny files

#tl;dr THE ARTICLE SAYS THERE IS NO PROBLEM, THE MILLIONS OF TINY FILES ARE ONE OF THE AWESOME BENEFITS OF LINUX/MAC

So, you agree with the article, then.

citation, please.

Those zillions of stray files donā€™t seem to consume much space, but tools like SuperDuper! and Disk Utility ā€œverify diskā€ need to process each and every one of them in one way or another.

You are probably archiving to a different filesystem; ._filename is the resource fork for filename on a non-HFS volume. Iā€™m not sure if the resource fork gets pulled out the same way upon creating a tarball or compressed archive. I canā€™t even find anything with a resource fork on this recentish clean install.

EDIT: Not to be confused with .DS_STORE, which is the Finderā€™s metadata for a folderā€™s default view, window position, and so forth.

Iā€™ve heard that the computer itself is actually made of trillions of tiny molecules. Is that true? Golly, how am I ever going to keep track of all that? Thanks Apple. (sad trombone)

6 Likes

(couldnā€™t find an animated one)

3 Likes

You know what would be great? If you Windows Fanbois werenā€™t so quick to assume that everything Mac was bad. Steve Jobs designed those millions of tiny flies because He knew that millions of tiny flies would offer a better user experience for everyone except Neanderthals who are stuck on M$ Wind0wz. You think you donā€™t want millions of tiny flies, but the moment Windows 10 comes out with millions of tiny flies youā€™re going to be loving them. Well, donā€™t forget to thank Steve Jobs. In heaven.

3 Likes

Each of those flies is a reincarnation of Steve Jobs.

I asked for a citation of where the article author believes the number of files causes slowdown.

But you provided a quote from ā€œDougā€:

Doug Eldred writes in with a concern about a form of file bloatā€”but not about bloated sizes. Rather, the sheer number of items that seem to appear on his drive.

[ā€¦]

Doug continues:

Those zillions of stray files donā€™t seem to consume much space, but tools like SuperDuper! and Disk Utility ā€œverify diskā€ need to process each and every one of them in one way or another.

So, back in context, itā€™s a question that is being asked by a reader, not the author.

And, lo! There is an answer from the author that lays out his opinion on files causing slowdowns:

To my recollection and experience, the number of files shouldnā€™t
contribute to any system slowdowns, because theyā€™re inert unless needed.
[emphasis added]

The article goes on to be the devilā€™s advocate and explore whether or not those files could actual cause slowdowns. TL;DR: they donā€™t.

OK, who prefers a monolithic registry file that is impossible to backup, manage change control, cleanup or repair after corruption?

3 Likes

meh, it pays the mortgage.
after fiddling with OSā€™s since CPM/DOS and apple IIā€™s etc, I have gotten to the point that you know they all suck just not at the same things. sadly I am too familiar with the bowels of windows registry.

1 Like

Bundling of application including configuration metadata and selected dependencies to support portability?

It is a NeXT design. Yeah! NeXT!

OSX was - in 1998/9 - really in most regards successive version of NeXTStep - with MacOS compatibility and UI conventions.

This Mac compatibility was partially achieved by the App Container bundle. It also assisted in bridging NeXTStep hardware platforms from Motorola 640XX to PowerPC, SPARC and even Intel. Some of these were repeated in MacOS - where a single ā€œFat Binaryā€ would run multiple OS frameworks or even processor platform, from a single compile by the developer.

NeXT! Yeah!

2 Likes

I really liked the NEXT though only got to do cursory playing around with them in the university lab.

System restore?

This behavior is really endearing when the macs start doing it all over your non-HFS fileserver, let me tell you.

2 Likes

Man. I had ā€œunauthorizedā€ access to the Media Lab NeXT Cube for almost two years. Problem? only 1 MB disk quotaā€¦ On 1 GB optical.

This ran TIA for me - as my first regular Internet pipe at home - via shell to the NeXT. 1993.

agreed. the correct answer is when to use a database and when to use lots of little files and without knowing a lot more about their architecture, Iā€™m not going to argue that spotlight or time capsule do things WRONG. but really, HF+ can hold more than 2 BILLION files in a single directory. So a few million total should be no big deal.

Some trimming of what spotlight is indexing and how far back you keep backups might go a very long way in improving performance, regardless of DB vs. file system strategy.

The archives are typically ZIP and RAR archives, fwiw.

And whatā€™s with finding a buttmess of .goutputstream files in my linux system? Everything I find on the forums says that itā€™s more a bug than anything else, so itā€™s okay to unload them like Iā€™ve been doing.

I thought the whole point of the optical drive was to give each user his own spaceā€¦

http://simson.net/ref/NeXT/brochure_storage.htm