Performance Zone is brought to you in partnership with:

Peter is a DZone MVB and is not an employee of DZone and has posted 161 posts at DZone. You can read more from them at their website. View Full User Profile

Wasting time by saving memory

12.16.2012
| 2300 views |
  • submit to reddit

Overview

Memory and disk space is getting cheaper all the time, but the cost of an hours development is increasing.  Often I see people trying to save memory or disk space which literally wasn't worth worrying about.

I have a tendancy to do this myself, because I can, not because it is good use of my time. ;)

Costs for comparison

Cheap memory - You can buy 16 GB of memory for £28.
Expensive memory - You can buy 1 GB in a phone for £320. (The entire cost of the phone)
Cheap disk space - You can buy 3 TB of disk for £120
Expensive disk space - You can buy 1 TB of the fastest RAID-1 PCI SDD for £2000.
The living wage in London is £8.55.

You might say that at my company, the hard ware is 10x more expensive, but it also likely you time is costing the company about the same more. 

In any case, this article attempts to demonstate that there is a tipping point where it no longer makes sense to spend time saving memory, or even thinking about it.

time
spent
cheap
memory
expensive
memory
cheap
disk
expensive
disk
a screen refresh
20 ms
      27 KB 150 bytes        1 MB   24 KB
one trivial change
~1 sec
     1.4 MB  7.6 KB      60 MB     1.2 MB
one command
~5 sec
        7 MB   50 KB    400 MB     6 MB
a line of code
~1 min
      84 MB 460 KB  3,600 MB   72 MB
a small change
~20 min
  1600 MB     9 MB 72,000 MB     1.4 GB
a significant change
~1 day
       40 GB  0.2 GB   1,700 GB   35 GB
a major change
~2 weeks
      390 GB   2 GB 17,000 GB 340 GB

Your mileage may vary, but just today some one asked how to save a few bytes by passing short instead of int as method arguments (Java doesn't save any memory if you do) Even if it did save as much as it might, the time taken to ask the question, let alone implement and test it, could have been worth 10,000,000 times the cost of memory it could have saved. 

In short; don't fall into the trap of a mind boggling imbalance of scale.
Published at DZone with permission of Peter Lawrey, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Wal Rus replied on Sun, 2012/12/16 - 1:30am

Computers can't keep up with the rate at which programs get bigger and  slower, wasting more and more of my valuable time.

David Gates replied on Sun, 2012/12/16 - 7:49pm in response to: Wal Rus

They can and do, in my experience.  I look at the programs I considered bloated and slow on my older workstation-class systems, and even though the new versions of those programs are often bulkier they range from lightweight to trivial for my modern budget box.

Donald Knuth's famous advice is "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil."  As available memory grows, so too does the level at which a memory optimization counts as a large efficiency.

Dapeng Liu replied on Sun, 2012/12/16 - 9:42pm

Rich Cook "Programming today is a race between software engineers striving to build bigger and better idiot- proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning."

Wal Rus replied on Sun, 2012/12/16 - 10:23pm

I see you point David. Perhaps an effective feedback loop is a solution: i.e. the users need to point hotspots, and the developers need to respond quicker.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.