Get the latest tech news

Amdahl's Law


In computer architecture, Amdahl's law (or Amdahl's argument[1]) is a formula that shows how much faster a task can be completed when you add more resources to the system. The law can be stated as: "the overall performance improvement gained by optimizing a single part of a system is limited by the fraction of time that the improved part is actually used".[2] It is named after computer scientist Gene Amdahl, and was presented at the American Federation of Information Processing Societies (AFIPS) Spring Joint Computer Conference in 1967.

An implication of Amdahl's law is that to speed up real applications which have both serial and parallel portions, heterogeneous computing techniques are required. CiteSeerX 10.1.1.221.8635. doi: 10.1109/MC.2008.209.^ Rafiev, Ashur; Al-Hayanni, Mohammed A. N.; Xia, Fei; Shafik, Rishad; Romanovsky, Alexander; Yakovlev, Alex (2018-07-01). S2CID 52287374.^ Al-hayanni, Mohammed A. Noaman; Xia, Fei; Rafiev, Ashur; Romanovsky, Alexander; Shafik, Rishad; Yakovlev, Alex (July 2020).

Get the Android app

Or read this on Hacker News

Read more on:

Photo of law

law

Photo of Amdahl

Amdahl

Related news:

News photo

Now Online Safety Act is law, UK has 'priorities' – but still won't explain 'spy clause'

News photo

Arm lays down the law with a blueprint to challenge x86's PC dominance

News photo

Hyrum’s Law in Golang