They are used to classify resolutions based on horizontal pixel count. "2K" refers to resolutions that have around 2,000 (2K) pixels horizontally. Examples include:
- 1920 × 1080 (16:9)
- 1920 × 1200 (16:10)
- 2048 × 1080 (≈19:10)
- 2048 × 1152 (16:9)
- 2048 × 1536 (4:3)
All of these are examples of 2K resolutions. 1920×1080 is a 2K resolution. 2048×1080 is another 2K resolution. 2560×1440 is not a 2K resolution, it is a 2.5K resolution.
"2.5K" refers to resolutions around 2,500 (2.5K) pixels horizontally. For example:
- 2304 × 1440 (16:10)
- 2400 × 1350 (16:9)
- 2560 × 1080 (64:27 / ≈21:9)
- 2560 × 1440 (16:9)
- 2560 × 1600 (16:10)
All of these are examples of 2.5K resolutions.
So why do people call 2560×1440 "2K"?
Because when "4K" was new to the consumer market, people would ask: "
What's 4K?", and usually the response was "
it’s four times as many pixels as 1080p". Unfortunately most people misinterpreted this and assumed that the "4" in "4K" actually stood for "how many times 1080p" the resolution was, and since 2560×1440 is popularly known as being "twice as many pixels as 1080p" (it's 1.77 times, but close enough), some people decided to start calling it "2K", and other people heard that and repeated it.
While it’s true that 4K UHD (3840×2160) is four times as many pixels as 1920×1080, that isn’t why it’s called "4K". It’s called 4K because it's approximately 4,000 pixels horizontally. The fact that it’s also 4 × 1080p is just a coincidence, and that pattern doesn’t continue with other resolutions.
For example, the 5K resolution featured in the Retina 5K iMac, 5120×2880, is equivalent to four 2560×1440 screens. If 1440p is "2K" because it’s twice as many pixels as 1080p, then wouldn’t four of them together be called "8K"? (Well, technically 7K since like I said 1440p is 1.77 times not 2 times 1080p, but that’s beside the point). We don’t call it 7K
or 8K. We call it 5K, because it's around 5,000 pixels horizontally. It has nothing to do with "how many times 1080p" the resolution is.
In addition, an actual 8K resolution such as 8K UHD (7680×4320) is equivalent to four 4K UHD screens. A single 4K UHD screen is four times as many pixels as 1080p, so four of those together is sixteen times as many pixels as 1080p. But 7680×4320 isn't called "16K", it’s called "8K", because it’s approximately 8,000 pixels horizontally. Again it doesn't have anything to do with "how many times 1080p" the resolution is.
So although 2560×1440 is around twice as many pixels as 1080p, it is not called "2K", because that isn’t where these names come from. Since 2560×1440 is approximately 2,500 pixels horizontally, it falls into the 2.5K classification.
Examples of How the Cinematography Industry Uses These Terms
RED CAMERA:
View attachment 556146
RED Scarlet-W Manual
In the charts above, the naming convention is made pretty clear, though it's not without its inconsistencies. For example, every 6:5 format has a far lower horizontal pixel count than its name suggests since these formats are intended to be used with anamorphic lenses, and the images will have a wider horizontal pixel count once they are de-squeezed. There are other minor oddities like 5568×3160 being classified as 6K while 5632×2948 is classified as 5.5K, but this is somewhat expected since this naming convention does not have any "official" set of rules for determining names, it's all just convention-based. In any case, despite the occasional deviation, the main pattern of the naming convention quite clearly follows the horizontal pixel count, and definitely not "how many times 1080p".
Just to sum up some of the more interesting parts of the above charts from the RED Scarlet-W manual:
- 1920×1080 is listed as "2K 16:9 (HD)".
- 2560×1080 is listed as "2.5K 2.4:1". Despite being an "ultrawide" version of 1920×1080 (2K 16:9), calling it "2K ultrawide" is improper usage of the term 2K, as it is a 2.5K resolution, not 2K.
- 2560×1340 is listed as "2.5K Full Frame", it’s safe to say if 2560×1440 were included on the list it would be classified as a 2.5K resolution as well. (You might think ''1340" is just a typo for "1440", but actually it's more likely a typo for "1350", which would make it a 256:135 (≈19:10) ratio which is consistent with the other full frame resolutions listed)
- 3840×2160 and 4096×2160 are both classified as 4K resolutions. 4096×2160 is not "the only" 4K resolution.
- 5120×2160 (ultrawide version of 3840×2160) is listed as "5K 2.4:1". Calling it "4K ultrawide" is improper usage of the term 4K, as it is a 5K resolution, not 4K.
Canon EOS C500
View attachment 556147
Note here that 2048×1080 and 1920×1080 both fall under the "2K" categories. 2K definitely does not refer to 2560×1440 or similar resolutions. 4096×2160 and 3840×2160 are also both classified as "4K" resolutions. 4096×2160 is not "the only" 4K resolution.
"True 4K"
Some people will get upset when you call 3840×2160 "4K", and will say:
"3840×2160 isn’t ‘4K’, that’s ‘UHD’! True 4K is 4096×2160!"
And some go as far as saying 4K TVs are a consumer scam because they're not "real 4K". This is nonsense, really. As explained at the top, "4K" isn’t a resolution. It’s a category. The term is used to refer to any resolution approximately 4,000 (4K) pixels horizontally, for example:
- 3840 × 1600 (24:10 / ≈21:9)
- 3840 × 2160 (16:9)
- 3840 × 2400 (16:10)
- 4096 × 2160 (≈19:10)
- 4096 × 2304 (16:9)
- 4096 × 2560 (16:10)
- 4096 × 3072 (4:3)
All of these are examples of 4K resolutions. None of them is "the one true" 4K resolution, because there is no such thing. They are all classified as 4K resolutions, and neither 3840×2160 nor 4096×2160 is more "true" than the other.
In digital cinema where these terms originate from, "4K" is and always has been a generic term that refers to a class of resolutions, not any one specific resolution. This idea that 4096×2160 is the "true 4K definition" used in cinema, you may notice, is only held by consumer-level internet people, not by anyone actually involved in cinema.
Yes, 4096×2160 is established as a standard resolution by the DCI specification, and they do refer to it as 4K, but that is not a term that they came up with, it's only a generic term. It's no more of a name for 4096×2160 than if you wrote a new standard saying "we're going to establish a standardized 16:9 resolution, 1600×900" and then you had a bunch of people running around on the internet saying "1600×900 is the true 16:9 resolution, it's defined in this standard, 1920×1080 isn't really 16:9!"
4096×2160 is not "the definition" of 4K, it is just one of several standardized 4K resolutions, just as we have several standardized 16:9 resolutions but none of them are "the definition" of 16:9, because 16:9 isn't a resolution, it's a category (in this case, anything with a width-to-height ratio of 16:9 fits in that category). And in both cases, it's not really about what resolutions are established by standards. If you have a resolution with a ratio of 16:9, then it's a 16:9 resolution, it doesn't have to have a standards document to go with it, and the same applies to 4K; any resolution with ≈4,000 horizontal pixels is a 4K resolution, because
that's the definition of 4K.
The entire "4096×2160 is true 4K" thing was completely made up by tech news websites when 4K TVs were first starting to appear. Of course, all the major tech websites wanted to write a "4K explained" article, and of course being consumer-level writers, they themselves knew nothing about the established usage of the term "4K" (which had been used for years in cinema at this point).
Long story short, all the articles about "4K explained" were written by a bunch of consumers who know nothing about cinema, and are based on a few Google searches for "4K" in which these writers saw that 4096×2160 was mentioned a lot (since it is quite a common resolution), investigated a little and saw that it was a DCI standard, and leapt to the conclusion that "4K" was a unique name that referred exclusively to 4096×2160 in the same way that "Full HD" refers to 1920×1080.
Unfortunately their little assumption was completely wrong, and none of them researched enough to understand how the term "4K" was (is) actually used in industry. But boy did it catch on. Mostly, I suspect, because people on the internet like the feeling of knowing things that other people don't know, or feeling that they're doing things (or saying things) "how the pros do it", and believing the 4096×2160 true 4K thing makes them feel as though they have special cinema industry insider knowledge. Sadly, the entire thing was completely made up by consumers. Sorry to say.
"UHD" is not a name for 3840×2160
Secondly, UHD is not a name for 3840×2160. The whole "4096×2160 is 4K, and 3840×2160 is just called UHD" thing is entirely wrong; both of those resolutions are 4K resolutions, and in fact both of those resolutions are UHD resolutions as well. UHD is a term created by CEA as a marketing standard to refer to displays that meet certain requirements. Here is the relevant part of the definition of UHD:
UHD is basically a class of display; note that the definition is
at least 3840×2160, and 16:9
or wider. This means that higher resolutions and wider ratios, such as 4096×2160, or even 5120×2880 or higher, or ultrawide resolutions, qualify as "UHD resolution". UHD does not have to be 3840×2160, or even a 4K class display at all.
3840×2160 is established as a standard by ITU, and it never defines "UHD" as 3840×2160. This is, again, something made up by the internet because it's simple and easy to say "4096 is 4K, and 3840 is UHD", so it catches on easily.
This standard does the exact same thing, establishing "4K" and "8K" as shorthand terms for discussing the formats in the context of the standards document. It does not mean 4K is "the name" for 3840×2160, just as the DCI specification's usage of the term does not mean 4K is "the name" for 4096×2160 either. It is just a general term used in the industry for anything ≈4,000 pixels horizontally, but of course may have specific meanings within certain standards documents, which are made clear in the documents themselves and only apply within that respective document.
There is no sense in which 3840×2160 is "just called UHD", or in which 4096×2160 is the "one true" 4K resolution.
Q: "Why are we suddenly using horizontal pixels anyway, vertical makes much more sense! The "4K" name is just marketing gibberish created by TV companies!"
The "K" shorthand was not created by TV companies. It is borrowed from the cinematography industry, where it has been used commonly for years prior to the introduction of 4K TVs to the consumer market. In cinematography, it makes much more sense to use horizontal resolution to classify images, because movies are often cropped vertically (black bars on top and bottom), so naming resolutions by vertical pixel count would mean the resolution classification of the material would change just based on how much black bar is added, even though the detail level of the image hasn’t changed. Since resolution is used to classify detail level, it doesn’t make any sense to have a classification system that changes designations when the detail level remains the same.
Instead, horizontal resolution is used to classify images, so that images with the same level of detail will be classified in the same group regardless of what aspect ratio has been chosen for the material.
However, in gaming, images are not cropped vertically when moving to a wider aspect ratio, but instead they are expanded horizontally because the content is rendered in real time, so it is possible to generate additional new content to fill the extra width rather than just expanding the existing image to fill the screen and cropping the top and bottom off. In this case classifying resolutions based on width alone isn’t all that useful, because the aspect ratio plays a much larger role in how the content appears on the screen.
But since the TV industry is more concerned with cinematic material than it is with gaming, they chose to use the "K" shorthand that is used in cinematography.