4K vs UHD
-
@Nic said:
I'm betting that UHD is more for television broadcast, and is more easily understood by the consumer. HD they know now, and UHD is better because it's "ultra". 4K is already accepted nomenclature among the nerds and gamers, and I think we'll continue to see that used along with specific resolutions like 2160p.
So, you're basically saying that "UHD" and "4K" is the "organic" of nerd culture. I agree.
-
I honestly think having anything other than a straight resolution is a problem. Having names for resolutions and colour settings like "VGA" made sense in the 1980s. Not sure it does today. All of those names appear to exist for the purpose of misleading and not informing. And unless you have a certifying body that makes the standards and enforces them, they are just loose terms. HD, UHD.... people have been using those things for decades. We were calling things HD in the 80s, I'm sure they were in the 60s too. But suddenly it meant 480p to some people and 720p to others and then those were considered lies and only 1080p was HD. But some people thought 720i could be HD, but not others.
None of these terms are useful.
-
@scottalanmiller said:
I honestly think having anything other than a straight resolution is a problem.
Sounds a little homophobic...
-
@scottalanmiller There's a whole lot of terms, and every single one is standardized, it's just that the general public doesn't know what each one means (but isn't that always the case?), and neither do people in marketing for said products. 720p and 720i are both HD, 1080p and 1080i are FHD (Full High Definition). 4K is actually a 16:10 at 4096 x 2160, which is not the consumer standard for "4K", which is actually 3840 x 2160, and is technically called UHD.
-
I agree Scott - HD and UHD are non-specific marketing terms. HD for consumer broadcast can mean 720p or 1080p, and I'm assuming UHD will have some equivalent fuzziness.
-
@Mike-Ralston said:
@scottalanmiller There's a whole lot of terms, and every single one is standardized, it's just that the general public doesn't know what each one means (but isn't that always the case?), and neither do people in marketing for said products. 720p and 720i are both HD, 1080p and 1080i are FHD (Full High Definition).
Those are later changes to pre-existing terms. HD is not something that can be standardized since the term vastly predates any "after the fact" standardization. This is a misuse of the term standardize. HD, even in the 480 era was old, and was used differently by everyone.
Who, exactly, do you feel owns these terms to standardize them? 4K might have an owner as that is a new term AFAIK, but HD is an old one and does not have an owner.
-
@Mike-Ralston said:
4K is actually a 16:10 at 4096 x 2160, which is not the consumer standard for "4K", which is actually 3840 x 2160, and is technically called UHD.
According to @thecreativeone91 UHD and 4K are different standards. And UHD can't be standardized due to the nature of its name (again, just a loose, long running description.) And he works in commercial video so probably knows.
HD is like HA. It's a moving target. You can't legitimately even call 1080p HA today, it is, at best, standard definition. It's not high by any standard, low by many, but still owns a lot of the mainstream.
-
I thought normal tv programs were still broadcast in 720p due to bandwidth.
-
I agree with Scott, the names instead of the actual resolution is no longer useful. I'm sure its only used since sales can use it to confuse and over sell to consumers.
-
@scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived... Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.
-
@Dashrender said:
I thought normal tv programs were still broadcast in 720p due to bandwidth.
"Normal TV" is something that has become a backwater. But many, many years ago the last time that I looked into it, even rural programs out here in the middle of nowhere were broadcast higher than that commonly. And in Spain I know that over the air is on 4K.
-
@Mike-Ralston said:
Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.
Agreed
-
@Mike-Ralston said:
@scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived...
This is completely untrue. The names predate those things. Some, like 4K might have a ratifying body, HD, simply, cannot. There is no way to standardize a name of that sort. It predates the FCC. It cannot be standardized as it is an English term that means something.
-
@Mike-Ralston said:
Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.
Not does marketing claiming that something is a standard make it so. HD simply predates the standard. We had HD when I was little and no one had dreamed up 480p yet.
-
@Mike-Ralston said:
@scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived... Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.
According to Wikipedia, even 4K is not a standard, just a loose group of standards or assumptions. Some people have made some standards, but making "a" a standard is the opposite of the term being "a" standard. Anything near 4K lines of resolution is a 4K according to the page.
So even the terms that are new and could, in theory, be standards, are not:
-
DCI 4K should not be confused with ultra-high-definition television (UHDTV) AKA "UHD-1", which has a resolution of 3840 x 2160 (16:9, or approximately a 1.78:1 aspect ratio). Many manufacturers may advertise their products as UHD 4K, or simply 4K, when the term 4K is traditionally reserved for the cinematic, DCI resolution.[3][4] This often causes great confusion among consumers.
Wikipedia.
This is interesting that the "standard" use of 4K is for something that should be 2K or just 2160p.
-
2160p/i (what people call UHD), 1080p/i, 720p/i are all at 16:9 resolution, these specs are made by the NAB for broadcast. (P is progressive, I is interlaced). The term "UHD" is not a spec, is a term made by CES for marketing, really all it means is higher than 1080 HD.
Broadcast standards have to be made (NAB) then approved (FCC). Currently this is an MPEG2 stream (slightly better than DVD Quality). NAB wants this to change by late 2016 to a H.265 HEVC codec for a 2160p stream (UHD). This would mean that your 1080p/720p would now need a converter box to decode the H.265 stream into MPEG2, that the TV knows how to decode.
Film Standards are made by DCI, they aren't approved by anyone (There's no transmission). They just become accepted standards. These will likely never be the same for a number of reasons. One of them being incompatible aspect ratios. It's easy to put 16:9 or 4:3 on a 16:9 TV. or modify a CinemaScope aspect (there are a few different ones) to fit on a 16:9 Display. Fitting a 16:9 on a film aspect would give you much more wasted space as films are shot with a wider aspect than TV.
-
UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
-
@Dashrender said:
I thought normal tv programs were still broadcast in 720p due to bandwidth.
It's depends on the broadcaster they can easily do 1080p and many are here. but the secondary channels tend to be 720p. The hardware encoders required are much more costly at 1080 vs 720. Just as Progressive is more expensive than interlaced.
-
@scottalanmiller said:
Not does marketing claiming that something is a standard make it so. HD simply predates the standard. We had HD when I was little and no one had dreamed up 480p yet.
Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard, and since the resolution was developed, it has been called as such... And regardless of WHEN the terms were coined, they're still the industry standard, and the lines are very clear as to what term applies to what resolution. The only time that this isn't clear, is when people haven't been educated about the proper terms, which is perfectly fine, but the industry still uses accepted standards. Everything in the electronics industry is carefully categorized, display resolutions are the same way. And, as always, marketing is usually not correct, and popular belief tends to be based on what someone saw on Facebook, or a TV advertisement.