4K vs UHD
-
@Mike-Ralston said:
they'll tell you all the same things I am.
Considering I'm friends with a lot of these people, I can tell you they will not tell the things you said. You are listing off irrelevant standards. VESA standards are low level standards of how the electronics work, Device IDs. etc. It's more relevant to computer video than it is TV, though it does play a role. It looks like you are just spitting out something google told you.
-
@scottalanmiller said:
@Mike-Ralston said:
That is incorrect, HD is SPECIFICALLY 1280 Horizontal by 720 Vertical, Progressive Scan.
That's completely made up. That is in no way an agreed upon use of English, common knowledge, industry accepted standard of anything. If there is one thing that HD routinely is not, this is it.
Pretty much never do they mean that, unless they are a marketing person trying to pull something over.
-
I have super duper ultra high awesome definition TM!
-
@Mike-Ralston said:
@thecreativeone91 Mmmk, go talk to some people who set the standards by being industry leaders...
To whom have you spoken about this?
-
I think that this sums it up...
-
@thecreativeone91 said:
Granted Hulu does this I think.
Yeah, they suck.
-
@scottalanmiller said:
I think that this sums it up...
stand·ard
/ˈstandərd/adjective
- used or accepted as normal or average.
-
@scottalanmiller said:
@thecreativeone91 said:
Granted Hulu does this I think.
Yeah, they suck.
Slightly worse than the cable companies... at least I am watching the shows I want to watch when I want to watch them. Granted, I'd rather see them match Netflix and charge $9 a month and not show me the commercials.
-
@Mike-Ralston said:
And they understand perfectly that FHDi is 1920 x 1080 Interlaced, it's pretty simple.
What makes that simple, logical or even remotely likely? Have you tried Googling that term? I've never heard of it and Google seems pretty stumped too. I have a feeling that's a new one created here, once Google gets through this thread, maybe it will show up.
1920x1080 interlaced isn't enough to create a useful standard. The terms you are using aren't ones that the people you think need them could use. Those people do have standards, but they are not ones sold to consumers or end users.
And "standards" between business partners doesn't rely on people coming up with marketing names. They talk is detailed standards at a lower level.
-
@Mike-Ralston said:
stand·ard
/ˈstandərd/adjective
- used or accepted as normal or average.
verb (used with object), standardized, standardizing.
1.
to bring to or make of an established standard size, weight, quality, strength, or the like:
to standardize manufactured parts.
2.
to compare with or test by a standard.
3.
to choose or establish a standard for.A bit different than "common usage", and even common usage is not qualifying for most of this stuff. 4K, UHD, HD.... there is no accepted common usage. You call standard usage of HD one thing, the store calls it another, I call it another. While I often have less than common standards that I use myself (HD means high definition, not a resolution rating) I know that anyone I've spoken to in the last many, many years has accepted HD outside of that as 1080p, not 720p. I don't know anyone that would think 720p was the standard for HD by industry or by language.
-
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
This use of the 'p' was my understanding.
-
@scottalanmiller said:
@Mike-Ralston said:
And they understand perfectly that FHDi is 1920 x 1080 Interlaced, it's pretty simple.
What makes that simple, logical or even remotely likely? Have you tried Googling that term? I've never heard of it and Google seems pretty stumped too. I have a feeling that's a new one created here, once Google gets through this thread, maybe it will show up.
1920x1080 interlaced isn't enough to create a useful standard. The terms you are using aren't ones that the people you think need them could use. Those people do have standards, but they are not ones sold to consumers or end users.
And "standards" between business partners doesn't rely on people coming up with marketing names. They talk is detailed standards at a lower level.
Sure if you call it full HD interlaced that's what FHDi would mean so yes that would mean 1920x1080 interlaced. It's not that simple though. There are multiple flavors of HD. That only referees to the resolution.
Progressive/Interlaced and Frame rate are all separate things. You could say that HD 1920x1080 Interlaced is likely going to be 30fps as well, but it does not have to be nor is it a standard. -
Has anyone seen or heard of 4K interlaced yet?
-
@thecreativeone91 said:
@Mike-Ralston said:
@Dashrender There's your answer. Film is done to look good, TV is done to comply to stringent FCC rules.
The FCC only adopts the rules created by the broadcast associations. It would be impossible to do the level of a Film OTA. You get shipped multiple harddrives that create a raid to plug directly into the the digital projector both because of size and needed data rates to get high quality. Film is DPX files, one file per frame (it's actually just a large picture file with no compression). Audio is done separately, and synced with timecode.
LOL I heard that - it's basically a giant flip book.. LOL
Though I heard they are streaming or at least downloading that data now, is that not true?
-
@Dashrender said:
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
This use of the 'p' was my understanding.
If you refer to something as 1080p or 720p yes then it would be progressive. But it doesn't have to be progressive. There is 1080i and 1080p. He also contradicted himself saying her that HD is only progressive and then later stating that it's understood full hd is 1080i.
-
@Dashrender said:
@thecreativeone91 said:
@Mike-Ralston said:
@Dashrender There's your answer. Film is done to look good, TV is done to comply to stringent FCC rules.
The FCC only adopts the rules created by the broadcast associations. It would be impossible to do the level of a Film OTA. You get shipped multiple harddrives that create a raid to plug directly into the the digital projector both because of size and needed data rates to get high quality. Film is DPX files, one file per frame (it's actually just a large picture file with no compression). Audio is done separately, and synced with timecode.
LOL I heard that - it's basically a giant flip book.. LOL
Though I heard they are streaming or at least downloading that data now, is that not true?
nope, all hard drives the only thing online about it is unlocking them, they have to be de-encrpyted per show to make sure theaters don't to unlicensed showings, but using them at slots they didn't pay for. That used to be a common thing in the 35mm/S35mm film days, which wasn't that long ago.
-
@Dashrender said:
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
This use of the 'p' was my understanding.
Yes, the need for i and p was because HD did not originally imply either. If you HD in the English way, there is nothing interlaced that is truly HD. But if you use it the marketing way, you must specify as low definition interlacing is common.
-
@scottalanmiller said:
@thecreativeone91 said:
Granted Hulu does this I think.
Yeah, they suck.
Exactly - why would anyone want Hulu? I suppose if you could ditch cable and just do Hulu.. that might be OK.
-
@Dashrender said:
Exactly - why would anyone want Hulu? I suppose if you could ditch cable and just do Hulu.. that might be OK.
Because they have exclusive show rights.
-
@MattSpeller said:
Has anyone seen or heard of 4K interlaced yet?
Hopefully no one will use it. high end cameras shooting which are the only ones doing true 4k, wouldn't offer interlaced as you'd be insane to use it. I would not be surprised to see UHD 2160i at some point even though Interlacing sucks, especially now days. was really meant for CRT days.