The tech press has been full of announcements over the last day or two regarding GPMI. It’s a new standard with the backing of a range of Chinese hardware companies, for a high-speed digital video interface to rival HDMI. The Chinese semiconductor company HiSilicon have a whitepaper on the subject (Chinese language, Google Translate link), promising a tremendously higher data rate than HDMI, power delivery well exceeding that of USB-C, and interestingly, bi-directional data transfer. Is HDMI dead? Probably not, but the next few years will bring us some interesting hardware as they respond to this upstart.
Reading through pages of marketing from all over the web on this topic, it appears to be an early part of the push for 8k video content. There’s a small part of us that wonders just how far we can push display resolution beyond that of our eyes without it becoming just a marketing gimmick, but it is true to say that there is demand for higher-bandwidth interfaces. Reports mention two plug styles: a GPMI-specific one and a USB-C one. We expect the latter to naturally dominate. In terms of adoption, though, and whether users might find themselves left behind with the wrong interface, we would expect that far from needing to buy new equipment, we’ll find that support comes gradually with fallback to existing standards such as DisplayPort over USB-C, such that we hardly notice the transition.
Nearly a decade ago we marked the passing of VGA. We don’t expect to be doing the same for HDMI any time soon in the light of GPMI.
The Licensing Administrations in the U.S. will never allow it. Even the more free Display Port is being kept out of AV receivers and TVs. HDMI doesn’t have to be a technically superior standard, but a safe DRM riddled and copy protected standard with the secret keys only for those the Licensing Administrations deem worthy.
With how things are working now, it will mean that there will be “US” area with only hdmi and “Rest of the world” area with DP and GPMI screens, plus maybe HDMI-compatible sockets or gpmi-hdmi converters which won’t be able to play US-area DVD’s.
I second this.
The US can keep HDMI all for itself, along with the high licensing fees no one likes.
And hopefully, there will be some DVI/VGA ports remaining, still.
I never liked the idea that PCs adopt consumer standards.
https://en.wikipedia.org/wiki/HDMI#Licensing
As long as the U.S. produces the content I consume, I fear I must abide to the cruel idiom of: “He who pays the piper, calls the tune.”
As honourable mention, the mini-series “Senna” was perhaps the best thing I’ve seen in years and produced by Brazil.
Both your sentiments are morally good and just. Let’s bring Hollywood to “us” and not the U.S.
At very least, DVD isn’t bound to HDMI and HDCP.
It started out in a time when DVD players had used S-Video and RGB via SCART.
Later models had VGA ports, too.
So at least on a media level, HDMI is not vital here.
Blu ray medium might be different, though, I admit.
In the future it may need needs media converts, maybe.
Converter boxes or chips that transvert the HDCP video stream to another one format.
That of course won’t be being blessed by HDMI/BD consortium, but at this point, who cares anymore.
The US and its companies is going to be more and more irrelevant the way things develop right now.
Viewed very soberly, I mean.
“Senna” was a good series, even if the reason I started watching it was to see Snetterton circuit with it’s famous Norfolk mountains in the background. (For those who don’t know the area, Norfolk is exceptionally flat.)
I don’t see this happening. It will be one of the other. TV manufacturers don’t like making region specific TV’s. The main reason they pulled back on 8K is EU power restriction laws as it takes more power for 8K to drive that many pixels. 8K isn’t even noticable till you get to around a 150″ screen anyways.
HDCP has become meaningless as for as DRM. Everyone knows whatever DRM you invent the internet will find a way to bypass it.
Yeah, but if you circumvent hdcp systematically as a big company, you will be sued. It’s cheaper to make two models than to be sued.
TV manufacturers make TVs for specific regions as it is so don’t see why this is an issue.
Depending on viewing distance or if you prefer the % of your FOV the screen occupies you would notice 8K in the 40″ sort of range as well. But on the whole I do agree there isn’t for most folks much point in going to 8K as 4K is sharp enough and challenging enough to render already that taking that next leap up just isn’t worth it.
Though region specific is something that would be made for the US market, assuming the First Felon of the USA hasn’t broken the global trade into the USA. As its a huge market of very wealthy folks it is worth creating what in this case is effectively a downgraded version of the product for the US market. When the EU or USA mandate a higher minimum standard, or a sensible enough connector as they did with USB-C it will become the new baseline fitted to everything globally as it isn’t worth creating two product lines when the economies of scale to make just one is vastly more profitable than the minor added cost (if any) of using that higher spec. But going the other way and mandating something terrible as the best you are allowed still leaves the whole rest of the world, which contains many very wealthy people too who will be willing and able to buy the superior product.
I have a strong dislike for the “entertainment industry”, which has it’s roots in the pre-internet area. And I’m not the only one. Take for example “Have a Cigar” from the Pink Floyd album “Wish you were here”. I also refused to buy a standalone DVD player because of non skippable garbage that accuses all users of being pirates. Instead I Ripped DVD’s to watch content without that garbage.
And the list goes on. I once had a Hauppauge TV card (quite bad video quality due to the “chroma bug”) which could output to a TV. (SCART / component (It was a long time ago). That functionality was removed in one of the “driver updates”.
For CD’s… Those things cost around EUR 22 for a long time. Costs breakdown was about 25 cent for pressing the CD’s. (maybe 50 cent if you inclde the box and printed artwork), an additional 50 ct for the artist, and the other 21 EUR for the record companies and distribution. I was also one of the rare people who had a DAT recorder, and had much more music on DAT tapes than I had CD’s. (Remember the silly “copy bit” in S/PDIF data) Later I ripped all my CD’s (and the CD’s I had previously copied to DAT) and put them on a HDD, about as soon as that was viable to do. In those days, the norm was 500GB HDD’s, and I needed two of them for my collection. (Stored as FLAC, I don’t do lossy compression).
Once I also experimented with a PC CD player which had an digital output on the back, and play / pause / stop buttons on it’s front panel, and plugged it into my stereo set via a Behringer DCX2496 used as a DAC. It worked but audio quality was atrocious. I never put in the effort to find out exactly why, but I guess they maimed the digital output of that CD player on purpose.
On itself, I don’t mind paying a fair price to watch some video’s or listen to music, but the industry is treating everybody as pirates, while at the same time pirating the actual artists. It had been a sick industry for a very long time, and that gave me plenty of justification for not being part of that sort of extortion.
Most of my movies I watched from torrents. Those had been legal here in the Netherlands, but at some time the government gave in to t he pressure from the record labels lobbying. But the last 10 or so years I don’t do much anymore with either audio or video, due to unrelated circumstances.
A bunch of years ago I tried netflix. I had to enable some DRM thing in Firefox, which I did not do, and that was the end of my netflix experience, apart from a lot of unsolicited spam mails I got from netflix. Luckily I used a trowaway email address to communicate with netflix.
“I have a strong dislike for the “entertainment industry”, which has it’s roots in the pre-internet area. And I’m not the only one. Take for example “Have a Cigar” from the Pink Floyd album “Wish you were here”. I also refused to buy a standalone DVD player because of non skippable garbage that accuses all users of being pirates. Instead I Ripped DVD’s to watch content without that garbage.”
The Video CD (VCD) and CD-i didn’t have this issue. No nag screens.
At least here in Europe, in early 90s, there was a shortlived market of commercial VCDs.
Not the pirate market of Asia, I mean.
VCDs rather were niche alternatives to VHS cassettes, like Laserdics (LDs) used to be.
VCDs were a thing circa 1993 to 1995, I would say.
Prior to the days of Windows 95; rather in the days of Windows 3.1x, Philips CD-i, Kodak PhotoCD.
In the days of 486 PCs with VLB graphics cards and single-speed and double-speed CD-ROM drives.
Some models of the Playstation 1 (yuck) had VCD support, too, I believe.
I know, VCD has a stigma of being crappy.
But if being encoded with a commercial encoder and being viewed on a CRT monitor,
then it looks no worse than a VHS.
It also doesn’t suffer from wear out, like VHS does.
One downside, having to flip the disk half way through the movie or 2 disks for long movies. I don’t disagree with anything you posted but it was an issue with the format and user adoption.
The technology itself was quite impressive, especially when it came out. In fact the laserdisk versions of the original Star Wars trilogy are considered the best source material be because George Lucas added all the CGI stuff to DVDs and crushed the black levels. Granted that’s more of how they were remastered for DVD.
do you realize the irony of pirating basically everything you own because you think the industry treats everyone as a pirate?
does the industry realize theirs ?
way to shill for the man – you can’t pirate something you own
Format conversion is not pirating.
That’s not pirating, that’s making a backup.
If you’re going to be treated like a criminal for doing things the right way, might as well actually be a criminal and be treated like a human.
The real irony is that piracy is the only way to have your ownership rights respected.
There’s another irony, too:
In some countries, making private copies of films, music etc is already being considered in the fees for blanc media (CD-Rs, empty VHS tapes, empty cassettes, flash cards).
So making a copy of something you had acquired before is already being payed for.
So what the film studios say about piracy isn’t only a treatening of the consumer, but also a lie.
But again, in some countries. Law is different everywhere.
to quote the site:
Compared with existing interface technologies, GPMI has seven core advantages: bidirectional multi-stream, bidirectional control, high-power power supply, ecological compatibility, ultra-fast transmission, fast wake-up and full-chain security
Full chain security….
And for the other commenters, they also talk about multiple screens and automotive use and industrial use. Clearly it’s not to simply show TV shows in 8K.
Ooooh, “Full chain security” from China!
Hard pass.
Thanks to Donald Trump the world is becoming aware that dependency on the US has it’s downside. Open standards, go for it!
Unfortunately it will be not open standards, but just another set of closed ones. And being China – unlikely to ever get open.
You mean MPAA and RIAA, not federal or state institutions.
This is a good thing, we don’t need or want any CCP funded backdoor or attack vectors, here.
I think there was a push for 8k video some 5 to 8 years ago, and not many people cared because it’s mostly useless. With 4k, you already have more then enough pixels for video for “normal viewing distances”. There are differences, but things like higher contrast, higher dynamic range and higher frame rate are more important. For me personally, I would only like the see a higher frame rate, so there is finally an end to stuttering rolling panorama’s and artificially added motion blur to attempt to disguise the low frame rate. Back then there was also very little content shot natively in 8k. Sure, you can upscale 4k, but what’s the point? This does not give you (much) sharper pictures.
And what’s the bandwidth of this GPMI standard?
HDMI goes up to 48GBps and Displayport goes up to 77Gbps.
According to Wikipedia, Displayport already has enough bandwidth for:
“Two 8K (7680 × 4320) displays @ 120 Hz and 10 bpc (30 bit/px, HDR) RGB/Y′CBCR 4:4:4 color (with DSC)”
Displayport is also pretty unique in that it’s data is packetized. Instead of dumbly pushing through all the data (repeatedly) for all the lines of the display in a way very similar to the first Nipkow disk, they are attempting to put some more brains behind the datastream. It seems quite silly to refresh big areas of the screen at the framerate, without any of the screen content actually changing. If there is any improvement for future standards, I think this is the way to go.
So I’m not sure what this new video standard is going to improve.
Also, I hope that interfaces will finally become more universal. Why would consumers have to worry whether cables and plugs are being used for either video, ethernet, connecting a printer or whatever. It’s just bits of date. The latest USB standards are finally approaching this ideal, but with a row of USB plugs on your laptop, and each only supporting a different sub set, it also adds confusion. The Idea of “just plug the cable into whatever connector it can mate with” is maybe not so bad after all.
My own PC has a 4k monitor with a 107cm diagonal. Pixel pitch is around 0.24mm, which is similar to the old 640*480 VGA on a 14 bananas CRT and to the first 3 or so generations of TFT monitors, all the way up to the time when ridiculously small pixels became a fashion statement. At my normal viewing distance (55cm) I can not distinguish individual pixels on a slanted line. I can not see the “jaggedness” when the line goes form one row of pixels to the next, without my reading glasses. (I’m 54 years old now and am starting to use +1 reading glasses for seeing small things, but I don’t need them for my 107cm PC monitor). At this moment my eyes are good enough to read text with an 8 pixel “base height” easily, but my eyes have deteriorated just enough so the curved edges of the letters blend into each other. But I admit, when I was younger, I could see individual pixels on a monitor with a 0.24mm pitch.
As far as I am concerned, monitor and video technology has now reached a level that makes it indistinguishable from magic. Maybe the technology can be pushed further, but it would still be magic to me, and with no additional practical benefit.
Magic at its heart has a different meaning, though.
It’s the manifestation of will power, that’s how spells came to be.
It’s all about “mind over matter”, if you will.
About the similarity between high-tec vs magic..
Maybe It’s just my grumpy German attitude here, but I think
Mr. Clark simply had the talent to tell the obvious, IMHO.
He also liked to sound more smart then he was, maybe.
Who needs 8k video? I don’t even care about 4k video.
Most of what’s on TV or on the streaming services is crap. Putting it in higher resolution just gives you high resolution crap.
Plastic surgeons and book sellers mostly, the march to hi def made it possible for viewers to see every wrinkle and “imperfection” on an actor’s face and read the titles of books on politician’s bookshelves
+1
Progress, baby! Number go up…
It matters to young people, I think. Not us, maybe, anymore.
I’ve had boomers as visitors in my living room that couldn’t see the huge difference between SD material and a 1080p blu ray.
After years of being messed up by consuming low-res, highly compressed content,
they’ve must become numb to registering fine details and nuances.
What does it matter?
The shows are crap: crappy writing, crappy actors, crappy concepts resulting in crappy movies and TV shows.
Why would I want a high resolution view of utter crap? That’s what we get. The movie equivalent of a picture of a dog turd in 4k resolution.
Documentaries. There are some great ones.
At least in my little corner of the world.
I like those Terra X series, for example.
https://en.wikipedia.org/wiki/Terra_X
Some of those made by privately, by individuals on YouTube are great, too. And in high resolution.
This channel comes to mind: https://tinyurl.com/ajzytbcu
(I’m not affiliated, I just found this channel while browsing YT channels.)
Lol, as predicted in Fahrenheit 451
Improvements in display technology can also matter to a subset of crotchety old people who like old video games and the way they used to look, but don’t want to keep a CRT around for whatever reason. (Size, weight, reliability, …)
For instance, 480p is enough for basic 240p/480i scanline simulation, good enough when simulating a lower resolution CRT on a higher resolution CRT but not great on a large modern display. At 1080p it starts to make sense to add basic aperture grille or shadow mask effects but they’re pretty lacking when looked at too closely; good enough for my eyes and my screen size and my viewing distance but these all vary wildly. 4k is better still and can start to simulate things like convergence differences across the screen, but there’s still room for improvement (or disimprovement depending on tastes). 8k might just be enough for perfect simulation.
Similarly there is recent interest in using 240Hz displays to simulate CRT beam scanning, by drawing a rolling 1/4 of the screen in 1/4 of the time it would take to draw a 60Hz frame. When paired with a bright OLED, this apparently results in many of the benefits of modern BFI but with less flicker.
It’s funny how much digital horsepower is required to simulate analog defects and deficiencies.
Well, yes.. The main “problem” is that both are different technologies.
CRTs are line based or vector based, but not pixel based like modern day displays.
So in order to displayolder content correctly, CRT technology needs to be simulated.
It can work well, but needs smart algorithms. And good displays.
Like OLED displays, for example, were the pixels glow by themself like pixels on a CRT would.
Then there’s things like dithering and color gradients.
See, a VHS or Amiga or DOS PC look most natural on a low-res CRT tube.
A tube that has a big dot pitch of about 0,40 mm (roughly).
On such a CRT, all the imperfections are being hidden.
Color bars do converge, causing pretty gradients.
That’s why many people still enjoy watching cartoons or old TV shows on a 1980s color TV.
The convergence errors, big dit pitch of CRT mask, the NTSC artifacts etc. do make the experience more natural, less sterile.
By contrast, high-end CAD monitors of, say, an SGI Indy unix graphics workstation were not that all.
They were very sharp, just like LCD/TFT monitors of the day.
You saw all the pixelation of low-res material.
However, colors were true black and refresh rates very high.
The pixels could “glow”, as well, depending on brightness/contrast settings.
Typical limit of a CAD grade CRT was 1600×1200 at 240 Hz refresh rate, some went higher.
Compared to high-end CRTs, modern flatpanels are still kids toys.
A CRT is a litle particle accelerator, after all.
Hard to beat with an LC display and a weak CCFL or LED background light.
I tend to agree, but I’ll make my case for high resolution TV even though the volume of crap I’ll catch for saying it will be deep. 3D TV.
Back when it was a thing my BiL had an LG set. Big but not humongous in size, it used polarization to achieve stereoscopic images. The glasses were simple, low cost, comfortable and effective. The image lost resolution due to every line being alternating polarity. With 4K or 8K resolution I doubt 99% of the population would see the loss. With enough pixels, assuming the polarization screen could keep up, you could even alternate the horizontal axis.
That would make TV a more realistic, immersive experience. Now all we have to do is convince production studios to use the effect intelligently.
Yaa, I deserve the crap.
As a 3d TV enthusiast I hope you are strapped to a Ludovico chair and forced to watch the locker room scene from Baseketball.
Over and over, unable to duck.
Yes I know they didn’t 3defy that movie.
It’s a puzzle.
Parker and Stone being such cultured gentlemen.
Come to talk about 3D. The old shutter glasses should work pretty good on OLED right?
I mean you don’t have issues with the screen being polarized and interfering with the LCD in the glasses, and you have very fast framerates, so you could do 120Hz and switch between eyes to get a 60Hz image per eye easy I would think.
So all you need is a synch signal and some old shutter glasses.
Now the issue is that the shutter glasses didnt ‘shut’ perfectly, but we have newer technology now that can do similar things surely? Although those transparant OLED demos at CES and such used a very non-tech method to make them non-transparant at wish by simply unrolling a sheet behind them. More 1925 tech that 2025 really. So they seem to not have had an easy and semi-cheap tech to do it. Or did they just not do enough effort? Can we really not get a ‘transparant or black’ sheet at the press of a button yet?
I hear several manufacturers agreed to implement it.
And they have a spec link for paying members.
And the western standards also require you to be a (paying) member for full specs incidentally.
Again with the misplaced comment placing, that was a reply to john.
And I even tried clearing my cookies first this time, quite clever how it manages to retain where I last commented and misplacing my replies there even when I don’t have a cookie to keep track, high end adanced crippleware..
And I think this was all coded before AI was there to assist.
So there AI, bet you can’t get code that is that messed up, hah. Amateur. :)
RE: Who needs 8k video? – those who buy larger-than-life mega-sized TVs. 4k is already quite sharp on them 100 inch TVs, and 8k will look like 4k on the 200 inch TV and I speculate that 10k will be for the 400 inch TVs. (waving a flag with “SARCASM” written on it).
IMHO, 4k was already quite advanced enough for my ordinary Sam’s computing needs (multiple displays, etc), and as far as TV goes, the (now) old school 1080 or 720 are usually adequate for most things (local over-the-air free feeds are even less than that, I think 720 is the absolute upper limit; most seem to be roughly half of that, but for the reruns of the old shows that’s all that’s needed anyway).
As far as the quality of the content goes, we are in the midst of general downturn of the megamonsters of the past, IBMs of the media had their heyday in the late 1990s, and it’s over; however we now have youtube to offer alternatives aplenty.
Strongly disagree with everything but 8K not being needed. It’s been proven to only make a difference once you get to around a 150″ screen. UHD isn’t even noticable over 1080p until 85″+ and that’s IF you have good eyes. HDR, specifically dynamic HDR has made a bigger difference than going from 1080p to UHD.
For video the mastering process is the most important part in the end results. Streaming services usually suffer from macro blocking during action or fast moving scenes. It’s hard to detect unless you know what to look for or just have sensitive eyes. Higher bitrate material like UHD doesn’t suffer from this. That and supported display-led Dolby Vision is much better than LLDV were the source does the dynamic tone mapping. With display led, the display does the dynamic tone mapping and the display knows its capabilities better than say, an Nvidia shield or ATV4K as all streaming services use LLDV. Only UHD disks support display led Dolby Vision.
This becomes more important as more movies are being mastered at 4000 nits. For years mastering at 1000 nits was the standard which means if you have a TV that can hit 2500 nits in a 5 to 10 percent window it doesn’t matter because the peak brightness in any scene is 1000 nits, even with dynamic metadata for HDR like DV or HDR10+ which doesn’t have much content anyways.
“There’s a small part of us that wonders just how far we can push display resolution beyond that of our eyes without it becoming just a marketing gimmick”
Only assuming the sole use-case for a video interconnect is for a TV at a long viewing distance. Video walls, VR, mapping displays, composite displays (multi-panel assemblies, not the single-cable analog scheme), autostereo displays, lightfield displays, etc. There are plenty of cases where “8K” is far, far below the limits of human vision.
Remember, the 60PPD “retina display” marketing buzzword has little to no basis in reality. Required angular resolution to be visually indistinguishable is a much trickier problem and relies on a lot more factors than just pixel density. Just changing the visual task from “are these two 1px parallel lines touching?” to “is this 1px line aligned to this 1px line next to it?” adds an order of magnitude finer pixel density requirement at the same viewing distance (and all other environmental factors being equal).
Oh come on! Where are the obligatory old standby comments?!?!
https://xkcd.com/927/
Shoulda just used a 555!
This comic gets old, IMHO. It’s very true for Linux, though, IMHO.
The core problem with HDMI is that it’s a product of US entertainment industry.
As far as I know, there was no international committee or something involved which gave its blessing for HDMI being a standard.
No ANSI or ISO registration, as far as I can tell.
It’s sole reasone for existing is “Hollywood”, the support of US film industry, big studios.
It used its power to enforce HDMI as a global standard, rather.
And China doesn’t like that. I understand this, they have my sympathy here.
Too much power on one side is not good, it leads to a monopoly.
While the US entertainment industry certainly played a role in lobbying for a standard that favored their needs, insisting on copy protection, etc., the primary originators of HDMI were pretty much all Japanese hardware and media companies – the ones who, at the time, manufactured the vast majority of the video players and the things you could play on them, and wanted a new universal (but not public/free) standard they could use to replace analog video and DVI.
In that sense its genesis is actually quite similar to GPMI.
But yeah, it’s absolutely a standard that was, and is, designed to cater to the needs of a consortium of businesses and industry groups first, and the end-user second.
all i want to know is whether it has finally de-synced the video datastream? my understanding is the competitors all still require the video data to be sent at certain times within the new frame being drawn, as if you were still driving an electron beam at a fixed scanning frequency
Beta-vs-VHS, here we come, now reflecting the political boundaries.
IMHO, instead of forcing all to convert to the better/more-expensive next-VHS/Beta, why not focus on introducing something better that will be 100% backward compatible with the existing some standards?
Challenge – replicate what implementation of the color TV did – it was backward compatible with the B&W TVs that were prevalent; invent a new/better standard that will work with the older technologies that are still around (say, figure out how to push 1080 or even 4k over the RCA jacks – I am NOT saying that’s how it should be done – I am bring it up as the example idea; technically, I think upgraded SPDIF may be a better idea, no copper wiring to pick up noise, etc).
IM other HO, US is in the midst of unstoppable downgrading its entertainment industry to lower/crappier standards (or none whatsoever), so, naturally, it now can barely compete with what’s produced elsewhere :-[ They way I see it, we’ve long ceded the lead, and watching, say, Grammys made it quite obvious. I do not give hoot whether one sells ten gazillion albums or ten dozens, if it is okay/below-okay quality stuffs, it is still okay/below-okay level, just properly funneled into the largest profit maximizing schema possible. Similarly how dumb people with power won’t automagically become any smarter – they’ll still be dumb people, except with power, which they mostly use to convince the non-believers that reverse is true. US entertainment industry was one of the best in the world in the 1980s, but that was 40 years ago, time to wake up and go back to competing with the best, and they are now gaining rather fast. We are ALREADY partially a museum of things that used to work well (representative democracy, is one of them, btw, but not only – corporations that fueled progress, not slowed it down in order to milk the cow until it goes dead – and then milk it some more still).
Regardless, Beta-vs-VHS is back in style, and this time it doesn’t look for the VHS. As far as consumers go, the combined China/India market alone is half of the world population, so it looks quite likely that US will be left out (again, btw, speed trains, what speed trains, we haven’t finished upgrading the 1980s Acela yet).
“Beta-vs-VHS, here we come, now reflecting the political boundaries.”
.. and Video 2000!
You guys always forget the number three!
https://en.wikipedia.org/wiki/Video_2000
That’s right, I forgot the Video 2000, my bad (and my memory getting feeble/brittle).
That and the failed RCA “vinyl video discs” that were analog and couldn’t accommodate enough video AND had to be flipped like LPs mid-movie.
SDI, which is considered a “professional” standard uses a single copper cable or fiber and can transmit video and audio over ridiculous lengths. It’s what is used to broadcast live events like sports. Obviously the lack of DRM is why it’s not available to the general public. HDMI is just a headache IMO. Dual link 12G-SDI can do 24Gbps over a dual link which is more than enough for any UHD content, same as HDMI 2.0 (outside 8K). HDMI 2.1 really is only needed for gaming.
12G-SDI is a video interface standard that allows for the transmission of video data at a significantly higher rate than previous versions like HD-SDI or 3G-SDI. It supports resolutions like 4K 60 frames per second with a single link and 8K 30 frames per second with a dual link. 12G-SDI utilizes single-link connections, which simplifies cabling compared to earlier standards
Cable churn. Every few years introduce a new receptacle.
guh…i only switched to hdmi itself about 5 years ago and already i’ve had to buy two different sizes of hdmi cable. i don’t even know hdmi well enough to tell micro from mini by looking at it but i’ve bought both…i think. what a mess it is, and what a voyage of discovery too, under my tv
It uses a USB-C connector, same as DisplayPort does as part of Thunderbolt/USB4. I think the goal here is to get it fairly widely adopted and then convince the USB-IF to roll it into the USB standard, like what happened with Thunderbolt and USB4
Is The Chinese backed standard more compatabke with the rmbedded codes people are finding i tbier ma ufactured hardware?
This is good and all but what about the millions of devices that use hdmi we could just make a adapter yes but it does get rid of some of the quality that gpmi has and just to be honest I thing it might just be a big hoax honestly to steal data just like TikTok was
I’m still moderately perturbed that HDMI does not incorporate CC support. Why NOT?
HDMI has been around for a long time, so I don’t think this will sweep the market.
It took at least ten years until scart and vga connectors were (almost) gone. The old, ugly three way usb plugs will haunt us for nany years.
It takes long to adopt a new standard.
In the banner image, the display port cable does not have any apparent method of actuating the locking mechanism and so could never be unplugged. What are we to believe that this is some kind of magic display port cable or something??
Is anyone actually talking about it or did they pay you to blast this with tech radar et the other similar posts I’ve seen? So far I’ve seen 0 technical specs on this standard other than bandwidth and wattage.
I’m personally releasing a standard next year that can carry 2kw of power and 90Tb/s video over two unshielded 40 gauge wires.
It’s gonna be way better than what they say they are doing.
If the how they plan to achieve this is out there please point me in the direction. Otherwise this seems like most of their military claims. Nice headline lacks in substance
Here’s my take from the few things I’ve seen. Pros: it fits onto standard USB-C receptacles and appears to work well enough as USB-C. Cons: it’s apparently quite proprietary (so, Thunderbolt-like but I’d expect it to be even more impenetrable), cable length’s questionable, and it’s unclear just exactly how PD-compatible it will be (if “not very”, it can go take a hike).
OK but you’re not a consortium of major Chinese tech companies and TV manufacturers
it seems like you asnwered your own question when you said that you know people are talking about it, when you said that people are talking about it because they’re paid to.
A ‘standard’ like this I wouldn’t hear about probably for ages without HAD covering it, and given where it is coming from is the place most electronics are built it might well become a real standard you see everywhere. Its well worth knowing what the folks building your devices are thinking before it happens – gives you the chance to buy the last gasp of the previous tech if you really hate the direction of travel etc.
HDMI Forum Rejects Open-Source HDMI 2.1 Driver Support Sought By AMD
https://www.phoronix.com/news/HDMI-2.1-OSS-Rejected
HDMI is controlled by a daughter company of Intel…..
This needs a paid advertisement warning.
HiSilicon is Huawei’s IC design subsidiary, I think.
I really like this! Hooking up a tiny laptop to GPU, power, monitor and peripherals with one connector is the dream.
Judging by the reactions to this, people are being people and nasty about this being Chinese, but I’m not surprised.
If this becomes an open, well documented standard, I’m all for it. In general, I’m expecting a lot from Chinese engineering in the near future, I’m seeing a lot of practical, no-nonsense, robust products. If they learn to be better at open standards than we are in the West (HDMI, Thunderbolt, Firewire etc. are great examples of shady practices), we’re all winning.
Will it be current Thunderbolt 4 or 5 standards?