[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
Will freesync over hdmi become a thing?
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 57
Thread images: 4
Will freesync over hdmi become a thing?
>>
>HDMI

literally jew: the connector
>>
>freesync
literally g-sync: the jew version
>>
>>51669317
How so? It's literally just a proprietary version of vesa adaptive sync, a precursor to both.
>>
>>51669280
amd users can't afford hdmi devices
>>
>>51669331
Except gsync released in 2013
>>
>>51669352
>The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.
nmemeia didn't invent variable refresh
http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
>>
>>51669317

>doesnt need a propritoary chip
>doesnt cost 200€ extra
>can be theoretically used from both AMD and Nvidia, Nvidia just doesnt want to, since adaptive sync is a DP feature, not an AMD one.

How is that jew exactly?
>>
>>51669331
>>51669377
the idea of variable refresh has been around for a while, but g-sync and adaptive-sync/freesync aren't exactly completely comparable things.

> g-sync: proprietary product feature using non-standard bidirectional DP signaling between GeForce GPU and a custom scaler chip (FPGA actually) that can do display-internal things like time-compressed panel scans etc.

> VESA adaptive-sync: tweak to DP 1.2 spec that says GPUs can send variably-periodic updates that fall within the window advertised by the display at the connection handshake

> freesync: AMD's GPU/driver stack that can send variably paced frames to compatible adaptive-sync displays

the g-sync platform has a dumber protocol than adaptive-sync (not that it matters since muh proprietary) but appears to have a better display-side scaler imprementation than what adaptive-sync monitors are currently using.
that said, literally any scaler ASIC manufacturer can add whatever features they want for more clarity, etc., in the future without impacting the protocol whatsoever.
>>
>>51669771
>> freesync: AMD's GPU/driver stack that can send variably paced frames to compatible adaptive-sync displays
Isn't it literally just a marketing/certification thing?
>>
>>51669871
there's the branding thing for displays, but part of it is that GPUs basically need an additional hardware timer for the display block compared to older GPUs, and the drivers need hooks added to use the timer for page flipping etc. intelligently.

the wire-level protocol and display handling support are necessary but not sufficient to make worthwhile use of dynamic refreshing in vidya.
the GPU and driver need a minimum amount of added intelligence.
>>
>>51669871
You're right, it's just an implementation of an open standard in the driver. However, they probably decided to give it the name to show that it competes with G-sync.
>>
>>51669280
yes AMD said it themselves
https://youtu.be/AXLXPiJpVGs?t=41m8s
>>
>>51670025
>the wire-level protocol and display handling support are necessary but not sufficient to make worthwhile use of dynamic refreshing in vidya.
>the GPU and driver need a minimum amount of added intelligence.
But isn't that literally what's required for vesa adaptive sync?
>>
>>51670160
no, adaptive sync is retarded simple.
it's literally like a one paragraph addition to the spec that says something like:

> if a display tells a GPU that at a given resolution is will accept new frames every X to Y milliseconds, it must accept and display every frame delivered to it within those timing constraints

the spec says nothing about what a display or gpu must do internally in order to meet the requirements and use it intelligently.

moreover, traditional GPU hardware never had timing hardware independent of the display output (RAMDAC, etc.) writing, so they need something like the ability to say, "automatically flip output buffer to memory location B after N more clocks, but also start writing display output again any time after M clocks if some buffer completion register is flagged".

It's honestly not a complicated piece of logic at all (even when taking minimum refresh rates into consideration), but the transistors weren't there in older GPUs since there was no reason for them to be.

Likewise, the driver upgrades pretty much amount to setting special hardware registers and maybe handling automatic frame doubling for low-fps situations, but it's still functionality that needed to be added and then thoroughly tested.
>>
>>51670406
ah
>>
>>51669280
>Will freesync over hdmi become a thing?

this has been discussed, but it's hard to see how.

the only application that really benefits from fully dynamic refresh is vidya, and TV manufacturers don't care about PC gaming.

you'd basically need MS and Sony pushing this together, for TVs released ca. 2020 for the PS5/xbuhtoo, which seems unlikely.
>>
>>51670534
does 7750 support freesync?
>>
screen tearing has never been an issue until 2013

what is this shit. Why would anyone pay 250 euros extra for this gimmick? might as well spend that cash on a more powerful graphics card..
>>
>>51670850
Nope, it's only Rx cards.
>>
>>51670943
7970 is 280X and 7950 is 280.
Current console chips are cut down GCN cards.
>>
>>51670534
just get amd to force it on ms or sony desu
>mtfw ps5 and xb2 with "game ready" tvs
>>
>>51670943
>>51670982
>>51670850
I did some research and technically Xbox One can use Freesync. It has GCN 1.1 Bonaire GPU. PS4 uses Pitcairn, wich is GCN 1.0, so no Freesync for PS4.
>>
>MG279Q, Freesync
>600e.

>PG279Q, G-Sync version of above moniotor
>950e

Why are nVidia such jews

more importantly is it worth it if I dont play FPS games like Quake3 or CS
>>
>>51671154
gsync is terrible for for those games you listed as it induces input lag and limits your fps to your refreshrate

Its ok for medium to low fps games - but not a gamechanger
>>
>>51671195
so shit it's good for stuff like the Witcher 3 and GTA V?

Fuck this shit is going to make me buy G-Sync monitor soon Christ and COCKSUCK
>>
>>51671225
depends on you really - its just vsync with less input lag and with none of that speeding up and slowing down effect.

have you ever obsessed with having vsync on in games? has tearing ever annoyed you?

its very much a personal thing and franky i couldnt give a fuck about tearing. especially not at a 300 euro + premium.
>>
>>51671118
GCN means mostly the ALU array and associated shader controls.

All the bolten on display block and codec stuff is pretty interchangeable between ASICs, and it's highly unlikely that ps4/xbone support freesync, multiple outputs, etc., since they only needed 1x hdmi 1.4 at best.
>>
File: AMD%20FreeSync%20Slide13.jpg (169 KB, 1466x824) Image search: [Google]
AMD%20FreeSync%20Slide13.jpg
169 KB, 1466x824
>>51671508
yeah, it's true,
there is a chance Xbone supports and just doesn't use it, there is a chance it doesn't. And even if it does, it's up to MS to make it work.

7790 is 260x
>>
Remember: intel said they will support freesync (though iric non of their current product line does) and what Intel says, goes (as much as I detest it). G-sync has lost.
>>
>>51671639
desu senpai, FS timer is part of the DP display controller sub-block.

there is a zero percent chance that any current console supports it.
>>
>>51671681
G-sync is a gaymen thing.
Freesync aslo doesn't run on nVidia cards. So in gaymen sector G-sync is the king.

Everything else will be able to use freesync. Laptops with nVidia Optimus would be an interesting case. I know Intel will use Adaptive vsync to lower power consumption, not to improve gaming, but paired with nvidia card, here you see it, now you don't.
>>
>>51671695
way to shoot me down, snepai
>>
>>51671717

>Freesync aslo doesn't run on nVidia cards. So in gaymen sector G-sync is the king.

In the here and now? Sure.
Intel still sell one hell of a lot of chips to systems without Dgpu and that is going to get panel manufacturers onboard. Intel supporting freesync can - and I bet will - kill gysnc off given the cost of buy the scaler module from Nvidia is a factor for companies when they could just tell Nvidia to fuck off and support basically every AMD chip made as well as whatever Intel have.

I have no doubt there will always exist high end screens with gsync but the writing is on the wall.
>>
>>51671792
People who buy Intel nucs or prebuilts with no dGPU, don't drop 1000 bucks on an IPS 144hz 1440p g-sync panel.
They go with the cheapest option. Sometimes getting an IPS screen even if it's worse than PVA model in same price category, because IPS. For the same reason they buy nVidia afterwards.

nVidia will rally a following and will make a living selling overpriced adapters and locking people in with their 1k$ monitors and other stuff.
Unless AMD can turn it around hardcore in dGPU market, freesync will be a cheap option to g-sync in people's minds, like AMD to nvidia right now, or 6 moth ago.
>>
>>51671717
the thing though is that baseline level Adaptive-sync support is already being put into all future scaler ASICs by the main 2-3 manufacturers since it requires virtually zero added circuitry, while g-sync is only available on custom scalers (FPGA for now, someday maybe ASICs) from Nvidia that cost a lot more.

at some point every monitor that's not explicitly g-sync will be supporting at least crap-tier freesync by default.
the thing that AMD needs to worry about is the scaler vendors actually putting in advanced functionality like compressed panel scan-out, advanced frequency-away color correction, strobing with variable syncing, etc.

g-sync will lose in the marketshare race for displays by default unless Nvidia gives away the IP for free to other scaler vendors, but this won't happen since the entire point is to dominate the enthusiast and then mainstream display market with their scalers just like they have for GPUs.
>>
>>51669377
Does this mean my monitor with displayport I bought in 2012 has some kind of freesync-like ability?
>>
>>51671985
no, only the embedded DisplayPort used in some laptops supported variable refresh that long ago.
>>
>>51671977
well, Freesync is an open standard, so Intel might do their own stuff with it.

The thing with G-sync, I don't think nVidia wants to get wide aroption with this thing, they want to get 980ti\Titan, maybe 980 crowd, they don't care about everyone else, well maybe an odd guy with 970.

Freesync will become a de-facto Industry standard, but it doesn't mean it will become anything big in enthusiast gaming level. It's closely ted with AMD and AMD drivers, so other companies may want to do something about it.

I'm not saying Freesync will fail, moment Intel pledges support it became obvious support will spread, considering how easy it is to implement. I'm just not sure it will mean that much for AMD and their gaming amrketshare.
>>
>>51672120
the best thing AMD could do for itself is to make a competent Zen+GCN APU next year to push FreeSync to a huge chunk of the market.

Intel has verbally committed to Adaptive-sync, but they didn't exactly promise to have it done in any specific timeframe/product generation.
>>
>>51672220

Has any solid rumour confirmed or denied Zen have integrated graphics? I would imagine the zen cpus would be without graphics and for AMD to go balls out on the zen based apus that arel ined up for later.
>>
>>51672220
I though AMD pushed ZEN APUs to 2017.

AMD is a laughing stock in CPU market, people hate on their CPU just because they're AMD CPUs. To be fair the have a basis behind this. But getting out of this "cheap version of X" will be hard. AMD answer was to jack up prices, but I don't think this is the way, especially on FuryX launch when it was priced closely, but didn't get anywhere near 980Ti.
Heat it up slowly, don't boil it straight away.
>>51672242
Initial Zen launch will lack iGPU, but they will fix it next year.
>>
>>51672242
Zen is just the CPU core+cache design, just like GCN is the shader ALUs+controllers.

people have been speculating for many moons already about various APUs mixing and matching the different cores, pic related.
>>
>>51672297
>raising prices just because nvidia has high prices
lel no

just get a good marketing team for fucksake.
>>
>>51672346
Well, this is how business works, you price your goods as high as consumer is willing to pay for it. And then sell it. If they get a cards that's equal or better than nVidia card pricing it in a similar fashion may net you good sells with hefty income.
AMD put their foot in the door by using this basic tactic, make a GPu that's equal or better than nVidia GPU in this market, price it a bit lower.

390Xs don't cost 400$ to make and transport and most R&D got covered by 290X.

And yes, their marketing team is kinda shit. They're working on it though, recent restructure + they started pulling staff from tech sites. We'll see. MS pours ridiculous amount of money into their marketing, their consumer side still sucks. Throwing money at it won't solve the problem
>>
>>51672297
>I though AMD pushed ZEN APUs to 2017.

Nobody knows anything for sure, but rumor on the street is that Zen CPUs are Q4'16 at the earliest due to GF 14nm woes.

It's possible that Zen APUs could come off TSMC's 16nm line before 2017, but it seems unlikely.
>>
>>51669288
It may be, but display port is a fucked up connection and a wonky DP cable can actually cause your computer to crash
>>
>>51672515
Why is DP 'fucked up connection'?
>>
>>51672457
man if Zen is as good as they say, it will put AMD back into the game.
IF DX12 is a go, AMD CPUs will strike back with vengeance in games.
>>
>>51672569
Zen has a very good chance of being at least competitive with newer Intel cores, but that's only since they've been sitting idle for so long.

However, if they can't get Zen made in 14nm until 2017, AMD is gonna get fucked in short order by Cannonlake due to 14nm/10nm gap.
>>
>>51672674

Intel keep delaying their 10nm chips. Every time people think it is just around the corner they shift the date back by 6 months.

I would not be surprised if consumer cannonlake is one of the last chips to come out since the real money is in laptops and servers.
>>
>>51671639
doubt it because xbone has no dp
>>
>>51671154
Just get the Dell G-sync. It is $600
>>
>>51671639
>freesync on a console that can't do 1080p 60fps
For what reason?
>>
Since when did enabling vsync induce lag anyway?
I remember playing Half Life and derivative with it on just fine, other games around that generation were fine, but suddenly about 5 years ago enabling vsync would cause your input to lag horrendously.
What did they do? Was this all some buildup ploy to make people think they needed new sync technology?
>>
>>51674529

Adaptive sync works best when you have a fluctuating framerate - if the console ran everything at a 60fps lock then there would be no need.
>>
>>51674593

TRIPLE BUFFERING
Thread replies: 57
Thread images: 4

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.