Gå til innhold

GeForce RTX 20** tråden


Anbefalte innlegg

Free sync får du på nVidia og, så hvorfor skal jeg da velge Vega som yter dårligere?

 

Og i tillegg så er ikke free sync en magisk quick fix for dårlig ytelse.

 

Hva var poenget ditt egentlig, annet enn å reklamere for Vega?

Får ikke Freesync gjennom HDMI med Nvidia. Så lenge man ikke spiller competetive så synes jeg Freesync er helt knall. Merker fort visst den er skrudd av hvertfall

Lenke til kommentar
Videoannonse
Annonse

Tja, uten AMD ingen Freesync.

Uten AMD ikke priskrig innenfor 2 til 4 K hvor man får relativt mye for pengene.

Uten AMD trolig mindre oppdateringer på driver fronten.

Uten AMD trolig elendig apu ytelse på CPU i mange år til og Quad-Core til 2025.

Uten AMD trolig mindre inoveringer innen gpu fronten og kanskje 1/2 år til 1 år lengre mellom nye skjermkort.

 

Om noe så burde man forstå at selv om AMD per dags dato ikke har skjermkort i samme ytelsen som NVIDIA hadde alternativet sugd noe innmari, noe alla råttent julenisse skjegg.

  • Liker 6
Lenke til kommentar

Får ikke Freesync gjennom HDMI med Nvidia. Så lenge man ikke spiller competetive så synes jeg Freesync er helt knall. Merker fort visst den er skrudd av hvertfall

Hvorfor skal jeg ha freesync via HDMI? Bruker DP, og folk flest med PC bruker PC-monitorer, så da er det veldig uinteressant at et fåtall bruker TV med HDMI som argument for Vega.

 

I tillegg fikser ikke freesync lav fps, produktet blir ikke automatisk bedre bare fordi man aktiverer freesync.

 

Så for midt bruk får jeg mer igjen med Nvidia, freesync og DP :)

Endret av Emile the rat
Lenke til kommentar

Hvorfor skal jeg ha freesync via HDMI? Bruker DP, og folk flest med PC bruker PC-monitorer, så da er det veldig uinteressant at et fåtall bruker TV med HDMI som argument for Vega.

 

I tillegg fikser ikke freesync lav fps, produktet blir ikke automatisk bedre bare fordi man aktiverer freesync.

 

Så for midt bruk får jeg mer igjen med Nvidia, freesync og DP :)

Freesync fikser variabel fps innen for Freesync rangen. Spiller jeg The Witcher 3 på full grafikk i 3440*1440 så ligger jeg innenfor rangen på min monitor, å jeg kan si at det er natt og dag forskjell på Freesync av og på.

Lenke til kommentar

Freesync fikser variabel fps innen for Freesync rangen. Spiller jeg The Witcher 3 på full grafikk i 3440*1440 så ligger jeg innenfor rangen på min monitor, å jeg kan si at det er natt og dag forskjell på Freesync av og på.

Nå kan jeg kjøre freesync på min skjerm og med 2080 ti. Så Jeg lUrer fortsatt på hvilke fordeler Vega VII freesync gir over nVidia om man bruker PC monitor og ikke TV.

Lenke til kommentar

AMD er litt for glad i strøm, så ytelse pr watt er litt diskutabel. GPU er mer fleksibelt laget enn nVidia sin og som regel bruker de bra minne også. Mange som setter pris på de egenskapene hos AMD, men til effektiv spillbruk er nok nVidia foretrukket av de fleste.

 

3 av 4 velger nVidia over AMD.

AMD sine GPUer er virkelig ikke mer fleksible enn Nvidia sine. Nvidias Turing har dedikert skedulering for hver enkelt SM, AMDs GCN har maks 4 enheter som takler dette uansett om man ser på RX 570 eller Radeon VII.

 

Forestill deg en organisasjon med 2048 ansatte, for å styre denne har du delt den opp i 32 team med 64 personer, 8 team deler på en leder som gir dem oppgaver.

Så forestiller du deg en konkurrerende organisasjon på 2048 ansatte, for å styre denne er teamene delt opp i 32 team med 64 personer, hvert team har en teamleder, 8 teamledere snakker med en avdelingsleder, det er totalt 4 avdelingslededere.

Lenke til kommentar

Her er noen spørsmål og svar om DLSS:

 

 

NVIDIA DLSS: Your Questions, Answered
By Andrew Edelsten on February 15, 2019 | Featured StoriesDLSS

Hi, I’m Andrew Edelsten, Technical Director of Deep Learning at NVIDIA. I’ve been working here since 2010, and for the last couple of years my team has been working with the folks at NVIDIA Research to create DLSS.

This week, we’re excited to launch DLSS for BattlefieldTM V and Metro Exodus, following launches in Final Fantasy XV: Windows Edition and 3DMark Port Royal. There have been a lot of questions, and I wanted to get some answers out to you on the most popular ones.

Q: What is DLSS?

A: Deep Learning Super Sampling (DLSS) is an NVIDIA RTX technology that uses the power of AI to boost your frame rates in games with graphically-intensive workloads. With DLSS, gamers can use higher resolutions and settings while still maintaining solid framerates.

Q: How does DLSS work?

A: The DLSS team first extracts many aliased frames from the target game, and then for each one we generate a matching “perfect frame” using either super-sampling or accumulation rendering. These paired frames are fed to NVIDIA’s supercomputer. The supercomputer trains the DLSS model to recognize aliased inputs and generate high quality anti-aliased images that match the “perfect frame” as closely as possible. We then repeat the process, but this time we train the model to generate additional pixels rather than applying AA. This has the effect of increasing the resolution of the input. Combining both techniques enables the GPU to render the full monitor resolution at higher frame rates.

Q: Where does DLSS provide the biggest benefit? And why isn’t it available for all resolutions?

A: The results of DLSS vary a bit, because each game has different characteristics based on the game engine, complexity of content, and the time spent on training. Our supercomputer never sleeps, and we continue to train and improve our deep learning neural network even after a game’s launch. When we have improvements to performance or image quality ready, we provide them to you via NVIDIA software updates.

 DLSS is designed to boost frame rates at high GPU workloads (i.e. when your framerate is low and your GPU is working to its full capacity without bottlenecks or other limitations). If your game is already running at high frame rates, your GPU’s frame rendering time may be shorter than the DLSS execution time. In this case, DLSS is not available because it would not improve your framerate. However, if your game is heavily utilizing the GPU (e.g. FPS is below ~60), DLSS provides an optimal performance boost. You can crank up your settings to maximize your gains. (Note: 60 FPS is an approximation -- the exact number varies by game and what graphics settings are enabled)

To put it a bit more technically, DLSS requires a fixed amount of GPU time per frame to run the deep neural network. Thus, games that run at lower frame rates (proportionally less fixed workload) or higher resolutions (greater pixel shading savings), benefit more from DLSS. For games running at high frame rates or low resolutions, DLSS may not boost performance. When your GPU’s frame rendering time is shorter than what it takes to execute the DLSS model, we don’t enable DLSS. We only enable DLSS for cases where you will receive a performance gain. DLSS availability is game-specific, and depends on your GPU and selected display resolution.

Q: Some users mentioned blurry frames. Can you explain?

A: DLSS is a new technology and we are working hard to perfect it.

We built DLSS to leverage the Turing architecture’s Tensor Cores and to provide the largest benefit when GPU load is high. To this end, we concentrated on high resolutions during development (where GPU load is highest) with 4K (3840x2160) being the most common training target. Running at 4K is beneficial when it comes to image quality as the number of input pixels is high. Typically for 4K DLSS, we have around 3.5-5.5 million pixels from which to generate the final frame, while at 1920x1080 we only have around 1.0-1.5 million pixels. The less source data, the greater the challenge for DLSS to detect features in the input frame and predict the final frame.

We have seen the screenshots and are listening to the community’s feedback about DLSS at lower resolutions, and are focusing on it as a top priority. We are adding more training data and some new techniques to improve quality, and will continue to train the deep neural network so that it improves over time.

Q: Why don’t I just use upscaled TAA instead?

A:  Depending on the resolution, quality settings, and game implementation, some may prefer TAA in one game and DLSS in another.

The game industry has used TAA for many years and we know that it can fall down in certain ways. TAA is generated from multiple frames and can suffer from high-motion ghosting and flickering that DLSS tends to handle better.

Q: When’s the next DLSS update for Battlefield V and Metro Exodus?

A: We are constantly working to improve image quality. Recently we updated the core of DLSS so that you get the latest model updates the moment you launch your game. So make sure you have our latest Game Ready Driver (418.91 or higher) installed.

For Battlefield V, we think DLSS delivers a big improvement in 4K and 2560x1440 performance -- up to 40% -- for the corresponding quality, but also hear the community.  For the next push, we are focusing our testing and training to improve the image quality at 1920x1080 and also for ultrawide monitors (e.g. 3440x1440). The current experience at these resolutions is not where we want them.

For Metro Exodus, we’ve got an update coming that improves DLSS sharpness and overall image quality across all resolutions that didn’t make it into day of launch. We’re also training DLSS on a larger cross section of the game, and once these updates are ready you will see another increase in quality. Lastly, we are looking into a few other reported issues, such as with HDR, and will update as soon as we have fixes.  

https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/?fbclid=IwAR1QB448tfuHRf8ARcsDbloEjJ17n1vD2Cg02Bc2yXB_U8QW9T0Vl3fIsQM&linkId=100000005126417

Endret av Nizzen
  • Liker 1
Lenke til kommentar

"For Battlefield V, we think DLSS delivers a big improvement in 4K and 2560x1440 performance -- up to 40% -- for the corresponding quality, but also hear the community."

 

Hva mener de med den siste delen?

 

 

 

NVIDIA DLSS: Your Questions, Answered

By Andrew Edelsten on February 15, 2019 | Featured StoriesDLSS

 

 

Hi, I’m Andrew Edelsten, Technical Director of Deep Learning at NVIDIA. I’ve been working here since 2010, and for the last couple of years my team has been working with the folks at NVIDIA Research to create DLSS.

This week, we’re excited to launch DLSS for BattlefieldTM V and Metro Exodus, following launches in Final Fantasy XV: Windows Edition and 3DMark Port Royal. There have been a lot of questions, and I wanted to get some answers out to you on the most popular ones.

Q: What is DLSS?

A: Deep Learning Super Sampling (DLSS) is an NVIDIA RTX technology that uses the power of AI to boost your frame rates in games with graphically-intensive workloads. With DLSS, gamers can use higher resolutions and settings while still maintaining solid framerates.

Q: How does DLSS work?

A: The DLSS team first extracts many aliased frames from the target game, and then for each one we generate a matching “perfect frame” using either super-sampling or accumulation rendering. These paired frames are fed to NVIDIA’s supercomputer. The supercomputer trains the DLSS model to recognize aliased inputs and generate high quality anti-aliased images that match the “perfect frame” as closely as possible. We then repeat the process, but this time we train the model to generate additional pixels rather than applying AA. This has the effect of increasing the resolution of the input. Combining both techniques enables the GPU to render the full monitor resolution at higher frame rates.

Q: Where does DLSS provide the biggest benefit? And why isn’t it available for all resolutions?

A: The results of DLSS vary a bit, because each game has different characteristics based on the game engine, complexity of content, and the time spent on training. Our supercomputer never sleeps, and we continue to train and improve our deep learning neural network even after a game’s launch. When we have improvements to performance or image quality ready, we provide them to you via NVIDIA software updates.

 DLSS is designed to boost frame rates at high GPU workloads (i.e. when your framerate is low and your GPU is working to its full capacity without bottlenecks or other limitations). If your game is already running at high frame rates, your GPU’s frame rendering time may be shorter than the DLSS execution time. In this case, DLSS is not available because it would not improve your framerate. However, if your game is heavily utilizing the GPU (e.g. FPS is below ~60), DLSS provides an optimal performance boost. You can crank up your settings to maximize your gains. (Note: 60 FPS is an approximation -- the exact number varies by game and what graphics settings are enabled)

To put it a bit more technically, DLSS requires a fixed amount of GPU time per frame to run the deep neural network. Thus, games that run at lower frame rates (proportionally less fixed workload) or higher resolutions (greater pixel shading savings), benefit more from DLSS. For games running at high frame rates or low resolutions, DLSS may not boost performance. When your GPU’s frame rendering time is shorter than what it takes to execute the DLSS model, we don’t enable DLSS. We only enable DLSS for cases where you will receive a performance gain. DLSS availability is game-specific, and depends on your GPU and selected display resolution.

Q: Some users mentioned blurry frames. Can you explain?

A: DLSS is a new technology and we are working hard to perfect it.

We built DLSS to leverage the Turing architecture’s Tensor Cores and to provide the largest benefit when GPU load is high. To this end, we concentrated on high resolutions during development (where GPU load is highest) with 4K (3840x2160) being the most common training target. Running at 4K is beneficial when it comes to image quality as the number of input pixels is high. Typically for 4K DLSS, we have around 3.5-5.5 million pixels from which to generate the final frame, while at 1920x1080 we only have around 1.0-1.5 million pixels. The less source data, the greater the challenge for DLSS to detect features in the input frame and predict the final frame.

We have seen the screenshots and are listening to the community’s feedback about DLSS at lower resolutions, and are focusing on it as a top priority. We are adding more training data and some new techniques to improve quality, and will continue to train the deep neural network so that it improves over time.

Q: Why don’t I just use upscaled TAA instead?

A:  Depending on the resolution, quality settings, and game implementation, some may prefer TAA in one game and DLSS in another.

The game industry has used TAA for many years and we know that it can fall down in certain ways. TAA is generated from multiple frames and can suffer from high-motion ghosting and flickering that DLSS tends to handle better.

Q: When’s the next DLSS update for Battlefield V and Metro Exodus?

A: We are constantly working to improve image quality. Recently we updated the core of DLSS so that you get the latest model updates the moment you launch your game. So make sure you have our latest Game Ready Driver (418.91 or higher) installed.

For Battlefield V, we think DLSS delivers a big improvement in 4K and 2560x1440 performance -- up to 40% -- for the corresponding quality, but also hear the community.  For the next push, we are focusing our testing and training to improve the image quality at 1920x1080 and also for ultrawide monitors (e.g. 3440x1440). The current experience at these resolutions is not where we want them.

For Metro Exodus, we’ve got an update coming that improves DLSS sharpness and overall image quality across all resolutions that didn’t make it into day of launch. We’re also training DLSS on a larger cross section of the game, and once these updates are ready you will see another increase in quality. Lastly, we are looking into a few other reported issues, such as with HDR, and will update as soon as we have fixes.  

 

https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/?fbclid=IwAR1QB448tfuHRf8ARcsDbloEjJ17n1vD2Cg02Bc2yXB_U8QW9T0Vl3fIsQM&linkId=100000005126417

 

Lenke til kommentar

Så DLSS er egentlig bare nyttig visst man spiller med 60fps og lavere?

DLSS ser ut som det kun er nyttig om NVIDIA får ut fingrene sine og sørger for at det fungerer som det skal i det konkrete spillet du ønsker det til å gi en god effekt. 

Så slår du på standard DLSS i et spill som NVIDIA ikke bryr seg så mye om så vil det trolig ikke endre seg så mye over tid, det er jo her brukerne burde hatt mulighet til å si ifra til "skjermkortet" at hei, her bør ting se litt bedre ut og så få endret det.

 

Må si at dette burde fungere slik som det er poengtert

 

 

 

Deep Learning Super Sampling (DLSS) is an NVIDIA RTX technology that uses the power of deep learning and AI to improve game performance while maintaining visual quality. DLSS helps players achieve smooth frame rates with graphically-intensive settings, boosting framerates when your GPU is under heavy load.
Lenke til kommentar

DLSS ser ut som det kun er nyttig om NVIDIA får ut fingrene sine og sørger for at det fungerer som det skal i det konkrete spillet du ønsker det til å gi en god effekt. 

Så slår du på standard DLSS i et spill som NVIDIA ikke bryr seg så mye om så vil det trolig ikke endre seg så mye over tid, det er jo her brukerne burde hatt mulighet til å si ifra til "skjermkortet" at hei, her bør ting se litt bedre ut og så få endret det.

 

Må si at dette burde fungere slik som det er poengtert

 

 

Det er ikke Nvidia som velger hvilke spill som "får" DLSS og ikke. For øyeblikket er det visstnok gratis, men det er utvikleren av spillet som må gå med på at DLSS skal implementeres, og det er ikke bare å "skru det på"...

Lenke til kommentar

Artig. Med i5 4690 og GTX970 gikk gpu på 100% når cpu kjørte på 50% ca (Civ 6 som har rimelig jevnt fps). Nå med RTX2080 er det motsatt, gpu rusler avgårde med 45-50% når cpu jobber på 100%, på 4,4GHz. 

På tide med et cpu-bytte kanskje? Eller vil det ikke være veldig stor forskjell? Har forstått at Civ6 krever mest av cpu, men det er vel nok av spill som også krever vel så mye av gpu. Så hva kan være den mest optimale cpu for 2080?

Lenke til kommentar

Opprett en konto eller logg inn for å kommentere

Du må være et medlem for å kunne skrive en kommentar

Opprett konto

Det er enkelt å melde seg inn for å starte en ny konto!

Start en konto

Logg inn

Har du allerede en konto? Logg inn her.

Logg inn nå
×
×
  • Opprett ny...