Nvidia announces January GeForce event, could reveal the RTX 3080 Ti, RTX 3060

midian182

Posts: 6,653   +59
Staff member
Something to look forward to: Nvidia has announced that its 'GeForce RTX: Game On' event takes place on January 12, 2021, at 9 AM PST. The company hasn't revealed what will be on show, but you can expect new Ampere cards to be announced, with the RTX 3080 Ti and RTX 3060 the most likely candidates.

The broadcast lands during the online-only Consumer Electronics Show (CES), which runs from Monday, January 11, and Thursday, January 14. Jeff Fisher, senior vice president of Nvidia's GeForce business, will present the event. He was also on stage last year to unveil the RTX 2060 and Max-Q laptops.

It's rumored that the successor to that RTX 2060—the non-Ti version of the RTX 3060—will be revealed next month. All rumors point to two versions of the card: one with 12GB and one with 6GB. The latter is thought to have started life as the RTX 3050 Ti before being rebranded by Nvidia.

A potentially more exciting announcement could be the RTX 3080 Ti. Recent leaks claim that Nvidia had been planning to launch the card in January but decided to delay the release until February, partly due to Big Navi not being as threatening as it expected.

The RTX 3080 Ti is thought to feature 20GB of GDDR6X, double that of the vanilla RTX 3080. It could cost $999, putting it in direct competition with the Radeon RX 6900 XT, which packs 16GB of GDDR6. Nvidia's offering also has a 320-bit memory bus, 760 GBps memory bandwidth, 9,984 CUDA cores (78 streaming multiprocessors), and a 320W TDP, according to reports.

In addition to announcing the GeForce RTX: Game On broadcast, the company revealed four new multiplayer games now support its latency-lowering Nvidia Reflex. CRSED: F.O.A.D. (formerly Cuisine Royale), Enlisted, Mordhau, and Warface are joining the likes of Fortnite, Valorant, and Call of Duty Modern Warfare in offering the feature. You can read more about Reflex here.

Permalink to story.

 

QuantumPhysics

Posts: 4,711   +5,111
I am definitely happier with the way the 3000 series released with the 3070 being roughly equal to the 2080Ti at half the cost.

I truly hope that in the 4000 series, the lowest end 4000 series, be it the 4050 or 4060- will be more powerful than the 3080Ti.

It would just make so much more sense to completely exceed the previous generation and undercut its price.
 

Theinsanegamer

Posts: 2,391   +3,474
I am definitely happier with the way the 3000 series released with the 3070 being roughly equal to the 2080Ti at half the cost.

I truly hope that in the 4000 series, the lowest end 4000 series, be it the 4050 or 4060- will be more powerful than the 3080Ti.

It would just make so much more sense to completely exceed the previous generation and undercut its price.
No GPU generation has had that kind of a leap.
 

Dsirius

Posts: 20   +25
TechSpot Elite
How about tech and hardware websites and YouTube reviewers unite and make an embargo to the infamous Nvidia for their RTX 3080Ti launch? On the day launch all of you just make a big announce that Nvidia is banned by reviewers because of their maleficent behavior and repeatedly blackmail against you. Let them taste what they are doing to all of you (just that Nvidia is taking you one by one). Otherwise none of you will have a chance against greedy and maleficent Nvidia corporation in the long term.
 

Lew Zealand

Posts: 1,769   +1,892
TechSpot Elite
Moreover, what's the technical reason that the memory on GPU boards is soldered in as opposed to having a memory slot?

Cooling
Clearance
Latency
Warranty
User error

...for starters. Have a look at a current gen video card and consider as an OEM how you'd manage all these risks with the typical user. You wouldn't. You'd make a video card just like they have been for decades now, a closed unit for all but the most die-hard cooling modders.
 

terzaerian

Posts: 772   +1,102
Cooling
Clearance
Latency
Warranty
User error

...for starters. Have a look at a current gen video card and consider as an OEM how you'd manage all these risks with the typical user. You wouldn't. You'd make a video card just like they have been for decades now, a closed unit for all but the most die-hard cooling modders.
I actually found this discussion later in the day:

https://www.reddit.com/r/AskEngineers/comments/3thjnb
And it sounds like it has the most to do with latency.

That said, apparently some cards from the early 90s from Diamond had open memory slots you could put actual RAM chips in, while some Matrox cards had slots for laptop memory modules.

Which makes me wonder if some obscure Chinese manufacturers somewhere haven't tried this. I saw this on LTT the other day:


You'd think if someone has built a graphics card into a motherboard, maybe something like what I've described is out there?
 

Puiu

Posts: 4,481   +3,317
TechSpot Elite
I actually found this discussion later in the day:

https://www.reddit.com/r/AskEngineers/comments/3thjnb
And it sounds like it has the most to do with latency.

That said, apparently some cards from the early 90s from Diamond had open memory slots you could put actual RAM chips in, while some Matrox cards had slots for laptop memory modules.

Which makes me wonder if some obscure Chinese manufacturers somewhere haven't tried this. I saw this on LTT the other day:


You'd think if someone has built a graphics card into a motherboard, maybe something like what I've described is out there?
I remember upgrading from 1MB VRAM to 2MB VRAM in the 90s on a 686 system :)