Report: Intel Ivy Bridge launch set for April 29, AMD Trinity on May 15

By Jos
Apr 5, 2012
Post New Reply
  1. More information on Intel and AMD's planned launch for their respective next-generation processors has come to light thanks a couple of reports by CPU World and SWEClockers. According to theā€¦

    Read the whole story
  2. amstech

    amstech TechSpot Enthusiast Posts: 761   +178

    Basically a re-release of chips for Intel to dance to the bank with again.
    Thanks AMD! :p

    The new Intel HD Graphics 4000 + etc is impressive though.
  3. When you consider PCI-E 3.0 support, Ivy Bridge is a HUGGGEE improvement over Sandy Bridge. Its not even close.
  4. Wagan8r

    Wagan8r TechSpot Guru Posts: 572   +37

    Looks like I'll have plenty of options to choose from in the near future. Ivy Bridge and Kepler, or Trinity. I'm thinking the Trinity APUs will be considerably more affordable though.
  5. I guess the i7-3770K will suffice.
  6. Since when has Intel had the idea of making triple core CPU's?
  7. engineer - Sir ! You need to release a new version of intel processors to gain more profit.

    Ceo - **** it, just replace the name sandy with ivy and add 100mhz in HD graphics, we are done !

    Engineer - people are so dumb here

    Ceo - So much win
  8. EEatGDL

    EEatGDL TechSpot Booster Posts: 241   +43

    Yeah, and add more graphics cores, increase the transistor count by shrinking the transistor to 22nm and produce the first tru-3D transistors for consumers in existence. Do you know anything about semi-conductors, MOSFETs or something else to give a more intelligent opinion? I'm not to buy IB for at least the following year (haven't upgraded my PC in 4 years) so I'm not a fanboy who'll buy when the new products arrive, and while actually is a minor improvement for specifications and performance, in the engineering aspects are quite interesting -10 years of development.
  9. Sarcasm

    Sarcasm TechSpot Enthusiast Posts: 340   +20

    Really...

    With today's graphics cards, is there any proof that PCI-e 3.0 will have any massive advantages?
  10. dividebyzero

    dividebyzero trainee n00b Posts: 4,705   +590

    Maybe not for the mainstream, but for compute functionality the benefits of higher bandwidth are readily apparent. Any gaming GPU performance increase is likely to be marginal at best in most situations simply because most of the computation for frame rendering and post processing is accomplished within the GPU+VRAM. Games that require a higher level of CPU <-> GPU co-operation ( large game maps, intensive AI, CPU physics etc.) would likely start producing tangible differences between PCI-E 2.0 and 3.0 only once top SKU dual cards make an appearance ( HD 7990, GTX 690 ), and the CPU at the other end of the PCI-E bus is capable of living up to its end of the transactions.
  11. EEatGDL

    EEatGDL TechSpot Booster Posts: 241   +43

    As usual dividebyzero giving a more objective opinion; as we can see in the link the difference is not huge, but if you have PCIe 3 you may get some frames more with the same hardware which is very welcomed, and if you don't, well... you'll still get what you expected.
  12. 3DCGMODELER

    3DCGMODELER TechSpot Enthusiast Posts: 307   +18

    No 6 Cores...
    No 8 Cores....
    No 12 Cores...
    No 16 Cores....
    No 24 Cores...
    .
    Man they need to get going...
    Ya Think..

    :)
  13. MrBungle

    MrBungle TechSpot Booster Posts: 142   +62

    Why? Does anything really tax more than 4 threads these days? I'd rather see them offer a CPU with 4 cores and 2x the IPC than an 8 core based on todays Architectures... both would have the same theoretical throughput but one would be massively faster in just about every application.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.