Cambridge University gets early access to Intel's 60-core Xeon Phi chip

By Shawn Knight ยท 18 replies
Nov 15, 2012
Post New Reply
  1. The latest Intel Xeon Phi chip has arrived, packing a full 60 cores into a single package. It's not expected to be publically available until sometime early next year although a few have been granted early access to the technology...

    Read more
  2. MeladT

    MeladT TS Rookie Posts: 22   +7

    But can it run Crysis?
    hammer2085, Lurker101 and Burty117 like this.
  3. Ranger12

    Ranger12 TS Evangelist Posts: 621   +122

    I'll keep my eyes peeled for a techspot review...
  4. Maybe it's silly question, but I'm serious since I'm a noob here: if I plugged it to my gaming rig, would that (significantly) increase fps when gaming?
  5. ghasmanjr

    ghasmanjr TS Booster Posts: 363   +86

    This joke is entirely too outdated. My tablet can run Crysis. Times have changed.
    BlueDrake likes this.
  6. Cota

    Cota TS Enthusiast Posts: 513   +8

    How about going in a new way, specially when a $100 dlls video card has 20+ times more juice than most of the end-line CPU's, it has been my whet dream to use my GPU card for all porpuses.
  7. VitalyT

    VitalyT Russ-Puss Posts: 3,663   +1,949

    Today's top-end GPU cards passed 4TFlop some time ago, when it comes to floating-point operations, and CPU market is far behind that figure. But that's not the point. GPU-s do too much graphics-specific stuff these days, and expecting a CPU to match that is simply unrealistic.
  8. PC nerd

    PC nerd TS Booster Posts: 317   +41

    Crysis is still pretty demanding, even by today's standards.
  9. VitalyT

    VitalyT Russ-Puss Posts: 3,663   +1,949

    And what is it giving back for its demands? :) Or if anyone cares for that matter :)

    I could write a 2+2 application and make it most demanding ever too :)
  10. Zeromus

    Zeromus TS Booster Posts: 227   +7

    If crytek optimized crysis 1, they would ruin the jokes.
    bielius likes this.
  11. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Intel's Xeon Phi has no texture units. No TMU's = no graphics visualization. Number crunching only.
  12. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 3,485   +45

    So what if I made the graphics out of nothing but 1's and 0's..............
  13. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Would likely be more original than Activision's offerings.
    BlueDrake and Burty117 like this.
  14. DKRON

    DKRON TS Guru Posts: 569   +25

    Maybe on the lowest graphics settings, no single card can still play crysis on maximum 1080P
  15. dcnc123

    dcnc123 TS Enthusiast Posts: 57

    Impressive intel..... maybe by 2016 or 2017 we will be having 64cores desktops also
  16. > Impressive intel..... maybe by 2016 or 2017 we will be having 64cores desktops also

    Thats cool... only thing left to figure out is what to do with a 64 core processor in a normal computer.

    Its a specialized data crunching chip really, in a way similar to specialized gpu chips made for graphics processing, most of the current types of software just cant make use of such architecture. And multitasking benefits have limits as well, especially since individual cores arent that powerful.
    There are some programs which can make use of use dual/quad core cpus, but even those are not particularly efficient beyond two cores from what I hear...maybe if regular software made use of some adaptive algorithms, and would automatically divide itself into a number of sub-threads based on the number of cores the main cpu has and its current load....or rewritten from the ground up to separate its operations into a bunch of independant small threads, like an operating system...though is that even necessary/beneficial?

    Anyone of the techies, is it actually beneficial to break up a program into multiple threads/subprogramms, or is it better to keep it all confined into one large file ? Obviously it would depend on what kind of cpu is being used in your system, but I'm asking a question in general, not in a context of optimizing a program resource usage for a specific cpu. It just seems like software is being forced to multi thread because of the types of chips being released, and not the other way around, as in chips being made that way to account for the fact that its more efficient to write software in many threads.

    Come to think of it, I have firefox running as one large 3Gb file/thread that I assume shares cpu usage from all tabs in it (well, plugin container is external), but Chrome would normally divide that single large file and make a dozen or more smaller threads out of it...
  17. Zeromus

    Zeromus TS Booster Posts: 227   +7

    Reason for that being that is that Windows gives processes so much memory accessible from its linear address space, even 64-bit applications only get so much. By starting multiple processes that offload the work into separate address spaces, you can leverage a lot more isolation and optimization by breaking part of the memory allocation intensive portions of the program's logic. Chrome does this but also separates rendering, tab management, and plugins. Multi-processing Chrome not only breaks through the memory addressing limitations, but isolating plugins provides more security through process isolation. The same can be said about page management as plugins or errors that occur in tabs do not crash the browser entirely, just the offending page.
  18. MeladT

    MeladT TS Rookie Posts: 22   +7

    A joke, Taken too far LOL
  19. Yes it would but xeons arent made for gaming however with 60 cores and hyperthreading it would but its a waste of money its better to just get a i7-4gen

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...