GPU-Accelerated Virtual Desktops: The Future of Computing?

Jos

Posts: 3,073   +97
Staff
Read the full article at:
[newwindow=https://www.techspot.com/article/851-virtual-desktop-gpu-acceleration/]https://www.techspot.com/article/851-virtual-desktop-gpu-acceleration/[/newwindow]

Please leave your feedback here.
 
I don't know if it's the future of computing, but it's certainly one possible future. Personal CPU/GPU servers at home make sense. There's no need for everyone to have their own gaming level PC if most of the time they just browse the web or use office products.

That said, the other direction computing is going is having a powerful PC in your pocket. In a couple of years common phones will be more powerful than an Xbox 360 and it should be possible in theory to use them as a PC (when hooked to a display/keyboard).

The combination isn't bad though, have a PC server at home and mobile devices to stream it.
 
The last company I worked for that went VDI went back to desktop pc's in 6 months.

Why?

One network outage and all of the PC's in the whole building were down except for IT which was the only department that didn't go VDI.

They won't be trying that again anytime soon loss productivity is a major issue with upper management lol.
 
Additional products and information that may be of interest to readers that is less focused on hardware specific solutions.

Windows Server uniquely has a technology called RemoteFX which is GPU agnostic and unique because it leverage the WDDM/WDM GPU parallelization and virtualization features that only Windows NT currently offers. This is a technology that has been shipping since 2006 and has evolved into a very rich solution, especially for Enterprise clients that want to run richer 3D accelerated software or even new GPU rich frameworks like WinRT on low end client workstations and do so from simple Windows Server configurations with any brand and mix of GPUs.

RemoteFX is also capable of providing full 3D gaming support and can do so by utilizing any GPU resources available to the Windows Server configuration, scaling up and down and GPU demands are needed.


The reason this is newsworthy to average readers is that Windows 8.1 Pro also supports RemoteFX technology. This means that users can connect to their Windows 8.1 system from any Remote Desktop client and run 3D and GPU intensive software. (Very little is ever mentioned of Windows 8.1's positive technical features, so this feature isn't widely known.)


So end users can now use their phone/tablet/device to connect to their Windows 8.1 system to run rich GPU based software including being able to play games, from older titles like WoW to the latest version of Battlefield. (Prior to reading this article I was playing SWTOR on my Nokia Phone by connecting to my gaming desktop.)

Since this is built into Windows 8.1, it is also a free solution that many users already have.
 
You missed one disadvantage - the licensing is a nightmare and in some cases insanely expensive. Some companies want to charge for every user that could potentially use the service rather than the number that actually use it. Others want money for every core in the host machine even if the hosted server is configured to use only half of them.
 
yukka,

Have you dealt with the licensing model for VMware View? Their "concurrent user" model allows for you to license the number of "currently connected users."
 
If the bugs get worked out, this could be really awesome!!! I could imagine one machine to do everything and be able to separate things while gaming, working, or whatever your doing without paying the price (In performance terms)!!!

Very interesting article.
 
Like yukka said, the greatest problem with this model is that you trade hardware costs for perpetual license costs, and complex license usage agreements. Which, I not sure if there will ever be a price point where VDI has an advantage over a traditional 3 to 5-year computer life cycle.
 
Ever since I got into virtualization back in 1999 I have always dreamed of having GPU acceleration but here we are 15 years later and while things are better they still suck compared to the progress we have made in every other category.

With RemoteFX Microsoft did some great things for high performance graphics over the wire so I really hope in the next 3-5 years we see big steps taken by Nvidia and AMD to really push for a solid virtualization extension to their graphics products.
 
I think it will steadily gain more traction. It has a huge number of benefits for business, especially where you just have carbon copy desktops for hundreds of users.

We've been trialling an limited VDI deployment at work actually for system admins like myself. Benefits include being able to remotely access administrative tools out of hours to fix problems, being able to get the same consistent desktop across multiple platforms and indeed being able to just suspend work midway through then just reconnect to the same session from a different location. It's extremely useful and you do get used to the convenience.

I guess the only show stopper at least in VMware's case is that they price their solutions just so that the cost comparison against the traditional solution is only ever JUST compelling enough to make it worth looking at - it's never a total "no brainer".

But yeah - we love it, and I can see us rolling it out wider within the establishment.
 
A big problem with this situation is that in a large business, the server would require more memory than is currently feasible for a single server. Say someone is doing video editing, a typical engineering team of 10 is each using cad/cam software if that's what it's still called, each copy now requiring 8g. Firefox would not function on this laptop with 2g until I upgraded to 3g. Each person simply browsing with html5 requires 3g a copy? Microsoft went out of their way (in bed with intel) to make everything very memory intensive with dot net. A 3 line visual c++ program I wrote required like 300 meg. Not likely, but say each user is playing an 8g game. At this time, impossible unless it's just a toy.
 
Firefox would not function on this laptop with 2g until I upgraded to 3g.

Sounds strange. Firefox's RAM usage is typically around 1GB for me with quite a few tabs loaded, and much less for simpler use. Much better than Chrome for example. And what do you mean by "not function"? Not open?
 
This is quite an interesting article, but my interests are much more modest. What does a single user (let's call him "me") need to do in order to establish an instance of a virtual desktop running on the same/primary machine that I'm using as I type this question?

I know there are minimum hardware requirements that I currently don't meet, but I'm planning a build & I'd like to be sure that I select components accordingly. I'll often read comments by developers who mention having one or more virtual desktops running for testing purposes, as this avoids the possibility of corrupting their primary system software.

Any thoughts or good source links for someone (again, let's call him "me") who is a noob to this arena?
 
A big problem with this situation is that in a large business, the server would require more memory than is currently feasible for a single server. Say someone is doing video editing, a typical engineering team of 10 is each using cad/cam software if that's what it's still called, each copy now requiring 8g. Firefox would not function on this laptop with 2g until I upgraded to 3g. Each person simply browsing with html5 requires 3g a copy? Microsoft went out of their way (in bed with intel) to make everything very memory intensive with dot net. A 3 line visual c++ program I wrote required like 300 meg. Not likely, but say each user is playing an 8g game. At this time, impossible unless it's just a toy.
Wait what? I ran firefox on 1gb of ram (not very productively I agree but still). Mid/high end workingstation or server with 128GB ram is nothing special and even then you could have more than one server for virtual desktop...
 
This is quite an interesting article, but my interests are much more modest. What does a single user (let's call him "me") need to do in order to establish an instance of a virtual desktop running on the same/primary machine that I'm using as I type this question?

Are you referring to a remote desktop (they kind discussed in this article) or a local virtual machine? Assuming you mean the latter, you need software such as VirtualBox, enough RAM (and disk space) to dedicate to the virtual machine, and a copy of the OS you want to install in that VM.

What do you want it for?
 
Sounds strange. Firefox's RAM usage is typically around 1GB for me with quite a few tabs loaded, and much less for simpler use. Much better than Chrome for example. And what do you mean by "not function"? Not open?
I'm still using vista, and firefox would take 1 minute between web pages with 2 g of memory. There are other processes like mozilla maintanence task that use memory other than the firefox' display of memory usage in task manager.
 
I'm still using vista

Ah, that explains it. :)

Really, Vista is a fine OS but a memory hog, so I guess even Firefox, which takes less RAM than other browsers (I've seen Chrome take 1GB for a single tab) would suffer. Switching to another OS will likely improve your experience a lot in memory tight situations.
 
Back