PC sales have reversed the decline seen over the past several years of tablet popularity, while workstation sales have reached $7 billion and are expected to climb to more than $9 billion within the next couple of years. Meanwhile, gaming machines are growing by 39 percent. What does all this add up to? There’s a growing need for better graphics performance and more powerful CPUs. Until recently, Nvidia (as well as AMD) have effectively made Intel feel like an un-favored, red-headed stepchild. Is that changing?
Performance Testing: Intel Versus Nvidia
In recent performance testing, which pitted the Intel 3.7 GHZ E3-1245 V3 CPU directly against its class-equal, the Nvidia Quadro K600 AIB, the two came out more or less equally, with Intel outperforming the Nvidia in some key areas such as superior graphics capabilities and slightly faster processing time (though the times were on par enough that the average user would be unlikely to notice the difference).
In environments running compute-laden CAD software and complex models, every second counts. Will Intel’s performance improvements translate directly into improved market shares? Some evidence indicates this is already happening. You can learn about even more new developments in CAD workstation design at Cadalyst.
Market Shares: Intel Versus Nvidia
Recent market analysis shows that there is a slight decline in the overall demand for graphics cards. But most of this is attributed to normal market fluctuations, and doesn’t necessarily indicate a permanent decline in the overall market. Intel is blessed with a talented bunch of GPU engineers, and have been for some time.
While AMD’s market shares dropped over 18 percent in a single quarter, and Nvidia’s fell more than 10 percent, Intel’s market share went down less than 8 percent. Much of the support for Intel’s products can be attributed to the gaming sector, but the rest is likely a direct result of workstations, as PC demand was down significantly for that same quarter.
As early as last year, some were reporting significant improvements in Intel’s products, specifically in the arena of graphics. One of Intel’s best moves was incorporating support for OpenCL and OpenGL, both of which are gaining popularity among CAD professionals, as well as graphic artists, animators, and many graphics specialists in the media. Cadalyst is always here with the latest news and reviews involving CAD hardware and software development.
The Future of CAD Workstation Graphics
Is Intel clearly taking hold of the CAD graphics market? Maybe, maybe not. What is clear is that Intel is no longer content to sit on the sidelines while Nvidia and AMD run the show. CAD professionals who depend on workstations to make a living can expect to see better products coming from all of the top manufacturers in the graphics realm as each try to outdo the others in terms of performance and features.
For CAD users, Cadalyst is the brand of CAD information provider that offers the most complete and up-to-date information about CAD. Visit the Cadalyst website for up-to-date, accurate industry information today.
I recently read an article by an Intel product manager on the need for “ECC” (error correction code) memory in CAD workstations. From the article: “Corrupted data can impact every aspect of your business, and worse yet you may not even realize your data has become corrupted. Error-correcting code (ECC) memory detects and corrects the more common kinds of internal data corruption.”
For some reason this triggered my memory of the sudden-acceleration Toyota Prius incident from 2010. The popular press latched on to the idea that cosmic rays were screwing with the electronics in the Prius. While theoretically possible, the probabilities of this were astronomically low. It did however, make for a great story and the FUD (fear uncertainty doubt) caused Prius prices to temporarily plummet and sales come to a crawl.
Back to ECC memory and CAD systems. Is there really a need for ECC memory in CAD or is it just FUD marketing to upsell hardware and make products sound more valuable than they really are? I decided to do a little research.
Who needs ECC memory and what is its role in professional & CAD workstation computing?
Naturally occurring cosmic rays can and do cause problems for computers down here on planet Earth. Certain types of subatomic particles (primarily neutrons) can pierce through buildings and computer components and physically alter the electrical state of electronic components. When one of these particles interacts with a block of system memory, GPU memory or other binary electronics inside your computer, it can cause a single bit to spontaneously flip to the opposite state. This can lead to an instantaneous error and the potential for incorrect application output and sometimes, even a total system crash. However, the theoretical chances of a single bit error caused by a cosmic ray strike on your PC or workstation’s memory is fairly rare — only about once every 9 years per 8GB of RAM, according to recent data.
ECC technology — used as both system RAM, and in devices such as high-end GPUs — can reliably detect and correct these errors, reducing the odds of memory corruption due to “single bit errors” down to about once every 45 years for 8GB of RAM. Of course, just like everything else in life there are always tradeoffs. ECC memory is typically up to 10% slower and significantly more expensive than standard non-ECC memory.
Because the odds of a cosmic ray strike increase in direct proportion to the physical amount of memory (and related components) inside a computer, this is a real concern for large scale, clustered supercomputing and other environments where computing tasks often include high-precision calculation sets that can take days or even weeks to complete. In the case of supercomputer clusters, which often contain hundreds or even thousands of connected computer nodes and terabytes of memory, the odds of cosmic ray strikes on the system are much more likely — and much more costly. Restarting a week-long calculation on a supercomputer can cost a facility many tens of thousands of dollars in lost time, electricity and manpower —not to mention lost productivity.
But for even very beefy PC CAD workstation configurations with loads of RAM on board, you are probably not at imminent risk from problems caused by cosmic ray strikes and the resulting single bit errors. Over the course of your work, you are much more likely to endure system crashes or application hangs dues to failing components, power fluctuations and software bugs than due to cosmic ray strikes. Additionally, many applications in the desktop design and engineering space can actually endure a single bit error without negatively impacting the computing process or product. For example, if the color or brightness of a single pixel on a display monitor is changed due to this type of memory corruption on the system’s GPU, nobody will ever see or notice it. There are many such examples of this type of error not really impacting ones everyday work.
This said, many leading technology manufacturers are enabling their high-end products with ECC memory for compute-heavy (especially clustered supercomputing) applications where the benefits of using error correcting memory outweigh any comparative speed/cost drawbacks. AMD for example, has engineered their new AMD FirePro W9000 and FirePro S9000 ultra-high-end GPU cards to include ECC memory which can selectively be enabled by the end user and used for many advanced computing purposes where rock-solid stability and protection from space rays is crucial.
Author: Tony DeYoung
Where do you begin your quest for the right workstation? This particular hardware search should start with your software.
Let’s be real: Nobody relies on just one application over the course of a day. We’re all bouncing between disparate tasks and windows. But for the majority of CAD professionals, there is one application — or maybe a couple — that consumes the bulk of your hours at the desk. What’s the app that dominates your day? Got it? Now hit the web site of the software developer and find the minimum and recommended system requirements for your killer app. AutoCAD users can find this information at http://usa.autodesk.com/autocad/system-requirements.
Minimum is the Starting Point Only
In most cases, an application’s minimum requirements set an extremely low standard, as the software vendors begrudgingly must address the least common denominator of the installed base. We don’t recommend you follow these guidelines, but it’s worth making a note of the minimum graphics, system memory and CPU requirements. On the other hand, it’s highly likely that any new workstation on the market today will meet or exceed these numbers.
More interesting is the list of recommended or certified hardware. For SolidWorks, Dassault Systèmes (as of this writing) specifies a minimum of 1 GB RAM, but suggests 6 GB. Well, if you go with 1 GB, you’ll be sorry — even 6 GB isn’t necessarily the best choice, depending on your budget, and especially given the incredible amount of gigabytes/dollar that can be had today.
Similarly, Autodesk isn’t going to stop you from running a PC gamer graphics card, but the company will tell you which cards are optimized for performance and built for reliability when it comes to supporting AutoCAD or Autodesk Inventor.
Increasingly, the only CAD-certified graphics cards are professional-brand NVIDIA Quadro and AMD FirePro. That’s because software developers have consistently seen the fewest bugs and problems with cards that, like the system overall, have been exhaustively tested and tuned for professional workstation applications. In fact, the major CAD software developers will help you address issues related to running a Quadro or FirePro card, but they dedicate no support cycles to fixing bugs on consumer-class hardware.
Last week, I talked about why Intel’s latest generations of graphics-enabled CPUs might make CAD professionals think twice about paying extra dollars for a discrete graphics card on their next workstations.
As I mentioned previously, the low-cost Entry 3D segment has seen steady gains over the years, for a logical reason … as average street prices fall and capabilities climb, the Entry class satisfies more and more of the workstation community. But then right around the start of 2011 — precisely when Sandy Bridge comes out of the chute in workstations like HP’s Z210 —Entry 3D shipments start to flatten and then decline (albeit modestly).
Why are Entry 3D sales more indicative than other segments of a possible erosion from integrated Sandy Bridge graphics? Well, if recent buyers were to opt for Sandy Bridge graphics, the discrete card they’d most likely be opting against would be an entry-class product. Those shopping for a mid-range or better card aren’t going to be enticed by CPU-integrated graphics. Such buyers have both the need for performance and the dollars to pay for it. So if Intel’s new push into professional-brand integrated graphics were to have an impact, we would logically see the effects first in Entry 3D. And that appears precisely to be the case, albeit at a far-from-dramatic rate.
Don’t expect the impact of CPU-integrated graphics to be either dramatic or fast-paced. For the near term, while Intel’s “good enough” graphics performance can satisfy a big chunk of the mainstream, it will be an appropriate choice for only the most budget-conscious professionals. Still, the trend line, as it was in mainstream graphics, is pointing just one way: up. Sandy Bridge’s successor, Ivy Bridge, has just recently begun shipping in the market, and it again provides a substantial bump in performance and features over its predecessor.
Give it time, and integrated solutions will eventually hold significant share among CAD pros … not to the extent it does in mainstream PC markets, but significant share nonetheless.
Intel had been promising that its latest generations of graphics-enabled CPUs would make CAD professionals think twice about paying extra dollars for a discrete graphics card on their next workstations. And it appears those promises are holding true … not in dramatic fashion, but valid nonetheless.
The thought of CPU-integrated graphics is a new proposition for buyers of professional-caliber looking to speed their CAD workflows. Prior to Intel’s Westmere generation, released in early 2010, virtually ever workstation shipped with a professional-brand graphics add-in card installed. The vast majority have been Nvidia Quadro models, with a minority share of units bearing AMD’s FirePro brand.
Westmere’s CPU+GPU combination first raised the question — could integrated graphics perform well enough for CAD duties to allow buyers to save some cash on the add-in card? The answer in 2010 was generally “no.” Performance was not up to snuff, even for entry-class CAD use, and as a result, most workstation OEMs still required the presence of a Quadro or FirePro card in any machine leaving the factory. That choice made sense, as the last thing HP or Dell would want for their professional customers is a poor graphics experience that might turn them off workstations altogether.
But then came 2011 and the launch of the Sandy Bridge generation of die-integrated graphics. With Sandy Bridge, Intel more than anything else focused performance improvements in graphics. And for the first time, the company began actively marketing its graphics for professional use (the “P” prefix in the P3000 signifying “professional” grade). The combination of Intel’s posture and Sandy Bridge’s substantially improved graphics were enough to get OEMs like HP to (for the first time) allow buyers to choose integrated graphics and pass on the graphics add-in card.
Now, Sandy Bridge’s graphics can’t compete head-to-head with Quadro or FirePro … it’s not intended to. What it is intended to do is provide competent graphics for CAD professionals who don’t have the highest demand for performance and whose budgets are especially tight. How did Intel do on its goals? Well, a look in the past few quarters at the add-in card attach rates for low-end systems and the distribution of the add-in cards sold should give a clue.
Anecdotally, OEMs are reporting that, while attach rates remain quite high, they have dropped with Sandy Bridge. And those reports seem to be validated by shipment numbers seen for professional graphics add-in card segments, specifically the low-cost Entry 3D segment. That segment sees steady gains over the years, for a logical reason … as average street prices fall and capabilities climb, the Entry class satisfies more and more of the workstation community. But then right around the start of 2011 — precisely when Sandy Bridge comes out of the chute in workstations like HP’s Z210 —Entry 3D shipments start to flatten and then decline (albeit modestly).
Next week, I’ll continue this discussion by explaining why Entry 3D sales more indicative than other segments of a possible erosion from integrated Sandy Bridge graphics.
The incessant pace of progress and innovation for workstation technology never slows.
Less than a quarter after every major workstation OEM launched a full trio of models based on Intel’s Sandy Bridge-EP (a.k.a. Xeon E5), the industry leader in CPUs has already released its follow-on processor generation, code-named Ivy Bridge. And subsequently, we are now seeing the first Ivy Bridge workstations hitting the market, including Dell’s Precision T1650 and HP’s Z220.
How Does Ivy Bridge Affect the CAD Workstation Market?
What benefits can Ivy Bridge offer to those plying their trade in CAD? Well, there’s the usual broad-based boost in performance that any good generational upgrade will provide, as Intel expects a 20 percent performance improvement for general computation from Ivy Bridge (though of course mileage will vary by application). But there’s more appeal for this upcoming product family than just the usual generation-to-generation performance bump. Because while that appeal extends across applications and usage models, there are a few special nuggets of technology in this generation that will pique the interest of workstation-wielding CAD professionals.
Intel’s lead in silicon process manufacturing continues to grow, and the benefits of Ivy Bridge should prove an ideal vehicle to showcase that lead. Just as competitors are getting their 32 nm process, with Ivy Bridge Intel’s jumping a full generation ahead with a 22 nm process that allows for millions more transistors in the same silicon area.
That’s a win for workstation buyers especially, as they represent a professional community that certainly care about CPU performance, but demand a lot more. First off, a shrink buys room for more cores, and we’ll eventually see some Ivy Bridge SKUs with eight or more cores (not at first launch, but later in the product lifecycle). Far from being one-trick-ponies, today’s MCAD professionals have to be jacks-of-all-trades — a competitive market, tight budgets and tighter schedules all demand it. Drawing is just one piece of the daily workflow, complemented by a host of other critical compute tasks, from simulation to styling. And chores like finite element analysis and computational fluid dynamics multi-thread quite well, making 50% more available cores a serious weapon in driving computation time down and achieving the ultimate goal — boosting productivity.
Improved Integrated Graphics
The extra silicon space also allowed Intel to dial up the performance and functionality of its integrated graphics hardware. For example, Ivy Bridge’s P4000 GPU populates more on-chip graphics engines and supports advanced features like hardware tessellation, a proven tool that can deliver finer, more realistic 3D surfaces in less time. With its range of upgrades, Ivy Bridge can claim full DirectX11 support that its predecessor could not. And with more of those bigger, faster graphics engines, Intel can claim a 30% increase in performance for Ivy Bridge’s graphics over Sandy Bridge’s. And that means CAD professionals on a budget can now more seriously consider choosing a low-cost CPU-integrated graphics solution like the P400.
Support for Three Monitors
But looking beyond performance, Ivy Bridge’s graphics is going to provide another big draw for the MCAD professional: native support for three monitors. While the mainstream is now just discovering the benefits of dual monitors, many mechanical designers are already using three: for example, one for drawing, one for simulation and one for visualization. Prior to Ivy Bridge, a desktop with three high-resolution monitors mandated at least one discrete add-in graphics card. But with this generation, a cost-conscious MCAD user could go three-wide and stick with base platform graphics.
MCAD Users: Same Performance, 50% Fewer Watts!
With more cores to speed CAD simulation and ultra-realistic rendering, as well as a 30 percent graphics improvement, Ivy Bridge promises to be a tide that raises all boats, as all workstations — deskside or mobile — will benefit. But there’s one unique advancement debuting in Ivy Bridge that’s a particular boon to the MCAD pro on the go. You see, Ivy Bridge’s 22 nm technology introduces a revolutionary new transistor structure called TriGate that offers the same performance at 50% fewer Watts than Sandy Bridge’s 32 nm.
And that’s allowing leading vendors HP, Lenovo, Dell and Fujitsu to introduce new mobile workstation models that dramatically extend battery life at the same performance level, or deliver far more performance, with the same battery life. Either way you look at it, it’s a win when computation demands are high. And few corners of the computing world demand more performance/Watt than mechanical designers trying to accomplish demanding design work on the road.
This post reflects industry analyst Alex Herrera’s views and does not necessarily reflect the opinions, product plans or strategy of either Dell or Intel.