PC sales have reversed the decline seen over the past several years of tablet popularity, while workstation sales have reached $7 billion and are expected to climb to more than $9 billion within the next couple of years. Meanwhile, gaming machines are growing by 39 percent. What does all this add up to? There’s a growing need for better graphics performance and more powerful CPUs. Until recently, Nvidia (as well as AMD) have effectively made Intel feel like an un-favored, red-headed stepchild. Is that changing?
Performance Testing: Intel Versus Nvidia
In recent performance testing, which pitted the Intel 3.7 GHZ E3-1245 V3 CPU directly against its class-equal, the Nvidia Quadro K600 AIB, the two came out more or less equally, with Intel outperforming the Nvidia in some key areas such as superior graphics capabilities and slightly faster processing time (though the times were on par enough that the average user would be unlikely to notice the difference).
In environments running compute-laden CAD software and complex models, every second counts. Will Intel’s performance improvements translate directly into improved market shares? Some evidence indicates this is already happening. You can learn about even more new developments in CAD workstation design at Cadalyst.
Market Shares: Intel Versus Nvidia
Recent market analysis shows that there is a slight decline in the overall demand for graphics cards. But most of this is attributed to normal market fluctuations, and doesn’t necessarily indicate a permanent decline in the overall market. Intel is blessed with a talented bunch of GPU engineers, and have been for some time.
While AMD’s market shares dropped over 18 percent in a single quarter, and Nvidia’s fell more than 10 percent, Intel’s market share went down less than 8 percent. Much of the support for Intel’s products can be attributed to the gaming sector, but the rest is likely a direct result of workstations, as PC demand was down significantly for that same quarter.
As early as last year, some were reporting significant improvements in Intel’s products, specifically in the arena of graphics. One of Intel’s best moves was incorporating support for OpenCL and OpenGL, both of which are gaining popularity among CAD professionals, as well as graphic artists, animators, and many graphics specialists in the media. Cadalyst is always here with the latest news and reviews involving CAD hardware and software development.
The Future of CAD Workstation Graphics
Is Intel clearly taking hold of the CAD graphics market? Maybe, maybe not. What is clear is that Intel is no longer content to sit on the sidelines while Nvidia and AMD run the show. CAD professionals who depend on workstations to make a living can expect to see better products coming from all of the top manufacturers in the graphics realm as each try to outdo the others in terms of performance and features.
For CAD users, Cadalyst is the brand of CAD information provider that offers the most complete and up-to-date information about CAD. Visit the Cadalyst website for up-to-date, accurate industry information today.
I recently read an article by an Intel product manager on the need for “ECC” (error correction code) memory in CAD workstations. From the article: “Corrupted data can impact every aspect of your business, and worse yet you may not even realize your data has become corrupted. Error-correcting code (ECC) memory detects and corrects the more common kinds of internal data corruption.”
For some reason this triggered my memory of the sudden-acceleration Toyota Prius incident from 2010. The popular press latched on to the idea that cosmic rays were screwing with the electronics in the Prius. While theoretically possible, the probabilities of this were astronomically low. It did however, make for a great story and the FUD (fear uncertainty doubt) caused Prius prices to temporarily plummet and sales come to a crawl.
Back to ECC memory and CAD systems. Is there really a need for ECC memory in CAD or is it just FUD marketing to upsell hardware and make products sound more valuable than they really are? I decided to do a little research.
Who needs ECC memory and what is its role in professional & CAD workstation computing?
Naturally occurring cosmic rays can and do cause problems for computers down here on planet Earth. Certain types of subatomic particles (primarily neutrons) can pierce through buildings and computer components and physically alter the electrical state of electronic components. When one of these particles interacts with a block of system memory, GPU memory or other binary electronics inside your computer, it can cause a single bit to spontaneously flip to the opposite state. This can lead to an instantaneous error and the potential for incorrect application output and sometimes, even a total system crash. However, the theoretical chances of a single bit error caused by a cosmic ray strike on your PC or workstation’s memory is fairly rare — only about once every 9 years per 8GB of RAM, according to recent data.
ECC technology — used as both system RAM, and in devices such as high-end GPUs — can reliably detect and correct these errors, reducing the odds of memory corruption due to “single bit errors” down to about once every 45 years for 8GB of RAM. Of course, just like everything else in life there are always tradeoffs. ECC memory is typically up to 10% slower and significantly more expensive than standard non-ECC memory.
Because the odds of a cosmic ray strike increase in direct proportion to the physical amount of memory (and related components) inside a computer, this is a real concern for large scale, clustered supercomputing and other environments where computing tasks often include high-precision calculation sets that can take days or even weeks to complete. In the case of supercomputer clusters, which often contain hundreds or even thousands of connected computer nodes and terabytes of memory, the odds of cosmic ray strikes on the system are much more likely — and much more costly. Restarting a week-long calculation on a supercomputer can cost a facility many tens of thousands of dollars in lost time, electricity and manpower —not to mention lost productivity.
But for even very beefy PC CAD workstation configurations with loads of RAM on board, you are probably not at imminent risk from problems caused by cosmic ray strikes and the resulting single bit errors. Over the course of your work, you are much more likely to endure system crashes or application hangs dues to failing components, power fluctuations and software bugs than due to cosmic ray strikes. Additionally, many applications in the desktop design and engineering space can actually endure a single bit error without negatively impacting the computing process or product. For example, if the color or brightness of a single pixel on a display monitor is changed due to this type of memory corruption on the system’s GPU, nobody will ever see or notice it. There are many such examples of this type of error not really impacting ones everyday work.
This said, many leading technology manufacturers are enabling their high-end products with ECC memory for compute-heavy (especially clustered supercomputing) applications where the benefits of using error correcting memory outweigh any comparative speed/cost drawbacks. AMD for example, has engineered their new AMD FirePro W9000 and FirePro S9000 ultra-high-end GPU cards to include ECC memory which can selectively be enabled by the end user and used for many advanced computing purposes where rock-solid stability and protection from space rays is crucial.
Author: Tony DeYoung
Here at CADspeed, we get a lot of questions about buying new hardware for CAD applications. While the answer to, “What CAD hardware should I buy?” varies widely based on the person asking the question, it always starts in the same place: with the requirements of the CAD software you plan to use.
Yet a list of minimum requirements can be, well, only minimally helpful in the quest for the right CAD workstation. Most CAD users need hardware that will not just meet the minimum specifications, but enable them to maximize their productivity.
CAD software developers know this, and they have a vested interest in making sure you get the bang for your software buck. So this series will explore recommended hardware for a variety of common CAD applications from the makers of the applications themselves.
We start this series with Autodesk, creator of 3D design, engineering and entertainment software that includes some of the most commonly used applications in the industry. Autodesk has developed a web site to help users find certified or recommended software for Autodesk applications.
The truth is, however, many CAD users don’t use just one CAD software application. It’s very common to use both AutoCAD and Revit on the same system, for example. The intriguing part of the Autodesk hardware site is you can select multiple products and find the common driver and hardware configurations that will work best for your system.
Certified vs. Recommended
On the Autodesk website, you’ll see two terms that you need to understand: certified and recommended. “Certified” hardware meets Autodesk’s minimum hardware requirements for the applicable Autodesk software product. At least one configuration (e.g., GPU + driver, or CPU + GPU + RAM + HD + BIOS) has passed tests designed to verify that the hardware supports the product’s features.
“Recommended” hardware meets Autodesk’s recommended system requirements for the applicable Autodesk product. At least one configuration has passed tests designed to verify that the hardware supports the product’s features.
A “Recommended” or “Certified” rating is based on the test results for a graphics card and driver or a complete system. Clicking the link for a card or system will reveal the results of each individual component tests.
|Recommended – Meets Autodesk’s recommended system requirements and has passed all Autodesk certification tests.|
|Certified – Meets Autodesk’s minimum system requirements and has passed all Autodesk certification tests.|
|Icon||Component Test Results*|
|Passed – When tested with this configuration, the hardware passed testing.|
|Passed with issues – When tested with this configuration, the hardware has some minor problems or features that are not supported.|
|Failed – When tested with this configuration, the hardware does not adequately support the product’s features.|
|No Results – This configuration has not been tested by the associated product.|
* Test results are valid only for the tested combination of hardware and driver. Certified or Recommended status does not guarantee that the graphics hardware will operate acceptably with other drivers or configurations. Driver-specific test results are available for some hardware and can be found by clicking on a product name in the Hardware List.
Other Terms to Understand
Before using the Autodesk Certified Hardware site, you should understand a few other common terms to make sure you are getting the right results.
- Workstation—Graphics hardware designated by the manufacturer as workstation-grade, typically meaning it is designed to work with 3D CAD applications
- Consumer—Graphics hardware designated by the manufacturer for desktop or gaming level use, typically meaning it is not designed or recommended for use with 3D CAD applications
- Mobile—Integrated hardware normally found in laptops.
- Workstation Desktop—Desktop system designated by the manufacturer as workstation-grade, typically meaning it is designed to work with 3D CAD applications
- Workstation Laptop—Laptop designated by the manufacturer as workstation-grade, typically meaning it is designed to work with 3D CAD applications
- Consumer Desktop—Desktop system designated by the manufacturer for desktop or gaming level use, typically meaning it is not designed or recommended for use with 3D CAD applications
- Consumer Laptop—Laptop designated by the manufacturer for desktop or gaming level use, typically meaning it is not designed or recommended for use with 3D CAD applications.
- Tablet—Touch-screen device with integrated components.
The Hardware List page contains only the hardware products that Autodesk has tested for use with certain Autodesk applications. Autodesk tests a variety of hardware, but focuses primarily on hardware the manufacturer has indicated is workstation-grade and designed to work with 3D CAD applications.
Unless otherwise noted, Autodesk hardware certification tests are run on systems containing a single video card with a single monitor attached. Autodesk does not currently run certification tests on systems with multiple graphics cards installed or multiple monitors.
Author: CADspeed Editors
The most compelling reason to install multiple GPUs is to drive multiple high-resolution displays. The secret’s out that “multi-mon” is the single best way to improve your productivity. Anyone who’s gone to two displays (or three — or more!) will tell you they could never go back to one. And more graphics cards can display more pixels across more monitors.
Which Graphics Card Works for You?
That said, you don’t necessarily need to populate two cards to run two monitors, so pay attention to the cards you’re selecting. NVIDIA’s Quadro with nView and Mosaic technology can support two displays across most of the product line. A single high-end AMD FirePro V7900, with its Eyefinity technology, can handle four on its own, thank you very much. As such, if your performance demands have you buying midrange or high-end cards, you might get all the screen real estate you want with one card. But if you’re much hungrier for pixels and screens than you are for polygons per second, you might consider two less-expensive, dual-monitor cards.
On top of multi-monitor support, you can use that extra slot to turn your workstation into a supercomputer. An exaggeration? Not to some. General-purpose computing on GPUs (GPGPU) technology is still evolving, but many of the applications that show the most promise are the ones of most interest to engineers and other CAD users: applications such as computational fluid
dynamics (CFD) and finite-element analysis (FEA). Simulation software developers such as ANSYS and Abaqus are porting code to harness GPUs to deliver big speed-ups — in many cases tenfold or even 100- fold increases — over CPU-only computation.
High-end graphics cards usually require more power than the 75 watts supplied by the typical x16 PCI Express interface. Workstation OEMs accommodate their extra needs via auxiliary power cables drawn from the supply. Some high-end and virtually all ultra high-end graphics cards are dual-slot thickness. They insert into one PCI Express x16 connector, but their thickness means an
adjacent x16 slot may be blocked and rendered useless.
Make the Right Choice
When purchasing a workstation online, the OEM’s product configurator should let you know if the chosen card or cards will mate to the chosen system, with respect to power supplies and connectors, the number of available PCI Express x16 slots, and whether a dual-slot card has sufficient clearance. For example, when outfitting graphics on a smaller chassis that can’t accommodate two dual-slot cards, chances are the OEM will only offer the option of two entry-level or two mid-range cards, both of which are single slot width.
For that matter, if you’re perusing the latest flavor of entry level workstation, full-length cards may not have clearance lengthwise. Again, the online configurator should ensure compatibility, so you shouldn’t have to worry about these issues.
A GPU manages how your computer graphics process and display and, thanks to parallel processing, is typically more efficient than a CPU. The GPUs that are best optimized for professional graphics-intensive applications, such as CAD, design visualization and analysis, are found in workstation caliber AMD FirePro and NVIDIA Quadro graphics cards.
Five Categories of GPUs
Such professional-caliber GPUs come in a variety of flavors for desktop as well as mobile form factors. In the more mature desktop arena, they tend to fall into five categories of add-in cards.
The first category is 2D GPUs. Professional 2D cards can manage some 3D processing, but are not optimized for regular or intensive 3D applications. They generally aren’t well suited for CAD use.
For professional-level CAD work, you’ll want a Quadro or FirePro 3D add-in card. Each of these product lines includes approximately half a dozen models that fall into the remaining four product categories, as defined here by Jon Peddie Research:
- entry-level: $350 or less
- mid-range: $350–$950
- high-end: $950–$1,500
- ultra high-end: $1,500 or more
There are always exceptions, but most buyers will want to match the performance and capabilities of the GPU with the rest of the system — that is, an entry-caliber card for an entry caliber workstation. Achieving good balance, where each component hits a performance level that is supported by the rest of the system, is the best way to maximize ROI for your workstation purchase and optimize your productivity.
Fortunately, most workstation OEMs today do this work for you, offering a subset of cards from AMD and NVIDIA that best match the capabilities of the model you’ve chosen.
Optimizing GPU Performance
Most graphics cards — and all performance-oriented models — slide into PCI Express x16 slots in the workstation. Graphics cards can be installed in open slots at the factory when ordering your new system, or anytime later if you buy a card off the shelf. A mid-life upgrade of your system with a latest-generation GPU can provide a cost-effective kick, for example if rendering becomes a bottleneck.
And unlike the machine that’s at your desk today, your new workstation (unless it’s a small–form factor model) will likely come equipped with at least two PCI Express x16 slots, able to accommodate two cards. Why would you want two (or more)? One reason is that multi-GPU technologies from NVIDIA (SLI) and AMD (CrossFire) allow the pairing of two cards (rendering alternate frames) to boost performance.
Where do you begin your quest for the right workstation? This particular hardware search should start with your software.
Let’s be real: Nobody relies on just one application over the course of a day. We’re all bouncing between disparate tasks and windows. But for the majority of CAD professionals, there is one application — or maybe a couple — that consumes the bulk of your hours at the desk. What’s the app that dominates your day? Got it? Now hit the web site of the software developer and find the minimum and recommended system requirements for your killer app. AutoCAD users can find this information at http://usa.autodesk.com/autocad/system-requirements.
Minimum is the Starting Point Only
In most cases, an application’s minimum requirements set an extremely low standard, as the software vendors begrudgingly must address the least common denominator of the installed base. We don’t recommend you follow these guidelines, but it’s worth making a note of the minimum graphics, system memory and CPU requirements. On the other hand, it’s highly likely that any new workstation on the market today will meet or exceed these numbers.
More interesting is the list of recommended or certified hardware. For SolidWorks, Dassault Systèmes (as of this writing) specifies a minimum of 1 GB RAM, but suggests 6 GB. Well, if you go with 1 GB, you’ll be sorry — even 6 GB isn’t necessarily the best choice, depending on your budget, and especially given the incredible amount of gigabytes/dollar that can be had today.
Similarly, Autodesk isn’t going to stop you from running a PC gamer graphics card, but the company will tell you which cards are optimized for performance and built for reliability when it comes to supporting AutoCAD or Autodesk Inventor.
Increasingly, the only CAD-certified graphics cards are professional-brand NVIDIA Quadro and AMD FirePro. That’s because software developers have consistently seen the fewest bugs and problems with cards that, like the system overall, have been exhaustively tested and tuned for professional workstation applications. In fact, the major CAD software developers will help you address issues related to running a Quadro or FirePro card, but they dedicate no support cycles to fixing bugs on consumer-class hardware.
It’s about time. After a hiatus from its role as a viable alternative to Intel for workstation-class CPUs, AMD is back. Instead of its traditional server/workstation focused Opteron line, this time the company is — quite wisely — choosing to target the market with a combination CPU/GPU part, what AMD refers to as an Accelerated Processing Unit, or APU.
New to the market are two, professional-caliber versions of its recent “Trinity” part, workstation-branded as the FirePro A300 and A4320. And while having two such parts represents a drop in the workstation bucket, as compared to Intel’s position, any new competition should only help CAD professionals find better products — or at least better deals on those products — in the future.
A New Strategy
While AMD has never given up plying its professional-brand FirePro GPUs in workstations, the same can’t be said for professional-brand CPUs. After a promising start and a firm foothold in the market, AMD’s CPUs are today, for all intents and purposes, absent in workstation platforms.
The company’s Opteron processor began making significant inroads into workstation platforms back in the mid-2000s. With Intel’s offerings at that time looking comparatively poor, Opteron steadily picked up workstation OEMs, but the end of 2003 having all major suppliers in tow with the exception of Dell. That increased OEM presence translated directly to increased market share, up to 4% of the overall market in mid-2006, and to a more 10% of dual-socket workstations shipped.
Then came the steady, inexorable decline, which by the end of 2011 left Opteron without any major OEM on board and virtually no market share. Truth be told, it wasn’t like AMD was ignoring the workstation CPU market out of ignorance or incompetence. Rather, it was a case of triage. The company knows full well it doesn’t have the wherewithal of its chief rival, Intel, and accordingly it’s always had to be careful about which markets it targets and which it doesn’t.
And that begs a question: why now does AMD think it should invest its time and money marketing CPUs for workstations, when it didn’t before? It’s not like Intel’s CPUs are struggling like they were back in 2005. Heck, more than ever, AMD is looking for arenas to sell CPUs that don’t directly compete with Intel. No, AMD’s renewed interest in workstation CPUs has more to do with its competitive positioning in GPUs than CPUs.
Ever since the CPU duo began building and marketing combination, all-in-one CPU+GPU parts (first with Intel’s Westmere in 2010, followed by AMD’s first Fusion parts), a unique opportunity fell into the AMD lap. As we’ve been pointing out for some time now, AMD now finds itself in the rare position where it can make a compelling, competitive case over both its chief rivals, Intel and Nvidia. Intel’s reputation for performance graphics has been poor, and despite the company’s largely successful attempt to boost its graphics profile (with 2011’s Sandy Bridge and 2012’s Ivy Bridge), AMD still owns the undeniable edge over Intel in graphics. Nvidia, meanwhile, which could argue graphics supremacy, doesn’t have x86 technology, making it impossible to compete in the new CPU+GPU segment.
Pitching an ISV-certified, professional-caliber version of Trinity to workstation OEMs can be convincing, especially given which end of the market that part could play. The dominant and still fastest growing segment of the workstation markets is the Entry class, particularly the low end of that class … precisely where the cost-effectiveness of the integrated part can appeal. The capabilities of parts like Trinity and GPU-integrated Ivy Bridge aren’t record-breaking, but they’re too good for workstation-shipping CAD professionals too ignore … especially those on tight budgets.
And given Intel virtually owns the market, OEMs like Dell and HP ought to welcome an enthusiastic re-entry by AMD. After all, no business wants to be beholden to one supplier, even if it’s a supplier of essentially infinite volume, like Intel.
What Does It Mean for CAD Professionals?
So after doing some comparison shopping, will you end up with a workstation with neither Intel nor Nvidia inside? Maybe, maybe not. But either way, you’ll be much more likely to get the machine you want at a lower price, regardless of whose brand is on it. Because while Intel’s been doing an impressive job as of late delivering the type of hardware professionals demand, any competition is welcome competition. And that not only benefits OEMs like HP and Dell, it should only help when it comes to keeping down IT costs for CAD.