AMD launched the AMD FirePro A300 Series Accelerated Processing Unit (APU) for entry-level and mainstream desktop workstations. Featuring AMD Eyefinity multi-display technology, the AMD FirePro A300 Series APUs are designed for CAD and media and entertainment (M&E) workflows.
AMD FirePro A300 Series APUs combine CPU and GPU functionality on a single chip to blend workstation performance and application-certified compatibility required to help keep design professionals productive in their work.
“Design professionals demand workstation-class tools that enable productivity and flexibility in their workflow, and the AMD FirePro A300 Series APUs enable workstation integrators and OEMs an exciting new computing platform on which to design and build powerful, entry-level desktop workstation configurations,” said Matt Skynner, corporate vice president and general manager of AMD Graphics.
According to the company, the AMD FirePro A300 Series APUs are the first single-chip processors capable of delivering the workstation-class visual computing performance required for advanced professional design workflows. The introduction of AMD FirePro A300 Series APUs is designed to allow OEMs and workstation integrators (WSIs) greater flexibility, enabling new workstation designs that help save space, are energy efficient, and have low heat and noise levels without compromising true workstation-class performance and reliability.
AMD FirePro A300 Series APUs were developed for the entry-level and mainstream workstation segments, providing a blend of CPU and GPU performance and industry-leading features to keep design professionals efficient:
- Support for AMD Eyefinity Technology for enhanced efficiency and immersive, multi-monitor productivity;
- AMD Turbo Core technology, where CPU and GPU performance are dynamically scaled depending on workload demands, effectively providing a more responsive experience;
- Support for horizontal display resolutions up to 10,240 x 1600 pixels, enabling large desktop spaces across multiple high-resolution display devices for advanced multitasking;
- Support for Discrete Compute Offload (DCO), allowing additional compute capability by using discrete AMD FirePro GPUs in parallel with APU graphics for extended GPGPU performance;
- 30-bit color support to enable image and color fidelity for advanced workflows such as color correction and image processing when using displays capable of 10-bit-per-channel operation;
- Dedicated UVD (universal video decoder/VCE, or video CODEC engine) media encoding hardware for faster “fixed function” GPU processing of H.264/MPEG4 files and other motion media formats when using compatible software, to free up CPU resources for other tasks.
Pricing and Availability
The AMD FirePro A300 Series APUs will be available in systems from a number of workstation integrators starting in August 2012.
|AMD FirePro A300 Series APUs|
|APU Model||TDP||CPU Cores||CPU Clock (Max/Base)||AMD Stream Processors||GPU Clock||Unlocked|
|AMD FirePro A300||65W||4||4 GHz / 3.4 GHz||384||760 MHz||No|
|AMD FirePro A320||100W||4||4.2 GHz / 3.8 GHz||384||800 MHz||Yes|
Author: CADspeed editors
Reality capture is a boom business for the building industry. With roughly 5 million existing commercial buildings in the United States alone, it’s easy to understand why. Laser-scanner-based reality capture is the dominant methodology used today to accurately capture the 3D state of an existing building. However, the typical laser-scan-based point cloud is in the hundreds of millions of 3D points, sometimes even going into the billions of points. With this additional data overhead on top of an already dense Building Information Model, it’s important to optimize your workstation hardware to deliver a productive user experience.
Finding the Bottleneck
Under the hood, Autodesk Revit utilizes the PCG point cloud engine to rapidly access the 3D points contained in point cloud and retrieve points to be displayed in the current Revit View. Since the typical point cloud dataset is so large, a workstation’s RAM is insufficient to be used as the means for access by the PCG engine in Revit. Instead, the disk drive is used for access, while a small amount of System RAM and Video RAM is used for the current Revit View. Thus, the hard drive is commonly the limiting factor for point cloud performance, rather than system RAM, CPU, or GPU.
Learn the Options
With data access a common limiting factor to the performance of the Revit point cloud experience, let’s discuss the options available to deliver the best experience. There are two primary types that are found today: spinning platter and solid-state drives.
- Spinning platter drives are the traditional hard drive technology, and are found in most computers today, as they deliver the best balance of storage capacity, read/write access speed, and cost.
- Solid-state drives (SSDs) are relatively new technology, contain no moving parts, and are generally much faster at reading and writing data than typical spinning platter drives.
In a structured comparison completed by the Revit product team, we found the following results when comparing typical versions of these Disk Drive types:
Reap the Benefits
Based upon this investigation, we would highly recommend that those looking to optimize their Revit workstations for point cloud use install an SSD for at least the local storage of the point cloud data. While you will also achieve additional benefits from running the entire OS on your SSD, a significant performance boost can be achieved through the retrofit of a ~$200 SSD to an existing workstation.
Author: Kyle Bernhardt, Product Line Manager, Autodesk Building Design Suite
BIMx is GRAPHISOFT’s solution to explore, present, communicate and share design. BIMx enables architects and their clients to walk through professionally rendered 3D models with an easy-to-use navigation interface.
BIMx files can be exported from the ArchiCAD BIM software as a self-contained executable file for Mac or PC, or as a BIMx file that runs in the BIMx player app on iOS mobile devices such as the iPhone and iPad.
If you are not familiar with BIMx yet, you can try it now — just download a sample file along with the player environment from the Facebook-integrated BIMx community site.
How Large Can BIMx Models Be?
The maximum size of a model depends on the device where the project will be presented. BIMx uses OpenGL technology, so the video memory is often decisive. Still, due to smart optimization, even mobile devices can run amazingly complex models.
It is important to note that the BIMx file size is not indicative of the model complexity. What really counts is the memory usage of the geometry. When saving a BIMx file, this geometry size is calculated and labeled either Small, Medium, Large and Extra Large.
Small models run on any device. Medium size models will most likely run on mobile devices, but might be slower to navigate; while large models will only run on the latest mobile devices like iPad2 and iPhone4. Extra large models are not suitable for mobile devices, but will work well on desktops and laptops with powerful video cards.
How Can I Optimize Model Size?
By optimizing your model, you can achieve smoother navigation, especially on lower-spec devices. Optimization means lowering the size of memory needed to run your model. You can achieve this in three ways:
- Lowering the polygon count of the model
- Reducing the number and size of textures used
- Exporting model without global illumination.
Lowering Polygon Count
With the help of ArchiCAD’s PolyCount Add-On (which is a goodie tool — see ArchiCAD downloads under ArchiCAD’s Help menu), you can keep track of the overall polygon count of your model.
You can reduce the number of polygons by:
- Filtering elements — turn off layers of building elements that you don’t necessarily want to show in your model. Use the marquee tool to crop the model if you only want to show parts of it.
- Reducing the complexity of objects — many library objects have settings for level of detail. Curved elements also have resolution settings. Lower resolution means fewer polygons.
- Leaving out unnecessary details — plants, car and people objects are often very complex. Look for such objects with low polygon counts. Door knobs, faucets, taps are often very complex even though their model dimensions are small.
The number and size of textures can greatly inflate the model size. Here are some tricks to optimize textures:
- Use low-res, compressed images (e.g. .JPG files) as images. With an image editor you can reduce the texture map’s size to a size which still looks good enough in 3D, but results in a smaller .JPG file.
- Use as few textures as possible. Make sure that similar materials use the same texture map file.
- Don’t apply texture to elements whose model dimensions are small and therefore the texture doesn’t really improve the overall image quality.
Export Without Global Illumination
Global Illumination is an optional setting at model export that adds a more realistic lighting effect to the model, but uses considerable hardware resources. In the BIMx desktop viewer, you can check exactly how much video RAM it requires (see Figure 2). If a model proves to be too heavy with global illumination turned on, re-export the model a second time without this setting.
Author: Gergely (Greg) Kmethy, director of customer support at GRAPHISOFT
This series focuses on helping our readers understand what CAD workstations cost and how much they are going to have to spend to find a machine that meets their CAD production needs. The first part focused on entry-level systems. This post will discuss mid-range ($2,500 to $7,000) and high-end (more than $7,000) systems.
Mid-Range and High-End
Stepping up to the mid-range and high-end, you’ll typically find dual-socket Intel Xeon processors along with full tower enclosures to handle more slots and drive bays. Spring for a dual-socket system and you’ll get twice as many CPU cores, twice as much memory bandwidth, and twice the memory capacity.
Some OEMs are going to great lengths to show off the enhanced speed of processors and increased capacity of both graphics cards (for multi-monitor or high-performance computing support) and larger storage capabilities. For example, BOXX’s top-end 4800 and 8500 series workstations feature overclocked CPU performance that provides a 25% higher frequency rate — that is, an Intel 2600k (Sandy Bridge) processor running at 4.5 GHz instead of 3.4 GHz. These workstations also provide support for as many as eight drive bays and an incredible seven PCI Express slots, allowing users to populate 18 TB of total storage and house seven single-width or four dualslot graphics cards.
But there’s more to be had at the upper end of the market, as vendors are taking a page from Apple’s book and investing an impressive amount of time and money to engineer hardware aesthetics and ergonomics, resulting in advances such as tool-less and (almost) cable-less designs; carefully designed air flow; and custom, workstation-specific, high-efficiency power supplies.
Start with Your Base Requirements
So do you really need a mid-range to high-end workstation? Will an entry-level CAD workstation do? The place to start is the base requirements for your CAD software of choice, then plan a system purchase accordingly. Note that this information makes a good starting point for configuring your workstation. We consider that the baseline, and you probably want some room to grow for software upgrades.
Also if you are doing any 3D modeling, look for faster and more capable processors, more RAM, more available hard disk space in addition to free space required for installation, and a graphics display adapter capable of at least 1,280 x 1,024 resolution in true color. The graphics card needs to have 128MB or more memory, support for Pixel Shader 3.0 or greater, and Microsoft Direct3D capabilities. (Again, consider these a starting point.)
What’s the difference between a workstation or consumer-grade PC, and why should you care? Well, ten to fifteen years ago, no one had trouble distinguishing between one and the other. Workstations were very expensive, high-performance, proprietary, 3D-equipped RISC or UNIX boxes. PCs were lower-cost, lower-quality toys that couldn’t handle 3D.
But all that has changed.
Economy of Scale
Spurred on by technological advances funded by the huge economies of scale in the broader PC markets, workstation OEMs such as HP, Sun and SGI got out of the component-making business, leaving that to independent hardware vendors (IHVs) such as Intel, AMD and NVIDIA. As a result, workstations today share technology with PCs and enjoy the economy-of-scale benefits that come with mass-market production.
That raises the question: If the guts of the PC and the guts of the workstation are the same, why pay a premium for the latter? Interestingly, those exorbitant workstation premiums of the past are long gone. Yes, you can still spend your entire system budget on a single high-end graphics card, but today’s entry-level system — which more than 80% of desktop workstation buyers choose (according to Jon Peddie Research) — can sell for only about $100 more than a similarly configured PC.
Independent Software Vendor Certification
Although you don’t have to pay much of a premium for a workstation, there are compelling reasons to do so. There’s a whole laundry list of benefits to be had, but at a minimum you’ll get independent software vendor (ISV) certification, meaning your CAD software developer has tested the hardware and vouches for its reliability, and in most cases, you’ll get a professional graphics card as well.
“It is important that CAD users select an ISV-certified workstation to help ensure that the demanding applications they depend on run smoothly, right out of the box,” said Greg Weir, director of Precision Workstation Product and ISV Marketing at Dell. “[ISV-certified hardware] comes with supported drivers to help eliminate issues and increase performance after the point of sale. This intense level of testing and development between an OEM and the ISV only comes with workstations.”
Not All Graphics Cards are Created Equal
In contrast to the graphics cards sought by gamers, professional graphics processing units (GPUs) enable special rendering modes unique to CAD in general, and often to your specific application as well. Drivers from AMD and NVIDIA optimize the quality and performance for common tasks such as rendering AutoCAD Smooth lines and Gooch shaders. Try to render the same visuals on noncertified, gamer-class hardware, and AutoCAD will turn off hardware acceleration, dropping your rendering to a relative crawl.
Many such entry-level models incorporate integrated graphics processing — that is, no discrete graphics card. Although in our opinion this option is not adequate for most CAD applications, it does offer improved graphics performance compared with a standard PC. According to Wes Shimanek, workstation product manager at Intel, “If you have been buying a PC to do CAD, you’ll want to rethink that investment and consider [a workstation]. This system offers you better performance for similar dollars to the PC you have been using.”
Optimizing hardware for SolidWorks is essential for getting the most out of this heavy-hitting CAD application, as we’ve discussed on CADspeed previously. So we were thrilled when the SolidWorks forum addressed this very issue recently on their forums.
The key to getting the most out of SolidWorks, or any CAD application for that matter, is ensuring your hardware can handle the workload. Remember that your situation is unique. In simple terms, two users using the same software on the same system may have very different perspectives on their workload efficiency if one is using 3D rendering and the other is not. Consider your needs first and foremost.
On the flip side, if you know you need new hardware, simply buying the most expensive machine may not pay off in the long run either. Think in terms of your productivity while shopping for a new workstation to get the most for your budget, hopefully with a little room to grow for those inevitable upgrades.
That said, here’s a summary of the recommendations straight from SolidWorks themselves.
RAM (Random-Access Memory)
The amount of RAM you need depends less on SolidWorks and more on the number of applications you run at the same time, plus the size and complexity of your SolidWorks parts, assemblies and drawings. SolidWorks recommends you have enough RAM to work with your common applications (i.e., Microsoft Office, email, etc.) and load your SolidWorks documents at the same time.
Processor speed is another key factor when selecting the right hardware for you. It’s hard to sort through all the different options though, so we recommend testing a system with your actual models. SolidWorks also offers a helpful Performance Test, which offers a standardized test for determining performance of your major system components (i.e., CPU, I/O, video) when working with SolidWorks datasets. Even better, when you complete the SolidWorks Performance Test, you have an option to share your score with others. This gives you, and other community members, a sense of where a system stands relative to others. Nice!
Note that SolidWorks and some of its add-ons (PhotoView 360) have some multithreaded capabilities, so the application can use the second processor or multiple cores. But SolidWorks says that rebuilds are single threaded and therefore rebuilds generally will not be faster with multiple CPUs or cores.
The size of your hard drive or solid-state drive should be based on the disk space you need. Take a look at all your system’s components: operating system, applications and documents. If you work primarily on a network, your needs may be different than those who primarily use their local drive. Don’t forget to develop a back-up plan for your data, if you don’t already have one. (You do have one, right?)
The very nature of CAD software requires a good workstation-level graphics card and driver. You are probably going to need at least a mid-range card, if not a high-end card, depending on the type of CAD work you do. For graphics cards, we recommend starting with the SolidWorks Certified Graphics Cards and System, because SolidWorks has done the testing for you.
Can’t get enough about hardware configurations for SolidWorks? Check out this great post from SolidWorks on their forums. Or learn more about the minimum requirements for SolidWorks.